algorithmic bias

42 results back to index


pages: 370 words: 112,809

The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future by Orly Lobel

2021 United States Capitol attack, 23andMe, Ada Lovelace, affirmative action, Airbnb, airport security, Albert Einstein, algorithmic bias, Amazon Mechanical Turk, augmented reality, barriers to entry, basic income, Big Tech, bioinformatics, Black Lives Matter, Boston Dynamics, Charles Babbage, choice architecture, computer vision, Computing Machinery and Intelligence, contact tracing, coronavirus, corporate social responsibility, correlation does not imply causation, COVID-19, crowdsourcing, data science, David Attenborough, David Heinemeier Hansson, deep learning, deepfake, digital divide, digital map, Elon Musk, emotional labour, equal pay for equal work, feminist movement, Filter Bubble, game design, gender pay gap, George Floyd, gig economy, glass ceiling, global pandemic, Google Chrome, Grace Hopper, income inequality, index fund, information asymmetry, Internet of things, invisible hand, it's over 9,000, iterative process, job automation, Lao Tzu, large language model, lockdown, machine readable, machine translation, Mark Zuckerberg, market bubble, microaggression, Moneyball by Michael Lewis explains big data, natural language processing, Netflix Prize, Network effects, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, occupational segregation, old-boy network, OpenAI, openstreetmap, paperclip maximiser, pattern recognition, performance metric, personalized medicine, price discrimination, publish or perish, QR code, randomized controlled trial, remote working, risk tolerance, robot derives from the Czech word robota Czech, meaning slave, Ronald Coase, Salesforce, self-driving car, sharing economy, Sheryl Sandberg, Silicon Valley, social distancing, social intelligence, speech recognition, statistical model, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, surveillance capitalism, tech worker, TechCrunch disrupt, The Future of Employment, TikTok, Turing test, universal basic income, Wall-E, warehouse automation, women in the workforce, work culture , you are the product

To paraphrase the study co-author Joanna Bryson, AI is no fairy godmother—it’s just an extension of our culture.8 So, in the absence of fairy godmothers, researchers are taking the lead in addressing these problems to build better machines. A growing number of computer scientists have committed to making machine learning fairer and more equal and are developing algorithms that would mitigate bias. One type of debiasing algorithm sorts out words that are inherently gendered (such as “daughter,” “mother,” “king,” or “brother”) from those that are not (say, “computer programmer,” “nurse,” “doctor,” or “homemaker”). The algorithm can thereby extract the bias from the data, reducing analogies that create stereotypical connections about traits, abilities, and activities. The programmer can thus constrain the algorithm’s learning process.

When the application of technology fails—as with racially biased facial recognition algorithms, for instance—these failures can be better exposed than past wrongs that were hidden from sight and not recorded. And failures can often be better corrected than our human fallibilities. We can scale success and learn from experimentation. Throughout the book, we will uncover so much to celebrate: tech communities—in research, business, and the public sector—developing algorithms that detect bias and discrimination in everyday workplace and social settings; software designed to help employers close pay gaps; bots that detect early signs of a propensity to harass and allow victims to report harassment anonymously; digital platforms transforming the images used in media, politics, and marketing to foster more diverse and empowering representations of women and people of color; a lively feminist debate on whether and how sex bots can mitigate trafficking and sex crimes and liberate desire and difference; digital health data collection and analysis that can reverse thousands of years of biased research; and so much more.

In a study published in Science, a machine trained to read through Google News articles learned to make associations between words. Without being guided in any way, the program came to associate male and female names with different types of careers and different kinds of emotions.7 Bias creeps into algorithms in this way because bias is baked into the language of our culture—because our societies are unequal. Once a machine is trained—namely, once it has read through thousands of news articles—it will exhibit stereotypes to a troubling degree. It will complete the sentences it’s fed in a discriminatory way, stereotypically predicting professions, behaviors, and treatment.


pages: 346 words: 97,890

The Road to Conscious Machines by Michael Wooldridge

Ada Lovelace, AI winter, algorithmic bias, AlphaGo, Andrew Wiles, Anthropocene, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, backpropagation, basic income, Bletchley Park, Boeing 747, British Empire, call centre, Charles Babbage, combinatorial explosion, computer vision, Computing Machinery and Intelligence, DARPA: Urban Challenge, deep learning, deepfake, DeepMind, Demis Hassabis, don't be evil, Donald Trump, driverless car, Elaine Herzberg, Elon Musk, Eratosthenes, factory automation, fake news, future of work, gamification, general purpose technology, Geoffrey Hinton, gig economy, Google Glasses, intangible asset, James Watt: steam engine, job automation, John von Neumann, Loebner Prize, Minecraft, Mustafa Suleyman, Nash equilibrium, Nick Bostrom, Norbert Wiener, NP-complete, P = NP, P vs NP, paperclip maximiser, pattern recognition, Philippa Foot, RAND corporation, Ray Kurzweil, Rodney Brooks, self-driving car, Silicon Valley, Stephen Hawking, Steven Pinker, strong AI, technological singularity, telemarketer, Tesla Model S, The Coming Technological Singularity, The Future of Employment, the scientific method, theory of mind, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, trolley problem, Turing machine, Turing test, universal basic income, Von Neumann architecture, warehouse robotics

But the good news is that at least there are signs that governments are willing to try. Algorithmic Bias We might hope that AI systems would be free of the prejudices and biases that plague the human world, but I’m afraid that is not the case. Over the past decade, as machine learning systems have been rolled out into more and more areas of application, we have begun to understand how automated decision-making systems can exhibit algorithmic bias. It is now a major research field, with many groups struggling to understand the problems it raises, and how to avoid it. Algorithmic bias, as the name suggests, is concerned with situations in which a computer program – not just AI systems, but any computer program – exhibits bias of some form in its decision-making.

This takes us into a consideration of how AI will affect the nature of work, and the possibility that AI-governed employment will be alienating. This, in turn, leads us to consider the impact that the use of AI technologies might have on human rights, and the possibility of lethal autonomous weapons. We’ll then consider the emergence of algorithmic bias, and the issues surrounding lack of diversity in AI, as well as the phenomena of fake news and fake AI. Employment and Unemployment ‘Robots will take our jobs. We’d better plan now, before it’s too late.’ –– Guardian, 2018 After the Terminator narrative, probably the most widely discussed and widely feared aspect of AI is how it will affect the future of work and, in particular, the potential it has to put people out of work.

For example, imagine that, in the banking example above, the key feature of the data that you chose to train your program on was … racial origin. Then it would be no surprise if the resulting program made hopelessly biased decisions about who should get a mortgage. (You don’t think a bank would be stupid enough to do something like this, do you? Just you wait.) Algorithmic bias is a particularly prominent issue at present because, as we saw, one feature of the current wave of AI systems is that they are ‘black boxes’: they cannot explain or rationalize the decisions they make in the way that a person can. This problem is exacerbated if we place too much trust in the systems we build – and there is anecdotal evidence that we do exactly that with AI systems.


pages: 345 words: 92,063

Power, for All: How It Really Works and Why It's Everyone's Business by Julie Battilana, Tiziana Casciaro

"Friedman doctrine" OR "shareholder theory", "World Economic Forum" Davos, Abraham Maslow, affirmative action, agricultural Revolution, Albert Einstein, algorithmic bias, Andy Rubin, Asperger Syndrome, benefit corporation, Big Tech, BIPOC, Black Lives Matter, blood diamond, Boris Johnson, British Empire, call centre, Cass Sunstein, classic study, clean water, cognitive dissonance, collective bargaining, conceptual framework, coronavirus, COVID-19, CRISPR, deep learning, different worldview, digital rights, disinformation, Elon Musk, Erik Brynjolfsson, fake news, feminist movement, fundamental attribution error, future of work, George Floyd, gig economy, Greta Thunberg, hiring and firing, impact investing, income inequality, informal economy, Intergovernmental Panel on Climate Change (IPCC), invention of movable type, Jeff Bezos, job satisfaction, Joshua Gans and Andrew Leigh, Mahatma Gandhi, means of production, mega-rich, meritocracy, meta-analysis, Milgram experiment, moral hazard, Naomi Klein, Nelson Mandela, Occupy movement, Panopticon Jeremy Bentham, principal–agent problem, profit maximization, Ralph Waldo Emerson, ride hailing / ride sharing, Salesforce, School Strike for Climate, Second Machine Age, shareholder value, sharing economy, Sheryl Sandberg, Shoshana Zuboff, Silicon Valley, social distancing, Social Justice Warrior, Social Responsibility of Business Is to Increase Its Profits, Steven Pinker, surveillance capitalism, tacit knowledge, tech worker, the scientific method, The Wisdom of Crowds, TikTok, Tim Cook: Apple, transatlantic slave trade, union organizing, zero-sum game

House Committee on Science, Space, and Technology, she outlined key priorities related to AI, including the need to halt both governmental and commercial use of facial recognition in sensitive social and political contexts until the risks are fully studied and adequate regulations, such as biometric privacy laws and assessments to vet algorithms for bias, are in place.72 The latter will be particularly challenging, not only in terms of deciding where responsibility for assessing algorithms for bias should rest, but also because what constitutes a fair algorithm is a complex question, one that engineers, computer scientists, and legal scholars on the front lines of ethical AI development are asking with increasing urgency.73 But we do know some things about how algorithms function and where we are better equipped to exercise oversight.

The Workers Disagree,” Vox, November 21, 2018, https://www.vox.com/2018/11/21/18105719/google-walkout-real-change-organizers-protest-discrimination-kara-swisher-recode-decode-podcast. 72 Artificial Intelligence, Societal and Ethical Implications, Before the United States House of Representatives Committee on Science, Space, and Technology, 116th Cong. (2019) (statement of Meredith Whittaker, cofounder and codirector of AI Now Institute). 73 For example, Genie Barton, Nicol Turner-Lee, and Paul Resnick, “Algorithmic Bias Detection and Mitigation: Best Practices and Policies to Reduce Consumer Harms,” Brookings, May 22, 2019, https://www.brookings.edu/research/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-harms/; Sorelle A. Friedler et al., “A Comparative Study of Fairness Enhancing Interventions in Machine Learning,” in Proceedings of the Conference on Fairness, Accountability, and Transparency (2019): 329–38; Solon Barocas, Moritz Hardt, and Arvind Narayanan, Fairness and Machine Learning (fairmlbook.org, 2019). 74 Cynthia Dwork, “Skewed or Rescued?

In 2021, Australia passed a law requiring social media companies to pay for the journalism appearing on their platforms, despite their protestations—a landmark step toward restoring a measure of power for public-interest journalism.69 Judicial systems are also beginning to hold tech companies to account for algorithmic bias. In a watershed 2021 lawsuit brought by delivery riders against food app Deliveroo, a court in Bologna, Italy, ruled that even if an algorithm discriminates against workers unintentionally, a company can still be held liable and be forced to pay damages.70 It isn’t easy to regulate complex technologies that tend to evolve rapidly, however.


pages: 290 words: 73,000

Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble

A Declaration of the Independence of Cyberspace, affirmative action, Airbnb, algorithmic bias, Alvin Toffler, Black Lives Matter, borderless world, cloud computing, conceptual framework, critical race theory, crowdsourcing, data science, desegregation, digital divide, disinformation, Donald Trump, Edward Snowden, fake news, Filter Bubble, Firefox, Future Shock, Gabriella Coleman, gamification, Google Earth, Google Glasses, housing crisis, illegal immigration, immigration reform, information retrieval, information security, Internet Archive, Jaron Lanier, John Perry Barlow, military-industrial complex, Mitch Kapor, Naomi Klein, new economy, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, PageRank, performance metric, phenotype, profit motive, Silicon Valley, Silicon Valley ideology, Snapchat, the long tail, Tim Cook: Apple, union organizing, women in the workforce, work culture , yellow journalism

Helen Nissenbaum, a professor of media, culture, and communication and computer science at New York University, has written with Lucas Introna, a professor of organization, technology, and ethics at the Lancaster University Management School, about how search engines bias information toward the most powerful online. Their work was corroborated by Alejandro Diaz, who wrote his dissertation at Stanford on sociopolitical bias in Google’s products. Kate Crawford and Tarleton Gillespie, two researchers at Microsoft Research New England, have written extensively about algorithmic bias, and Crawford recently coorganized a summit with the White House and New York University for academics, industry, and activists concerned with the social impact of artificial intelligence in society. At that meeting, I participated in a working group on artificial-intelligence social inequality, where tremendous concern was raised about deep-machine-learning projects and software applications, including concern about furthering social injustice and structural racism.

The library practitioner Matthew Reidsma gave a recent gift to the profession when he blogged about library discovery systems, or search interfaces, that are just as troubled as commercial interfaces. In his blog post, he details the limitations of databases, the kinds of gender biases that are present in discovery tools, and how little innovation has been brought to bear in resolving some of the contradictions we know about.35 Figure 5.2. A call to the profession to address algorithmic bias in library discovery systems by Matthew Reidsma attempts to influence the field of information studies. Source: Reidsma, 2016. I sought to test the call that Reidsma made to the profession to interrogate library information management tools by conducting searches in a key library database.

., and Madden, M. (2015, March). Americans’ Privacy Strategies Post-Snowden. Pew Research Center. Retrieved from www.pewinternet.org. Rajagopal, I., and Bojin, N. (2002). Digital Representation: Racism on the World Wide Web. First Monday, 7(10). Retrieved from www.firstmonday.org. Reidsma, M. (2016, March 11). Algorithmic Bias in Library Discovery Systems. Matthew Reidsma’s blog. Retrieved from http://​matthew.reidsrow.com/​articles/​173. Rifkin, J. (1995). The End of Work: The Decline of the Global Labor Force and the Dawn of the Post-Market Era. New York: Putnam. Rifkin, J. (2000). The Age of Access: The New Culture of Hypercapitalism, Where All of Life Is a Paid-For Experience.


pages: 370 words: 107,983

Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All by Robert Elliott Smith

"World Economic Forum" Davos, Ada Lovelace, adjacent possible, affirmative action, AI winter, Alfred Russel Wallace, algorithmic bias, algorithmic management, AlphaGo, Amazon Mechanical Turk, animal electricity, autonomous vehicles, behavioural economics, Black Swan, Brexit referendum, British Empire, Cambridge Analytica, cellular automata, Charles Babbage, citizen journalism, Claude Shannon: information theory, combinatorial explosion, Computing Machinery and Intelligence, corporate personhood, correlation coefficient, crowdsourcing, Daniel Kahneman / Amos Tversky, data science, deep learning, DeepMind, desegregation, discovery of DNA, disinformation, Douglas Hofstadter, Elon Musk, fake news, Fellow of the Royal Society, feminist movement, Filter Bubble, Flash crash, Geoffrey Hinton, Gerolamo Cardano, gig economy, Gödel, Escher, Bach, invention of the wheel, invisible hand, Jacquard loom, Jacques de Vaucanson, John Harrison: Longitude, John von Neumann, Kenneth Arrow, Linda problem, low skilled workers, Mark Zuckerberg, mass immigration, meta-analysis, mutually assured destruction, natural language processing, new economy, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, On the Economy of Machinery and Manufactures, p-value, pattern recognition, Paul Samuelson, performance metric, Pierre-Simon Laplace, post-truth, precariat, profit maximization, profit motive, Silicon Valley, social intelligence, statistical model, Stephen Hawking, stochastic process, Stuart Kauffman, telemarketer, The Bell Curve by Richard Herrnstein and Charles Murray, The Future of Employment, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Bayes, Thomas Malthus, traveling salesman, Turing machine, Turing test, twin studies, Vilfredo Pareto, Von Neumann architecture, warehouse robotics, women in the workforce, Yochai Benkler

Understood: Women’s Hearts Are Victims of a System that Is Ill-Equipped to Diagnose, Treat and Support Them: Heart & Stroke 2018 Heart Report. www.heartandstroke.ca/-/media/pdf-files/canada/2018-heart-month/hs_2018-heart-report_en.ashx?la=en&hash=B7E7C6225111EB4AECEE7EE729BFC050E2643082 Chapter 8 1Frederick S. Lane, 2009, American Privacy: The 400-Year History of Our Most Contested Right. Boston: Beacon Press. 2Aaron Pressman, 2018, How to Fight the Growing Scourge of Algorithmic Bias in AI. Fortune, http://fortune.com/2018/09/14/fight-algorithmic-bias-joy-buolamwini/ 3Mark Frauenfelder, 2017, Racist Soap Dispenser. BoingBoing, https://boingboing.net/2017/08/16/racist-soap-dispenser.html 4Jana Kasperkevic, 2105, Google Says Sorry for Racist Auto-Tag in Photo App. Guardian, www.theguardian.com/technology/2015/jul/01/google-sorry-racist-auto-tag-photo-app 5Tom Simonite, 2018, When It Comes to Gorillas, Google Photos Remains Blind.

People pick the features; it’s extremely difficult for those people to avoid biases in their selection, and all feature-based models of complex phenomena contain biases. Therefore, the amplification Eubanks suggests is not just about biases in data, it’s about the whole presumption of algorithmic understanding of complex human issues. Algorithmic bias is certainly a huge and hidden civil rights issue, because it will not be the wealthy who will have their job or loan applications assessed by a computer (or their child care, mortgage or medical insurance). It is not the white-collar criminal who will be evaluated en masse by an algorithm for parole, or white-collar suburbs that will be targeted as areas that must be patrolled by police cars and surveillance in a search for potential financial crimes.

While Wikipedia defines ‘fake news’ as a synonym for news satire, more recently it has become at worst a condemnation of real human journalism, and at best an umbrella term for the uncontrolled explosion of misleading information dispersed on social media. Mark Zuckerberg has pledged to make Facebook impede fake news. But it’s unclear how, as in the past his company has eliminated human editors and curators in favour of algorithms, ironically to ensure less bias in their presentation of news. As we’ve seen before, being able to determine what is true and what is fake is not a goal that algorithms can achieve easily. Furthermore, the war on fake news misses the point. Regardless of whether algorithms present ‘true’ or ‘fake’ news, they will still be working towards their primary directive: the maximization of value.


pages: 416 words: 112,268

Human Compatible: Artificial Intelligence and the Problem of Control by Stuart Russell

3D printing, Ada Lovelace, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Alfred Russel Wallace, algorithmic bias, AlphaGo, Andrew Wiles, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, augmented reality, autonomous vehicles, basic income, behavioural economics, Bletchley Park, blockchain, Boston Dynamics, brain emulation, Cass Sunstein, Charles Babbage, Claude Shannon: information theory, complexity theory, computer vision, Computing Machinery and Intelligence, connected car, CRISPR, crowdsourcing, Daniel Kahneman / Amos Tversky, data science, deep learning, deepfake, DeepMind, delayed gratification, Demis Hassabis, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Ernest Rutherford, fake news, Flash crash, full employment, future of work, Garrett Hardin, Geoffrey Hinton, Gerolamo Cardano, Goodhart's law, Hans Moravec, ImageNet competition, Intergovernmental Panel on Climate Change (IPCC), Internet of things, invention of the wheel, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Nash: game theory, John von Neumann, Kenneth Arrow, Kevin Kelly, Law of Accelerating Returns, luminiferous ether, machine readable, machine translation, Mark Zuckerberg, multi-armed bandit, Nash equilibrium, Nick Bostrom, Norbert Wiener, NP-complete, OpenAI, openstreetmap, P = NP, paperclip maximiser, Pareto efficiency, Paul Samuelson, Pierre-Simon Laplace, positional goods, probability theory / Blaise Pascal / Pierre de Fermat, profit maximization, RAND corporation, random walk, Ray Kurzweil, Recombinant DNA, recommendation engine, RFID, Richard Thaler, ride hailing / ride sharing, Robert Shiller, robotic process automation, Rodney Brooks, Second Machine Age, self-driving car, Shoshana Zuboff, Silicon Valley, smart cities, smart contracts, social intelligence, speech recognition, Stephen Hawking, Steven Pinker, superintelligent machines, surveillance capitalism, Thales of Miletus, The Future of Employment, The Theory of the Leisure Class by Thorstein Veblen, Thomas Bayes, Thorstein Veblen, Tragedy of the Commons, transport as a service, trolley problem, Turing machine, Turing test, universal basic income, uranium enrichment, vertical integration, Von Neumann architecture, Wall-E, warehouse robotics, Watson beat the top human players on Jeopardy!, web application, zero-sum game

Although this sounds admirable in principle, it remains to be seen—at least at the time of writing—how much impact this will have in practice. It is often so much easier, faster, and cheaper to leave the decisions to the machine. One reason for all the concern about automated decisions is the potential for algorithmic bias—the tendency of machine learning algorithms to produce inappropriately biased decisions about loans, housing, jobs, insurance, parole, sentencing, college admission, and so on. The explicit use of criteria such as race in these decisions has been illegal for decades in many countries and is prohibited by Article 9 of the GDPR for a very wide range of applications.

The EU’s GDPR is often said to provide a general “right to an explanation” for any automated decision,38 but the actual language of Article 14 merely requires meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject. At present, it is unknown how courts will enforce this clause. It’s possible that the hapless consumer will just be handed a description of the particular deep learning algorithm used to train the classifier that made the decision. Nowadays, the likely causes of algorithmic bias lie in the data rather than in the deliberate malfeasance of corporations. In 2015, Glamour magazine reported a disappointing finding: “The first female Google image search result for ‘CEO’ appears TWELVE rows down—and it’s Barbie.” (There were some actual women in the 2018 results, but most of them were models portraying CEOs in generic stock photos, rather than actual female CEOs; the 2019 results are somewhat better.)

Fortunately, a good deal of attention has been paid to the problem of removing inadvertent bias from machine learning algorithms, and there are now methods that produce unbiased results according to several plausible and desirable definitions of fairness.39 The mathematical analysis of these definitions of fairness shows that they cannot be achieved simultaneously and that, when enforced, they result in lower prediction accuracy and, in the case of lending decisions, lower profit for the lender. This is perhaps disappointing, but at least it makes clear the trade-offs involved in avoiding algorithmic bias. One hopes that awareness of these methods and of the issue itself will spread quickly among policy makers, practitioners, and users. If handing authority over individual humans to machines is sometimes problematic, what about authority over lots of humans? That is, should we put machines in political and management roles?


pages: 161 words: 39,526

Applied Artificial Intelligence: A Handbook for Business Leaders by Mariya Yao, Adelyn Zhou, Marlene Jia

Airbnb, algorithmic bias, AlphaGo, Amazon Web Services, artificial general intelligence, autonomous vehicles, backpropagation, business intelligence, business process, call centre, chief data officer, cognitive load, computer vision, conceptual framework, data science, deep learning, DeepMind, en.wikipedia.org, fake news, future of work, Geoffrey Hinton, industrial robot, information security, Internet of things, iterative process, Jeff Bezos, job automation, machine translation, Marc Andreessen, natural language processing, new economy, OpenAI, pattern recognition, performance metric, price discrimination, randomized controlled trial, recommendation engine, robotic process automation, Salesforce, self-driving car, sentiment analysis, Silicon Valley, single source of truth, skunkworks, software is eating the world, source of truth, sparse data, speech recognition, statistical model, strong AI, subscription business, technological singularity, The future is already here

Finally, the use of technology—including AI, predictive analytics, automation, and social media bots—can have far-ranging social impact. AI can be used for illegal surveillance, propaganda, deception, and social manipulation. * * * (29) Yao, M. (2017, November 2). Fighting Algorithmic BIAS & Homogenous Thinking in AI. TopBots. Retrieved from http://www.topbots.com/fightinghomogenous-thinking-algorithmic-bias-ai/ (30) Westervelt, E (Contributor). (2017, August 18). Did A Bail Reform Algorithm Contribute To This San Francisco Man’s Murder? [Radio Broadcast Episode]. In Carline Watson (Executive Producer), All Things Considered.


pages: 506 words: 133,134

The Lonely Century: How Isolation Imperils Our Future by Noreena Hertz

"Friedman doctrine" OR "shareholder theory", Airbnb, airport security, algorithmic bias, Asian financial crisis, autism spectrum disorder, Bernie Sanders, Big Tech, big-box store, Broken windows theory, call centre, Capital in the Twenty-First Century by Thomas Piketty, car-free, Cass Sunstein, centre right, conceptual framework, Copley Medal, coronavirus, correlation does not imply causation, COVID-19, dark matter, deindustrialization, Diane Coyle, digital divide, disinformation, Donald Trump, driverless car, emotional labour, en.wikipedia.org, Erik Brynjolfsson, Evgeny Morozov, fake news, Fellow of the Royal Society, future of work, gender pay gap, gentrification, gig economy, Gordon Gekko, greed is good, Greta Thunberg, happiness index / gross national happiness, housing crisis, illegal immigration, independent contractor, industrial robot, Jane Jacobs, Jeff Bezos, Jeremy Corbyn, Jessica Bruder, job automation, job satisfaction, karōshi / gwarosa / guolaosi, Kevin Roose, knowledge economy, labor-force participation, lockdown, longitudinal study, low interest rates, low skilled workers, Lyft, Mark Zuckerberg, mass immigration, means of production, megacity, meta-analysis, move fast and break things, Network effects, new economy, Pepto Bismol, QWERTY keyboard, Ray Oldenburg, remote working, rent control, RFID, robo advisor, Ronald Reagan, Salesforce, San Francisco homelessness, Second Machine Age, Shoshana Zuboff, side hustle, Silicon Valley, Skype, Snapchat, social distancing, Social Responsibility of Business Is to Increase Its Profits, SoftBank, Steve Jobs, surveillance capitalism, TaskRabbit, tech worker, The Death and Life of Great American Cities, The Future of Employment, The Great Good Place, the long tail, The Wealth of Nations by Adam Smith, TikTok, Tim Cook: Apple, Uber and Lyft, uber lyft, urban planning, Wall-E, warehouse automation, warehouse robotics, WeWork, work culture , working poor, workplace surveillance

CHAPTER EIGHT: The Digital Whip 1 Robert Booth, ‘Unilever saves on recruiters by using AI to assess job interviews’, Guardian, 25 October 2019, https://www.theguardian.com/technology/2019/oct/25/unilever-saves-on-recruiters-by-using-ai-to-assess-job-interviews; The Harvey Nash HR Survey 2019, https://www.harveynash.com/hrsurvey/full-report/charts/#summary. 2 ‘HireVue surpasses ten million video interviews completed worldwide’, HireVue, 21 May 2019, https://www.hirevue.com/press-release/hirevue-surpasses-ten-million-video-interviews-completed-worldwide. 3 ‘EPIC Files Complaint with FTC about Employment Screening Firm HireVue’, Electronic Privacy Information Center, 6 November 2019, https://epic.org/2019/11/epic-files-complaint-with-ftc.html; see full complaint at https://epic.org/privacy/ftc/hirevue/EPIC_FTC_HireVue_Complaint.pdf. 4 Loren Larsen, ‘HireVue Assessments and Preventing Algorithmic Bias’, HireVue, 22 June 2018, https://www.hirevue.com/blog/hirevue-assessments-and-preventing-algorithmic-bias; cf. Emma Leech, ‘The perils of AI recruitment’, New Statesman, 14 August 2019, https://tech.newstatesman.com/emerging-technologies/ai-recruitment-algorithms-bias; Julius Schulte, ‘AI-assisted recruitment is biased. Here’s how to make it more fair’, World Economic Forum, 9 May 2019, https://www.weforum.org/agenda/2019/05/ai-assisted-recruitment-is-biased-heres-how-to-beat-it/. 5 Drew Harwell, ‘A face-scanning algorithm increasingly decides whether you deserve the job’, Washington Post, 6 November 2019, https://www.washingtonpost.com/technology/2019/10/22/ai-hiring-face-scanning-algorithm-increasingly-decides-whether-you-deserve-job/. 6 Reuters, ‘Amazon ditched AI recruiting tool that favoured men for technical jobs’, Guardian, 11 October 2018, https://www.theguardian.com/technology/2018/oct/10/amazon-hiring-ai-gender-bias-recruiting-engine. 7 Kuba Krys et al., ‘Be Careful Where You Smile: Culture Shapes Judgments of Intelligence and Honesty of Smiling Individuals’, Journal of Nonverbal Behavior 40 (2016), 101–16, https://doi.org/10.1007/s10919-015-0226-4. these assumptions, reflected in proverbs and stereotypes, are now backed up by quantitative analysis of 44 countries. 8 The broad theory holds that in countries with more historical diversity – i.e. populated by a large proportion of immigrants who may not share languages or cultural norms – smiles are more expected and used as social currency; see Khazan, ‘Why Americans smile so much’, The Atlantic, 3 May 2017, https://www.theatlantic.com/science/archive/2017/05/why-americans-smile-so-much/524967/. 9 The retail giant’s eventual closure of its German stores was related, analysts speculate, to its inability to adapt to different cultural expectations; Mark Landler and Michael Barbaro, ‘Wal-Mart Finds That Its Formula Doesn’t Fit Every Culture’, New York Times, 2 August 2006, https://www.nytimes.com/2006/08/02/business/worldbusiness/02walmart.html; see also Khazan, ‘Why Americans smile so much’. 10 Implied in this statement on its website under FAQ for interviewees: ‘Customer-facing jobs, like a bank teller role, do require a degree of friendliness and attention to other people.

Platforms like Uber, Fiverr and TaskRabbit that use ratings should be mandated to audit their mechanisms, identify potential biases and recalibrate accordingly. Moreover, an ‘appeals process’ must be guaranteed, so that those who rely on these platforms to make a decent living are able to contest ratings that they consider unfair. Whilst algorithmic bias is considerably harder to address, we can certainly screen for it much better than has been done to date. Ironically, there is perhaps even a role for algorithms here in monitoring and identifying such biases.61 More fundamentally, the choices that underpin the algorithm – how the data is collected, its code and the decision-making heuristics that parse the data – need to be made transparent, so that redress and recourse are possible.


Artificial Whiteness by Yarden Katz

affirmative action, AI winter, algorithmic bias, AlphaGo, Amazon Mechanical Turk, autonomous vehicles, benefit corporation, Black Lives Matter, blue-collar work, Californian Ideology, Cambridge Analytica, cellular automata, Charles Babbage, cloud computing, colonial rule, computer vision, conceptual framework, Danny Hillis, data science, David Graeber, deep learning, DeepMind, desegregation, Donald Trump, Dr. Strangelove, driverless car, Edward Snowden, Elon Musk, Erik Brynjolfsson, European colonialism, fake news, Ferguson, Missouri, general purpose technology, gentrification, Hans Moravec, housing crisis, income inequality, information retrieval, invisible hand, Jeff Bezos, Kevin Kelly, knowledge worker, machine readable, Mark Zuckerberg, mass incarceration, Menlo Park, military-industrial complex, Nate Silver, natural language processing, Nick Bostrom, Norbert Wiener, pattern recognition, phenotype, Philip Mirowski, RAND corporation, recommendation engine, rent control, Rodney Brooks, Ronald Reagan, Salesforce, Seymour Hersh, Shoshana Zuboff, Silicon Valley, Silicon Valley billionaire, Silicon Valley ideology, Skype, speech recognition, statistical model, Stephen Hawking, Stewart Brand, Strategic Defense Initiative, surveillance capitalism, talking drums, telemarketer, The Signal and the Noise by Nate Silver, W. E. B. Du Bois, Whole Earth Catalog, WikiLeaks

Some protesters are holding masks imprinted with Amazon founder Jeff Bezos’s face. Photograph by NWDC Resistance. Yet these are familiar uses of computing that do not fall under the fundable rubric of “AI” that the AI expert industry is after. And activists haven’t framed these issues around “AI,” nor around phrases such as “algorithmic accountability” or “algorithmic bias” invoked by critical AI experts. Instead, activists have focused on how technical systems fit into the political system that fuels deportations and detentions, as outlined in Mijente’s report Who’s Behind ICE?. The report highlights the role of “cloud computing” in enabling data interoperability across agencies.

abolition, 143, 148, 247nn37–38 Action Technologies, 290n29 Adam, Alison, 6, 30, 99–100, 103, 187, 207 adaptation, of AI, 10, 65, 128, 167, 178–80 Afghanistan, 58, 209, 219 Agre, Philip, 50, 198–99, 248n84, 249n85, 291n32, 291nn33–34, 299n79 AI Magazine, 23, 56–57, 249n88 AI Now, 79–80, 135, 138–43, 147–49, 151, 261nn41–43, 272n19, 274n37, 276n43 air force, 54–55, 191, 211, 251n102 AI: The Tumultuous History of the Quest for Artificial Intelligence, 51–52, 249n89, 250n90 Algeria, 15 algorithmic accountability, 125, 129, 132 algorithmic bias, 132, 135, 138–40, 169, 175–76, 178–80, 272n22, 284n61, 285n65 algorithmic sentencing, 129 Allen, Theodore, 279n24 Allende, Salvador, 193 AlphaGo, 62, 106, 108 Amazon (company): and Amazon Mechanical Turk (AMT), 116–17; and facial recognition, 176; roles in carceral state, 131–33, 149, 151, 179; and platform companies, 254n5; and the rebranding of AI, 255n12, 256n15, 257n16, 272n19, 276n43 Amazon (place), 82, 86 American Civil Liberties Union (ACLU), 141, 143 American exceptionalism, 60–61 American Psychological Association, 21 analytic epistemology, 188–189, 192, 287n6.

See also rationalistic tradition anthropometry, 7, 160, 278n19, 279n26 AnyVision, 276n45 apartheid, 114 Apple, 68, 71, 74, 285n65 Armer, Paul, 244n37 army, 44, 133, 219, 251n104, 295n55, 299n77 Arpaio, Joe, 179 artificial intelligence (AI): alternatives to, 11, 14, 185–88, 203–10; autonomy of, 52–53, 58–59, 247n72; coining of term, 3, 22–23, 241nn8–9; dissenting views of, 33–34, 44–48, 193–94; expert industry around, 2–3, 9–10, 66–74; as a foil for neoliberalism, 70–78, 119–22, 166, 226–27; “hype” about, 33, 208, 238n17, 255n14; and models of the self, 5–7, 10, 27–32, 154, 157, 165–67; as a site of imperial struggle, 60–61; individual and institutional investment in, 7–11, 20, 79–81, 154, 167–72, 182, 229; militaristic frame of, 35–38, 45–47, 50–59, 245n44; nebulous and shifting character of, 2–3, 5–6, 10, 22–27, 33–38, 42, 45, 52–55, 63–70, 154–55, 164–67; springs and winters of, 4, 48; sponsorship of, 24–26, 59; rebranding of, 12, 65–70, 80–81; relation to psychology and biology, 5, 21; “technical” narratives about, 2–4, 66–68, 261n43; as a technology of whiteness, 8–11, 155, 172, 181–82 artificial life (ALife): and bio-inspired computing, 210, 296n60; and racial narratives, 159, 277n12–14; relation to AI, 159, 277n11 Artificial War, 212–13 ARPA, 24, 96, 249n85; and Behavioral Sciences Program, 98, 265n11 Association for the Advancement of Artificial Intelligence (AAAI), 5, 23, 37–38 Association for Computing Machinery (ACM), 23 Atlantic, The, 61, 104 automata studies, 241nn8–9 automation: and automated systems, 141, 169–70; of industrial production, 25–26, 34, 74–75, 78; of warfare, 43–44, 57–59 autopoiesis, 194–96: and neoliberalism, 201–2; and war, 213 Azoulay, Ariella Aïsha, 300n1 Azoulay, Roger, 15, 300n1 Baldwin, James, 93 Barbrook, Richard, 156 Baudrillard, Jean, 151 Baupost Group, 263n57 Beer, Stafford, 25, 193, 290n30 Belfer Center, 260n40 Bell Labs, 22 Bell, Trudy E., 246n58 behaviorism, 119, 121–22, 166, 267n46, 269n49, 269n51 Benjamin, Ruha, 285n67 Berkman Klein Center for Internet & Society, 66, 260n39, 261n42 bias. See algorithmic bias big data, 12, 67–71, 79–80, 254n3 Black Lives Matter, 138, 284n64 Black in AI, 173–74 Black radical tradition, 173, 178, 181 Black Skin, White Masks, 175, 180–81 Blackstone Group, 81–84, 86, 88 blood quantum, 162, 279n26 Blumenbach, Johann F., 160, 278n18 Boden, Margaret, 259n24 Boeing, 260n39 Bolsonaro, Jair, 82–83 Bostrom, Nick, 76 Brand, Stewart, 293n43 Brazil, 82–83, 86 Bridges, Ruby, 113, 115 Brooks, Fred P., 270n54 Brooks, Rodney, 44, 207–8, 295nn52–53, 296n60 Brown, Michael, 137 Buolamwini, Joy, 175–80, 284n61, 285n65, 285n67 Bush, George W., 58, 131, 209, 216 California Institute of Technology, 242n19 Cameron, Andy, 156 capitalism: and anticapitalism, 45; and capitalist visions of AI experts, 9–10, 35, 65, 74–78, 81–88.


pages: 533

Future Politics: Living Together in a World Transformed by Tech by Jamie Susskind

3D printing, additive manufacturing, affirmative action, agricultural Revolution, Airbnb, airport security, algorithmic bias, AlphaGo, Amazon Robotics, Andrew Keen, Apollo Guidance Computer, artificial general intelligence, augmented reality, automated trading system, autonomous vehicles, basic income, Bertrand Russell: In Praise of Idleness, Big Tech, bitcoin, Bletchley Park, blockchain, Boeing 747, brain emulation, Brexit referendum, British Empire, business process, Cambridge Analytica, Capital in the Twenty-First Century by Thomas Piketty, cashless society, Cass Sunstein, cellular automata, Citizen Lab, cloud computing, commons-based peer production, computer age, computer vision, continuation of politics by other means, correlation does not imply causation, CRISPR, crowdsourcing, cryptocurrency, data science, deep learning, DeepMind, digital divide, digital map, disinformation, distributed ledger, Donald Trump, driverless car, easy for humans, difficult for computers, Edward Snowden, Elon Musk, en.wikipedia.org, end-to-end encryption, Erik Brynjolfsson, Ethereum, ethereum blockchain, Evgeny Morozov, fake news, Filter Bubble, future of work, Future Shock, Gabriella Coleman, Google bus, Google X / Alphabet X, Googley, industrial robot, informal economy, intangible asset, Internet of things, invention of the printing press, invention of writing, Isaac Newton, Jaron Lanier, John Markoff, Joseph Schumpeter, Kevin Kelly, knowledge economy, Large Hadron Collider, Lewis Mumford, lifelogging, machine translation, Metcalfe’s law, mittelstand, more computing power than Apollo, move fast and break things, natural language processing, Neil Armstrong, Network effects, new economy, Nick Bostrom, night-watchman state, Oculus Rift, Panopticon Jeremy Bentham, pattern recognition, payday loans, Philippa Foot, post-truth, power law, price discrimination, price mechanism, RAND corporation, ransomware, Ray Kurzweil, Richard Stallman, ride hailing / ride sharing, road to serfdom, Robert Mercer, Satoshi Nakamoto, Second Machine Age, selection bias, self-driving car, sexual politics, sharing economy, Silicon Valley, Silicon Valley startup, Skype, smart cities, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, smart contracts, Snapchat, speech recognition, Steve Bannon, Steve Jobs, Steve Wozniak, Steven Levy, tech bro, technological determinism, technological singularity, technological solutionism, the built environment, the Cathedral and the Bazaar, The Structural Transformation of the Public Sphere, The Wisdom of Crowds, Thomas L Friedman, Tragedy of the Commons, trolley problem, universal basic income, urban planning, Watson beat the top human players on Jeopardy!, work culture , working-age population, Yochai Benkler

‘Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings’. arXiv, 21 Jul. 2016 <https://arxiv.org/ pdf/1607.06520.pdf> (accessed 3 Dec. 2017). Bonchi, Francesco, Carlos Castillo, and Sara Hajian. ‘Algorithmic Bias: From Discrimination Discovery to Fairness-aware Data Mining’. KDD OUP CORRECTED PROOF – FINAL, 28/05/18, SPi РЕЛИЗ ПОДГОТОВИЛА ГРУППА "What's News" VK.COM/WSNWS Bibliography 443 2016 Tutorial. <http://francescobonchi.com/tutorial-algorithmic-bias. pdf> (accessed 3 Dec. 2017). Booth, Robert. ‘Facebook Reveals News Feed Experiment to Control ­Emotions’. The Guardian, 30 Jun. 2004 <https://www.theguardian.com/ technology/2014/jun/29/facebook-users-emotions-news-feeds> (accessed 11 Dec. 2017).

This is just one way of assessing OUP CORRECTED PROOF – FINAL, 28/05/18, SPi РЕЛИЗ ПОДГОТОВИЛА ГРУППА "What's News" VK.COM/WSNWS Algorithmic Injustice 281 a­ lgorithmic injustice. One of the tasks for political theorists will be to find more. ‘Algorithmic Discrimination’ Different types of algorithmic injustice are sometimes lumped together under the name ‘algorithmic discrimination’. I avoid this term, along with the term algorithmic bias, because it can lead to confusion. Discrimination is a subtle concept with at least three acceptable meanings. The first is neutral, referring to the process of drawing distinctions between one thing and another. (If I say you are a highly discriminating art critic, I am praising your acuity and not calling you a bigot.)

Google and the Perpetuation of Stereotypes via Autocomplete Search Forms’, Critical Discourse Studies 10, no. 2 (2013) <http://www.tandfonline.com/doi/full/10.1080/17405904.2012.7 44320?scroll=top&needAccess=true> (accessed 3 December 2017). 14. Francesco Bonchi, Carlos Castillo, and Sara Hajian, ‘Algorithmic Bias: From Discrimination Discovery to Fairness-aware Data Mining’, KDD 2016 Tutorial <http://francescobonchi.com/tutorial-algorithmicbias.pdf> (accessed 3 December 2017). 15. Tom Slee, What’s Yours is Mine: Against the Sharing Economy (New York and London: OR Books, 2015), 94. 16. Slee, What’s Yours is Mine, 95. 17.


pages: 307 words: 88,180

AI Superpowers: China, Silicon Valley, and the New World Order by Kai-Fu Lee

"World Economic Forum" Davos, AI winter, Airbnb, Albert Einstein, algorithmic bias, algorithmic trading, Alignment Problem, AlphaGo, artificial general intelligence, autonomous vehicles, barriers to entry, basic income, bike sharing, business cycle, Cambridge Analytica, cloud computing, commoditize, computer vision, corporate social responsibility, cotton gin, creative destruction, crony capitalism, data science, deep learning, DeepMind, Demis Hassabis, Deng Xiaoping, deskilling, Didi Chuxing, Donald Trump, driverless car, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, fake news, full employment, future of work, general purpose technology, Geoffrey Hinton, gig economy, Google Chrome, Hans Moravec, happiness index / gross national happiness, high-speed rail, if you build it, they will come, ImageNet competition, impact investing, income inequality, informal economy, Internet of things, invention of the telegraph, Jeff Bezos, job automation, John Markoff, Kickstarter, knowledge worker, Lean Startup, low skilled workers, Lyft, machine translation, mandatory minimum, Mark Zuckerberg, Menlo Park, minimum viable product, natural language processing, Neil Armstrong, new economy, Nick Bostrom, OpenAI, pattern recognition, pirate software, profit maximization, QR code, Ray Kurzweil, recommendation engine, ride hailing / ride sharing, risk tolerance, Robert Mercer, Rodney Brooks, Rubik’s Cube, Sam Altman, Second Machine Age, self-driving car, sentiment analysis, sharing economy, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Skype, SoftBank, Solyndra, special economic zone, speech recognition, Stephen Hawking, Steve Jobs, strong AI, TED Talk, The Future of Employment, Travis Kalanick, Uber and Lyft, uber lyft, universal basic income, urban planning, vertical integration, Vision Fund, warehouse robotics, Y Combinator

As public policy and personal values blend, we should really take the time to study new experiments in defining and measuring progress, such as Bhutan’s decision to pursue “Gross National Happiness” as a key development indicator. Finally, our governments will need to consistently look to one another in evaluating thorny new tradeoffs in data privacy, digital monopolies, online security, and algorithmic bias. In tackling these issues, we can learn much from comparing the different approaches taken by regulators in Europe, the United States, and China. While Europe has opted for a more heavy-handed approach (fining Google, for example, for antitrust and trying to wrest control over data away from the technology companies), China and the United States have given these companies greater leeway, letting technology and markets develop before intervening on the margins.

Index A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z A Africa, 138, 139, 169 age of data, 14, 18, 56 age of implementation Chinese entrepreneurs and, 16, 18, 25 Chinese government and, 18 data and, 17, 20, 55, 80 deep learning and, 13–14, 143 going light vs. going heavy, 71 AGI (artificial general intelligence), 140–44 AI. See artificial intelligence (AI) AI engineers, 14 Airbnb, 39, 49, 73 AI revolution deep learning and, 5, 25, 92, 94, 143 economic impact of, 151–52 speed of, 152–55 AI winters, 6–7, 8, 9, 10 algorithmic bias, 229 algorithms, AI AI revolution and, 152–53 computing power and, 14, 56 credit and, 112–13 data and, 14, 17, 56, 138 fake news detection by, 109 intelligence sharing and, 87 legal applications for, 115–16 medical diagnosis and, 114–15 as recommendation engines, 107–8 robot reporting, 108 white-collar workers and, 167, 168 Alibaba Amazon compared to, 109 Chinese startups and, 58 City Brain, 93–94, 117, 124, 228 as dominant AI player, 83, 91, 93–94 eBay and, 34–35 financial services spun off from, 73 four waves of AI and, 106, 107, 109 global markets and, 137 grid approach and, 95 Microsoft Research Asia and, 89 mobile payments transition, 76 New York Stock Exchange debut, 66–67 online purchasing and, 68 success of, 40 Tencent’s “Pearl Harbor attack” on, 60–61 Wang Xing and, 24 Alipay, 35, 60, 69, 73–74, 75, 112, 118 Alphabet, 92–93 AlphaGo, 1–4, 5, 6, 11, 199 AlphaGo Zero, 90 Altman, Sam, 207 Amazon Alibaba compared to, 109 Chinese market and, 39 data captured by, 77 as dominant AI player, 83, 91 four waves of AI and, 106 grid approach and, 95 innovation mentality at, 33 monopoly of e-commerce, 170 online purchasing and, 68 Wang Xing and, 24 warehouses, 129–30 Amazon Echo, 117, 127 Amazon Go, 163, 213 Anderson, Chris, 130 Andreesen Horowitz, 70 Ant Financial, 73 antitrust laws, 20, 28, 171, 229 Apollo project, 135 app constellation model, 70 Apple, 33, 75, 117, 126, 143, 177, 184 Apple Pay, 75, 76 app-within-an-app model, 59 ARM (British firm), 96 Armstrong, Neil, 3 artificial general intelligence (AGI), 140–44 artificial intelligence (AI) introduction to, ix–xi See also China; deep learning; economy and AI; four waves of AI; global AI story; human coexistence with AI; new world order artificial superintelligence.


The Ethical Algorithm: The Science of Socially Aware Algorithm Design by Michael Kearns, Aaron Roth

23andMe, affirmative action, algorithmic bias, algorithmic trading, Alignment Problem, Alvin Roth, backpropagation, Bayesian statistics, bitcoin, cloud computing, computer vision, crowdsourcing, data science, deep learning, DeepMind, Dr. Strangelove, Edward Snowden, Elon Musk, fake news, Filter Bubble, general-purpose programming language, Geoffrey Hinton, Google Chrome, ImageNet competition, Lyft, medical residency, Nash equilibrium, Netflix Prize, p-value, Pareto efficiency, performance metric, personalized medicine, pre–internet, profit motive, quantitative trading / quantitative finance, RAND corporation, recommendation engine, replication crisis, ride hailing / ride sharing, Robert Bork, Ronald Coase, self-driving car, short selling, sorting algorithm, sparse data, speech recognition, statistical model, Stephen Hawking, superintelligent machines, TED Talk, telemarketer, Turing machine, two-sided market, Vilfredo Pareto

By making this distinction, the algorithm is able to “subtract off” the bias in the data associated with nongendered words, thus reducing analogy completions like the one in the paper’s title, while still preserving “correct” analogies like “Man is to king as woman is to queen.” These are the themes of this chapter: scientific notions of algorithmic (and human) bias and discrimination, how to detect and measure them, how to design fairer algorithmic solution—and what the costs of fairness might be to predictive accuracy and other important objectives, just as we examined the costs to accuracy of differential privacy. We will eventually show how such costs can be made quantitative in the form of what are known as Pareto curves specifying the theoretical and empirical trade-offs between fairness and accuracy.

See also societal norms and values Hunch, 52–53, 68 hybrid models, 76–77 hypotheses, 153–54 IBM Research, 132 image classifiers, 181–82 image-creating algorithms, 132–33 ImageNet, 146–51, 160–63, 165 image recognition, 145–49, 162–63, 165–67, 175 incentives and algorithmic game theory, 101 and correlated equilibrium, 114–15 and differential privacy, 56 and differing notions of fairness, 85 and ethical issues of optimization, 189–90 and investing scams, 140–41 and medical residency hiring, 128 and navigation problems, 111–15 and p-hacking, 144–45 and scientific research, 136, 144–45 and user preferences, 97 income discrimination, 88–89 incriminating information, 40–45 individual preferences, 115–17 inference, 33, 51 Infinite Jest (Wallace), 118, 120 informational challenges, 104 input bias, 91 insurance rates, 38 intelligence explosion, 185–88 intentionality, 7 internal design of algorithms, 131 Internal Revenue Service (IRS), 49–50 Internet and availability of data, 66–67 and commerce, 64 Internet-derived data, 6–7 and theoretical computer science field, 13 Internet Movie Database (IMDB), 25–26 interpretability of outputs and algorithmic morality, 175–77 and current state of ethics research, 170–75 and General Data Protection Regulation, 15 and goals of ethics research, 170–71 investing scams, 137–41 Ioannidis, John, 143 iPhones, 47–48, 195 Jeopardy, 180 jump balls, 99–100 Kaggle, 146–47 k-anonymity, 27–30, 44 kidney donation matching, 130 King, Stephen, 118 k-nearest neighbors algorithm, 173 Kubrick, Stanley, 100 large datasets, 100–101, 151, 155 LA Weekly, 94 law enforcement access to data, 54–56 laws and regulations algorithms as regulatory measure, 16–17 bans on data uses, 39 and concerns about algorithm use, 3–4 and correlated equilibrium, 113 and differential privacy, 47–48 and “fairness gerrymandering,” 86–87 and fairness vs. accuracy of models, 77–78 and forbidden inputs, 66–67 and interpretability of model outputs, 172, 174 and investing scams, 138, 140–41 and navigation problems, 107 recent efforts to address machine learning issues, 14–15 and scale issues, 192 and scope of topics covered, 19–21 and shortcomings of anonymization methods, 24 and theoretical computer science field, 13–14 and the US Census, 49–50 Learner and Regulator game, 89 learning process, formal, 38–39 LeCun, Yann, 133 Legg, Shane, 179 lending and creditworthiness and algorithmic bias, 62 and algorithmic violations of fairness and privacy, 96 benefits of machine learning, 191–92 and concerns about algorithm use, 3 and criticisms of ethical algorithm efforts, 193–94 and “fairness gerrymandering,” 86–87 and interpretability of model outputs, 171–72, 174 “merit” and algorithmic fairness, 72–74, 84 and Pareto frontiers, 82–86 random lending, 69–71 and statistical parity, 69–72, 84 and unique challenges of algorithms, 8 Lewis, Amanda, 94–95, 97 linear models, 173 loan applications, 171–72, 174 local differential privacy, 46–47 local trust model, 47 location data and broken anonymity, 2–3 and scope of topics covered, 19–20 and shortcomings of anonymization methods, 23, 27–29 logistic regression, 173 Loken, Eric, 159 lung cancer research, 34–36, 39, 51, 53–54 lying, 111–12 machine learning and adaptive data analysis, 160–62 and aggregate data, 30 and algorithmic game theory, 101 and algorithmic morality, 176–77 and algorithms as regulatory measure, 16–17 contrasted with human learning, 6–7 criminal justice applications, 3 and dangers of adaptive data analysis, 151, 153–54 and data collection bias, 90–93 and differential privacy, 38–39, 52 and discrimination, 96 and echo chamber equilibrium, 124 and “fairness gerrymandering,” 87–90 and forbidden inputs, 68 generative adversarial network (GAN), 133 and Google’s “word embedding” model, 58 and image recognition competition, 145–49, 165–67 and interpretability of model outputs, 171–74 and limitations of differential privacy, 51 and “merit” in algorithmic fairness, 74 and Pareto frontier, 82 and product recommendation algorithms, 122–23 recent laws and regulations, 15 and scope of topics covered, 18–19 self-play in, 131–34 and theoretical computer science, 13 and unique challenges of algorithms, 7–11 and weaknesses of aggregate data, 31 and weaknesses of encryption, 32 and word embedding, 57–63 macroscopic efficiency, 112 Magrin, Lisa, 2–3 Manhattan Project, 17 markets, 177–78 Marriott, 32 Massachusetts Institute of Technology (MIT), 23 matching markets, 126–30 mathematical constraints, 37–38 mathematical literacy, 172.


pages: 288 words: 86,995

Rule of the Robots: How Artificial Intelligence Will Transform Everything by Martin Ford

AI winter, Airbnb, algorithmic bias, algorithmic trading, Alignment Problem, AlphaGo, Amazon Mechanical Turk, Amazon Web Services, artificial general intelligence, Automated Insights, autonomous vehicles, backpropagation, basic income, Big Tech, big-box store, call centre, carbon footprint, Chris Urmson, Claude Shannon: information theory, clean water, cloud computing, commoditize, computer age, computer vision, Computing Machinery and Intelligence, coronavirus, correlation does not imply causation, COVID-19, crowdsourcing, data is the new oil, data science, deep learning, deepfake, DeepMind, Demis Hassabis, deskilling, disruptive innovation, Donald Trump, Elon Musk, factory automation, fake news, fulfillment center, full employment, future of work, general purpose technology, Geoffrey Hinton, George Floyd, gig economy, Gini coefficient, global pandemic, Googley, GPT-3, high-speed rail, hype cycle, ImageNet competition, income inequality, independent contractor, industrial robot, informal economy, information retrieval, Intergovernmental Panel on Climate Change (IPCC), Internet of things, Jeff Bezos, job automation, John Markoff, Kiva Systems, knowledge worker, labor-force participation, Law of Accelerating Returns, license plate recognition, low interest rates, low-wage service sector, Lyft, machine readable, machine translation, Mark Zuckerberg, Mitch Kapor, natural language processing, Nick Bostrom, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, Ocado, OpenAI, opioid epidemic / opioid crisis, passive income, pattern recognition, Peter Thiel, Phillips curve, post scarcity, public intellectual, Ray Kurzweil, recommendation engine, remote working, RFID, ride hailing / ride sharing, Robert Gordon, Rodney Brooks, Rubik’s Cube, Sam Altman, self-driving car, Silicon Valley, Silicon Valley startup, social distancing, SoftBank, South of Market, San Francisco, special economic zone, speech recognition, stealth mode startup, Stephen Hawking, superintelligent machines, TED Talk, The Future of Employment, The Rise and Fall of American Growth, the scientific method, Turing machine, Turing test, Tyler Cowen, Tyler Cowen: Great Stagnation, Uber and Lyft, uber lyft, universal basic income, very high income, warehouse automation, warehouse robotics, Watson beat the top human players on Jeopardy!, WikiLeaks, women in the workforce, Y Combinator

Suppose, however, that the past hiring decisions upon which the algorithm is trained reflect some degree of overt or subconscious racism or sexism on the part of hiring managers. In that case, the machine learning system will automatically pick up that bias as it goes through its normal training process. There is no nefarious intent on the part of the creators of the algorithm; the bias exists in the training data. The result would be a system that perpetuated, or perhaps even amplified, existing human biases and would be demonstrably unfair to people of color or women. Something very similar to this happened at Amazon in 2018 when the company halted development of a machine learning system because it was shown to be biased against women when screening resumes for technical positions.

AN EXISTENTIAL THREAT FROM SUPERINTELLIGENCE AND THE “CONTROL PROBLEM” The AI risk that transcends all others is the possibility that machines with superhuman intelligence might someday wrest themselves from our direct control and pursue a course of action that ultimately presents an existential threat to humanity. Security issues, weaponization and algorithmic bias all pose immediate or near-term dangers. These are concerns that we clearly need to be addressing right now—before it is too late. An existential threat from superintelligence, however, is far more speculative and almost certainly lies decades—or perhaps even a century or more—in the future.


pages: 340 words: 97,723

The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity by Amy Webb

"Friedman doctrine" OR "shareholder theory", Ada Lovelace, AI winter, air gap, Airbnb, airport security, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic bias, AlphaGo, Andy Rubin, artificial general intelligence, Asilomar, autonomous vehicles, backpropagation, Bayesian statistics, behavioural economics, Bernie Sanders, Big Tech, bioinformatics, Black Lives Matter, blockchain, Bretton Woods, business intelligence, Cambridge Analytica, Cass Sunstein, Charles Babbage, Claude Shannon: information theory, cloud computing, cognitive bias, complexity theory, computer vision, Computing Machinery and Intelligence, CRISPR, cross-border payments, crowdsourcing, cryptocurrency, Daniel Kahneman / Amos Tversky, data science, deep learning, DeepMind, Demis Hassabis, Deng Xiaoping, disinformation, distributed ledger, don't be evil, Donald Trump, Elon Musk, fail fast, fake news, Filter Bubble, Flynn Effect, Geoffrey Hinton, gig economy, Google Glasses, Grace Hopper, Gödel, Escher, Bach, Herman Kahn, high-speed rail, Inbox Zero, Internet of things, Jacques de Vaucanson, Jeff Bezos, Joan Didion, job automation, John von Neumann, knowledge worker, Lyft, machine translation, Mark Zuckerberg, Menlo Park, move fast and break things, Mustafa Suleyman, natural language processing, New Urbanism, Nick Bostrom, one-China policy, optical character recognition, packet switching, paperclip maximiser, pattern recognition, personalized medicine, RAND corporation, Ray Kurzweil, Recombinant DNA, ride hailing / ride sharing, Rodney Brooks, Rubik’s Cube, Salesforce, Sand Hill Road, Second Machine Age, self-driving car, seminal paper, SETI@home, side project, Silicon Valley, Silicon Valley startup, skunkworks, Skype, smart cities, South China Sea, sovereign wealth fund, speech recognition, Stephen Hawking, strong AI, superintelligent machines, surveillance capitalism, technological singularity, The Coming Technological Singularity, the long tail, theory of mind, Tim Cook: Apple, trade route, Turing machine, Turing test, uber lyft, Von Neumann architecture, Watson beat the top human players on Jeopardy!, zero day

“Why the Future Doesn’t Need Us.” Wired, April 1, 2000. http://www.wired.com/wired/archive/8.04/joy.html. Kelly, K. The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future. New York: Viking, 2016. Kirkpatrick, K. “Battling Algorithmic Bias.” Communications of the ACM 59, no. 10 (2016): 16–17. https://cacm.acm.org/magazines/2016/10/207759-battling-algorithmic-bias/abstract. Knight, W. “AI Fight Club Could Help Save Us from a Future of Super-Smart Cyberattacks.” MIT Technology Review, July 20, 2017. https://www.technologyreview.com/s/608288/ai-fight-club-could-help-save-us-from-afuture-of-supersmart-cyberattacks/. .


System Error by Rob Reich

"Friedman doctrine" OR "shareholder theory", "World Economic Forum" Davos, 2021 United States Capitol attack, A Declaration of the Independence of Cyberspace, Aaron Swartz, AI winter, Airbnb, airport security, Alan Greenspan, Albert Einstein, algorithmic bias, AlphaGo, AltaVista, artificial general intelligence, Automated Insights, autonomous vehicles, basic income, Ben Horowitz, Berlin Wall, Bernie Madoff, Big Tech, bitcoin, Blitzscaling, Cambridge Analytica, Cass Sunstein, clean water, cloud computing, computer vision, contact tracing, contact tracing app, coronavirus, corporate governance, COVID-19, creative destruction, CRISPR, crowdsourcing, data is the new oil, data science, decentralized internet, deep learning, deepfake, DeepMind, deplatforming, digital rights, disinformation, disruptive innovation, Donald Knuth, Donald Trump, driverless car, dual-use technology, Edward Snowden, Elon Musk, en.wikipedia.org, end-to-end encryption, Fairchild Semiconductor, fake news, Fall of the Berlin Wall, Filter Bubble, financial engineering, financial innovation, fulfillment center, future of work, gentrification, Geoffrey Hinton, George Floyd, gig economy, Goodhart's law, GPT-3, Hacker News, hockey-stick growth, income inequality, independent contractor, informal economy, information security, Jaron Lanier, Jeff Bezos, Jim Simons, jimmy wales, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Perry Barlow, Lean Startup, linear programming, Lyft, Marc Andreessen, Mark Zuckerberg, meta-analysis, minimum wage unemployment, Monkeys Reject Unequal Pay, move fast and break things, Myron Scholes, Network effects, Nick Bostrom, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, NP-complete, Oculus Rift, OpenAI, Panopticon Jeremy Bentham, Parler "social media", pattern recognition, personalized medicine, Peter Thiel, Philippa Foot, premature optimization, profit motive, quantitative hedge fund, race to the bottom, randomized controlled trial, recommendation engine, Renaissance Technologies, Richard Thaler, ride hailing / ride sharing, Ronald Reagan, Sam Altman, Sand Hill Road, scientific management, self-driving car, shareholder value, Sheryl Sandberg, Shoshana Zuboff, side project, Silicon Valley, Snapchat, social distancing, Social Responsibility of Business Is to Increase Its Profits, software is eating the world, spectrum auction, speech recognition, stem cell, Steve Jobs, Steven Levy, strong AI, superintelligent machines, surveillance capitalism, Susan Wojcicki, tech billionaire, tech worker, techlash, technoutopianism, Telecommunications Act of 1996, telemarketer, The Future of Employment, TikTok, Tim Cook: Apple, traveling salesman, Triangle Shirtwaist Factory, trolley problem, Turing test, two-sided market, Uber and Lyft, uber lyft, ultimatum game, union organizing, universal basic income, washing machines reduced drudgery, Watson beat the top human players on Jeopardy!, When a measure becomes a target, winner-take-all economy, Y Combinator, you are the product

They had not deliberately inserted that bias or actively programmed a “sexist algorithm.” Yet gender bias had crept in. The team attempted to adjust the code to neutralize the bias, but they could not rid the tool of all potential discrimination. After years of effort, Amazon decided to scrap its vision for the tool altogether and disband the team responsible for it. The Amazon case is a revealing one and sparks specific questions we should ask about the rise of automated decision-making tools: If one of the most powerful companies in the world can’t successfully build an algorithmic tool free of bias, can anyone? When new technologies are deployed to aid in or replace human decision-making, what standard of objectivity should we hold for their automated replacements?

In transforming how we teach technologists, we must also pay attention to whom we are teaching. Since new technologies encode the needs, perspectives, and values of those who create (and fund) them, it’s no surprise that the lack of diversity in tech companies is part of the problem. It helps explain algorithmic bias, an inattention to the ways that surveillance has been misused, a lack of concern for the distributional harms caused by automation, and the proliferation of hate speech online. The movement afoot among companies and funders to recruit, support, and retain a diverse field in technology is long overdue.

See artificial intelligence “Aims of Education” (Allen), xi Albrecht, Jan Philipp, 143 Alciné, Jacky, 86 algorithmic amplification of potentially harmful speech, 219 algorithmic decision-making, 79–109 accountability, 99–101 auditing, 105, 107–9 bias elimination, 80–82, 86–87, 101, 105–6 in criminal justice system, 94–99 defining fairness, 89–94 designing fair algorithms, 87–94 emphasis on producing efficient solutions, 11 engineers provide high-level descriptions of their models, 107–8 governing algorithms, 103–7 human bias introduced by data, 80–82, 83, 86, 101–3 job applicants gaming the system, 83–84, 102–3 key ingredients in the governance of, 105–7 machines that learn, 82–87, 109 transparency and auditing requirements, 105, 107–9 for “Traveling Salesperson Problem,” 12–13 Algorithmic Justice League, 241 Algorithms to Live By (Christian and Griffiths), 14 Allen, Danielle, xi, 70 Amazon, 4–5, 64–65, 79–82, 180.


pages: 444 words: 117,770

The Coming Wave: Technology, Power, and the Twenty-First Century's Greatest Dilemma by Mustafa Suleyman

"World Economic Forum" Davos, 23andMe, 3D printing, active measures, Ada Lovelace, additive manufacturing, agricultural Revolution, AI winter, air gap, Airbnb, Alan Greenspan, algorithmic bias, Alignment Problem, AlphaGo, Alvin Toffler, Amazon Web Services, Anthropocene, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, ASML, autonomous vehicles, backpropagation, barriers to entry, basic income, benefit corporation, Big Tech, biodiversity loss, bioinformatics, Bletchley Park, Blitzscaling, Boston Dynamics, business process, business process outsourcing, call centre, Capital in the Twenty-First Century by Thomas Piketty, ChatGPT, choice architecture, circular economy, classic study, clean tech, cloud computing, commoditize, computer vision, coronavirus, corporate governance, correlation does not imply causation, COVID-19, creative destruction, CRISPR, critical race theory, crowdsourcing, cryptocurrency, cuban missile crisis, data science, decarbonisation, deep learning, deepfake, DeepMind, deindustrialization, dematerialisation, Demis Hassabis, disinformation, drone strike, drop ship, dual-use technology, Easter island, Edward Snowden, effective altruism, energy transition, epigenetics, Erik Brynjolfsson, Ernest Rutherford, Extinction Rebellion, facts on the ground, failed state, Fairchild Semiconductor, fear of failure, flying shuttle, Ford Model T, future of work, general purpose technology, Geoffrey Hinton, global pandemic, GPT-3, GPT-4, hallucination problem, hive mind, hype cycle, Intergovernmental Panel on Climate Change (IPCC), Internet Archive, Internet of things, invention of the wheel, job automation, John Maynard Keynes: technological unemployment, John von Neumann, Joi Ito, Joseph Schumpeter, Kickstarter, lab leak, large language model, Law of Accelerating Returns, Lewis Mumford, license plate recognition, lockdown, machine readable, Marc Andreessen, meta-analysis, microcredit, move 37, Mustafa Suleyman, mutually assured destruction, new economy, Nick Bostrom, Nikolai Kondratiev, off grid, OpenAI, paperclip maximiser, personalized medicine, Peter Thiel, planetary scale, plutocrats, precautionary principle, profit motive, prompt engineering, QAnon, quantum entanglement, ransomware, Ray Kurzweil, Recombinant DNA, Richard Feynman, Robert Gordon, Ronald Reagan, Sam Altman, Sand Hill Road, satellite internet, Silicon Valley, smart cities, South China Sea, space junk, SpaceX Starlink, stealth mode startup, stem cell, Stephen Fry, Steven Levy, strong AI, synthetic biology, tacit knowledge, tail risk, techlash, techno-determinism, technoutopianism, Ted Kaczynski, the long tail, The Rise and Fall of American Growth, Thomas Malthus, TikTok, TSMC, Turing test, Tyler Cowen, Tyler Cowen: Great Stagnation, universal basic income, uranium enrichment, warehouse robotics, William MacAskill, working-age population, world market for maybe five computers, zero day

Talking about the ethics of machine learning systems is a world away from, say, the technical safety of synthetic bio. These discussions happen in isolated, echoey silos. They rarely break out. Yet I believe they are aspects of what amounts to the same phenomenon; they all aim to address different aspects of the same wave. It’s not enough to have dozens of separate conversations about algorithmic bias or bio-risk or drone warfare or the economic impact of robotics or the privacy implications of quantum computing. It completely underplays how interrelated both causes and effects are. We need an approach that unifies these disparate conversations, encapsulating all those different dimensions of risk, a general-purpose concept for this general-purpose revolution.

In 2022 the White House released a blueprint for an AI Bill of Rights with five core principles “to help guide the design, development, and deployment of artificial intelligence and other automated systems so that they protect the rights of the American public.” Citizens should, it says, be protected from unsafe and ineffective systems and algorithmic bias. No one should be forced to subject themselves to AI. Everyone has the right to say no. Efforts like this should be widely supported and quickly implemented. However, policy makers’ imaginations will need to match the scope of technology. Government needs to go further. For understandable reasons, we don’t let any business build or operate nuclear reactors in any way they see fit.


pages: 276 words: 81,153

Outnumbered: From Facebook and Google to Fake News and Filter-Bubbles – the Algorithms That Control Our Lives by David Sumpter

affirmative action, algorithmic bias, AlphaGo, Bernie Sanders, Brexit referendum, Cambridge Analytica, classic study, cognitive load, Computing Machinery and Intelligence, correlation does not imply causation, crowdsourcing, data science, DeepMind, Demis Hassabis, disinformation, don't be evil, Donald Trump, Elon Musk, fake news, Filter Bubble, Geoffrey Hinton, Google Glasses, illegal immigration, James Webb Space Telescope, Jeff Bezos, job automation, Kenneth Arrow, Loebner Prize, Mark Zuckerberg, meta-analysis, Minecraft, Nate Silver, natural language processing, Nelson Mandela, Nick Bostrom, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, p-value, post-truth, power law, prediction markets, random walk, Ray Kurzweil, Robert Mercer, selection bias, self-driving car, Silicon Valley, Skype, Snapchat, social contagion, speech recognition, statistical model, Stephen Hawking, Steve Bannon, Steven Pinker, TED Talk, The Signal and the Noise by Nate Silver, traveling salesman, Turing test

Moreover, if we consider the individuals who weren’t shown the advert, we could be said to have discriminated against males. One in 11 of the men who didn’t see the advert were interested in the job, while only one in 27 of the women who didn’t see the advert were interested. Our new algorithm has a calibration bias that favours females. Table 6.3 Revised breakdown of the men and women shown an advert for our revised (thought experiment) Facebook campaign. Unfairness is like those whack-a-mole games at the fairground where the mole keeps popping up in different places. You hammer it down in one place and another one comes out somewhere else.

The unsupervised algorithms looking at what we write are not programmed to be prejudiced. When we look at what they have learnt about us, they simply reflect the prejudice of the social world we live in. I also thought back to my discussion with Michal Kosinski. Michal had been very enthusiastic about the possibility of algorithms eliminating bias. And, as he predicted, researchers were already proposing tools for extracting information about applicants’ qualities and experience from their CVs.9 One Danish start-up, Relink, is using techniques similar to GloVe to summarise cover letters and match applicants to jobs. But looking more deeply at how the GloVe model works, I had found good reason to be cautious about this approach.


pages: 252 words: 72,473

Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O'Neil

Affordable Care Act / Obamacare, Alan Greenspan, algorithmic bias, Bernie Madoff, big data - Walmart - Pop Tarts, call centre, Cambridge Analytica, carried interest, cloud computing, collateralized debt obligation, correlation does not imply causation, Credit Default Swap, credit default swaps / collateralized debt obligations, crowdsourcing, data science, disinformation, electronic logging device, Emanuel Derman, financial engineering, Financial Modelers Manifesto, Glass-Steagall Act, housing crisis, I will remember that I didn’t make the world, and it doesn’t satisfy my equations, Ida Tarbell, illegal immigration, Internet of things, late fees, low interest rates, machine readable, mass incarceration, medical bankruptcy, Moneyball by Michael Lewis explains big data, new economy, obamacare, Occupy movement, offshore financial centre, payday loans, peer-to-peer lending, Peter Thiel, Ponzi scheme, prediction markets, price discrimination, quantitative hedge fund, Ralph Nader, RAND corporation, real-name policy, recommendation engine, Rubik’s Cube, Salesforce, Sharpe ratio, statistical model, tech worker, Tim Cook: Apple, too big to fail, Unsafe at Any Speed, Upton Sinclair, Watson beat the top human players on Jeopardy!, working poor

If you plot year-to-year scores on a chart: Gary Rubinstein, “Analyzing Released NYC Value-Added Data Part 2,” Gary Rubinstein’s Blog, February 28, 2012, http://​garyrubinstein.​teachforus.​org/​2012/​02/​28/​analyzing-​released-​nyc-​value-​added-​data-​part​2/. as the computer scientist Cynthia Dwork has noted: Claire Cain Miller, “Algorithms and Bias: Q. and A. with Cynthia Dwork,” New York Times, August 10, 2015, www.​nytimes.​com/​2015/​08/​11/​upshot/​algorithms-​and-​bias-​q-​and-​a-​with-​cynthia-​dwork.​html. Web Transparency and Accountability Project: Elizabeth Dwoskin, “How Social Bias Creeps into Web Technology,” Wall Street Journal, August 21, 2015, www.​wsj.​com/​articles/​computers-​are-​showing-​their-​biases-​and-​tech-​firms-​are-​concerned-​1440102894.


pages: 245 words: 83,272

Artificial Unintelligence: How Computers Misunderstand the World by Meredith Broussard

"Susan Fowler" uber, 1960s counterculture, A Declaration of the Independence of Cyberspace, Ada Lovelace, AI winter, Airbnb, algorithmic bias, AlphaGo, Amazon Web Services, autonomous vehicles, availability heuristic, barriers to entry, Bernie Sanders, Big Tech, bitcoin, Buckminster Fuller, Charles Babbage, Chris Urmson, Clayton Christensen, cloud computing, cognitive bias, complexity theory, computer vision, Computing Machinery and Intelligence, crowdsourcing, Danny Hillis, DARPA: Urban Challenge, data science, deep learning, Dennis Ritchie, digital map, disruptive innovation, Donald Trump, Douglas Engelbart, driverless car, easy for humans, difficult for computers, Electric Kool-Aid Acid Test, Elon Musk, fake news, Firefox, gamification, gig economy, global supply chain, Google Glasses, Google X / Alphabet X, Greyball, Hacker Ethic, independent contractor, Jaron Lanier, Jeff Bezos, Jeremy Corbyn, John Perry Barlow, John von Neumann, Joi Ito, Joseph-Marie Jacquard, life extension, Lyft, machine translation, Mark Zuckerberg, mass incarceration, Minecraft, minimum viable product, Mother of all demos, move fast and break things, Nate Silver, natural language processing, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, One Laptop per Child (OLPC), opioid epidemic / opioid crisis, PageRank, Paradox of Choice, payday loans, paypal mafia, performance metric, Peter Thiel, price discrimination, Ray Kurzweil, ride hailing / ride sharing, Ross Ulbricht, Saturday Night Live, school choice, self-driving car, Silicon Valley, Silicon Valley billionaire, speech recognition, statistical model, Steve Jobs, Steven Levy, Stewart Brand, TechCrunch disrupt, Tesla Model S, the High Line, The Signal and the Noise by Nate Silver, theory of mind, traumatic brain injury, Travis Kalanick, trolley problem, Turing test, Uber for X, uber lyft, Watson beat the top human players on Jeopardy!, We are as Gods, Whole Earth Catalog, women in the workforce, work culture , yottabyte

Index Abacus, 75 Ability beliefs, 83 Academy at Palumbo, 56–57 Ackerman, Arlene, 58–59 Activism, cyberspace, 82–83 Adair, Bill, 45 AI Now Institute, 194–195 AirBnB, 168 Albrecht, Steve, 159 Alda, Alan, 70 Alexa, 38–39, 72 Alexander, Michelle, 159 Algorithmic accountability reporting, 7, 43–44, 65–66 Algorithms bias in, 44, 150, 155–157, 195 defined, 7, 94 elevator, 157 function of, 43–44 risk, 44, 155–156 tic-tac-toe, 34 Alphabet, 96 AlphaGo, 33–37 Amazon, 115, 158 Analytical engine, 76 Anarcho-capitalism, 83 Anderson, C. W., 46–47 Angwin, Julia, 154–156 App hackathons, 165–174 Apple Watch, 157 Artificial intelligence (AI) beginnings, 69–73 expert systems, 52–53, 179 fantasy of, 132 in film, 31, 32, 198 foundations of, 9 future of, 194–196 games and, 33–37 general, 10–11, 32 narrow, 10–11, 32–33, 97 popularity of, 90 real vs. imagined, 31–32 research, women in, 158 sentience challenge in, 129 Asimov, Isaac, 71 Assembly language, 24 Association for Computing Machinery (ACM), 145 Astrolabe, 76 Asymmetry, positive, 28 Automation technology, 176–177 Autopilot, 121 Availability heuristic, 96 Babbage, Charles, 76–77 Bailiwick (Broussard), 182–185, 190–191, 193 Barlow, John Perry, 82–83 Bell Labs, 13 Bench, Shane, 84 Ben Franklin Racing Team (Little Ben), 122–127 Berkman Klein Center (Harvard), 195 Berners-Lee, Tim, 4–5, 47 Bezos, Jeff, 73, 115 Bias in algorithms, 44, 150, 155–157 in algorithms, racial, 44, 155–156 genius myth and, 83–84 programmers and, 155–158 in risk ratings, 44, 155–156 in STEM fields, 83–84 Bill & Melinda Gates Foundation, 60–61, 157 Bipartisan Campaign Reform Act, 180 Bitcoin, 159 Bizannes, Elias, 165, 166, 171 Blow, Charles, 95 Boggs, David, 67–68 Boole, George, 77 Boolean algebra, 77 Borden, Brisha, 154–155 Borsook, Paulina, 82 Bowhead Systems Management, 137 boyd, danah, 195 Bradley, Earl, 43 Brains 19–20, 95, 128–129, 132, 144, 150 Brand, Stewart, 5, 29, 70, 73, 81–82 Brin, Sergei, 72, 151 Brown, Joshua D., 140, 142 Bump, Philip, 186 Burroughs, William S., 77 Burroughs, William Seward, 77 Calculation vs. consciousness, 37 Cali-Fame, 186 California, drug use in, 158–159 Cameron, James, 95 Campaign finance, 177–186, 191 Čapek, Karel, 129 Caprio, Mike, 170–171 Carnegie Mellon University, autonomous vehicle research ALVINN, 131 University Racing Team (Boss), 124, 126–127, 130–131 Cars deaths associated with, 136–138, 146 distracted driving of, 146 human-centered design for, 147 Cars, self-driving 2005 Grand Challenge, 123–124 2007 Grand Challenge, 122–127 algorithms in, 139 artificial intelligence in, 129–131, 133 deaths in, 140 driver-assistance technology from, 135, 146 economics of, 147 experiences in, 121–123, 125–126, 128 fantasy of, 138, 142, 146 GPS hacking, 139 LIDAR guidance system, 139 machine ethics, 144–145, 147 nausea in, 121–123 NHTSA categories for, 134 problems/limitations, 138–140, 142–146 research funding, 133 SAE standards for levels of automation, 134–135 safety, 136–137, 140–142, 143, 146 sentience in, 132 Uber’s use of, 139 Udacity open-source car competition, 135 Waymo technology, 136 CERN, 4–5 Cerulo, Karen A., 28 Chess, 33 Children’s Online Privacy Protection Act (COPPA), 63–64 Chinese Room argument, 38 Choxi, Heteen, 122 Christensen, Clayton, 163 Chrome, 25, 26 Citizens United, 177, 178, 180 Clarke, Arthur C., 71–72 Client-server model, 27 Clinkenbeard, John, 172 Cloud computing, 26, 52, 196 Cohen, Brian, 56–57 Collins, John, 117 Common Core State Standards, 60–61 Communes, 5, 10 Computer ethics, 144–145 Computer Go, 34–36 Computers assumptions about vs. reality of, 8 components, identifying, 21–22 consciousness, 17 early, 196–199 human, 77–78, 198 human brains vs., 19–20, 128–129, 132, 144, 150 human communication vs., 169–170 human mind vs., 38 imagination, 128 limitations, 6–7, 27–28, 37–39 memory, 131 modern-day, development of, 75–79 operating systems, 24–25 in schools, 63–65 sentience, 17, 129 Computer science bias in, 79 ethical training, 145 explaining the world through, 118 women in, 5 Consciousness vs. calculation, 37 Constants in programming, 88 Content-management system (CMS), 26 Cooper, Donna, 58 Copeland, Jack, 74–75 Correctional Offender Management Profiling for Alternative Sanctions (COMPAS), 44, 155–156 Cortana, 72 Counterculture, 5, 81–82 Cox, Amanda, 41–42 Crawford, Kate, 194 Crime reporting, 154–155 CTB/McGraw-Hill, 53 Cumberbatch, Benedict, 74 Cyberspace activism, 82–83 DarkMarket, 159 Dark web, 82 Data on campaign finance, 178–179 computer-generated, 18–19 defined, 18 dirty, 104 generating, 18 people and, 57 social construction of, 18 unreasonable effectiveness of, 118–119, 121, 129 Data & Society, 195 DataCamp, 96 Data density theory, 169 Data journalism, 6, 43–47, 196 Data Journalism Awards, 196 Data journalism stories cost-benefit of, 47 on inflation, 41–42 Parliament members’ expenses, 46 on police speeding, 43 on police stops of people of color, 43 price discrimination, 46 on sexual abuse by doctors, 42–43 Data Privacy Lab (Harvard), 195 Data Recognition Corporation (DRC), 53 Datasets in machine learning, 94–95 Data visualizations, 41–42 Deaths distracted driving accidents, 146 from poisoning, 137 from road accidents, 136–138 in self-driving cars, 140 Decision making computational, 12, 43, 150 data-driven, 119 machine learning and, 115–116, 118–119 subjective, 150 Deep Blue (IBM), 33 Deep learning, 33 Defense Advanced Research Projects Agency (DARPA) Grand Challenge, 123, 131, 133, 164 Desmond, Matthew, 115 Detroit race riots story, 44 Dhondt, Rebecca, 58 Diakopoulos, Nicholas, 46 Difference engine, 76 Differential pricing and race, 116 Digital age, 193 Digital revolution, 193–194 Dinakar, Karthik, 195 Django, 45, 89 DocumentCloud, 52, 196 Domino’s, 170 Drone technology, 67–68 Drug marketplace, online, 159–160 Drug use, 80–81, 158–160 Duncan, Arne, 51 Dunier, Mitchell, 115 Edison, Thomas, 77 Education change, implementing in, 62–63 Common Core State Standards, 60–61 competence bar in, 150 computers in schools, 63–65 equality in, 77–78 funding, 60 supplies, availability of, 58 technochauvinist solutions for, 63 textbook availability, 53–60 unpredictability in, 62 18F, 178–179 Electronic Frontier Foundation, 82 Elevators, 156–157 Eliza, 27–28 Emancipation Proclamation, 78 Engelbart, Doug, 25, 80–81 Engineers, ethical training, 145 ENIAC, 71, 194, 196–199 Equality in education, 77–78 techno hostility toward, 83 technological, creating, 87 technology vs., 115, 156 for women, 5, 77–78, 83–85, 158 Essa, Irfan, 46 Ethics, 144–145, 147 EveryBlock, 46 Expertise, cognitive fallacies associated, 83 Expert systems, 52–53, 179 Facebook, 70, 83, 152, 158, 197 Facial recognition, 157 Fact checking, 45–46 Fake news, 154 Family Educational Rights and Privacy Act (FERPA), 63–64 FEC, McCutcheon v., 180 FEC, Speechnow.org v., 180 FEC.gov, 178–179 Film, AI in, 31, 32, 198 FiveThirtyEight.com, 47 Foote, Tully, 122–123, 125 Ford Motor Company, 140 Fowler, Susan, 74 Fraud campaign finance, 180 Internet advertising, 153–154 Free press, role of, 44 Free speech, 82 Fuller, Buckminster, 74 Futurists, 89–90 Games, AI and, 33–37 Gates, Bill, 61 Gates, Melinda, 157–158 Gawker, 83 Gender equality, hostility toward, 83 Gender gap, 5, 84–85, 115, 158 Genius, cult of, 75 Genius myth, 83–84 Ghost-in-the-machine fallacy, 32, 39 Giffords, Gabby, 19–20 GitHub, 135 Go, 33–37 Good Old-Fashioned Artificial Intelligence (GOFAI), 10 Good vs. popular, 149–152, 160 Google, 72 Google Docs, 25 Google Maps API, 46 Google Street View, 131 Google X, 138, 151, 158 Government campaign finance, 177–186, 191 cyberspace activism, antigovernment ideology, 82–83 tech hostility toward, 82–83 Graphical user interface (GUI), 25, 72 Greyball, 74 Guardian, 45, 46 Hackathons, 165–174 Hackers, 69–70, 82, 153–154, 169, 173 Halevy, Alon, 119 Hamilton, James T., 47 Harley, Mike, 140 Harris, Melanie, 58–59 Harvard, Andrew, 184 Harvard University Berkman Klein Center, 195 Data Privacy Lab, 195 mathematics department, 84 “Hello, world” program, 13–18 Her, 31 Hern, Alex, 159 Hernandez, Daniel, Jr., 19 Heuristics, 95–96 Hillis, Danny, 73 Hippies, 5, 82 HitchBOT, 69 Hite, William, 58 Hoffman, Brian, 159 Holovaty, Adrian, 45–46 Home Depot, 46, 115, 155 Hooke, Robert, 88 Houghton Mifflin Harcourt (HMH) HP, 157 Hugo, Christoph von, 145 Human-centered design, 147, 177 Human computers, 77–78, 198 Human error, 136–137 Human-in-the-loop systems, 177, 179, 187, 195 Hurst, Alicia, 164 Illinois quarter, 153–154 Imagination, 89–90, 128 Imitation Game, The (film), 74 Information industry, annual pay, 153 Injury mortality, 137 Innovation computational, 25 disruptive, 163, 171 funding, 172–173 hackathons and, 166 Instacart, 171 Intelligence in machine learning Interestingness threshold, 188 International Foundation for Advanced Study, 81 Internet advertising model, 151 browsers, 25, 26 careers, annual pay rates, 153 core values, 150 drug marketplace, 159–160 early development of the, 5, 81 fraud, 153–154 online communities, technolibertarianism in culture of, 82–83 rankings, 72, 150–152 Internet Explorer, 25 Internet pioneers, inspiration for, 5, 81–82 Internet publishing industry, annual pay, 153 Internet search, 72, 150–152 Ito, Joi, 147, 195 Jacquard, Joseph Marie, 76 Java, 89 JavaScript, 89 Jobs, Steve, 25, 70, 72, 80, 81 Jones, Paul Tudor, 187–188 Journalism.


pages: 606 words: 157,120

To Save Everything, Click Here: The Folly of Technological Solutionism by Evgeny Morozov

"World Economic Forum" Davos, 3D printing, algorithmic bias, algorithmic trading, Amazon Mechanical Turk, An Inconvenient Truth, Andrew Keen, augmented reality, Automated Insights, behavioural economics, Berlin Wall, big data - Walmart - Pop Tarts, Buckminster Fuller, call centre, carbon footprint, Cass Sunstein, choice architecture, citizen journalism, classic study, cloud computing, cognitive bias, creative destruction, crowdsourcing, data acquisition, Dava Sobel, digital divide, disintermediation, Donald Shoup, driverless car, East Village, en.wikipedia.org, Evgeny Morozov, Fall of the Berlin Wall, Filter Bubble, Firefox, Francis Fukuyama: the end of history, frictionless, future of journalism, game design, gamification, Gary Taubes, Google Glasses, Ian Bogost, illegal immigration, income inequality, invention of the printing press, Jane Jacobs, Jean Tirole, Jeff Bezos, jimmy wales, Julian Assange, Kevin Kelly, Kickstarter, license plate recognition, lifelogging, lolcat, lone genius, Louis Pasteur, machine readable, Mark Zuckerberg, market fundamentalism, Marshall McLuhan, moral panic, Narrative Science, Nelson Mandela, Nicholas Carr, packet switching, PageRank, Parag Khanna, Paul Graham, peer-to-peer, Peter Singer: altruism, Peter Thiel, pets.com, placebo effect, pre–internet, public intellectual, Ray Kurzweil, recommendation engine, Richard Thaler, Ronald Coase, Rosa Parks, self-driving car, Sheryl Sandberg, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Skype, Slavoj Žižek, smart meter, social graph, social web, stakhanovite, Steve Jobs, Steven Levy, Stuxnet, surveillance capitalism, systems thinking, technoutopianism, TED Talk, the built environment, The Chicago School, The Death and Life of Great American Cities, the medium is the message, The Nature of the Firm, the scientific method, The Wisdom of Crowds, Thomas Kuhn: the structure of scientific revolutions, Thomas L Friedman, transaction costs, Twitter Arab Spring, urban decay, urban planning, urban sprawl, Vannevar Bush, warehouse robotics, WikiLeaks, work culture , Yochai Benkler

EdgeRank chooses which news items—from the thousands shared by your friends—you should see when you log into the site. According to Bucher’s research, Facebook wants to feed us stuff with high meme potential; thus, it studies what kinds of stories—from which friends? on which subjects?—users tend to click on most often. Bucher even writes of “the algorithmic bias towards making those stories that signify engagement more visible than those that do not.” There is nothing wrong with this attitude per se, but as with Twitter, it does embody a certain vision of how public life should function and what it should reward, and it does make meme manufacturing easier.

Algorithmic Power and the Threat of Invisibility on Facebook,” New Media & Society 14, no. 7 (2012), available at http://nms.sagepub.com/content/14/7/1164; Bucher, “The Friendship Assemblage: Investigating Programmed Sociality on Facebook,” Television & New Media, August 2012, http://tvn.sagepub.com/content/early/2012/08/14/1527476412452800.abstract; and Bucher, “A Technicity of Attention: How Software Makes Sense,’” Culture Machine 13 (2012), http://culturemachine.net/index.php/cm/article/viewArticle/470. 158 “the algorithmic bias towards making those stories”: Bucher, “Algorithmic Power.” 158 Media scholar C. W. Anderson: C. W. Anderson, “Deliberative, Agonistic, and Algorithmic Audiences: Journalism’s Vision of Its Public in an Age of Audience Transparency,” International Journal of Communication 5 (2011): 529–547. 159 “If something is a total bummer”: quoted in Ryan Holiday, Trust Me, I’m Lying, 62. 159 “The economics of the web”: ibid., 62. 159 “efficiencies and the new breadth of artists”: Christopher Steiner, Automate This: How Algorithms Came to Rule Our World (New York: Portfolio Hardcover, 2012), 86. 160 historians of science Peter Galison and Lorraine Daston: Lorraine J.


pages: 661 words: 156,009

Your Computer Is on Fire by Thomas S. Mullaney, Benjamin Peters, Mar Hicks, Kavita Philip

"Susan Fowler" uber, 2013 Report for America's Infrastructure - American Society of Civil Engineers - 19 March 2013, A Declaration of the Independence of Cyberspace, affirmative action, Airbnb, algorithmic bias, AlphaGo, AltaVista, Amazon Mechanical Turk, Amazon Web Services, American Society of Civil Engineers: Report Card, An Inconvenient Truth, Asilomar, autonomous vehicles, Big Tech, bitcoin, Bletchley Park, blockchain, Boeing 737 MAX, book value, British Empire, business cycle, business process, Californian Ideology, call centre, Cambridge Analytica, carbon footprint, Charles Babbage, cloud computing, collective bargaining, computer age, computer vision, connected car, corporate governance, corporate social responsibility, COVID-19, creative destruction, cryptocurrency, dark matter, data science, Dennis Ritchie, deskilling, digital divide, digital map, don't be evil, Donald Davies, Donald Trump, Edward Snowden, en.wikipedia.org, European colonialism, fake news, financial innovation, Ford Model T, fulfillment center, game design, gentrification, George Floyd, glass ceiling, global pandemic, global supply chain, Grace Hopper, hiring and firing, IBM and the Holocaust, industrial robot, informal economy, Internet Archive, Internet of things, Jeff Bezos, job automation, John Perry Barlow, Julian Assange, Ken Thompson, Kevin Kelly, Kickstarter, knowledge economy, Landlord’s Game, Lewis Mumford, low-wage service sector, M-Pesa, Mark Zuckerberg, mass incarceration, Menlo Park, meta-analysis, mobile money, moral panic, move fast and break things, Multics, mutually assured destruction, natural language processing, Neal Stephenson, new economy, Norbert Wiener, off-the-grid, old-boy network, On the Economy of Machinery and Manufactures, One Laptop per Child (OLPC), packet switching, pattern recognition, Paul Graham, pink-collar, pneumatic tube, postindustrial economy, profit motive, public intellectual, QWERTY keyboard, Ray Kurzweil, Reflections on Trusting Trust, Report Card for America’s Infrastructure, Salesforce, sentiment analysis, Sheryl Sandberg, Silicon Valley, Silicon Valley ideology, smart cities, Snapchat, speech recognition, SQL injection, statistical model, Steve Jobs, Stewart Brand, tacit knowledge, tech worker, techlash, technoutopianism, telepresence, the built environment, the map is not the territory, Thomas L Friedman, TikTok, Triangle Shirtwaist Factory, undersea cable, union organizing, vertical integration, warehouse robotics, WikiLeaks, wikimedia commons, women in the workforce, Y2K

Germaine Moore, the suspect in the viral video case, was virtually undetectable, appearing in a video rather than still image and having never had his photograph logged in a prior arrest database against which to match his face. Further still, as a person of color Moore remains less detectable by facial-recognition algorithms that continue to bias accuracy toward white male faces.8 Apprehending Is Embodied Sensing To police is to apprehend—to recognize. Recognition is recall, data processing, and identification. To recognize is to bring close, to know again, to make one’s own. Recognition is a condition of proximity and closeness, or how near both beloved and despised people are presumed to be.

Part of my task in recording this history has been to convince these women that their contributions were important and that historians do value what they have to say. 13. Hicks, Programmed Inequality, 1–3. 14. For more on this, and a case study of one of the earliest examples of mainframe-era transphobic algorithmic bias, see Mar Hicks, “Hacking the Cis-tem: Transgender Citizens and the Early Digital State,” IEEE Annals of the History of Computing 41, no. 1 (January–March 2019): 20–33, https://doi.org/10.1109/MAHC.2019.2897667. 15. Hicks, Programmed Inequality, 90–93. 16. Hicks, Programmed Inequality, 93–96. 17.


pages: 625 words: 167,349

The Alignment Problem: Machine Learning and Human Values by Brian Christian

Albert Einstein, algorithmic bias, Alignment Problem, AlphaGo, Amazon Mechanical Turk, artificial general intelligence, augmented reality, autonomous vehicles, backpropagation, butterfly effect, Cambridge Analytica, Cass Sunstein, Claude Shannon: information theory, computer vision, Computing Machinery and Intelligence, data science, deep learning, DeepMind, Donald Knuth, Douglas Hofstadter, effective altruism, Elaine Herzberg, Elon Musk, Frances Oldham Kelsey, game design, gamification, Geoffrey Hinton, Goodhart's law, Google Chrome, Google Glasses, Google X / Alphabet X, Gödel, Escher, Bach, Hans Moravec, hedonic treadmill, ImageNet competition, industrial robot, Internet Archive, John von Neumann, Joi Ito, Kenneth Arrow, language acquisition, longitudinal study, machine translation, mandatory minimum, mass incarceration, multi-armed bandit, natural language processing, Nick Bostrom, Norbert Wiener, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, OpenAI, Panopticon Jeremy Bentham, pattern recognition, Peter Singer: altruism, Peter Thiel, precautionary principle, premature optimization, RAND corporation, recommendation engine, Richard Feynman, Rodney Brooks, Saturday Night Live, selection bias, self-driving car, seminal paper, side project, Silicon Valley, Skinner box, sparse data, speech recognition, Stanislav Petrov, statistical model, Steve Jobs, strong AI, the map is not the territory, theory of mind, Tim Cook: Apple, W. E. B. Du Bois, Wayback Machine, zero-sum game

In one of the first articles explicitly addressing the notion of bias in computing systems, the University of Washington’s Batya Friedman and Cornell’s Helen Nissenbaum had warned that “computer systems, for instance, are comparatively inexpensive to disseminate, and thus, once developed, a biased system has the potential for widespread impact. If the system becomes a standard in the field, the bias becomes pervasive.”40 Or, as Buolamwini herself puts it, “Halfway around the world, I learned that algorithmic bias can travel as quickly as it takes to download some files off of the internet.”41 After a Rhodes Scholarship at Oxford, Buolamwini came to the MIT Media Lab, and there she began working on an augmented-reality project she dubbed the “Aspire Mirror.” The idea was to project empowering or uplifting visuals onto the user’s face—making the onlooker transform into a lion, for instance.

., “Concrete Problems in AI Safety.” gives an overview of this issue, which comes up in various subsequent chapters of this book. 37. Hardt, “How Big Data Is Unfair.” 38. Jacky Alciné, personal interview, April 19, 2018. 39. Joy Buolamwini, “How I’m Fighting Bias in Algorithms,” https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms. 40. Friedman and Nissenbaum, “Bias in Computer Systems.” 41. Buolamwini, “How I’m Fighting Bias in Algorithms.” 42. Huang et al., “Labeled Faces in the Wild.” 43. Han and Jain, “Age, Gender and Race Estimation from Unconstrained Face Images.” 44. The estimate used here is 252 faces of Black women, arrived at by multiplying the proportion of women in the dataset (2,975/13,233) by the proportion of Black individuals in the dataset (1,122/13,233); numbers from Han and Jain. 45.


pages: 380 words: 109,724

Don't Be Evil: How Big Tech Betrayed Its Founding Principles--And All of US by Rana Foroohar

"Susan Fowler" uber, "World Economic Forum" Davos, accounting loophole / creative accounting, Airbnb, Alan Greenspan, algorithmic bias, algorithmic management, AltaVista, Andy Rubin, autonomous vehicles, banking crisis, barriers to entry, behavioural economics, Bernie Madoff, Bernie Sanders, Big Tech, bitcoin, Black Lives Matter, book scanning, Brewster Kahle, Burning Man, call centre, Cambridge Analytica, cashless society, clean tech, cloud computing, cognitive dissonance, Colonization of Mars, computer age, corporate governance, creative destruction, Credit Default Swap, cryptocurrency, data is the new oil, data science, deal flow, death of newspapers, decentralized internet, Deng Xiaoping, digital divide, digital rights, disinformation, disintermediation, don't be evil, Donald Trump, drone strike, Edward Snowden, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Etonian, Evgeny Morozov, fake news, Filter Bubble, financial engineering, future of work, Future Shock, game design, gig economy, global supply chain, Gordon Gekko, Great Leap Forward, greed is good, income inequality, independent contractor, informal economy, information asymmetry, intangible asset, Internet Archive, Internet of things, invisible hand, Jaron Lanier, Jeff Bezos, job automation, job satisfaction, junk bonds, Kenneth Rogoff, life extension, light touch regulation, low interest rates, Lyft, Mark Zuckerberg, Marshall McLuhan, Martin Wolf, Menlo Park, military-industrial complex, move fast and break things, Network effects, new economy, offshore financial centre, PageRank, patent troll, Paul Volcker talking about ATMs, paypal mafia, Peter Thiel, pets.com, price discrimination, profit maximization, race to the bottom, recommendation engine, ride hailing / ride sharing, Robert Bork, Sand Hill Road, search engine result page, self-driving car, shareholder value, sharing economy, Sheryl Sandberg, Shoshana Zuboff, side hustle, Sidewalk Labs, Silicon Valley, Silicon Valley startup, smart cities, Snapchat, SoftBank, South China Sea, sovereign wealth fund, Steve Bannon, Steve Jobs, Steven Levy, stock buybacks, subscription business, supply-chain management, surveillance capitalism, TaskRabbit, tech billionaire, tech worker, TED Talk, Telecommunications Act of 1996, The Chicago School, the long tail, the new new thing, Tim Cook: Apple, too big to fail, Travis Kalanick, trickle-down economics, Uber and Lyft, Uber for X, uber lyft, Upton Sinclair, warehouse robotics, WeWork, WikiLeaks, zero-sum game

Eventually, they decided to “stop being British,” as Shivaun put it to me, and take their case to the regulators, which is how Foundem became the lead complainant in the European Commission’s Google Search antitrust case, launched in 2009. It was led by the tough-as-nails EU competition chief, Margrethe Vestager, who eventually found against the firm in 2017. In compliance with EU law, Google was given eighteen months to figure out a way to rejigger its algorithms to eliminate bias in search. But in late 2018, the Raffs sent a letter to the commissioner, telling her that they were unpersuaded that the Google “compliance mechanism,” which depended once again on its own black box algorithmic formulas, was working. “It has now been more than a year since Google introduced its auction-based ‘remedy’ and the harm to competition, consumers, and innovation caused by Google’s illegal conduct has continued unabated,” they wrote.

“A recurring pattern has developed,” says Frank Pasquale at the University of Maryland, “in which some entity complains about a major Internet company’s practices, the company claims that its critics don’t understand how its algorithms sort and rank content, and befuddled onlookers are left to sift through rival stories in the press.” Companies should be prepared to make themselves open to algorithmic audits, as suggested by mathematician and Big Tech critic Cathy O’Neil, in case of complaints or concerns about algorithmic bias that could allow for discrimination in the workplace, healthcare, education, and so on.7 Individuals should also have their digital rights legalized. Former Wired editor John Battelle has proposed a digital bill of rights that would assign possession of data to its true owner, which is, of course, the user and generator of that data, not the company that made off with it.


Big Data and the Welfare State: How the Information Revolution Threatens Social Solidarity by Torben Iversen, Philipp Rehm

23andMe, Affordable Care Act / Obamacare, algorithmic bias, barriers to entry, Big Tech, business cycle, centre right, collective bargaining, COVID-19, crony capitalism, data science, DeepMind, deindustrialization, full employment, George Akerlof, income inequality, information asymmetry, invisible hand, knowledge economy, land reform, lockdown, loss aversion, low interest rates, low skilled workers, microbiome, moral hazard, mortgage debt, Network effects, new economy, obamacare, personalized medicine, Ponzi scheme, price discrimination, principal–agent problem, profit maximization, Robert Gordon, speech recognition, subprime mortgage crisis, tail risk, The Market for Lemons, The Rise and Fall of American Growth, union organizing, vertical integration, working-age population

Journal of Economic Perspectives 12(2): 41–62. Lane, Jan-Erik. 2000. New Public Management: An Introduction. Taylor & Francis. Lane, Julia. 2020. Democratizing Our Data: A Manifesto. MIT Press. Lee, Nicol Turner, Paul Resnick, and Genie Barton. 2019. Algorithmic Bias Detection and Mitigation: Best Practices and Policies to Reduce Consumer Harms. Brookings Institute. www.brookings.edu/research/algorithmic-bias-det ection-and-mitigation-best-practices-and-policies-to-reduce-consumer-harms/. Leeuwen, Marco H. D. Van. 2016. Mutual Insurance 1550-2015: From Guild Welfare and Friendly Societies to Contemporary Micro-Insurers. Springer.


pages: 345 words: 75,660

Prediction Machines: The Simple Economics of Artificial Intelligence by Ajay Agrawal, Joshua Gans, Avi Goldfarb

Abraham Wald, Ada Lovelace, AI winter, Air France Flight 447, Airbus A320, algorithmic bias, AlphaGo, Amazon Picking Challenge, artificial general intelligence, autonomous vehicles, backpropagation, basic income, Bayesian statistics, Black Swan, blockchain, call centre, Capital in the Twenty-First Century by Thomas Piketty, Captain Sullenberger Hudson, carbon tax, Charles Babbage, classic study, collateralized debt obligation, computer age, creative destruction, Daniel Kahneman / Amos Tversky, data acquisition, data is the new oil, data science, deep learning, DeepMind, deskilling, disruptive innovation, driverless car, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, everywhere but in the productivity statistics, financial engineering, fulfillment center, general purpose technology, Geoffrey Hinton, Google Glasses, high net worth, ImageNet competition, income inequality, information retrieval, inventory management, invisible hand, Jeff Hawkins, job automation, John Markoff, Joseph Schumpeter, Kevin Kelly, Lyft, Minecraft, Mitch Kapor, Moneyball by Michael Lewis explains big data, Nate Silver, new economy, Nick Bostrom, On the Economy of Machinery and Manufactures, OpenAI, paperclip maximiser, pattern recognition, performance metric, profit maximization, QWERTY keyboard, race to the bottom, randomized controlled trial, Ray Kurzweil, ride hailing / ride sharing, Robert Solow, Salesforce, Second Machine Age, self-driving car, shareholder value, Silicon Valley, statistical model, Stephen Hawking, Steve Jobs, Steve Jurvetson, Steven Levy, strong AI, The Future of Employment, the long tail, The Signal and the Noise by Nate Silver, Tim Cook: Apple, trolley problem, Turing test, Uber and Lyft, uber lyft, US Airways Flight 1549, Vernor Vinge, vertical integration, warehouse automation, warehouse robotics, Watson beat the top human players on Jeopardy!, William Langewiesche, Y Combinator, zero-sum game

Sweeney, “Discrimination in Online Ad Delivery,” Communications of the ACM 56, no. 5 (2013): 44–54, https://dataprivacylab.org/projects/onlineads/. 2. Ibid. 3. “Racism Is Poisoning Online Ad Delivery, Says Harvard Professor,” MIT Technology Review, February 4, 2013, https://www.technologyreview.com/s/510646/racism-is-poisoning-online-ad-delivery-says-harvard-professor/. 4. Anja Lambrecht and Catherine Tucker, “Algorithmic Bias? An Empirical Study into Apparent Gender-Based Discrimination in the Display of STEM Career Ads” (paper presented at the NBER Summer Institute, July 2017). 5. Diane Cardwell and Libby Nelson, “The Fire Dept. Tests That Were Found to Discriminate,” New York Times, July 23, 2009, https://cityroom.blogs.nytimes.com/2009/07/23/the-fire-dept-tests-that-were-found-to-discriminate/?


pages: 301 words: 85,126

AIQ: How People and Machines Are Smarter Together by Nick Polson, James Scott

Abraham Wald, Air France Flight 447, Albert Einstein, algorithmic bias, Amazon Web Services, Atul Gawande, autonomous vehicles, availability heuristic, basic income, Bayesian statistics, Big Tech, Black Lives Matter, Bletchley Park, business cycle, Cepheid variable, Checklist Manifesto, cloud computing, combinatorial explosion, computer age, computer vision, Daniel Kahneman / Amos Tversky, data science, deep learning, DeepMind, Donald Trump, Douglas Hofstadter, Edward Charles Pickering, Elon Musk, epigenetics, fake news, Flash crash, Grace Hopper, Gödel, Escher, Bach, Hans Moravec, Harvard Computers: women astronomers, Higgs boson, index fund, information security, Isaac Newton, John von Neumann, late fees, low earth orbit, Lyft, machine translation, Magellanic Cloud, mass incarceration, Moneyball by Michael Lewis explains big data, Moravec's paradox, more computing power than Apollo, natural language processing, Netflix Prize, North Sea oil, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, p-value, pattern recognition, Pierre-Simon Laplace, ransomware, recommendation engine, Ronald Reagan, Salesforce, self-driving car, sentiment analysis, side project, Silicon Valley, Skype, smart cities, speech recognition, statistical model, survivorship bias, systems thinking, the scientific method, Thomas Bayes, Uber for X, uber lyft, universal basic income, Watson beat the top human players on Jeopardy!, young professional

Many people, when they hear about something as shocking as a secret algorithm handing down prison sentences in a racially biased way, reach a simple conclusion: that artificial intelligence should play no role whatsoever in the criminal justice system. While we’re as shocked and angry as anyone, we think that’s the wrong conclusion. Yes, we must all fight algorithmic bias when it arises. To do that, we need constant vigilance by experts: people who know the law but who also know AI, and who are empowered to act if they see a threat to justice. But even as we acknowledge the pitfalls of using AI to help people make important decisions, and even as we echo the call for transparency and fairness to become defining values of this new age, let’s not forget that there’s also incredible potential here.


Off the Edge: Flat Earthers, Conspiracy Culture, and Why People Will Believe Anything by Kelly Weill

4chan, Albert Einstein, Alfred Russel Wallace, algorithmic bias, anti-communist, Apollo 11, Big Tech, bitcoin, Comet Ping Pong, coronavirus, COVID-19, crisis actor, cryptocurrency, disinformation, Donald Trump, Elon Musk, fake news, false flag, income inequality, Internet Archive, Isaac Newton, Johannes Kepler, Kevin Roose, Kickstarter, lockdown, Mark Zuckerberg, Mars Society, mass immigration, medical malpractice, moral panic, off-the-grid, QAnon, recommendation engine, side project, Silicon Valley, Silicon Valley startup, Skype, tech worker, Tesla Model S, TikTok, Timothy McVeigh, Wayback Machine, Y2K

When I left a Flat Earth lecture to use the 110 OFF THE EDGE bathroom at a 2018 conference, I returned to find multiple business cards for Flat Earth YouTube channels on my empty chair. While walking from one room to another, someone pushed a CD full of a Flat Earth YouTuber’s music into my hands. While earnest Flat Earthers were handing out business cards and CDs, other opportunists were circling the scene. YouTube’s algorithmic bias toward Flat Earth has led Globe Earth YouTubers to make Flat Earth films and leech off the movement’s growing notoriety. “There was a whisper that was being passed around content creators—not just ours but other people—that if you made a Flat Earth video, you would get more hits and you would get five hundred percent more comments, which track into the YouTube algorithm,” Sargent told me.


pages: 241 words: 70,307

Leadership by Algorithm: Who Leads and Who Follows in the AI Era? by David de Cremer

"Friedman doctrine" OR "shareholder theory", algorithmic bias, algorithmic management, AlphaGo, bitcoin, blockchain, business climate, business process, Computing Machinery and Intelligence, corporate governance, data is not the new oil, data science, deep learning, DeepMind, Donald Trump, Elon Musk, fake news, future of work, job automation, Kevin Kelly, Mark Zuckerberg, meta-analysis, Norbert Wiener, pattern recognition, Peter Thiel, race to the bottom, robotic process automation, Salesforce, scientific management, shareholder value, Silicon Valley, Social Responsibility of Business Is to Increase Its Profits, Stephen Hawking, The Future of Employment, Turing test, work culture , workplace surveillance , zero-sum game

Even more so, their inability to take the perspective of others makes them unable to make decisions on behalf of others and, as such, are perceived as incapable of leadership. Let us consider again Amazon’s experiment to use an algorithm to automate their recruitment process. This case taught us that the employed algorithm duplicated the human bias to favor men over women for the specific software development jobs they were advertising. As I just mentioned, it is not just algorithms, but humans too, that make such biased judgments. The difference is that humans are aware of the social consequences that emerge from the employment of this biased practice.


pages: 475 words: 134,707

The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health--And How We Must Adapt by Sinan Aral

Airbnb, Albert Einstein, algorithmic bias, AlphaGo, Any sufficiently advanced technology is indistinguishable from magic, AOL-Time Warner, augmented reality, behavioural economics, Bernie Sanders, Big Tech, bitcoin, Black Lives Matter, Cambridge Analytica, carbon footprint, Cass Sunstein, computer vision, contact tracing, coronavirus, correlation does not imply causation, COVID-19, crowdsourcing, cryptocurrency, data science, death of newspapers, deep learning, deepfake, digital divide, digital nomad, disinformation, disintermediation, Donald Trump, Drosophila, Edward Snowden, Elon Musk, en.wikipedia.org, end-to-end encryption, Erik Brynjolfsson, experimental subject, facts on the ground, fake news, Filter Bubble, George Floyd, global pandemic, hive mind, illegal immigration, income inequality, Kickstarter, knowledge worker, lockdown, longitudinal study, low skilled workers, Lyft, Mahatma Gandhi, Mark Zuckerberg, Menlo Park, meta-analysis, Metcalfe’s law, mobile money, move fast and break things, multi-sided market, Nate Silver, natural language processing, Neal Stephenson, Network effects, performance metric, phenotype, recommendation engine, Robert Bork, Robert Shiller, Russian election interference, Second Machine Age, seminal paper, sentiment analysis, shareholder value, Sheryl Sandberg, skunkworks, Snapchat, social contagion, social distancing, social graph, social intelligence, social software, social web, statistical model, stem cell, Stephen Hawking, Steve Bannon, Steve Jobs, Steve Jurvetson, surveillance capitalism, Susan Wojcicki, Telecommunications Act of 1996, The Chicago School, the strength of weak ties, The Wisdom of Crowds, theory of mind, TikTok, Tim Cook: Apple, Uber and Lyft, uber lyft, WikiLeaks, work culture , Yogi Berra

Given that Facebook is now the largest news outlet on the planet, with an audience greater than any Western television news network, newspaper, magazine, or online publication, it’s important to consider whether its newsfeed algorithm biases our exposure to different news sources, and whether its content-curation policies favor particular political views. (I’ll discuss whether social media should be regulated like traditional media in Chapter 12, but for now, it’s important to understand how algorithmic curation works. I’ll explore the effects of algorithmic curation on bias and polarization in news consumption in detail in Chapter 10.) Newsfeeds rank content according to its relevance. Each piece of content is given a relevance score that is unique to each of us and is sorted to appear in decreasing relevance order in our newsfeeds. Relevance is scored by predictive models that learn what drives us to interact with a piece of content.

A National Commission on Technology and Democracy In researching this book, I watched hours of congressional testimony by tech executives like Mark Zuckerberg, Jack Dorsey, Sundar Pichai, and Susan Wojcicki. I watched testimony on privacy, antitrust, election manipulation, data protection, algorithmic bias, and the role of social media in vaccine hesitancy, free speech, political bias, filter bubbles, and fake news. I got one overwhelming feeling from watching congressmen and -women question tech executives: we need more experts leading the way. Charting our technological future will be complex, technical, and nuanced.


pages: 390 words: 109,519

Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media by Tarleton Gillespie

4chan, A Declaration of the Independence of Cyberspace, affirmative action, Airbnb, algorithmic bias, algorithmic management, AltaVista, Amazon Mechanical Turk, borderless world, Burning Man, complexity theory, conceptual framework, crowdsourcing, deep learning, do what you love, Donald Trump, drone strike, easy for humans, difficult for computers, Edward Snowden, eternal september, fake news, Filter Bubble, Gabriella Coleman, game design, gig economy, Google Glasses, Google Hangouts, hiring and firing, Ian Bogost, independent contractor, Internet Archive, Jean Tirole, John Gruber, Kickstarter, Mark Zuckerberg, mass immigration, Menlo Park, Minecraft, moral panic, multi-sided market, Netflix Prize, Network effects, pattern recognition, peer-to-peer, power law, real-name policy, recommendation engine, Rubik’s Cube, Salesforce, sharing economy, Silicon Valley, Skype, slashdot, Snapchat, social graph, social web, Steve Jobs, Stewart Brand, TED Talk, Telecommunications Act of 1996, two-sided market, WikiLeaks, Yochai Benkler

See freedom of speech flagging, (i), (ii), (iii); limited vocabulary of, (i), (ii); can be gamed, (i), (ii); and the drag queen controversy, (i); as strategically valuable for platforms, (i); as labor, (i); who actually does it, (i); as data that could be offered back to users, (i) Flickr (Yahoo): community guidelines, (i), (ii), (iii); approach to moderation (i), (ii), (iii); and the NIPSA guideline, (i); response to international obligations, (i) Flyverbom, Mikkel, (i)n4 Foursquare (social networking platform), (i), (ii), (iii) Free the Nipple, (i) “freedom from porn” (i), (ii) freedom of speech: and the regulation of traditional media, (i), (ii), (iii); and global values, (i), (ii), (iii); as a commitment of platforms, (i), (ii), (iii), (iv), (v), (vi), (vii), (viii); and the early web, (i); and cyberporn, (i); and private intermediaries, (i), (ii), (iii), (iv); and defamation, (i); and digital copyright, (i) Friendster (social networking platform), (i) Gadde, Vitaya (Twitter), (i) game worlds, moderation of, (i), (ii) Gamergate, (i), (ii), (iii) gatekeepers, (i), (ii), (iii) Gawker, (i) #gay, blocked by Tumblr, (i), (ii) Geiger, Stuart, (i) gender, (i), (ii), (iii), (iv) Google: and political content, (i), (ii); community guidelines, (i), (ii), (iii), (iv), (v)n24; approach to moderation, (i), (ii), (iii); and algorithmic bias, (i); and automatic detection, (i); moderation of the Autocomplete function, (i), (ii)n31; and filtering, (i); and fake news / Russian ad controversies, (i), (ii)n40 Google Glass, (i), (ii) Google Image Search, (i), (ii) Google Maps, (i) Google Play (app store), (i), (ii) Google Translate, (i) Google Trends, (i) Google+, (i), (ii), (iii) graphic content, rules regarding, (i) Gray, Mary, (i), (ii)n105 Gregoire, Carolyn, (i) Grindr (dating app), (i), (ii) The Guardian, (i), (ii), (iii) Hansen, Espen Egil, (i) harassment: and Twitter, (i), (ii), (iii), (iv), (v)n2, (vi)n5; before social media, (i); rules regarding, (i), (ii); and flagging, (i), (ii); automated detection of, (i), (ii) Harvey, Del (Twitter), (i) hashing, (i) hate speech: and Twitter, (i); before social media, (i); legal obligations regarding, (i); rules regarding, (i); Apple’s “enemies” rule, (i); automated detection of, (i), (ii); and Google, (i) Heartmob, (i) Hebron, Micol, (i) Heffernan, Virginia, (i) Herrman, John, (i) Hey Facebook, Breastfeeding Is Not Obscene!


pages: 415 words: 102,982

Who’s Raising the Kids?: Big Tech, Big Business, and the Lives of Children by Susan Linn

Albert Einstein, algorithmic bias, Apple's 1984 Super Bowl advert, augmented reality, benefit corporation, Big Tech, big-box store, BIPOC, Black Lives Matter, British Empire, cashless society, clean water, coronavirus, COVID-19, delayed gratification, digital divide, digital rights, disinformation, Donald Trump, Elon Musk, en.wikipedia.org, fake news, gamification, George Floyd, Howard Zinn, impulse control, Internet of things, Isaac Newton, Jaron Lanier, Jeff Bezos, Kevin Roose, Khan Academy, language acquisition, late fees, lockdown, longitudinal study, Mark Zuckerberg, market design, meta-analysis, Minecraft, neurotypical, new economy, Nicholas Carr, planned obsolescence, plant based meat, precautionary principle, Ralph Nader, RAND corporation, randomized controlled trial, retail therapy, Ronald Reagan, Salesforce, Shoshana Zuboff, Silicon Valley, Snapchat, Steve Jobs, surveillance capitalism, techlash, theory of mind, TikTok, Tim Cook: Apple

Robb, News and America’s Kids: How Young People Perceive and Are Impacted by the News (San Francisco: Common Sense Media, 2017), 5. 29.  Tate-Ryan Mosely, “How Digital Beauty Filters Perpetuate Colorism,” MIT Technology Review, August 15, 2021, www.technologyreview.com/2021/08/15/1031804/digital-beauty-filters-photoshop-photo-editing-colorism-racism. 30.  Alex Hern, “Student Proves Twitter Algorithm ‘Bias’ Toward Lighter, Slimmer, Younger Faces,” The Guardian, August 10, 2021. 31.  Christopher P. Barton and Kyle Somerville, Historical Racialized Toys in the United States (Guides to Historical Artifacts) (New York: Rout-ledge, 2016), 17–18. 32.  Kirsten Weir, “Raising Anti-Racist Children.


pages: 482 words: 121,173

Tools and Weapons: The Promise and the Peril of the Digital Age by Brad Smith, Carol Ann Browne

"World Economic Forum" Davos, Affordable Care Act / Obamacare, AI winter, air gap, airport security, Alan Greenspan, Albert Einstein, algorithmic bias, augmented reality, autonomous vehicles, barriers to entry, Berlin Wall, Big Tech, Bletchley Park, Blitzscaling, Boeing 737 MAX, business process, call centre, Cambridge Analytica, Celtic Tiger, Charlie Hebdo massacre, chief data officer, cloud computing, computer vision, corporate social responsibility, data science, deep learning, digital divide, disinformation, Donald Trump, Eben Moglen, Edward Snowden, en.wikipedia.org, Hacker News, immigration reform, income inequality, Internet of things, invention of movable type, invention of the telephone, Jeff Bezos, Kevin Roose, Laura Poitras, machine readable, Mark Zuckerberg, minimum viable product, national security letter, natural language processing, Network effects, new economy, Nick Bostrom, off-the-grid, operational security, opioid epidemic / opioid crisis, pattern recognition, precision agriculture, race to the bottom, ransomware, Ronald Reagan, Rubik’s Cube, Salesforce, school vouchers, self-driving car, Sheryl Sandberg, Shoshana Zuboff, Silicon Valley, Skype, speech recognition, Steve Ballmer, Steve Jobs, surveillance capitalism, tech worker, The Rise and Fall of American Growth, Tim Cook: Apple, Wargames Reagan, WikiLeaks, women in the workforce

Julia Angwin, Jeff Larson, Surya Mattu, and Lauren Kirchner, “Machine Bias,” ProPublica, May 23, 2016, https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing. Back to note reference 13. The article led to a lively debate about the definition of bias and how to assess the risk of it in AI algorithms. See Matthias Spielkamp, “Inspecting Algorithms for Bias,” MIT Technology Review, June 12, 2017, https://www.technologyreview.com/s/607955/inspecting-algorithms-for-bias/. Back to note reference 14. Joy Buolamwini, “Gender Shades,” Civic Media, MIT Media Lab, accessed November 15, 2018, https://www.media.mit.edu/projects/gender-shades/overview/. Back to note reference 15. Thomas G.


pages: 586 words: 186,548

Architects of Intelligence by Martin Ford

3D printing, agricultural Revolution, AI winter, algorithmic bias, Alignment Problem, AlphaGo, Apple II, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, backpropagation, barriers to entry, basic income, Baxter: Rethink Robotics, Bayesian statistics, Big Tech, bitcoin, Boeing 747, Boston Dynamics, business intelligence, business process, call centre, Cambridge Analytica, cloud computing, cognitive bias, Colonization of Mars, computer vision, Computing Machinery and Intelligence, correlation does not imply causation, CRISPR, crowdsourcing, DARPA: Urban Challenge, data science, deep learning, DeepMind, Demis Hassabis, deskilling, disruptive innovation, Donald Trump, Douglas Hofstadter, driverless car, Elon Musk, Erik Brynjolfsson, Ernest Rutherford, fake news, Fellow of the Royal Society, Flash crash, future of work, general purpose technology, Geoffrey Hinton, gig economy, Google X / Alphabet X, Gödel, Escher, Bach, Hans Moravec, Hans Rosling, hype cycle, ImageNet competition, income inequality, industrial research laboratory, industrial robot, information retrieval, job automation, John von Neumann, Large Hadron Collider, Law of Accelerating Returns, life extension, Loebner Prize, machine translation, Mark Zuckerberg, Mars Rover, means of production, Mitch Kapor, Mustafa Suleyman, natural language processing, new economy, Nick Bostrom, OpenAI, opioid epidemic / opioid crisis, optical character recognition, paperclip maximiser, pattern recognition, phenotype, Productivity paradox, radical life extension, Ray Kurzweil, recommendation engine, Robert Gordon, Rodney Brooks, Sam Altman, self-driving car, seminal paper, sensor fusion, sentiment analysis, Silicon Valley, smart cities, social intelligence, sparse data, speech recognition, statistical model, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Pinker, strong AI, superintelligent machines, synthetic biology, systems thinking, Ted Kaczynski, TED Talk, The Rise and Fall of American Growth, theory of mind, Thomas Bayes, Travis Kalanick, Turing test, universal basic income, Wall-E, Watson beat the top human players on Jeopardy!, women in the workforce, working-age population, workplace surveillance , zero-sum game, Zipcar

One threat that is already becoming evident is the vulnerability of interconnected, autonomous systems to cyber attack or hacking. As AI becomes ever more integrated into our economy and society, solving this problem will be one of the most critical challenges we face. Another immediate concern is the susceptibility of machine learning algorithms to bias, in some cases on the basis of race or gender. Many of the individuals I spoke with emphasized the importance of addressing this issue and told of research currently underway in this area. Several also sounded an optimistic note—suggesting that AI may someday prove to be a powerful tool to help combat systemic bias or discrimination.

That now said, I agree with you that bias and diversity can be treated a little more separately. For example, in terms of data bias resulting in machine learning outcome bias, a lot of academia researchers are recognizing this now, and working on ways to expose that kind of bias. They’re also modifying algorithms to respond to bias in a way to try to correct it that way. This exposure to the bias of products and technology, from academia to industry, is really healthy, and it keeps the industry on their toes. MARTIN FORD: You must have to deal with machine learning bias at Google. How do you address it? FEI-FEI LI: Google now has a whole group of researchers working on machine learning bias and “explainability” because the pressure is there to tackle bias, to deliver a better product, and we want to be helping others.


The Smartphone Society by Nicole Aschoff

"Susan Fowler" uber, 4chan, A Declaration of the Independence of Cyberspace, Airbnb, algorithmic bias, algorithmic management, Amazon Web Services, artificial general intelligence, autonomous vehicles, barriers to entry, Bay Area Rapid Transit, Bernie Sanders, Big Tech, Black Lives Matter, blockchain, carbon footprint, Carl Icahn, Cass Sunstein, citizen journalism, cloud computing, correlation does not imply causation, crony capitalism, crowdsourcing, cryptocurrency, data science, deep learning, DeepMind, degrowth, Demis Hassabis, deplatforming, deskilling, digital capitalism, digital divide, do what you love, don't be evil, Donald Trump, Downton Abbey, Edward Snowden, Elon Musk, Evgeny Morozov, fake news, feminist movement, Ferguson, Missouri, Filter Bubble, financial independence, future of work, gamification, gig economy, global value chain, Google Chrome, Google Earth, Googley, green new deal, housing crisis, income inequality, independent contractor, Jaron Lanier, Jeff Bezos, Jessica Bruder, job automation, John Perry Barlow, knowledge economy, late capitalism, low interest rates, Lyft, M-Pesa, Mark Zuckerberg, minimum wage unemployment, mobile money, moral panic, move fast and break things, Naomi Klein, Network effects, new economy, Nicholas Carr, Nomadland, occupational segregation, Occupy movement, off-the-grid, offshore financial centre, opioid epidemic / opioid crisis, PageRank, Patri Friedman, peer-to-peer, Peter Thiel, pets.com, planned obsolescence, quantitative easing, Ralph Waldo Emerson, RAND corporation, Ray Kurzweil, RFID, Richard Stallman, ride hailing / ride sharing, Rodney Brooks, Ronald Reagan, Salesforce, Second Machine Age, self-driving car, shareholder value, sharing economy, Sheryl Sandberg, Shoshana Zuboff, Sidewalk Labs, Silicon Valley, single-payer health, Skype, Snapchat, SoftBank, statistical model, Steve Bannon, Steve Jobs, surveillance capitalism, TaskRabbit, tech worker, technological determinism, TED Talk, the scientific method, The Structural Transformation of the Public Sphere, TikTok, transcontinental railway, transportation-network company, Travis Kalanick, Uber and Lyft, Uber for X, uber lyft, upwardly mobile, Vision Fund, W. E. B. Du Bois, wages for housework, warehouse robotics, WikiLeaks, women in the workforce, yottabyte

Data & Society is a nonprofit research institute “focused on the social and cultural issues arising from data-centric technological development.” For more, see Caplan et al., “Algorithmic Accountability: A Primer,” prepared for the Congressional Progressive Caucus, Tech Algorithm Briefing, “How Algorithms Perpetuate Racial Bias and Inequality,” Data & Society, April 18, 2018. 21. Diakopoulos and Friedler, “How to Hold Algorithms Accountable.” 22. The term “Green New Deal” has been around for more than a decade, but its recent usage refers to a pair of resolutions submitted to Congress by Representative Alexandria Ocasio-Cortez and Senator Ed Markey and to the broader social movement supporting the ideas articulated in the proposals.

American Civil Liberties Union, May 22, 2018. Cahill, Damien, and Martijn Konings. Neoliberalism. Medford, MA: Polity Press, 2017. Caplan, Robyn, et al. “Algorithmic Accountability: A Primer.” Report originally prepared for the Congressional Progressive Caucus’s Tech Algorithm Briefing: How Algorithms Perpetuate Racial Bias and Inequality. Data & Society, April 18, 2018. Caron, André H., and Letizia Caronia. Moving Cultures: Mobile Communication in Everyday Life. Montreal: McGill-Queen’s University Press, 2007. Carr, Nicholas. “How Smartphones Hijack Our Minds.” Wall Street Journal, October 6, 2017. ———.


pages: 499 words: 144,278

Coders: The Making of a New Tribe and the Remaking of the World by Clive Thompson

"Margaret Hamilton" Apollo, "Susan Fowler" uber, 2013 Report for America's Infrastructure - American Society of Civil Engineers - 19 March 2013, 4chan, 8-hour work day, Aaron Swartz, Ada Lovelace, AI winter, air gap, Airbnb, algorithmic bias, AlphaGo, Amazon Web Services, Andy Rubin, Asperger Syndrome, augmented reality, Ayatollah Khomeini, backpropagation, barriers to entry, basic income, behavioural economics, Bernie Sanders, Big Tech, bitcoin, Bletchley Park, blockchain, blue-collar work, Brewster Kahle, Brian Krebs, Broken windows theory, call centre, Cambridge Analytica, cellular automata, Charles Babbage, Chelsea Manning, Citizen Lab, clean water, cloud computing, cognitive dissonance, computer vision, Conway's Game of Life, crisis actor, crowdsourcing, cryptocurrency, Danny Hillis, data science, David Heinemeier Hansson, deep learning, DeepMind, Demis Hassabis, disinformation, don't be evil, don't repeat yourself, Donald Trump, driverless car, dumpster diving, Edward Snowden, Elon Musk, Erik Brynjolfsson, Ernest Rutherford, Ethereum, ethereum blockchain, fake news, false flag, Firefox, Frederick Winslow Taylor, Free Software Foundation, Gabriella Coleman, game design, Geoffrey Hinton, glass ceiling, Golden Gate Park, Google Hangouts, Google X / Alphabet X, Grace Hopper, growth hacking, Guido van Rossum, Hacker Ethic, hockey-stick growth, HyperCard, Ian Bogost, illegal immigration, ImageNet competition, information security, Internet Archive, Internet of things, Jane Jacobs, John Markoff, Jony Ive, Julian Assange, Ken Thompson, Kickstarter, Larry Wall, lone genius, Lyft, Marc Andreessen, Mark Shuttleworth, Mark Zuckerberg, Max Levchin, Menlo Park, meritocracy, microdosing, microservices, Minecraft, move 37, move fast and break things, Nate Silver, Network effects, neurotypical, Nicholas Carr, Nick Bostrom, no silver bullet, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, Oculus Rift, off-the-grid, OpenAI, operational security, opioid epidemic / opioid crisis, PageRank, PalmPilot, paperclip maximiser, pattern recognition, Paul Graham, paypal mafia, Peter Thiel, pink-collar, planetary scale, profit motive, ransomware, recommendation engine, Richard Stallman, ride hailing / ride sharing, Rubik’s Cube, Ruby on Rails, Sam Altman, Satoshi Nakamoto, Saturday Night Live, scientific management, self-driving car, side project, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, single-payer health, Skype, smart contracts, Snapchat, social software, software is eating the world, sorting algorithm, South of Market, San Francisco, speech recognition, Steve Wozniak, Steven Levy, systems thinking, TaskRabbit, tech worker, techlash, TED Talk, the High Line, Travis Kalanick, Uber and Lyft, Uber for X, uber lyft, universal basic income, urban planning, Wall-E, Watson beat the top human players on Jeopardy!, WeWork, WikiLeaks, women in the workforce, Y Combinator, Zimmermann PGP, éminence grise

Index Aaron Swartz hackathon, ref1 Abbate, Janet, ref1, ref2, ref3, ref4 Abene, Mark, ref1 Abma, Jobert, ref1 Adams, John, ref1 Addiction by Design (Schüll), ref1 addictive behavior, as side effect of optimization, ref1 Adler, Mortimer J., ref1 Adobe, ref1 advertising, ref1, ref2 African American coders. See minority coders agile development, ref1 AI. See artificial intelligence (AI) Albright, Jonathan, ref1 Alciné, Jacky, ref1 algorithms, ref1, ref2 bias in ranking systems, ref1 scale and, ref1 algorithms challenge whiteboard interview, ref1, ref2, ref3 Algorithms of Oppression (Noble), ref1 Allen, Fran, ref1, ref2 Allen, Paul, ref1 AlphaGo, ref1, ref2 Altman, Sam, ref1, ref2 Amabile, Teresa M., ref1 Amazon, ref1, ref2, ref3 Amazons (board game), ref1 Amazon Web Services, ref1 Analytical Engine, ref1 Anderson, Tom, ref1 AND gate, ref1 Andreessen, Marc, ref1, ref2, ref3, ref4, ref5, ref6, ref7, ref8 Antisocial Media (Vaidhyanathan), ref1 Apple, ref1 Apple I, ref1 Apple iPhone, ref1, ref2 aptitude testing, ref1 architects, ref1 artificial intelligence (AI), ref1 dangers of, warnings about and debate over, ref1 de-biasing of, ref1 deep learning (See deep learning) edge cases and, ref1 expert systems, ref1 Hollywood depiction of, ref1 initial attempts to create, at Dartmouth in 1956, ref1 job listing sites, biased results in, ref1 justice system, effect of AI bias on, ref1 learning problem, ref1 neural nets (See neural nets) racism and sexism, learning of, ref1 artistic temperaments, ref1 Assembly computer language, ref1 Atwood, Jeff, ref1, ref2 Babbage, Charles, ref1, ref2 back-end code, ref1, ref2, ref3, ref4 backpropagation, ref1 “Bad Smells in Code” (Fowler and Beck), ref1 Baffler, The, ref1 Bahnken, A.


pages: 480 words: 119,407

Invisible Women by Caroline Criado Perez

"Hurricane Katrina" Superdome, Affordable Care Act / Obamacare, algorithmic bias, augmented reality, Bernie Sanders, Cambridge Analytica, collective bargaining, crowdsourcing, data science, Diane Coyle, Donald Trump, falling living standards, first-past-the-post, gender pay gap, gig economy, glass ceiling, Grace Hopper, Hacker Ethic, independent contractor, Indoor air pollution, informal economy, lifelogging, low skilled workers, mental accounting, meta-analysis, Nate Silver, new economy, obamacare, Oculus Rift, offshore financial centre, pattern recognition, phenotype, post-industrial society, randomized controlled trial, remote working, Sheryl Sandberg, Silicon Valley, Simon Kuznets, speech recognition, stem cell, Stephen Hawking, Steven Levy, tech bro, the built environment, urban planning, women in the workforce, work culture , zero-sum game

But if you aren’t aware of how those biases operate, if you aren’t collecting data and taking a little time to produce evidence-based processes, you will continue to blindly perpetuate old injustices. And so by not considering the ways in which women’s lives differ from men’s, both on and offline, Gild’s coders inadvertently created an algorithm with a hidden bias against women. But that’s not even the most troubling bit. The most troubling bit is that we have no idea how bad the problem actually is. Most algorithms of this kind are kept secret and protected as proprietary code. This means that we don’t know how these decisions are being made and what biases they are hiding.

The authors of the women = homemaker paper devised a new algorithm that reduced gender stereotyping (e.g. ‘he is to doctor as she is to nurse’) by over two-thirds, while leaving gender-appropriate word associations (e.g. ‘he is to prostate cancer as she is to ovarian cancer’) intact.49 And the authors of the 2017 study on image interpretation devised a new algorithm that decreased bias amplification by 47.5%. CHAPTER 9 A Sea of Dudes When Janica Alvarez was trying to raise funds for her tech start-up Naya Health Inc. in 2013, she struggled to get investors to take her seriously. In one meeting, ‘investors Googled the product and ended up on a porn site. They lingered on the page and started cracking jokes’, leaving Alvarez feeling like she was ‘in the middle of a fraternity’.1 Other investors were ‘too grossed out to touch her product or pleaded ignorance’, with one male investor saying ‘I’m not touching that; that’s disgusting.’2 And what was this vile, ‘disgusting’ and incomprehensible product Alvarez was pitching?


pages: 688 words: 147,571

Robot Rules: Regulating Artificial Intelligence by Jacob Turner

"World Economic Forum" Davos, Ada Lovelace, Affordable Care Act / Obamacare, AI winter, algorithmic bias, algorithmic trading, AlphaGo, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, autonomous vehicles, backpropagation, Basel III, bitcoin, Black Monday: stock market crash in 1987, blockchain, brain emulation, Brexit referendum, Cambridge Analytica, Charles Babbage, Clapham omnibus, cognitive dissonance, Computing Machinery and Intelligence, corporate governance, corporate social responsibility, correlation does not imply causation, crowdsourcing, data science, deep learning, DeepMind, Demis Hassabis, distributed ledger, don't be evil, Donald Trump, driverless car, easy for humans, difficult for computers, effective altruism, Elon Musk, financial exclusion, financial innovation, friendly fire, future of work, hallucination problem, hive mind, Internet of things, iterative process, job automation, John Markoff, John von Neumann, Loebner Prize, machine readable, machine translation, medical malpractice, Nate Silver, natural language processing, Nick Bostrom, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, nudge unit, obamacare, off grid, OpenAI, paperclip maximiser, pattern recognition, Peace of Westphalia, Philippa Foot, race to the bottom, Ray Kurzweil, Recombinant DNA, Rodney Brooks, self-driving car, Silicon Valley, Stanislav Petrov, Stephen Hawking, Steve Wozniak, strong AI, technological singularity, Tesla Model S, The Coming Technological Singularity, The Future of Employment, The Signal and the Noise by Nate Silver, trolley problem, Turing test, Vernor Vinge

– You are not permitted to modify any robot to enable it to function as a weapon.103 It remains to be seen though whether and to what extent the European Parliament’s ambitious proposals will be adopted in legislative proposals by the Commission. 4.8 Japanese Initiatives A June 2016 Report issued by Japan’s Ministry of Internal Affairs and Communications proposed nine principles for developers of AI, which were submitted for international discussion at the G7104 and OECD:1) Principle of collaboration—Developers should pay attention to the interconnectivity and interoperability of AI systems. 2) Principle of transparency —Developers should pay attention to the verifiability of inputs/outputs of AI systems and the explainability of their judgments. 3) Principle of controllability—Developers should pay attention to the controllability of AI systems. 4) Principle of safety—Developers should take it into consideration that AI systems will not harm the life, body, or property of users or third parties through actuators or other devices. 5) Principle of security—Developers should pay attention to the security of AI systems. 6) Principle of privacy—Developers should take it into consideration that AI systems will not infringe the privacy of users or third parties. 7) Principle of ethics—Developers should respect human dignity and individual autonomy in R&D of AI systems. 8) Principle of user assistance—Developers should take it into consideration that AI systems will support users and make it possible to give them opportunities for choice in appropriate manners. 9) Principle of accountability—Developers should make efforts to fulfill their accountability to stakeholders including users of AI systems.105 Japan emphasised that the above principles were intended to be treated as soft law, but with a view to “accelerate the participation of multistakeholders involved in R&D and utilization of AI… at both national and international levels, in the discussions towards establishing ‘AI R&D Guidelines’ and ‘AI Utilization Guidelines’”.106 Non-governmental groups in Japan have also been active: the Japanese Society for Artificial Intelligence proposed Ethical Guidelines for an Artificial Intelligence Society in February 2017, aimed at its members.107 Fumio Shimpo, a member of the Japanese Government’s Cabinet Office Advisory Board, has proposed his own Eight Principles of the Laws of Robots.108 4.9 Chinese Initiatives In furtherance of China’s Next Generation Artificial Intelligence Development Plan,109 and as mentioned in Chapter 6, in January 2018 a division of China’s Ministry of Industry and Information Technology released a 98-page White Paper on AI Standardization (the White Paper), the contents of which comprise China’s most comprehensive analysis to date of the ethical challenges raised by AI.110 The White Paper highlights emergent ethical issues in AI including privacy,111 the Trolley Problem,112 algorithmic bias,113 transparency 114 and liability for harm caused by AI.115 In terms of AI safety, the White Paper explains that:Because the achieved goals of artificial intelligence technology are influenced by its initial settings, the goal of artificial intelligence design must be to ensure that the design goals of artificial intelligence are consistent with the interests and ethics of most human beings.


pages: 569 words: 165,510

There Is Nothing for You Here: Finding Opportunity in the Twenty-First Century by Fiona Hill

2021 United States Capitol attack, active measures, Affordable Care Act / Obamacare, algorithmic bias, barriers to entry, Berlin Wall, Bernie Sanders, Big Tech, Black Lives Matter, blue-collar work, Boris Johnson, Brexit referendum, British Empire, business climate, call centre, collective bargaining, company town, coronavirus, COVID-19, crony capitalism, cuban missile crisis, David Brooks, deindustrialization, desegregation, digital divide, disinformation, Dissolution of the Soviet Union, Donald Trump, Fall of the Berlin Wall, financial independence, first-past-the-post, food desert, gender pay gap, gentrification, George Floyd, glass ceiling, global pandemic, Great Leap Forward, housing crisis, illegal immigration, imposter syndrome, income inequality, indoor plumbing, industrial cluster, industrial research laboratory, informal economy, Jeff Bezos, Jeremy Corbyn, Kickstarter, knowledge economy, lockdown, low skilled workers, Lyft, Martin Wolf, mass immigration, meme stock, Mikhail Gorbachev, new economy, oil shock, opioid epidemic / opioid crisis, Own Your Own Home, Paris climate accords, pension reform, QAnon, ransomware, restrictive zoning, ride hailing / ride sharing, Right to Buy, Ronald Reagan, self-driving car, Silicon Valley, single-payer health, statistical model, Steve Bannon, The Chicago School, TikTok, transatlantic slave trade, Uber and Lyft, uber lyft, University of East Anglia, urban decay, urban planning, Washington Consensus, WikiLeaks, Winter of Discontent, women in the workforce, working poor, Yom Kippur War, young professional

Sometimes, moreover, they find that the deck is stacked against them even when they perform exceptionally well, as low-income A-level students discovered during the UK algorithm debacle in summer 2020. Interventions in elementary and secondary education are critical in expanding the infrastructure of opportunity, but national government programs frequently miss the mark, and not only when it comes to processes such as testing and issues such as algorithmic bias. Schools in both the UK and the United States get only a fraction of their funding from the central government. K-12 institutions are dependent on local government budgets and local income tax revenues. Unlike universities, which can tap into a mix of government funding, tuition fees, private endowments, and corporate sponsorship, schools outside the private educational sphere are dependent on their immediate ecosystem.


pages: 424 words: 114,905

Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again by Eric Topol

"World Economic Forum" Davos, 23andMe, Affordable Care Act / Obamacare, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic bias, AlphaGo, Apollo 11, artificial general intelligence, augmented reality, autism spectrum disorder, autonomous vehicles, backpropagation, Big Tech, bioinformatics, blockchain, Cambridge Analytica, cloud computing, cognitive bias, Colonization of Mars, computer age, computer vision, Computing Machinery and Intelligence, conceptual framework, creative destruction, CRISPR, crowdsourcing, Daniel Kahneman / Amos Tversky, dark matter, data science, David Brooks, deep learning, DeepMind, Demis Hassabis, digital twin, driverless car, Elon Musk, en.wikipedia.org, epigenetics, Erik Brynjolfsson, fake news, fault tolerance, gamification, general purpose technology, Geoffrey Hinton, George Santayana, Google Glasses, ImageNet competition, Jeff Bezos, job automation, job satisfaction, Joi Ito, machine translation, Mark Zuckerberg, medical residency, meta-analysis, microbiome, move 37, natural language processing, new economy, Nicholas Carr, Nick Bostrom, nudge unit, OpenAI, opioid epidemic / opioid crisis, pattern recognition, performance metric, personalized medicine, phenotype, placebo effect, post-truth, randomized controlled trial, recommendation engine, Rubik’s Cube, Sam Altman, self-driving car, Silicon Valley, Skinner box, speech recognition, Stephen Hawking, techlash, TED Talk, text mining, the scientific method, Tim Cook: Apple, traumatic brain injury, trolley problem, War on Poverty, Watson beat the top human players on Jeopardy!, working-age population

., “Bias Detectives: The Researchers Striving to Make Algorithms Fair,” Nature. 2018. 37. Simonite, T., “Using Artificial Intelligence to Fix Wikipedia’s Gender Problem,” Wired. 2018. 38. Miller, A. P., “Want Less-Biased Decisions? Use Algorithms,” Harvard Business Review. 2018; Thomas, R., “What HBR Gets Wrong About Algorithms and Bias,” Fast AI. 2018. 39. Adamson, A. S., and A. Smith, “Machine Learning and Health Care Disparities in Dermatology.” JAMA Dermatol, 2018. 40. Harari, Y. N., Homo Deus. 2016. New York: HarperCollins, p. 348. 41. Lee, K. F., “The Real Threat of Artificial Intelligence,” New York Times. 2017. 42.


Designing the Mind: The Principles of Psychitecture by Designing the Mind, Ryan A Bush

Abraham Maslow, adjacent possible, Albert Einstein, algorithmic bias, augmented reality, butterfly effect, carbon footprint, cognitive bias, cognitive load, correlation does not imply causation, data science, delayed gratification, deliberate practice, drug harm reduction, effective altruism, Elon Musk, en.wikipedia.org, endowment effect, fundamental attribution error, hedonic treadmill, hindsight bias, impulse control, Kevin Kelly, Lao Tzu, lifelogging, longitudinal study, loss aversion, meta-analysis, Own Your Own Home, pattern recognition, price anchoring, randomized controlled trial, Silicon Valley, Stanford marshmallow experiment, Steven Pinker, systems thinking, Walter Mischel

Under normal circumstances, the planning fallacy would be activated: You would consult your intuitions and output the conclusion “one week from now.” We’ll call this mode of reasoning the “inside view.” But if you have sufficient metacognitive awareness and familiarity with this bias, you will have the opportunity to step in, design a counter-algorithm, and rewire the bias, which we call cognitive revision. Daniel Kahneman, a psychologist and pioneer in cognitive bias research, suggests a workaround for this particular bias: Using such distributional information from other ventures similar to that being forecasted is called taking an 'outside view' and is the cure to the planning fallacy

Watts adds: The whole process of nature is an integrated process of immense complexity, and it’s really impossible to tell whether anything that happens in it is good or bad — because you never know what will be the consequence of the misfortune; or, you never know what will be the consequences of good fortune. The psychological study of affective forecasting shows that we not only oversimplify the world, but our prediction of our own emotional state. Daniel Gilbert, a leading psychologist studying affective forecasting, discovered that humans share an algorithm he called impact bias, which causes us to poorly forecast how we will feel about a certain event or decision, how intense that feeling will be, and how long it will last. In other words, our internal emotion-simulator is just as flawed as our life-simulator.4 In his book, Stumbling on Happiness, Gilbert names several principles behind our shortcomings.


pages: 619 words: 177,548

Power and Progress: Our Thousand-Year Struggle Over Technology and Prosperity by Daron Acemoglu, Simon Johnson

"Friedman doctrine" OR "shareholder theory", "World Economic Forum" Davos, 4chan, agricultural Revolution, AI winter, Airbnb, airline deregulation, algorithmic bias, algorithmic management, Alignment Problem, AlphaGo, An Inconvenient Truth, artificial general intelligence, augmented reality, basic income, Bellingcat, Bernie Sanders, Big Tech, Bletchley Park, blue-collar work, British Empire, carbon footprint, carbon tax, carried interest, centre right, Charles Babbage, ChatGPT, Clayton Christensen, clean water, cloud computing, collapse of Lehman Brothers, collective bargaining, computer age, Computer Lib, Computing Machinery and Intelligence, conceptual framework, contact tracing, Corn Laws, Cornelius Vanderbilt, coronavirus, corporate social responsibility, correlation does not imply causation, cotton gin, COVID-19, creative destruction, declining real wages, deep learning, DeepMind, deindustrialization, Demis Hassabis, Deng Xiaoping, deskilling, discovery of the americas, disinformation, Donald Trump, Douglas Engelbart, Douglas Engelbart, Edward Snowden, Elon Musk, en.wikipedia.org, energy transition, Erik Brynjolfsson, European colonialism, everywhere but in the productivity statistics, factory automation, facts on the ground, fake news, Filter Bubble, financial innovation, Ford Model T, Ford paid five dollars a day, fulfillment center, full employment, future of work, gender pay gap, general purpose technology, Geoffrey Hinton, global supply chain, Gordon Gekko, GPT-3, Grace Hopper, Hacker Ethic, Ida Tarbell, illegal immigration, income inequality, indoor plumbing, industrial robot, interchangeable parts, invisible hand, Isaac Newton, Jacques de Vaucanson, James Watt: steam engine, Jaron Lanier, Jeff Bezos, job automation, Johannes Kepler, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Joseph-Marie Jacquard, Kenneth Arrow, Kevin Roose, Kickstarter, knowledge economy, labor-force participation, land reform, land tenure, Les Trente Glorieuses, low skilled workers, low-wage service sector, M-Pesa, manufacturing employment, Marc Andreessen, Mark Zuckerberg, megacity, mobile money, Mother of all demos, move fast and break things, natural language processing, Neolithic agricultural revolution, Norbert Wiener, NSO Group, offshore financial centre, OpenAI, PageRank, Panopticon Jeremy Bentham, paperclip maximiser, pattern recognition, Paul Graham, Peter Thiel, Productivity paradox, profit maximization, profit motive, QAnon, Ralph Nader, Ray Kurzweil, recommendation engine, ride hailing / ride sharing, Robert Bork, Robert Gordon, Robert Solow, robotic process automation, Ronald Reagan, scientific management, Second Machine Age, self-driving car, seminal paper, shareholder value, Sheryl Sandberg, Shoshana Zuboff, Silicon Valley, social intelligence, Social Responsibility of Business Is to Increase Its Profits, social web, South Sea Bubble, speech recognition, spice trade, statistical model, stem cell, Steve Jobs, Steve Wozniak, strikebreaker, subscription business, Suez canal 1869, Suez crisis 1956, supply-chain management, surveillance capitalism, tacit knowledge, tech billionaire, technoutopianism, Ted Nelson, TED Talk, The Future of Employment, The Rise and Fall of American Growth, The Structural Transformation of the Public Sphere, The Wealth of Nations by Adam Smith, theory of mind, Thomas Malthus, too big to fail, total factor productivity, trade route, transatlantic slave trade, trickle-down economics, Turing machine, Turing test, Twitter Arab Spring, Two Sigma, Tyler Cowen, Tyler Cowen: Great Stagnation, union organizing, universal basic income, Unsafe at Any Speed, Upton Sinclair, upwardly mobile, W. E. B. Du Bois, War on Poverty, WikiLeaks, wikimedia commons, working poor, working-age population

Slave Life in Georgia: A Narrative of the Life, Sufferings, and Escape of John Brown, a Fugitive Slave, Now in England. Edited by Louis Alexis Chamerovzow. https://docsouth.unc.edu/neh/jbrown/jbrown.html. Brown, Megan A., James Bisbee, Angela Lai, Richard Bonneau, Jonathan Nagler, and Joshua A. Tucker. 2022. “Echo Chambers, Rabbit Holes, and Algorithmic Bias: How YouTube Recommends Content to Real Users.” May 25. https://ssrn.com/abstract=4114905. Brundage, Vernon Jr. 2017. “Profile of the Labor Force by Educational Attainment.” US Bureau of Labor Statistics, Spotlight on Statistics, www.bls.gov/spotlight/2017/educational-attainment-of-the-labor-force.


pages: 364 words: 99,897

The Industries of the Future by Alec Ross

"World Economic Forum" Davos, 23andMe, 3D printing, Airbnb, Alan Greenspan, algorithmic bias, algorithmic trading, AltaVista, Anne Wojcicki, autonomous vehicles, banking crisis, barriers to entry, Bernie Madoff, bioinformatics, bitcoin, Black Lives Matter, blockchain, Boston Dynamics, Brian Krebs, British Empire, business intelligence, call centre, carbon footprint, clean tech, cloud computing, collaborative consumption, connected car, corporate governance, Credit Default Swap, cryptocurrency, data science, David Brooks, DeepMind, Demis Hassabis, disintermediation, Dissolution of the Soviet Union, distributed ledger, driverless car, Edward Glaeser, Edward Snowden, en.wikipedia.org, Erik Brynjolfsson, Evgeny Morozov, fiat currency, future of work, General Motors Futurama, global supply chain, Google X / Alphabet X, Gregor Mendel, industrial robot, information security, Internet of things, invention of the printing press, Jaron Lanier, Jeff Bezos, job automation, John Markoff, Joi Ito, Kevin Roose, Kickstarter, knowledge economy, knowledge worker, lifelogging, litecoin, low interest rates, M-Pesa, machine translation, Marc Andreessen, Mark Zuckerberg, Max Levchin, Mikhail Gorbachev, military-industrial complex, mobile money, money: store of value / unit of account / medium of exchange, Nelson Mandela, new economy, off-the-grid, offshore financial centre, open economy, Parag Khanna, paypal mafia, peer-to-peer, peer-to-peer lending, personalized medicine, Peter Thiel, precision agriculture, pre–internet, RAND corporation, Ray Kurzweil, recommendation engine, ride hailing / ride sharing, Rubik’s Cube, Satoshi Nakamoto, selective serotonin reuptake inhibitor (SSRI), self-driving car, sharing economy, Silicon Valley, Silicon Valley startup, Skype, smart cities, social graph, software as a service, special economic zone, supply-chain management, supply-chain management software, technoutopianism, TED Talk, The Future of Employment, Travis Kalanick, underbanked, unit 8200, Vernor Vinge, Watson beat the top human players on Jeopardy!, women in the workforce, work culture , Y Combinator, young professional

Because big data often relies on historical data or at least the status quo, it can easily reproduce discrimination against disadvantaged racial and ethnic minorities. The propensity models used in many algorithms can bake in a bias against someone who lived in the zip code of a low-income neighborhood at any point in his or her life. If an algorithm used by human resources companies queries your social graph and positively weighs candidates with the most existing connections to a workforce, it makes it more difficult to break in in the first place. In effect, these algorithms can hide bias behind a curtain of code. Big data is, by its nature, soulless and uncreative. It nudges us this way and that for reasons we are not meant to understand.


pages: 285 words: 86,853

What Algorithms Want: Imagination in the Age of Computing by Ed Finn

Airbnb, Albert Einstein, algorithmic bias, algorithmic management, algorithmic trading, AlphaGo, Amazon Mechanical Turk, Amazon Web Services, bitcoin, blockchain, business logic, Charles Babbage, Chuck Templeton: OpenTable:, Claude Shannon: information theory, commoditize, Computing Machinery and Intelligence, Credit Default Swap, crowdsourcing, cryptocurrency, data science, DeepMind, disruptive innovation, Donald Knuth, Donald Shoup, Douglas Engelbart, Douglas Engelbart, Elon Musk, Evgeny Morozov, factory automation, fiat currency, Filter Bubble, Flash crash, game design, gamification, Google Glasses, Google X / Alphabet X, Hacker Conference 1984, High speed trading, hiring and firing, Ian Bogost, industrial research laboratory, invisible hand, Isaac Newton, iterative process, Jaron Lanier, Jeff Bezos, job automation, John Conway, John Markoff, Just-in-time delivery, Kickstarter, Kiva Systems, late fees, lifelogging, Loebner Prize, lolcat, Lyft, machine readable, Mother of all demos, Nate Silver, natural language processing, Neal Stephenson, Netflix Prize, new economy, Nicholas Carr, Nick Bostrom, Norbert Wiener, PageRank, peer-to-peer, Peter Thiel, power law, Ray Kurzweil, recommendation engine, Republic of Letters, ride hailing / ride sharing, Satoshi Nakamoto, self-driving car, sharing economy, Silicon Valley, Silicon Valley billionaire, Silicon Valley ideology, Silicon Valley startup, SimCity, Skinner box, Snow Crash, social graph, software studies, speech recognition, statistical model, Steve Jobs, Steven Levy, Stewart Brand, supply-chain management, tacit knowledge, TaskRabbit, technological singularity, technological solutionism, technoutopianism, the Cathedral and the Bazaar, The Coming Technological Singularity, the scientific method, The Signal and the Noise by Nate Silver, The Structural Transformation of the Public Sphere, The Wealth of Nations by Adam Smith, transaction costs, traveling salesman, Turing machine, Turing test, Uber and Lyft, Uber for X, uber lyft, urban planning, Vannevar Bush, Vernor Vinge, wage slave

The apparent transparency and simplicity of computational systems are leading many to see them as vehicles for unbiased decision-making. Companies like UpStart and ZestFinance view computation as a way to judge financial reliability and make loans to people who fail more traditional algorithmic tests of credit-worthiness, like credit scores.14 These systems essentially deploy algorithms to counter the bias of other algorithms, or more cynically to identify business opportunities missed by others. The companies behind these systems are relatively unusual, however, in acknowledging the ideological framing of their business plans, and explicitly addressing how their systems attempt to judge “character.”


pages: 1,172 words: 114,305

New Laws of Robotics: Defending Human Expertise in the Age of AI by Frank Pasquale

affirmative action, Affordable Care Act / Obamacare, Airbnb, algorithmic bias, Amazon Mechanical Turk, Anthropocene, augmented reality, Automated Insights, autonomous vehicles, basic income, battle of ideas, Bernie Sanders, Big Tech, Bill Joy: nanobots, bitcoin, blockchain, Brexit referendum, call centre, Cambridge Analytica, carbon tax, citizen journalism, Clayton Christensen, collective bargaining, commoditize, computer vision, conceptual framework, contact tracing, coronavirus, corporate social responsibility, correlation does not imply causation, COVID-19, critical race theory, cryptocurrency, data is the new oil, data science, decarbonisation, deep learning, deepfake, deskilling, digital divide, digital twin, disinformation, disruptive innovation, don't be evil, Donald Trump, Douglas Engelbart, driverless car, effective altruism, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Evgeny Morozov, fake news, Filter Bubble, finite state, Flash crash, future of work, gamification, general purpose technology, Google Chrome, Google Glasses, Great Leap Forward, green new deal, guns versus butter model, Hans Moravec, high net worth, hiring and firing, holacracy, Ian Bogost, independent contractor, informal economy, information asymmetry, information retrieval, interchangeable parts, invisible hand, James Bridle, Jaron Lanier, job automation, John Markoff, Joi Ito, Khan Academy, knowledge economy, late capitalism, lockdown, machine readable, Marc Andreessen, Mark Zuckerberg, means of production, medical malpractice, megaproject, meta-analysis, military-industrial complex, Modern Monetary Theory, Money creation, move fast and break things, mutually assured destruction, natural language processing, new economy, Nicholas Carr, Nick Bostrom, Norbert Wiener, nuclear winter, obamacare, One Laptop per Child (OLPC), open immigration, OpenAI, opioid epidemic / opioid crisis, paperclip maximiser, paradox of thrift, pattern recognition, payday loans, personalized medicine, Peter Singer: altruism, Philip Mirowski, pink-collar, plutocrats, post-truth, pre–internet, profit motive, public intellectual, QR code, quantitative easing, race to the bottom, RAND corporation, Ray Kurzweil, recommendation engine, regulatory arbitrage, Robert Shiller, Rodney Brooks, Ronald Reagan, self-driving car, sentiment analysis, Shoshana Zuboff, Silicon Valley, Singularitarianism, smart cities, smart contracts, software is eating the world, South China Sea, Steve Bannon, Strategic Defense Initiative, surveillance capitalism, Susan Wojcicki, tacit knowledge, TaskRabbit, technological solutionism, technoutopianism, TED Talk, telepresence, telerobotics, The Future of Employment, The Turner Diaries, Therac-25, Thorstein Veblen, too big to fail, Turing test, universal basic income, unorthodox policies, wage slave, Watson beat the top human players on Jeopardy!, working poor, workplace surveillance , Works Progress Administration, zero day

Mike Butcher, “The Robot-Recruiter Is Coming—VCV’s AI Will Read Your Face in a Job Interview,” TechCrunch, April 23, 2019, https://techcrunch.com/2019/04/23/the-robot-recruiter-is-coming-vcvs-ai-will-read-your-face-in-a-job-interview/. 2. Miranda Bogen and Aaron Rieke, Help Wanted: An Examination of Hiring Algorithms, Equity, and Bias (Washington, DC: Upturn, 2018), https://www.upturn.org/static/reports/2018/hiring-algorithms/files/Upturn%20—%20Help%20Wanted%20-%20An%20Exploration%20of%20Hiring%20Algorithms,%20Equity%20and%20Bias.pdf. 3. There are numerous examples of person-judging technology reinscribing and reinforcing old forms of unearned privilege and unfair disadvantage.


pages: 410 words: 119,823

Radical Technologies: The Design of Everyday Life by Adam Greenfield

3D printing, Airbnb, algorithmic bias, algorithmic management, AlphaGo, augmented reality, autonomous vehicles, bank run, barriers to entry, basic income, bitcoin, Black Lives Matter, blockchain, Boston Dynamics, business intelligence, business process, Californian Ideology, call centre, cellular automata, centralized clearinghouse, centre right, Chuck Templeton: OpenTable:, circular economy, cloud computing, Cody Wilson, collective bargaining, combinatorial explosion, Computer Numeric Control, computer vision, Conway's Game of Life, CRISPR, cryptocurrency, David Graeber, deep learning, DeepMind, dematerialisation, digital map, disruptive innovation, distributed ledger, driverless car, drone strike, Elon Musk, Ethereum, ethereum blockchain, facts on the ground, fiat currency, fulfillment center, gentrification, global supply chain, global village, Goodhart's law, Google Glasses, Herman Kahn, Ian Bogost, IBM and the Holocaust, industrial robot, informal economy, information retrieval, Internet of things, Jacob Silverman, James Watt: steam engine, Jane Jacobs, Jeff Bezos, Jeff Hawkins, job automation, jobs below the API, John Conway, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Perry Barlow, John von Neumann, joint-stock company, Kevin Kelly, Kickstarter, Kiva Systems, late capitalism, Leo Hollis, license plate recognition, lifelogging, M-Pesa, Mark Zuckerberg, means of production, megacity, megastructure, minimum viable product, money: store of value / unit of account / medium of exchange, natural language processing, Network effects, New Urbanism, Nick Bostrom, Occupy movement, Oculus Rift, off-the-grid, PalmPilot, Pareto efficiency, pattern recognition, Pearl River Delta, performance metric, Peter Eisenman, Peter Thiel, planetary scale, Ponzi scheme, post scarcity, post-work, printed gun, proprietary trading, RAND corporation, recommendation engine, RFID, rolodex, Rutger Bregman, Satoshi Nakamoto, self-driving car, sentiment analysis, shareholder value, sharing economy, Shenzhen special economic zone , Sidewalk Labs, Silicon Valley, smart cities, smart contracts, social intelligence, sorting algorithm, special economic zone, speech recognition, stakhanovite, statistical model, stem cell, technoutopianism, Tesla Model S, the built environment, The Death and Life of Great American Cities, The Future of Employment, Tony Fadell, transaction costs, Uber for X, undersea cable, universal basic income, urban planning, urban sprawl, vertical integration, Vitalik Buterin, warehouse robotics, When a measure becomes a target, Whole Earth Review, WikiLeaks, women in the workforce

It will therefore have problems with accurate identification when presented with a black Camaro. Or it could suffer from the opposite problem, bias. In the context of machine learning, bias means that even after extensive training, an algorithm has failed to acquire anything essential at all about the set of target objects it’s being asked to identify. An algorithm displaying high bias is basically taking random stabs in the dark, however much confidence it may seem to be mustering in its labeling, and will without hesitation identify outright static as a house, a whale or a chair. (We should be careful to distinguish this sense of the word from its more usual, pejorative sense, in which the implicit prejudices of the party responsible for training an algorithm are reflected in its output—though that happens too, as on the notorious occasion on which a Google Images algorithm identified a picture of black people as “gorillas,” apparently because the only training images labeled “people” it had ever been provided had light skin.)6 However they might undermine an algorithm’s practical utility, or embarrass the software developers involved, errors of bias and overfitting can be corrected.


pages: 223 words: 60,909

pages: 448 words: 117,325

Click Here to Kill Everybody: Security and Survival in a Hyper-Connected World by Bruce Schneier

23andMe, 3D printing, air gap, algorithmic bias, autonomous vehicles, barriers to entry, Big Tech, bitcoin, blockchain, Brian Krebs, business process, Citizen Lab, cloud computing, cognitive bias, computer vision, connected car, corporate governance, crowdsourcing, cryptocurrency, cuban missile crisis, Daniel Kahneman / Amos Tversky, David Heinemeier Hansson, disinformation, Donald Trump, driverless car, drone strike, Edward Snowden, Elon Musk, end-to-end encryption, fault tolerance, Firefox, Flash crash, George Akerlof, incognito mode, industrial robot, information asymmetry, information security, Internet of things, invention of radio, job automation, job satisfaction, John Gilmore, John Markoff, Kevin Kelly, license plate recognition, loose coupling, market design, medical malpractice, Minecraft, MITM: man-in-the-middle, move fast and break things, national security letter, Network effects, Nick Bostrom, NSO Group, pattern recognition, precautionary principle, printed gun, profit maximization, Ralph Nader, RAND corporation, ransomware, real-name policy, Rodney Brooks, Ross Ulbricht, security theater, self-driving car, Seymour Hersh, Shoshana Zuboff, Silicon Valley, smart cities, smart transportation, Snapchat, sparse data, Stanislav Petrov, Stephen Hawking, Stuxnet, supply-chain attack, surveillance capitalism, The Market for Lemons, Timothy McVeigh, too big to fail, Uber for X, Unsafe at Any Speed, uranium enrichment, Valery Gerasimov, Wayback Machine, web application, WikiLeaks, Yochai Benkler, zero day


Calling Bullshit: The Art of Scepticism in a Data-Driven World by Jevin D. West, Carl T. Bergstrom

airport security, algorithmic bias, AlphaGo, Amazon Mechanical Turk, Andrew Wiles, Anthropocene, autism spectrum disorder, bitcoin, Charles Babbage, cloud computing, computer vision, content marketing, correlation coefficient, correlation does not imply causation, crowdsourcing, cryptocurrency, data science, deep learning, deepfake, delayed gratification, disinformation, Dmitri Mendeleev, Donald Trump, Elon Musk, epigenetics, Estimating the Reproducibility of Psychological Science, experimental economics, fake news, Ford Model T, Goodhart's law, Helicobacter pylori, Higgs boson, invention of the printing press, John Markoff, Large Hadron Collider, longitudinal study, Lyft, machine translation, meta-analysis, new economy, nowcasting, opioid epidemic / opioid crisis, p-value, Pluto: dwarf planet, publication bias, RAND corporation, randomized controlled trial, replication crisis, ride hailing / ride sharing, Ronald Reagan, selection bias, self-driving car, Silicon Valley, Silicon Valley startup, social graph, Socratic dialogue, Stanford marshmallow experiment, statistical model, stem cell, superintelligent machines, systematic bias, tech bro, TED Talk, the long tail, the scientific method, theory of mind, Tim Cook: Apple, twin studies, Uber and Lyft, Uber for X, uber lyft, When a measure becomes a target

Sky News. July 4, 2019. McCool, John H. “Opinion: Why I Published in a Predatory Journal.” The Scientist. April 6, 2017. Merton, R. K. “Priorities in Scientific Discovery: A Chapter in the Sociology of Science.” American Sociological Review 22 (1957): 635–59. “Mortgage Algorithms Perpetuate Racial Bias in Lending, Study Finds.” Press release. University of California, Berkeley. November 13, 2018. “NASA Twins Study Confirms Preliminary Findings.” Press release. National Aeronautics and Space Administration. January 31, 2018. https://www.nasa.gov/​feature/​nasa-twins-study-confirms-preliminary-findings.


pages: 296 words: 78,631

Hello World: Being Human in the Age of Algorithms by Hannah Fry

23andMe, 3D printing, Air France Flight 447, Airbnb, airport security, algorithmic bias, algorithmic management, augmented reality, autonomous vehicles, backpropagation, Brixton riot, Cambridge Analytica, chief data officer, computer vision, crowdsourcing, DARPA: Urban Challenge, data science, deep learning, DeepMind, Douglas Hofstadter, driverless car, Elon Musk, fake news, Firefox, Geoffrey Hinton, Google Chrome, Gödel, Escher, Bach, Ignaz Semmelweis: hand washing, John Markoff, Mark Zuckerberg, meta-analysis, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, pattern recognition, Peter Thiel, RAND corporation, ransomware, recommendation engine, ride hailing / ride sharing, selection bias, self-driving car, Shai Danziger, Silicon Valley, Silicon Valley startup, Snapchat, sparse data, speech recognition, Stanislav Petrov, statistical model, Stephen Hawking, Steven Levy, systematic bias, TED Talk, Tesla Model S, The Wisdom of Crowds, Thomas Bayes, trolley problem, Watson beat the top human players on Jeopardy!, web of trust, William Langewiesche, you are the product

(TV show) 97–9 John Carter (film) 180 Johnson, Richard 50, 51 Jones Beach 1 Jones, Robert 13–14 judges anchoring effect 73 bail, factors for consideration 73 decision-making consistency in 51 contradictions in 52–3 differences in 52 discretion in 53 unbiased 77 judges (continued) discrimination and bias 70–1, 75 intuition and considered thought 72 lawyers’ preference over algorithms 76–7 vs machines 59–61 offenders’ preference over algorithms 76 perpetuation of bias 73 sentencing 53–4, 63 use of algorithms 63, 64 Weber’s Law 74–5 Jukebox 192 junk algorithms 200 Just Noticeable Difference 74 justice 49–78 algorithms and 54–6 justification for 77 appeals process 51 Brixton riots 49–51 by country Australia 53 Canada 54 England 54 Ireland 54 Scotland 54 United States 53, 54 Wales 54 discretion of judges 53 discrimination 70–1 humans vs machines 59–61, 62–4 hypothetical cases (UK research) 52–3 defendants appearing twice 52–3 differences in judgement 52, 53 hypothetical cases (US research) 51–2 differences in judgements 52 differences in sentencing 52 inherent injustice 77 machine bias 65–71 maximum terms 54 purpose of 77–8 re-offending 54, 55 reasonable doubt 51 rehabilitation 55 risk-assessment algorithms 56 sentencing consistency in 51 mitigating factors in 53 substantial grounds 51 Kadoodle 15–16 Kahneman, Daniel 72 Kanevsky, Dr Jonathan 93, 95 kangaroos 128 Kant, Immanuel 185 Kasparov, Gary 5-7, 202 Kelly, Frank 87 Kerner, Winifred 188–9 Kernighan, Brian x Killingbeck 145, 146 Larson, Steve 188–9 lasers 119–20 Leibniz, Gottfried 184 Leroi, Armand 186, 192–3 level 0 (driverless technology) 131 level 1 (driverless technology) 131 level 2 (driverless technology) 131, 136 careful attention 134–5 level 3 (driverless technology) 131 technical challenge 136 level 4 (driverless technology) 131 level 5 (driverless technology) 131 Li Yingyun 45 Lickel, Charles 97–8 LiDAR (Light Detection and Ranging) 119–20 life insurance 109 ‘Lockdown’ (52Metro) 177 logic 8 logical instructions 8 London Bridge 172 London School of Economics (LSE) 129 Loomis, Eric 217n38 Los Angeles Police Department 152, 155 Lucas, Teghan 161–2, 163 machine-learning algorithms 10–11 neural networks 85–6 random forests 58–9 machines art and 194 bias in 65–71 diagnostic 98–101, 110–11 domination of humans 5-6 vs humans 59–61, 62–4 paradoxical relationship with 22–3 recognising images 84–7 superior judgement of 16 symbolic dominance over humans 5-6 Magic Test 199 magical illusions 18 mammogram screenings 94, 96 manipulation 39–44 micro-manipulation 42–4 Maple, Jack 147–50 Marx, Gary 173 mastectomies 83, 84, 92, 94 maternity wards, deaths on 81 mathematical certainty 68 mathematical objects 8 McGrayne, Sharon Bertsch 122 mechanized weaving machines 2 Medicaid assistance 16–17 medical conditions, algorithms for 96–7 medical records 102–7 benefits of algorithms 106 DeepMind 104–5 disconnected 102–3 misuse of data 106 privacy 105–7 medicine 79–112 in ancient times 80 cancer diagnoses study 79–80 complexity of 103–4 diabetic retinopathy 96 diagnostic machines 98–101, 110–11 choosing between individuals and the population 111 in fifteenth-century China 81 Hippocrates and 80 magic and 80 medical records 102–6 neural networks 85–6, 95, 96, 219–20n11 in nineteenth-century Europe 81 pathology 79, 82–3 patterns in data 79–81 predicting dementia 90–2 scientific base 80 see also Watson (IBM computer) Meehl, Paul 21–2 MegaFace challenge 168–9 Mercedes 125–6 microprocessors x Millgarth 145, 146 Mills, Tamara 101–2, 103 MIT Technology Review 101 modern inventions 2 Moses, Robert 1 movies see films music 176–80 choosing 176–8 diversity of charts 186 emotion and 189 genetic algorithms 191–2 hip hop 186 piano experiment 188–90 algorithm 188, 189–91 popularity 177, 178 quality 179, 180 terrible, success of 178–9 Music Lab 176–7, 179, 180 Musk, Elon 138 MyHeritage 110 National Geographic ­Genographic project 110 National Highway Traffic Safety Administration 135 Navlab 117 Netflix 8, 188 random forests 59 neural networks 85–6, 95, 119, 201, 219–20n11 driverless cars 117–18 in facial recognition 166–7 predicting performances of films 183 New England Journal of ­Medicine 94 New York City subway crime 147–50 anti-social behaviour 149 fare evasion 149 hotspots 148, 149 New York Police Department (NYPD) 172 New York Times 116 Newman, Paul 127–8, 130 NHS (National Health Service) computer virus in hospitals 105 data security record 105 fax machines 103 linking of healthcare records 102–3 paper records 103 prioritization of non-smokers for operations 106 nuclear war 18–19 Nun Study 90–2 obesity 106 OK Cupid 9 Ontario 169–70 openworm project 13 Operation Lynx 145–7 fingerprints 145 overruling algorithms correctly 19–20 incorrectly 20–1 Oxbotica 127 Palantir Technologies 31 Paris Auto Show (2016) 124–5 parole 54–5 Burgess’s forecasting power 55–6 violation of 55–6 passport officers 161, 164 PathAI 82 pathologists 82 vs algorithms 88 breast cancer research on corpses 92–3 correct diagnoses 83 differences of opinion 83–4 diagnosing cancerous tumours 90 sensitivity and 88 specificity and 88 pathology 79, 82 and biology 82–3 patterns in data 79–81, 103, 108 payday lenders 35 personality traits 39 advertising and 40–1 inferred by algorithm 40 research on 39–40 Petrov, Stanislav 18–19 piano experiment 188–90 pigeons 79–80 Pomerleau, Dean 118–19 popularity 177, 178, 179, 183–4 power 5–24 blind faith in algorithms 13–16 overruling algorithms 19–21 struggle between humans and algorithms 20–4 trusting algorithms 16–19 power of veto 19 Pratt, Gill 137 precision in justice 53 prediction accuracy of 66, 67, 68 algorithms vs humans 22, 59–61, 62–5 Burgess 55–6 of crime burglary 150–1 HunchLab algorithm 157–8 PredPol algorithm 152–7, 158 risk factor 152 Strategic Subject List algorithm 158 decision trees 56–8 dementia 90–2 prediction (continued) development of abnormalities 87, 95 homicide 62 of personality 39–42 of popularity 177, 178, 179, 180, 183–4 powers of 92–6 of pregnancy 29–30 re-offending criminals 55–6 recidivism 62, 63–4, 65 of successful films 180–1, 182–3, 183 superiority of algorithms 22 see also Clinical vs Statistical Prediction (Meehl); neural networks predictive text 190–1 PredPol (PREDictive POL­icing) 152–7, 158, 228–9n27 assessing locations at risk 153–4 cops on the dots 155–6 fall in crime 156 feedback loop 156–7 vs humans, test 153–4 target hardening 154–5 pregnancy prediction 29–30 prescriptive sentencing systems 53, 54 prioritization algorithms 8 prisons cost of incarceration 61 Illinois 55, 56 reduction in population 61 privacy 170, 172 false sense of 47 issues 25 medical records 105–7 overriding of 107 sale of data 36–9 probabilistic inference 124, 127 probability 8 ProPublica 65–8, 70 quality 179, 180 ‘good’ changing nature of 184 defining 184 quantifying 184–8 difficulty of 184 Washington Post experiment 185–6 racial groups COMPAS algorithm 65–6 rates of arrest 68 radar 119–20 RAND Corporation 158 random forests technique 56–9 rape 141, 142 re-offending 54 prediction of 55–6 social types of inmates 55, 56 recidivism 56, 62, 201 rates 61 risk scores 63–4, 65 regulation of algorithms 173 rehabilitation 55 relationships 9 Republican voters 41 Rhode Island 61 Rio de Janeiro–Galeão International Airport 132 risk scores 63–4, 65 Robinson, Nicholas 49, 50, 50–1, 77 imprisonment 51 Rossmo, Kim 142–3 algorithm 145–7 assessment of 146 bomb factories 147 buffer zone 144 distance decay 144 flexibility of 146 stagnant water pools 146–7 Operation Lynx 145–7 Rotten Tomatoes website 181 Royal Free NHS Trust 222–3n48 contract with DeepMind 104–5 access to full medical histories 104–5 outrage at 104 Rubin’s vase 211n13 rule-based algorithms 10, 11, 85 Rutherford, Adam 110 Safari browser 47 Sainsbury’s 27 Salganik, Matthew 176–7, 178 Schmidt, Eric 28 School Sisters of Notre Dame 90, 91 Science magazine 15 Scunthorpe 2 search engines 14–15 experiment 14–15 Kadoodle 15–16 Semmelweis, Ignaz 81 sensitivity, principle of 87, 87–8 sensors 120 sentencing algorithms for 62–4 COMPAS 63, 64 considerations for 62–3 consistency in 51 length of 62–3 influencing 73 Weber’s Law 74–5 mitigating factors in 53 prescriptive systems 53, 54 serial offenders 144, 145 serial rapists 141–2 Sesame Credit 45–6, 168 sexual attacks 141–2 shoplifters 170 shopping habits 28, 29, 31 similarity 187 Slash X (bar) 113, 114, 115 smallpox inoculation 81 Snowden, David 90–2 social proof 177–8, 179 Sorensen, Alan 178 Soviet Union detection of enemy missiles 18 protecting air space 18 retaliatory action 19 specificity, principle of 87, 87–8 speech recognition algorithms 9 Spotify 176, 188 Spotify Discover 188 Sreenivasan, Sameet 181–2 Stammer, Neil 172 Standford University 39–40 STAT website 100 statistics 143 computational 12 modern 107 NYPD 172 Stilgoe, Jack 128–9, 130 Strategic Subject List 158 subway crime see New York City subway crime supermarkets 26–8 superstores 28–31 Supreme Court of Wisconsin 64, 217n38 swine flu 101–2 Talley, Steve 159, 162, 163–4, 171, 230n47 Target 28–31 analysing unusual data ­patterns 28–9 expectant mothers 28–9 algorithm 29, 30 coupons 29 justification of policy 30 teenage pregnancy incident 29–30 target hardening 154–5 teenage pregnancy 29–30 Tencent YouTu Lab algorithm 169 Tesco 26–8 Clubcard 26, 27 customers buying behaviour 26–7 knowledge about 27 loyalty of 26 vouchers 27 online shopping 27–8 ‘My Favourites’ feature 27–8 removal of revealing items 28 Tesla 134, 135 autopilot system 138 full autonomy 138 full self-driving hardware 138 Thiel, Peter 31 thinking, ways of 72 Timberlake, Justin 175–6 Timberlake, Justin (artist) 175–6 Tolstoy, Leo 194 TomTom sat-nav 13–14 Toyota 137, 210n13 chauffeur mode 139 guardian mode 139 trolley problem 125–6 true positives 67 Trump election campaign 41, 44 trust 17–18 tumours 90, 93–4 Twain, Mark 193 Twitter 36, 37, 40 filtering 10 Uber driverless cars 135 human intervention 135 uberPOOL 10 United Kingdom (UK) database of facial images 168 facial recognition algorithms 161 genetic tests for Huntington’s disease 110 United States of America (USA) database of facial images 168 facial recognition algorithms 161 life insurance stipulations 109 linking of healthcare ­records 103 University of California 152 University of Cambridge research on personality traits 39–40 and advertising 40–1 algorithm 40 personality predictions 40 and Twitter 40 University of Oregon 188–90 University of Texas M.


pages: 280 words: 76,638

Rebel Ideas: The Power of Diverse Thinking by Matthew Syed

adjacent possible, agricultural Revolution, Alfred Russel Wallace, algorithmic bias, behavioural economics, Bletchley Park, Boeing 747, call centre, Cass Sunstein, classic study, cognitive load, computer age, crowdsourcing, cuban missile crisis, deep learning, delayed gratification, drone strike, Elon Musk, Erik Brynjolfsson, Fairchild Semiconductor, fake news, Ferguson, Missouri, Filter Bubble, Firefox, invention of writing, James Dyson, Jeff Bezos, knowledge economy, lateral thinking, market bubble, mass immigration, microbiome, Mitch Kapor, persistent metabolic adaptation, Peter Thiel, post-truth, Richard Thaler, Ronald Reagan, Second Machine Age, self-driving car, seminal paper, Silicon Valley, social intelligence, Steve Jobs, Steve Wozniak, Stuart Kauffman, tech worker, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, traveling salesman, vertical integration

But if you aren’t aware of how those biases operate, if you aren’t collecting data and taking a little time to produce evidence-based processes, you will continue to blindly perpetuate old injustices. And so by not considering ways in which women’s lives differ from men’s, both on and offline, Gild’s coders inadvertently created an algorithm with a hidden bias against women.’ 6 https://hbr.org/2019/06/why-you-should-create-a-shadow-board-of-younger-employees 7 https://www.npr.org/2015/09/14/440215976/journalist-says-the-drone-strike-that-killed-awlaki-did-not-silence-him Footnotes 1 Collective Blindness FN1 This was partly about the fear that gay staff, particularly those who had not come out, might be subject to blackmail.


pages: 208 words: 57,602

Futureproof: 9 Rules for Humans in the Age of Automation by Kevin Roose

"World Economic Forum" Davos, adjacent possible, Airbnb, Albert Einstein, algorithmic bias, algorithmic management, Alvin Toffler, Amazon Web Services, Atul Gawande, augmented reality, automated trading system, basic income, Bayesian statistics, Big Tech, big-box store, Black Lives Matter, business process, call centre, choice architecture, coronavirus, COVID-19, data science, deep learning, deepfake, DeepMind, disinformation, Elon Musk, Erik Brynjolfsson, factory automation, fake news, fault tolerance, Frederick Winslow Taylor, Freestyle chess, future of work, Future Shock, Geoffrey Hinton, George Floyd, gig economy, Google Hangouts, GPT-3, hiring and firing, hustle culture, hype cycle, income inequality, industrial robot, Jeff Bezos, job automation, John Markoff, Kevin Roose, knowledge worker, Kodak vs Instagram, labor-force participation, lockdown, Lyft, mandatory minimum, Marc Andreessen, Mark Zuckerberg, meta-analysis, Narrative Science, new economy, Norbert Wiener, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, off-the-grid, OpenAI, pattern recognition, planetary scale, plutocrats, Productivity paradox, QAnon, recommendation engine, remote working, risk tolerance, robotic process automation, scientific management, Second Machine Age, self-driving car, Shoshana Zuboff, Silicon Valley, Silicon Valley startup, social distancing, Steve Jobs, Stuart Kauffman, surveillance capitalism, tech worker, The Future of Employment, The Wealth of Nations by Adam Smith, TikTok, Travis Kalanick, Uber and Lyft, uber lyft, universal basic income, warehouse robotics, Watson beat the top human players on Jeopardy!, work culture


pages: 289 words: 87,292

The Strange Order of Things: The Biological Roots of Culture by Antonio Damasio

Albert Einstein, algorithmic bias, biofilm, business process, CRISPR, Daniel Kahneman / Amos Tversky, double helix, Gordon Gekko, invention of the wheel, invention of writing, invisible hand, job automation, mental accounting, meta-analysis, microbiome, Nick Bostrom, Norbert Wiener, pattern recognition, Peter Singer: altruism, planetary scale, post-truth, profit motive, Ray Kurzweil, Richard Feynman, self-driving car, Silicon Valley, Steven Pinker, Stuart Kauffman, Thomas Malthus

On the other hand, the public generally lacks the time and the method to convert massive amounts of information into sensible and practically usable conclusions. Moreover, the companies that manage the distribution and aggregation of the information assist the public in a dubious way: the flow of information is directed by company algorithms that, in turn, bias the presentation so as to suit a variety of financial, political, and social interests, not to mention the tastes of users so that they can continue within their own entertaining silo of opinions. One should acknowledge, in fairness, that the voices of wisdom from the past—the voices of experienced and thoughtful editors of newspapers and radio and television programs—were also biased and favored particular views of how societies should function.


pages: 592 words: 125,186

pages: 225 words: 70,241

Silicon City: San Francisco in the Long Shadow of the Valley by Cary McClelland

affirmative action, Airbnb, algorithmic bias, Apple II, autonomous vehicles, barriers to entry, Black Lives Matter, Burning Man, clean water, cloud computing, cognitive dissonance, Columbine, computer vision, creative destruction, driverless car, El Camino Real, Elon Musk, Fairchild Semiconductor, full employment, gamification, gentrification, gig economy, Golden Gate Park, Google bus, Google Glasses, high net worth, housing crisis, housing justice, income inequality, John Gilmore, John Perry Barlow, Joseph Schumpeter, Loma Prieta earthquake, Lyft, mass immigration, means of production, Menlo Park, Mitch Kapor, open immigration, PalmPilot, rent control, Salesforce, San Francisco homelessness, self-driving car, sharing economy, Silicon Valley, Skype, Social Justice Warrior, Steve Jobs, Steve Wozniak, TaskRabbit, tech bro, tech worker, transcontinental railway, Travis Kalanick, Uber and Lyft, uber lyft, urban planning, vertical integration, William Shockley: the traitorous eight, young professional

Simultaneously, if this is not your discipline—and I’m not just saying, “Are you a person of color? Are you a woman?”—if this is not your discipline, you can be helpful, but you cannot lead this conversation. Because we will tell you, you cannot program your way out of this problem. There is no algorithm that will solve this. Bias continues to be embedded in the code. Many of the solutions that will bring about racial-social justice, put an end to police violence, do those kinds of things—need to have, at their root, technology. Technology, to scale. Technology, for accessibility. Technology, for low-cost solutions and low barriers to entry.


pages: 268 words: 75,850

The Formula: How Algorithms Solve All Our Problems-And Create More by Luke Dormehl

3D printing, algorithmic bias, algorithmic trading, Alvin Toffler, Any sufficiently advanced technology is indistinguishable from magic, augmented reality, big data - Walmart - Pop Tarts, call centre, Cass Sunstein, classic study, Clayton Christensen, commoditize, computer age, death of newspapers, deferred acceptance, disruptive innovation, Edward Lorenz: Chaos theory, Erik Brynjolfsson, Evgeny Morozov, Filter Bubble, Flash crash, Florence Nightingale: pie chart, Ford Model T, Frank Levy and Richard Murnane: The New Division of Labor, fulfillment center, Google Earth, Google Glasses, High speed trading, Internet Archive, Isaac Newton, Jaron Lanier, Jeff Bezos, job automation, John Markoff, Kevin Kelly, Kodak vs Instagram, Lewis Mumford, lifelogging, machine readable, machine translation, Marshall McLuhan, means of production, Nate Silver, natural language processing, Netflix Prize, Panopticon Jeremy Bentham, Paradox of Choice, pattern recognition, price discrimination, recommendation engine, Richard Thaler, Rosa Parks, scientific management, self-driving car, sentiment analysis, Silicon Valley, Silicon Valley startup, Slavoj Žižek, social graph, speech recognition, stable marriage problem, Steve Jobs, Steven Levy, Steven Pinker, Stewart Brand, technological determinism, technological solutionism, TED Talk, the long tail, the scientific method, The Signal and the Noise by Nate Silver, upwardly mobile, Wall-E, Watson beat the top human players on Jeopardy!, Y Combinator

While Berk’s comments are designed to get actionable information to predict future criminality, one could argue that by black-boxing the inner workings of the technology, something similar has taken place with the underlying social dynamics. In other areas—particularly as relate to law—a reliance on algorithms might simply justify existing bias and lack of understanding, in the same way that the “filter bubble” effect described in Chapter 1 can result in some people not being presented with certain pieces of information, which may take the form of opportunities. “It’s not just you and I who don’t understand how these algorithms work—the engineers themselves don’t understand them entirely,” says scholar Ted Striphas.


The Deep Learning Revolution (The MIT Press) by Terrence J. Sejnowski

AI winter, Albert Einstein, algorithmic bias, algorithmic trading, AlphaGo, Amazon Web Services, Any sufficiently advanced technology is indistinguishable from magic, augmented reality, autonomous vehicles, backpropagation, Baxter: Rethink Robotics, behavioural economics, bioinformatics, cellular automata, Claude Shannon: information theory, cloud computing, complexity theory, computer vision, conceptual framework, constrained optimization, Conway's Game of Life, correlation does not imply causation, crowdsourcing, Danny Hillis, data science, deep learning, DeepMind, delayed gratification, Demis Hassabis, Dennis Ritchie, discovery of DNA, Donald Trump, Douglas Engelbart, driverless car, Drosophila, Elon Musk, en.wikipedia.org, epigenetics, Flynn Effect, Frank Gehry, future of work, Geoffrey Hinton, Google Glasses, Google X / Alphabet X, Guggenheim Bilbao, Gödel, Escher, Bach, haute couture, Henri Poincaré, I think there is a world market for maybe five computers, industrial robot, informal economy, Internet of things, Isaac Newton, Jim Simons, John Conway, John Markoff, John von Neumann, language acquisition, Large Hadron Collider, machine readable, Mark Zuckerberg, Minecraft, natural language processing, Neil Armstrong, Netflix Prize, Norbert Wiener, OpenAI, orbital mechanics / astrodynamics, PageRank, pattern recognition, pneumatic tube, prediction markets, randomized controlled trial, Recombinant DNA, recommendation engine, Renaissance Technologies, Rodney Brooks, self-driving car, Silicon Valley, Silicon Valley startup, Socratic dialogue, speech recognition, statistical model, Stephen Hawking, Stuart Kauffman, theory of mind, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, Turing machine, Von Neumann architecture, Watson beat the top human players on Jeopardy!, world market for maybe five computers, X Prize, Yogi Berra

Cowan, Manhattan Project to the Santa Fe Institute: The Memoirs of George A. Cowan (Albuquerque: University of New Mexico Press, 2010). 4. Google’s PageRank algorithm, which was invented by Google founders Larry Page and Sergey Brin, uses links to a webpage to rank the importance of pages on the Internet. It has since been elaborated with many layers of algorithms to manipulate the bias on searches. 5. A. D. I. Kramer, J. E. Guillory, and J. T. Hancock, “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks,” Proceedings of the National Academy of Sciences of the United States of America 111, no. 24 (2014): 8788–8790. 6. Stuart Kauffman, The Origins of Order: Self Organization and Selection in Evolution (New York: Oxford University Press, 1993). 7.


pages: 428 words: 103,544

The Data Detective: Ten Easy Rules to Make Sense of Statistics by Tim Harford

Abraham Wald, access to a mobile phone, Ada Lovelace, affirmative action, algorithmic bias, Automated Insights, banking crisis, basic income, behavioural economics, Black Lives Matter, Black Swan, Bretton Woods, British Empire, business cycle, Cambridge Analytica, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, Charles Babbage, clean water, collapse of Lehman Brothers, contact tracing, coronavirus, correlation does not imply causation, COVID-19, cuban missile crisis, Daniel Kahneman / Amos Tversky, data science, David Attenborough, Diane Coyle, disinformation, Donald Trump, Estimating the Reproducibility of Psychological Science, experimental subject, fake news, financial innovation, Florence Nightingale: pie chart, Gini coefficient, Great Leap Forward, Hans Rosling, high-speed rail, income inequality, Isaac Newton, Jeremy Corbyn, job automation, Kickstarter, life extension, meta-analysis, microcredit, Milgram experiment, moral panic, Netflix Prize, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, opioid epidemic / opioid crisis, Paul Samuelson, Phillips curve, publication bias, publish or perish, random walk, randomized controlled trial, recommendation engine, replication crisis, Richard Feynman, Richard Thaler, rolodex, Ronald Reagan, selection bias, sentiment analysis, Silicon Valley, sorting algorithm, sparse data, statistical model, stem cell, Stephen Hawking, Steve Bannon, Steven Pinker, survivorship bias, systematic bias, TED Talk, universal basic income, W. E. B. Du Bois, When a measure becomes a target


pages: 317 words: 87,048

Other Pandemic: How QAnon Contaminated the World by James Ball

"World Economic Forum" Davos, 2021 United States Capitol attack, 4chan, Abraham Wald, algorithmic bias, Bellingcat, Bernie Sanders, Big Tech, Black Lives Matter, blockchain, Boris Johnson, Charles Babbage, cognitive dissonance, Comet Ping Pong, coronavirus, COVID-19, crowdsourcing, deepfake, deplatforming, disinformation, Donald Trump, Elon Musk, en.wikipedia.org, fake news, false flag, Gabriella Coleman, global pandemic, green transition, housing justice, informal economy, Jeffrey Epstein, Jeremy Corbyn, John Perry Barlow, Jon Ronson, Julian Assange, lab leak, lockdown, lolcat, Mark Zuckerberg, meta-analysis, Minecraft, nuclear winter, paperclip maximiser, Peter Thiel, Piers Corbyn, post-truth, pre–internet, QAnon, real-name policy, Russell Brand, side project, Silicon Valley, Silicon Valley billionaire, Snapchat, social contagion, Steve Bannon, survivorship bias, TikTok, trade route, We are Anonymous. We are Legion, WikiLeaks

With concerted time and research we may get better at helping those pulled in to conspiratorial dreamlands to wake up (sheeple).12 But for now, we can start with a few principles – the first being not to waste time arguing the details of each ‘fact’ or sign that Q or some other online messiah says is actually prophetic. Instead, we can try to start where people are and pull them back into communities outside the digital conspiracies. We can also try to make people think about the online ecosystem and their place in it – to see how a toxic mixture of algorithmic curation and online survivorship bias in communities can radicalise first your information ecosystem and then your mind. All of that is slow, laborious work, with absolutely no guarantee of success – which can only lead us to the conclusion that online, just as offline, prevention is surely better than cure. Digital Public Health The things we can do to stop people getting sick are always a lot cheaper than treating them once they get there.



pages: 442 words: 94,734

pages: 404 words: 92,713

pages: 472 words: 117,093

Machine, Platform, Crowd: Harnessing Our Digital Future by Andrew McAfee, Erik Brynjolfsson

"World Economic Forum" Davos, 3D printing, additive manufacturing, AI winter, Airbnb, airline deregulation, airport security, Albert Einstein, algorithmic bias, AlphaGo, Amazon Mechanical Turk, Amazon Web Services, Andy Rubin, AOL-Time Warner, artificial general intelligence, asset light, augmented reality, autism spectrum disorder, autonomous vehicles, backpropagation, backtesting, barriers to entry, behavioural economics, bitcoin, blockchain, blood diamond, British Empire, business cycle, business process, carbon footprint, Cass Sunstein, centralized clearinghouse, Chris Urmson, cloud computing, cognitive bias, commoditize, complexity theory, computer age, creative destruction, CRISPR, crony capitalism, crowdsourcing, cryptocurrency, Daniel Kahneman / Amos Tversky, data science, Dean Kamen, deep learning, DeepMind, Demis Hassabis, discovery of DNA, disintermediation, disruptive innovation, distributed ledger, double helix, driverless car, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Ethereum, ethereum blockchain, everywhere but in the productivity statistics, Evgeny Morozov, fake news, family office, fiat currency, financial innovation, general purpose technology, Geoffrey Hinton, George Akerlof, global supply chain, Great Leap Forward, Gregor Mendel, Hernando de Soto, hive mind, independent contractor, information asymmetry, Internet of things, inventory management, iterative process, Jean Tirole, Jeff Bezos, Jim Simons, jimmy wales, John Markoff, joint-stock company, Joseph Schumpeter, Kickstarter, Kiva Systems, law of one price, longitudinal study, low interest rates, Lyft, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, Marc Andreessen, Marc Benioff, Mark Zuckerberg, meta-analysis, Mitch Kapor, moral hazard, multi-sided market, Mustafa Suleyman, Myron Scholes, natural language processing, Network effects, new economy, Norbert Wiener, Oculus Rift, PageRank, pattern recognition, peer-to-peer lending, performance metric, plutocrats, precision agriculture, prediction markets, pre–internet, price stability, principal–agent problem, Project Xanadu, radical decentralization, Ray Kurzweil, Renaissance Technologies, Richard Stallman, ride hailing / ride sharing, risk tolerance, Robert Solow, Ronald Coase, Salesforce, Satoshi Nakamoto, Second Machine Age, self-driving car, sharing economy, Silicon Valley, Skype, slashdot, smart contracts, Snapchat, speech recognition, statistical model, Steve Ballmer, Steve Jobs, Steven Pinker, supply-chain management, synthetic biology, tacit knowledge, TaskRabbit, Ted Nelson, TED Talk, the Cathedral and the Bazaar, The Market for Lemons, The Nature of the Firm, the strength of weak ties, Thomas Davenport, Thomas L Friedman, too big to fail, transaction costs, transportation-network company, traveling salesman, Travis Kalanick, Two Sigma, two-sided market, Tyler Cowen, Uber and Lyft, Uber for X, uber lyft, ubercab, Vitalik Buterin, warehouse robotics, Watson beat the top human players on Jeopardy!, winner-take-all economy, yield management, zero day


pages: 444 words: 130,646

Twitter and Tear Gas: The Power and Fragility of Networked Protest by Zeynep Tufekci

"Hurricane Katrina" Superdome, 4chan, active measures, Affordable Care Act / Obamacare, algorithmic bias, AltaVista, Alvin Toffler, Andy Carvin, anti-communist, Bernie Sanders, Black Lives Matter, bread and circuses, British Empire, citizen journalism, collective bargaining, conceptual framework, context collapse, crowdsourcing, digital divide, disinformation, Donald Trump, Edward Snowden, end-to-end encryption, Evgeny Morozov, fake news, feminist movement, Ferguson, Missouri, Filter Bubble, Future Shock, gentrification, Howard Rheingold, income inequality, index card, interchangeable parts, invention of movable type, invention of writing, John Gilmore, John Perry Barlow, loose coupling, Mahatma Gandhi, Mark Zuckerberg, Menlo Park, Mikhail Gorbachev, moral hazard, moral panic, Naomi Klein, Network effects, new economy, obamacare, Occupy movement, offshore financial centre, pre–internet, race to the bottom, RAND corporation, real-name policy, ride hailing / ride sharing, Rosa Parks, sharing economy, Silicon Valley, Skype, Snapchat, Streisand effect, the strength of weak ties, The Structural Transformation of the Public Sphere, The Theory of the Leisure Class by Thorstein Veblen, Thorstein Veblen, Twitter Arab Spring, We are the 99%, WikiLeaks, Yochai Benkler

It can result in more polarization and at the same time deepen the filter bubble.44 The bias toward “Like” on Facebook promotes the echo-chamber effect, making it more likely that one sees posts one already agrees with. Of course, this builds upon the pre-existing human tendency to gravitate toward topics and positions one already agrees with—confirmation bias—which is well demonstrated in social science research. Facebook’s own studies show that the algorithm contributes to this bias by making the feed somewhat more tilted toward one’s existing views, reinforcing the echo chamber.45 Another type of bias is “comment” bias, which can promote visibility for the occasional quarrels that have garnered many comments. But how widespread are these problems, and what are their effects?


pages: 337 words: 103,522

The Creativity Code: How AI Is Learning to Write, Paint and Think by Marcus Du Sautoy

3D printing, Ada Lovelace, Albert Einstein, algorithmic bias, AlphaGo, Alvin Roth, Andrew Wiles, Automated Insights, Benoit Mandelbrot, Bletchley Park, Cambridge Analytica, Charles Babbage, Claude Shannon: information theory, computer vision, Computing Machinery and Intelligence, correlation does not imply causation, crowdsourcing, data is the new oil, data science, deep learning, DeepMind, Demis Hassabis, Donald Trump, double helix, Douglas Hofstadter, driverless car, Elon Musk, Erik Brynjolfsson, Fellow of the Royal Society, Flash crash, Gödel, Escher, Bach, Henri Poincaré, Jacquard loom, John Conway, Kickstarter, Loebner Prize, machine translation, mandelbrot fractal, Minecraft, move 37, music of the spheres, Mustafa Suleyman, Narrative Science, natural language processing, Netflix Prize, PageRank, pattern recognition, Paul Erdős, Peter Thiel, random walk, Ray Kurzweil, recommendation engine, Rubik’s Cube, Second Machine Age, Silicon Valley, speech recognition, stable marriage problem, Turing test, Watson beat the top human players on Jeopardy!, wikimedia commons

This bias in the data has led to a whole host of algorithms that are making unacceptable decisions: voice recognition software trained on male voices that doesn’t recognise women’s voices; image recognition software that classifies black people as gorillas; passport photo booths that tell Asians their photos are unacceptable because they have their eyes closed. In Silicon Valley, four out of five people hired in the tech industry are white males. This has led Buolamwini to set up the Algorithmic Justice League to fight bias in the data that algorithms are learning on. The legal system is also facing challenges as people are being rejected for mortgages, jobs or state benefits because of an algorithm. These people justifiably want to know why they have been turned down. But given that these algorithms are creating decision trees based on their interaction with data that is hard to unravel, justifying these decisions is not easy.


pages: 420 words: 100,811


pages: 523 words: 143,139

Algorithms to Live By: The Computer Science of Human Decisions by Brian Christian, Tom Griffiths

4chan, Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, algorithmic bias, algorithmic trading, anthropic principle, asset allocation, autonomous vehicles, Bayesian statistics, behavioural economics, Berlin Wall, Big Tech, Bill Duvall, bitcoin, Boeing 747, Charles Babbage, cognitive load, Community Supported Agriculture, complexity theory, constrained optimization, cosmological principle, cryptocurrency, Danny Hillis, data science, David Heinemeier Hansson, David Sedaris, delayed gratification, dematerialisation, diversification, Donald Knuth, Donald Shoup, double helix, Dutch auction, Elon Musk, exponential backoff, fault tolerance, Fellow of the Royal Society, Firefox, first-price auction, Flash crash, Frederick Winslow Taylor, fulfillment center, Garrett Hardin, Geoffrey Hinton, George Akerlof, global supply chain, Google Chrome, heat death of the universe, Henri Poincaré, information retrieval, Internet Archive, Jeff Bezos, Johannes Kepler, John Nash: game theory, John von Neumann, Kickstarter, knapsack problem, Lao Tzu, Leonard Kleinrock, level 1 cache, linear programming, martingale, multi-armed bandit, Nash equilibrium, natural language processing, NP-complete, P = NP, packet switching, Pierre-Simon Laplace, power law, prediction markets, race to the bottom, RAND corporation, RFC: Request For Comment, Robert X Cringely, Sam Altman, scientific management, sealed-bid auction, second-price auction, self-driving car, Silicon Valley, Skype, sorting algorithm, spectrum auction, Stanford marshmallow experiment, Steve Jobs, stochastic process, Thomas Bayes, Thomas Malthus, Tragedy of the Commons, traveling salesman, Turing machine, urban planning, Vickrey auction, Vilfredo Pareto, Walter Mischel, Y Combinator, zero-sum game

In almost every domain we’ve considered, we have seen how the more real-world factors we include—whether it’s having incomplete information when interviewing job applicants, dealing with a changing world when trying to resolve the explore/exploit dilemma, or having certain tasks depend on others when we’re trying to get things done—the more likely we are to end up in a situation where finding the perfect solution takes unreasonably long. And indeed, people are almost always confronting what computer science regards as the hard cases. Up against such hard cases, effective algorithms make assumptions, show a bias toward simpler solutions, trade off the costs of error against the costs of delay, and take chances. These aren’t the concessions we make when we can’t be rational. They’re what being rational means. Notes Please note that some of the links referenced are no longer working.


pages: 629 words: 142,393

The Future of the Internet: And How to Stop It by Jonathan Zittrain

A Declaration of the Independence of Cyberspace, algorithmic bias, Amazon Mechanical Turk, Andy Kessler, barriers to entry, behavioural economics, book scanning, Brewster Kahle, Burning Man, c2.com, call centre, Cass Sunstein, citizen journalism, Citizen Lab, Clayton Christensen, clean water, commoditize, commons-based peer production, corporate governance, Daniel Kahneman / Amos Tversky, digital divide, disruptive innovation, distributed generation, en.wikipedia.org, end-to-end encryption, Firefox, folksonomy, Free Software Foundation, game design, Hacker Ethic, Howard Rheingold, Hush-A-Phone, illegal immigration, index card, informal economy, information security, Internet Archive, jimmy wales, John Markoff, John Perry Barlow, license plate recognition, loose coupling, mail merge, Morris worm, national security letter, old-boy network, One Laptop per Child (OLPC), OSI model, packet switching, peer-to-peer, post-materialism, pre–internet, price discrimination, profit maximization, radical decentralization, Ralph Nader, RFC: Request For Comment, RFID, Richard Stallman, Richard Thaler, risk tolerance, Robert Bork, Robert X Cringely, SETI@home, Silicon Valley, Skype, slashdot, software patent, Steve Ballmer, Steve Jobs, Ted Nelson, Telecommunications Act of 1996, the Cathedral and the Bazaar, the long tail, The Nature of the Firm, The Wisdom of Crowds, Tragedy of the Commons, web application, wikimedia commons, Yochai Benkler, zero-sum game


pages: 721 words: 197,134

Data Mining: Concepts, Models, Methods, and Algorithms by Mehmed Kantardzić

Albert Einstein, algorithmic bias, backpropagation, bioinformatics, business cycle, business intelligence, business process, butter production in bangladesh, combinatorial explosion, computer vision, conceptual framework, correlation coefficient, correlation does not imply causation, data acquisition, discrete time, El Camino Real, fault tolerance, finite state, Gini coefficient, information retrieval, Internet Archive, inventory management, iterative process, knowledge worker, linked data, loose coupling, Menlo Park, natural language processing, Netflix Prize, NP-complete, PageRank, pattern recognition, peer-to-peer, phenotype, random walk, RFID, semantic web, speech recognition, statistical model, Telecommunications Act of 1996, telemarketer, text mining, traveling salesman, web application

The method has two variants: random sampling without replacement and random sampling with replacement. Random sampling without replacement is a popular technique in which n distinct samples are selected from N initial samples in the data set without repetition (a sample may not occur twice). The advantages of the approach are simplicity of the algorithm and nonexistence of any bias in a selection. In random sampling with replacement, the samples are selected from a data set such that all samples are given an equal chance of being selected, no matter how often they already have been drawn, that is, any of the samples may be selected more than once. Random sampling is not a one-time activity in a data-mining process.


pages: 764 words: 261,694

The Elements of Statistical Learning (Springer Series in Statistics) by Trevor Hastie, Robert Tibshirani, Jerome Friedman

algorithmic bias, backpropagation, Bayesian statistics, bioinformatics, computer age, conceptual framework, correlation coefficient, data science, G4S, Geoffrey Hinton, greed is good, higher-order functions, linear programming, p-value, pattern recognition, random walk, selection bias, sparse data, speech recognition, statistical model, stochastic process, The Wisdom of Crowds

A. 510 Wong, W. 292 Wright, G. 664, 674, 693 Wright, M. 96, 421 Wu, T. 92, 294, 583 Wyner, A. 384, 603 Yang, N. 3, 49 Yang, Y. 686, 693 Yasui, Y. 664 Yeang, C. 654, 658 Yee, T. 300 Yekutieli, Y. 693 Yu, B. 90, 91, 384 Yuan, M. 90 Zhang, H. 90, 304, 428, 455 Zhang, J. 409–412, 605 Zhang, P. 257 Zhang, T. 384 Zhao, P. 90, 91 Zhao, Y. 693 Zhu, J. 89, 98, 174, 348, 349, 385, 426, 428, 434, 610, 611, 615, 657, 661, 664, 666, 693 Zidek, J. 84 Zou, H. 72, 78, 92, 349, 385, 550, 662, 693 This is page 737 Printer: Opaque this Index L1 regularization, see Lasso Activation function, 392–395 AdaBoost, 337–346 Adaptive lasso, 92 Adaptive methods, 429 Adaptive nearest neighbor methods, 475–478 Adaptive wavelet filtering, 181 Additive model, 295–304 Adjusted response, 297 Affine set, 130 Affine-invariant average, 482, 540 AIC, see Akaike information criterion Akaike information criterion (AIC), 230 Analysis of deviance, 124 Applications abstracts, 672 aorta, 204 bone, 152 California housing, 371–372, 591 countries, 517 demographics, 379–380 document, 532 flow cytometry, 637 galaxy, 201 heart attack, 122, 146, 207 lymphoma, 674 marketing, 488 microarray, 5, 505, 532 nested spheres, 590 New Zealand fish, 375–379 nuclear magnetic resonance, 176 ozone, 201 prostate cancer, 3, 49, 61, 608 protein mass spectrometry, 664 satellite image, 470 skin of the orange, 429–432 spam, 2, 300–304, 313, 320, 328, 352, 593 vowel, 440, 464 waveform, 451 ZIP code, 4, 404, 536–539 Archetypal analysis, 554–557 Association rules, 492–495, 499– 501 738 Index Automatic relevance determination, 411 Automatic selection of smoothing parameters , 156 B-Spline, 186 Back-propagation, 392–397, 408– 409 Backfitting, 297, 391 Backward selection, 58 stepwise selection, 59 Backward pass, 396 Bagging, 282–288, 409, 587 Basis expansions and regularization, 139–189 Basis functions, 141, 186, 189, 321, 328 Batch learning, 397 Baum–Welch algorithm, 272 Bayes classifier, 21 factor, 234 methods, 233–235, 267–272 rate, 21 Bayesian, 409 Bayesian information criterion (BIC), 233 Benjamini–Hochberg method, 688 Best-subset selection, 57, 610 Between class covariance matrix, 114 Bias, 16, 24, 37, 160, 219 Bias-variance decomposition, 24, 37, 219 Bias-variance tradeoff, 37, 219 BIC, see Bayesian Information Criterion Boltzmann machines, 638–648 Bonferroni method, 686 Boosting, 337–386, 409 as lasso regression, 607–609 exponential loss and AdaBoost, 343 gradient boosting, 358 implementations, 360 margin maximization, 613 numerical optimization, 358 partial-dependence plots, 369 regularization path, 607 shrinkage, 364 stochastic gradient boosting, 365 tree size, 361 variable importance, 367 Bootstrap, 249, 261–264, 267, 271– 282, 587 relationship to Bayesian method, 271 relationship to maximum likelihood method, 267 Bottom-up clustering, 520–528 Bump hunting, see Patient rule induction method Bumping, 290–292 C5.0, 624 Canonical variates, 441 CART, see Classification and regression trees Categorical predictors, 10, 310 Censored data, 674 Classical multidimensional scaling, 570 Classification, 22, 101–137, 305– 317, 417–429 Classification and regression trees (CART), 305–317 Clique, 628 Clustering, 501–528 k-means, 509–510 agglomerative, 523–528 hierarchical, 520–528 Codebook, 515 Combinatorial algorithms, 507 Combining models, 288–290 Committee, 289, 587, 605 Comparison of learning methods, 350–352 Complete data, 276 Index Complexity parameter, 37 Computational shortcuts quadratic penalty, 659 Condensing procedure, 480 Conditional likelihood, 31 Confusion matrix, 301 Conjugate gradients, 396 Consensus, 285–286 Convolutional networks, 407 Coordinate descent, 92, 636, 668 COSSO, 304 Cost complexity pruning, 308 Covariance graph, 631 Cp statistic, 230 Cross-entropy, 308–310 Cross-validation, 241–245 Cubic smoothing spline, 151–153 Cubic spline, 151–153 Curse of dimensionality, 22–26 Dantzig selector, 89 Data augmentation, 276 Daubechies symmlet-8 wavelets, 176 De-correlation, 597 Decision boundary, 13–15, 21 Decision trees, 305–317 Decoder, 515, see encoder Decomposable models, 641 Degrees of freedom in an additive model, 302 in ridge regression, 68 of a tree, 336 of smoother matrices, 153–154, 158 Delta rule, 397 Demmler-Reinsch basis for splines, 156 Density estimation, 208–215 Deviance, 124, 309 Diagonal linear discriminant analysis, 651–654 Dimension reduction, 658 for nearest neighbors, 479 Discrete variables, 10, 310–311 739 Discriminant adaptive nearest neighbor classifier, 475–480 analysis, 106–119 coordinates, 108 functions, 109–110 Dissimilarity measure, 503–504 Dummy variables, 10 Early stopping, 398 Effective degrees of freedom, 17, 68, 153–154, 158, 232, 302, 336 Effective number of parameters, 15, 68, 153–154, 158, 232, 302, 336 Eigenvalues of a smoother matrix, 154 Elastic net, 662 EM algorithm, 272–279 as a maximization-maximization procedure, 277 for two component Gaussian mixture, 272 Encoder, 514–515 Ensemble, 616–623 Ensemble learning, 605–624 Entropy, 309 Equivalent kernel, 156 Error rate, 219–230 Error-correcting codes, 606 Estimates of in-sample prediction error, 230 Expectation-maximization algorithm, see EM algorithm Extra-sample error, 228 False discovery rate, 687–690, 692, 693 Feature, 1 extraction, 150 selection, 409, 658, 681–683 Feed-forward neural networks, 392– 408 740 Index Fisher’s linear discriminant, 106– 119, 438 Flexible discriminant analysis, 440– 445 Forward selection, 58 stagewise, 86, 608 stagewise additive modeling, 342 stepwise, 73 Forward pass algorithm, 395 Fourier transform, 168 Frequentist methods, 267 Function approximation, 28–36 Fused lasso, 666 Gap statistic, 519 Gating networks, 329 Gauss-Markov theorem, 51–52 Gauss-Newton method, 391 Gaussian (normal) distribution, 16 Gaussian graphical model, 630 Gaussian mixtures, 273, 463, 492, 509 Gaussian radial basis functions, 212 GBM, see Gradient boosting GBM package, see Gradient boosting GCV, see Generalized cross-validation GEM (generalized EM), 277 Generalization error, 220 performance, 220 Generalized additive model, 295– 304 Generalized association rules, 497– 499 Generalized cross-validation, 244 Generalized linear discriminant analysis, 438 Generalized linear models, 125 Gibbs sampler, 279–280, 641 for mixtures, 280 Gini index, 309 Global Markov property, 628 Gradient Boosting, 359–361 Gradient descent, 358, 395–397 Graph Laplacian, 545 Graphical lasso, 636 Grouped lasso, 90 Haar basis function, 176 Hammersley-Clifford theorem, 629 Hard-thresholding, 653 Hat matrix, 46 Helix, 582 Hessian matrix, 121 Hidden nodes, 641–642 Hidden units, 393–394 Hierarchical clustering, 520–528 Hierarchical mixtures of experts, 329–332 High-dimensional problems, 649 Hints, 96 Hyperplane, see Separating Hyperplane ICA, see Independent components analysis Importance sampling, 617 In-sample prediction error, 230 Incomplete data, 332 Independent components analysis, 557–570 Independent variables, 9 Indicator response matrix, 103 Inference, 261–294 Information Fisher, 266 observed, 274 Information theory, 236, 561 Inner product, 53, 668, 670 Inputs, 10 Instability of trees, 312 Intercept, 11 Invariance manifold, 471 Invariant metric, 471 Inverse wavelet transform, 179 Index IRLS, see Iteratively reweighted least squares Irreducible error, 224 Ising model, 638 ISOMAP, 572 Isometric feature mapping, 572 Iterative proportional scaling, 585 Iteratively reweighted least squares (IRLS), 121 Jensen’s inequality, 293 Join tree, 629 Junction tree, 629 K-means clustering, 460, 509–514 K-medoid clustering, 515–520 K-nearest neighbor classifiers, 463 Karhunen-Loeve transformation (principal components), 66– 67, 79, 534–539 Karush-Kuhn-Tucker conditions, 133, 420 Kernel classification, 670 density classification, 210 density estimation, 208–215 function, 209 logistic regression, 654 principal component, 547–550 string, 668–669 trick, 660 Kernel methods, 167–176, 208–215, 423–438, 659 Knot, 141, 322 Kriging, 171 Kruskal-Shephard scaling, 570 Kullback-Leibler distance, 561 Lagrange multipliers, 293 Landmark, 539 Laplacian, 545 Laplacian distribution, 72 LAR, see Least angle regression Lasso, 68–69, 86–90, 609, 635, 636, 661 741 fused, 666 Latent factor, 674 variable, 678 Learning, 1 Learning rate, 396 Learning vector quantization, 462 Least angle regression, 73–79, 86, 610 Least squares, 11, 32 Leave-one-out cross-validation, 243 LeNet, 406 Likelihood function, 265, 273 Linear basis expansion, 139–148 Linear combination splits, 312 Linear discriminant function, 106– 119 Linear methods for classification, 101–137 for regression, 43–99 Linear models and least squares, 11 Linear regression of an indicator matrix, 103 Linear separability, 129 Linear smoother, 153 Link function, 296 LLE, see Local linear embedding Local false discovery rate, 693 Local likelihood, 205 Local linear embedding, 572 Local methods in high dimensions, 22–27 Local minima, 400 Local polynomial regression, 197 Local regression, 194, 200 Localization in time/frequency, 175 Loess (local regression), 194, 200 Log-linear model, 639 Log-odds ratio (logit), 119 Logistic (sigmoid) function, 393 Logistic regression, 119–128, 299 Logit (log-odds ratio), 119 Loss function, 18, 21, 219–223, 346 Loss matrix, 310 742 Index Lossless compression, 515 Lossy compression, 515 LVQ, see Learning Vector Quantization Mahalanobis distance, 441 Majority vote, 337 Majorization, 294, 553 Majorize-Minimize algorithm, 294, 584 MAP (maximum aposteriori) estimate, 270 Margin, 134, 418 Market basket analysis, 488, 499 Markov chain Monte Carlo (MCMC) methods, 279 Markov graph, 627 Markov networks, 638–648 MARS, see Multivariate adaptive regression splines MART, see Multiple additive regression trees Maximum likelihood estimation, 31, 261, 265 MCMC, see Markov Chain Monte Carlo Methods MDL, see Minimum description length Mean field approximation, 641 Mean squared error, 24, 285 Memory-based method, 463 Metropolis-Hastings algorithm, 282 Minimum description length (MDL), 235 Minorization, 294, 553 Minorize-Maximize algorithm, 294, 584 Misclassification error, 17, 309 Missing data, 276, 332–333 Missing predictor values, 332–333 Mixing proportions, 214 Mixture discriminant analysis, 449– 455 Mixture modeling, 214–215, 272– 275, 449–455, 692 Mixture of experts, 329–332 Mixtures and the EM algorithm, 272–275 MM algorithm, 294, 584 Mode seekers, 507 Model averaging and stacking, 288 Model combination, 289 Model complexity, 221–222 Model selection, 57, 222–223, 230– 231 Modified regression, 634 Monte Carlo method, 250, 495 Mother wavelet, 178 Multidimensional scaling, 570–572 Multidimensional splines, 162 Multiedit algorithm, 480 Multilayer perceptron, 400, 401 Multinomial distribution, 120 Multiple additive regression trees (MART), 361 Multiple hypothesis testing, 683– 693 Multiple minima, 291, 400 Multiple outcome shrinkage and selection, 84 Multiple outputs, 56, 84, 103–106 Multiple regression from simple univariate regression, 52 Multiresolution analysis, 178 Multivariate adaptive regression splines (MARS), 321–327 Multivariate nonparametric regression, 445 Nadaraya–Watson estimate, 193 Naive Bayes classifier, 108, 210– 211, 694 Natural cubic splines, 144–146 Nearest centroids, 670 Nearest neighbor methods, 463– 483 Nearest shrunken centroids, 651– 654, 694 Network diagram, 392 Neural networks, 389–416 Index Newton’s method (Newton-Raphson procedure), 120–122 Non-negative matrix factorization, 553–554 Nonparametric logistic regression, 299–304 Normal (Gaussian) distribution, 16, 31 Normal equations, 12 Numerical optimization, 395–396 Object dissimilarity, 505–507 Online algorithm, 397 Optimal scoring, 445, 450–451 Optimal separating hyperplane, 132– 135 Optimism of the training error rate, 228–230 Ordered categorical (ordinal) predictor, 10, 504 Ordered features, 666 Orthogonal predictors, 53 Overfitting, 220, 228–230, 364 PageRank, 576 Pairwise distance, 668 Pairwise Markov property, 628 Parametric bootstrap, 264 Partial dependence plots, 369–370 Partial least squares, 80–82, 680 Partition function, 638 Parzen window, 208 Pasting, 318 Path algorithm, 73–79, 86–89, 432 Patient rule induction method(PRIM), 317–321, 499–501 Peeling, 318 Penalization, 607, see regularization Penalized discriminant analysis, 446– 449 Penalized polynomial regression, 171 Penalized regression, 34, 61–69, 171 Penalty matrix, 152, 189 743 Perceptron, 392–416 Piecewise polynomials and splines, 36, 143 Posterior distribution, 268 probability, 233–235, 268 Power method, 577 Pre-conditioning, 681–683 Prediction accuracy, 329 Prediction error, 18 Predictive distribution, 268 PRIM, see Patient rule induction method Principal components, 66–67, 79– 80, 534–539, 547 regression, 79–80 sparse, 550 supervised, 674 Principal curves and surfaces, 541– 544 Principal points, 541 Prior distribution, 268–272 Procrustes average, 540 distance, 539 Projection pursuit, 389–392, 565 regression, 389–392 Prototype classifier, 459–463 Prototype methods, 459–463 Proximity matrices, 503 Pruning, 308 QR decomposition, 55 Quadratic approximations and inference, 124 Quadratic discriminant function, 108, 110 Radial basis function (RBF) network, 392 Radial basis functions, 212–214, 275, 393 Radial kernel, 548 Random forest, 409, 587–604 algorithm, 588 744 Index bias, 596–601 comparison to boosting, 589 example, 589 out-of-bag (oob), 592 overfit, 596 proximity plot, 595 variable importance, 593 variance, 597–601 Rao score test, 125 Rayleigh quotient, 116 Receiver operating characteristic (ROC) curve, 317 Reduced-rank linear discriminant analysis, 113 Regression, 11–14, 43–99, 200–204 Regression spline, 144 Regularization, 34, 167–176 Regularized discriminant analysis, 112–113, 654 Relevance network, 631 Representer of evaluation, 169 Reproducing kernel Hilbert space, 167–176, 428–429 Reproducing property, 169 Responsibilities, 274–275 Ridge regression, 61–68, 650, 659 Risk factor, 122 Robust fitting, 346–350 Rosenblatt’s perceptron learning algorithm, 130 Rug plot, 303 Rulefit, 623 SAM, 690–693, see Significance Analysis of Microarrays Sammon mapping, 571 SCAD, 92 Scaling of the inputs, 398 Schwarz’s criterion, 230–235 Score equations, 120, 265 Self-consistency property, 541–543 Self-organizing map (SOM), 528– 534 Sensitivity of a test, 314–317 Separating hyperplane, 132–135 Separating hyperplanes, 136, 417– 419 Separator, 628 Shape average, 482, 540 Shrinkage methods, 61–69, 652 Sigmoid, 393 Significance Analysis of Microarrays, 690–693 Similarity measure, see Dissimilarity measure Single index model, 390 Singular value decomposition, 64, 535–536, 659 singular values, 535 singular vectors, 535 Sliced inverse regression, 480 Smoother, 139–156, 192–199 matrix, 153 Smoothing parameter, 37, 156–161, 198–199 Smoothing spline, 151–156 Soft clustering, 512 Soft-thresholding, 653 Softmax function, 393 SOM, see Self-organizing map Sparse, 175, 304, 610–613, 636 additive model, 91 graph, 625, 635 Specificity of a test, 314–317 Spectral clustering, 544–547 Spline, 186 additive, 297–299 cubic, 151–153 cubic smoothing, 151–153 interaction, 428 regression, 144 smoothing, 151–156 thin plate, 165 Squared error loss, 18, 24, 37, 219 SRM, see Structural risk minimization Stacking (stacked generalization), 290 Starting values, 397 Statistical decision theory, 18–22 Index Statistical model, 28–29 Steepest descent, 358, 395–397 Stepwise selection, 60 Stochastic approximation, 397 Stochastic search (bumping), 290– 292 Stress function, 570–572 Structural risk minimization (SRM), 239–241 Subset selection, 57–60 Supervised learning, 2 Supervised principal components, 674–681 Support vector classifier, 417–421, 654 multiclass, 657 Support vector machine, 423–437 SURE shrinkage method, 179 Survival analysis, 674 Survival curve, 674 SVD, see Singular value decomposition Symmlet basis, 176 Tangent distance, 471–475 Tanh activation function, 424 Target variables, 10 Tensor product basis, 162 Test error, 220–223 Test set, 220 Thin plate spline, 165 Thinning strategy, 189 Trace of a matrix, 153 Training epoch, 397 Training error, 220–223 Training set, 219–223 Tree for regression, 307–308 Tree-based methods, 305–317 Trees for classification, 308–310 Trellis display, 202 745 Undirected graph, 625–648 Universal approximator, 390 Unsupervised learning, 2, 485–585 Unsupervised learning as supervised learning, 495–497 Validation set, 222 Vapnik-Chervonenkis (VC) dimension, 237–239 Variable importance plot, 594 Variable types and terminology, 9 Variance, 16, 25, 37, 158–161, 219 between, 114 within, 114, 446 Variance reduction, 588 Varying coefficient models, 203– 204 VC dimension, see Vapnik–Chervonenkis dimension Vector quantization, 514–515 Voronoi regions, 510 Wald test, 125 Wavelet basis functions, 176–179 smoothing, 174 transform, 176–179 Weak learner, 383, 605 Weakest link pruning, 308 Webpages, 576 Website for book, 8 Weight decay, 398 Weight elimination, 398 Weights in a neural network, 395 Within class covariance matrix, 114, 446


pages: 918 words: 257,605

The Age of Surveillance Capitalism by Shoshana Zuboff

"World Economic Forum" Davos, algorithmic bias, Amazon Web Services, Andrew Keen, augmented reality, autonomous vehicles, barriers to entry, Bartolomé de las Casas, behavioural economics, Berlin Wall, Big Tech, bitcoin, blockchain, blue-collar work, book scanning, Broken windows theory, California gold rush, call centre, Cambridge Analytica, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, choice architecture, citizen journalism, Citizen Lab, classic study, cloud computing, collective bargaining, Computer Numeric Control, computer vision, connected car, context collapse, corporate governance, corporate personhood, creative destruction, cryptocurrency, data science, deep learning, digital capitalism, disinformation, dogs of the Dow, don't be evil, Donald Trump, Dr. Strangelove, driverless car, Easter island, Edward Snowden, en.wikipedia.org, Erik Brynjolfsson, Evgeny Morozov, facts on the ground, fake news, Ford Model T, Ford paid five dollars a day, future of work, game design, gamification, Google Earth, Google Glasses, Google X / Alphabet X, Herman Kahn, hive mind, Ian Bogost, impulse control, income inequality, information security, Internet of things, invention of the printing press, invisible hand, Jean Tirole, job automation, Johann Wolfgang von Goethe, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Joseph Schumpeter, Kevin Kelly, Kevin Roose, knowledge economy, Lewis Mumford, linked data, longitudinal study, low skilled workers, Mark Zuckerberg, market bubble, means of production, multi-sided market, Naomi Klein, natural language processing, Network effects, new economy, Occupy movement, off grid, off-the-grid, PageRank, Panopticon Jeremy Bentham, pattern recognition, Paul Buchheit, performance metric, Philip Mirowski, precision agriculture, price mechanism, profit maximization, profit motive, public intellectual, recommendation engine, refrigerator car, RFID, Richard Thaler, ride hailing / ride sharing, Robert Bork, Robert Mercer, Salesforce, Second Machine Age, self-driving car, sentiment analysis, shareholder value, Sheryl Sandberg, Shoshana Zuboff, Sidewalk Labs, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, slashdot, smart cities, Snapchat, social contagion, social distancing, social graph, social web, software as a service, speech recognition, statistical model, Steve Bannon, Steve Jobs, Steven Levy, structural adjustment programs, surveillance capitalism, technological determinism, TED Talk, The Future of Employment, The Wealth of Nations by Adam Smith, Tim Cook: Apple, two-sided market, union organizing, vertical integration, Watson beat the top human players on Jeopardy!, winner-take-all economy, Wolfgang Streeck, work culture , Yochai Benkler, you are the product