deep learning

227 results back to index


pages: 586 words: 186,548

Architects of Intelligence by Martin Ford

3D printing, agricultural Revolution, AI winter, algorithmic bias, Alignment Problem, AlphaGo, Apple II, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, backpropagation, barriers to entry, basic income, Baxter: Rethink Robotics, Bayesian statistics, Big Tech, bitcoin, Boeing 747, Boston Dynamics, business intelligence, business process, call centre, Cambridge Analytica, cloud computing, cognitive bias, Colonization of Mars, computer vision, Computing Machinery and Intelligence, correlation does not imply causation, CRISPR, crowdsourcing, DARPA: Urban Challenge, data science, deep learning, DeepMind, Demis Hassabis, deskilling, disruptive innovation, Donald Trump, Douglas Hofstadter, driverless car, Elon Musk, Erik Brynjolfsson, Ernest Rutherford, fake news, Fellow of the Royal Society, Flash crash, future of work, general purpose technology, Geoffrey Hinton, gig economy, Google X / Alphabet X, Gödel, Escher, Bach, Hans Moravec, Hans Rosling, hype cycle, ImageNet competition, income inequality, industrial research laboratory, industrial robot, information retrieval, job automation, John von Neumann, Large Hadron Collider, Law of Accelerating Returns, life extension, Loebner Prize, machine translation, Mark Zuckerberg, Mars Rover, means of production, Mitch Kapor, Mustafa Suleyman, natural language processing, new economy, Nick Bostrom, OpenAI, opioid epidemic / opioid crisis, optical character recognition, paperclip maximiser, pattern recognition, phenotype, Productivity paradox, radical life extension, Ray Kurzweil, recommendation engine, Robert Gordon, Rodney Brooks, Sam Altman, self-driving car, seminal paper, sensor fusion, sentiment analysis, Silicon Valley, smart cities, social intelligence, sparse data, speech recognition, statistical model, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Pinker, strong AI, superintelligent machines, synthetic biology, systems thinking, Ted Kaczynski, TED Talk, The Rise and Fall of American Growth, theory of mind, Thomas Bayes, Travis Kalanick, Turing test, universal basic income, Wall-E, Watson beat the top human players on Jeopardy!, women in the workforce, working-age population, workplace surveillance , zero-sum game, Zipcar

MARTIN FORD: How does all of that thinking relate to the current overwhelming focus on deep learning? Clearly, deep neural networks have transformed AI, but lately I’ve been hearing more pushback against deep learning hype, and even some suggestions that we could be facing a new AI Winter. Is deep learning really the primary path forward, or is it just one tool in the toolbox? JOSH TENENBAUM: What most people think of as deep learning is one tool in the toolbox, and a lot of deep learning people realize that too. The term “deep learning” has expanded beyond its original definition. MARTIN FORD: I would define deep learning broadly as any approach using sophisticated neural networks with lots of layers, rather than using a very technical definition involving specific algorithms like backpropagation or gradient descent.

Going one step further, deep learning is where we have neural networks that have many layers. There is no required minimum for a neural network to be deep, but we would usually say that two or three layers is not a deep learning network, while four or more layers is deep learning. Some deep learning networks get up to one thousand layers or more. By having many layers in deep learning, we can represent a very complex transformation between the input and output, by a composition of much simpler transformations, each represented by one of those layers in the network. The deep learning hypothesis suggests that many layers make it easier for the learning algorithm to find a predictor, to set all the connection strengths in the network so that it does a good job.

MARTIN FORD: Do you think there will be a backlash against all the hype surrounding deep learning when its limitations are more widely recognized? BARBARA GROSZ: I have survived numerous AI Winters in the past and I’ve come away from them feeling both fearful and hopeful. I’m fearful that people, once they see the limitations of deep learning will say, “Oh, it doesn’t really work.” But I’m hopeful that, because deep learning is so powerful for so many things, and in so many areas, that there won’t be an AI Winter around deep learning. I do think, however, that to avoid an AI Winter for deep learning, people in the field need to put deep learning in its correct place, and be clear about its limitations.


pages: 414 words: 109,622

Genius Makers: The Mavericks Who Brought A. I. To Google, Facebook, and the World by Cade Metz

AI winter, air gap, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, AlphaGo, Amazon Robotics, artificial general intelligence, Asilomar, autonomous vehicles, backpropagation, Big Tech, British Empire, Cambridge Analytica, carbon-based life, cloud computing, company town, computer age, computer vision, deep learning, deepfake, DeepMind, Demis Hassabis, digital map, Donald Trump, driverless car, drone strike, Elon Musk, fake news, Fellow of the Royal Society, Frank Gehry, game design, Geoffrey Hinton, Google Earth, Google X / Alphabet X, Googley, Internet Archive, Isaac Newton, Jeff Hawkins, Jeffrey Epstein, job automation, John Markoff, life extension, machine translation, Mark Zuckerberg, means of production, Menlo Park, move 37, move fast and break things, Mustafa Suleyman, new economy, Nick Bostrom, nuclear winter, OpenAI, PageRank, PalmPilot, pattern recognition, Paul Graham, paypal mafia, Peter Thiel, profit motive, Richard Feynman, ride hailing / ride sharing, Ronald Reagan, Rubik’s Cube, Sam Altman, Sand Hill Road, self-driving car, side project, Silicon Valley, Silicon Valley billionaire, Silicon Valley startup, Skype, speech recognition, statistical model, stem cell, Stephen Hawking, Steve Ballmer, Steven Levy, Steven Pinker, tech worker, telemarketer, The Future of Employment, Turing test, warehouse automation, warehouse robotics, Y Combinator

he published what he called a trilogy of papers critiquing: Gary Marcus, “Deep Learning: A Critical Appraisal,” 2018, https://arxiv.org/abs/1801.00631; Gary Marcus, “In Defense of Skepticism About Deep Learning,” 2018, https://medium.com/@GaryMarcus/in-defense-of-skepticism-about-deep-learning-6e8bfd5ae0f1; Gary Marcus, “Innateness, AlphaZero, and Artificial Intelligence,” 2018, https://arxiv.org/abs/1801.05667. would eventually lead to a book: Gary Marcus and Ernest Davis, Rebooting AI: Building Artificial Intelligence We Can Trust (New York: Pantheon, 2019). he agreed that deep learning alone could not achieve true intelligence: “Artificial Intelligence Debate—Yann LeCun vs.

* * * — IN the spring of 2012, Geoff Hinton phoned Jitendra Malik, the University of California–Berkeley professor who had publicly attacked Andrew Ng over his claims that deep learning was the future of computer vision. Despite the success of deep learning with speech recognition, Malik and his colleagues questioned whether the technology would ever master the art of identifying images. And because he was someone who generally assumed that incoming calls were arriving from telemarketers trying to sell him something, it was surprising that he even picked up the phone. When he did, Hinton said: “I hear you don’t like deep learning.” Malik said this was true, and when Hinton asked why, Malik said there was no scientific evidence to back any claim that deep learning could outperform any other technology on computer vision.

Malik said this was true, and when Hinton asked why, Malik said there was no scientific evidence to back any claim that deep learning could outperform any other technology on computer vision. Hinton pointed to recent papers showing that deep learning worked well when identifying objects on multiple benchmark tests. Malik said these datasets were too old. No one cared about them. “This is not going to convince anyone who doesn’t share your ideological predilections,” he said. So Hinton asked what would convince him. At first, Malik said deep learning would have to master a European dataset called PASCAL. “PASCAL is too small,” Hinton told him. “To make this work, we need a lot of training data.


pages: 424 words: 114,905

Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again by Eric Topol

"World Economic Forum" Davos, 23andMe, Affordable Care Act / Obamacare, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic bias, AlphaGo, Apollo 11, artificial general intelligence, augmented reality, autism spectrum disorder, autonomous vehicles, backpropagation, Big Tech, bioinformatics, blockchain, Cambridge Analytica, cloud computing, cognitive bias, Colonization of Mars, computer age, computer vision, Computing Machinery and Intelligence, conceptual framework, creative destruction, CRISPR, crowdsourcing, Daniel Kahneman / Amos Tversky, dark matter, data science, David Brooks, deep learning, DeepMind, Demis Hassabis, digital twin, driverless car, Elon Musk, en.wikipedia.org, epigenetics, Erik Brynjolfsson, fake news, fault tolerance, gamification, general purpose technology, Geoffrey Hinton, George Santayana, Google Glasses, ImageNet competition, Jeff Bezos, job automation, job satisfaction, Joi Ito, machine translation, Mark Zuckerberg, medical residency, meta-analysis, microbiome, move 37, natural language processing, new economy, Nicholas Carr, Nick Bostrom, nudge unit, OpenAI, opioid epidemic / opioid crisis, pattern recognition, performance metric, personalized medicine, phenotype, placebo effect, post-truth, randomized controlled trial, recommendation engine, Rubik’s Cube, Sam Altman, self-driving car, Silicon Valley, Skinner box, speech recognition, Stephen Hawking, techlash, TED Talk, text mining, the scientific method, Tim Cook: Apple, traumatic brain injury, trolley problem, War on Poverty, Watson beat the top human players on Jeopardy!, working-age population

Bar, “Artificial Intelligence: Driving the Next Technology Cycle,” in Next Generation (Zurich: Julius Baer Group, 2017); Chollet, F., Deep Learning with Python (Shelter Island, New York: Manning, 2017); T. L. Fonseca, “What’s Happening Inside the Convolutional Neural Network? The Answer Is Convolution,” buZZrobot (2017); A. Geitgey, “Machine Learning Is Fun! Part 3: Deep Learning and Convolutional Neural Networks,” Medium (2016); Y. LeCun, Y. Bengio, and G. Hinton, “Deep Learning,” Nature (2015): 521(7553), 436–444; R. Raicea, “Want to Know How Deep Learning Works? Here’s a Quick Guide for Everyone,” Medium (2017); P. Voosen, “The AI Detectives,” Science (2017): 357(6346), 22–27.

Neural networks aren’t actually very neural. François Chollet, a deep learning expert at Google, points out in Deep Learning with Python, “There is no evidence that the brain implements anything like the learning mechanisms in use in modern deep learning models.”2 Of course, there’s no reason why machines should mimic the brain; that’s simplistic reverse-anthropomorphic thinking. And, when we see machines showing some semblance of smartness, we anthropomorphize and think that our brains are just some kind of CPU equivalent, cognitive processing units. Deep learning AI is remarkably different from and complementary to human learning.

., “Accurate and Reproducible Invasive Breast Cancer Detection in Whole-Slide Images: A Deep Learning Approach for Quantifying Tumor Extent.” Sci Rep, 2017. 7: p. 46450. 51. Ehteshami Bejnordi, B., et al., “Diagnostic Assessment of Deep Learning Algorithms for Detection of Lymph Node Metastases in Women with Breast Cancer.” JAMA, 2017. 318(22): pp. 2199–2210. 52. Golden, J. A., “Deep Learning Algorithms for Detection of Lymph Node Metastases from Breast Cancer: Helping Artificial Intelligence Be Seen.” JAMA, 2017. 318(22): pp. 2184–2186. 53. Yang, S. J., et al., “Assessing Microscope Image Focus Quality with Deep Learning.” BMC Bioinformatics, 2018. 19(1): p. 77. 54.


The Deep Learning Revolution (The MIT Press) by Terrence J. Sejnowski

AI winter, Albert Einstein, algorithmic bias, algorithmic trading, AlphaGo, Amazon Web Services, Any sufficiently advanced technology is indistinguishable from magic, augmented reality, autonomous vehicles, backpropagation, Baxter: Rethink Robotics, behavioural economics, bioinformatics, cellular automata, Claude Shannon: information theory, cloud computing, complexity theory, computer vision, conceptual framework, constrained optimization, Conway's Game of Life, correlation does not imply causation, crowdsourcing, Danny Hillis, data science, deep learning, DeepMind, delayed gratification, Demis Hassabis, Dennis Ritchie, discovery of DNA, Donald Trump, Douglas Engelbart, driverless car, Drosophila, Elon Musk, en.wikipedia.org, epigenetics, Flynn Effect, Frank Gehry, future of work, Geoffrey Hinton, Google Glasses, Google X / Alphabet X, Guggenheim Bilbao, Gödel, Escher, Bach, haute couture, Henri Poincaré, I think there is a world market for maybe five computers, industrial robot, informal economy, Internet of things, Isaac Newton, Jim Simons, John Conway, John Markoff, John von Neumann, language acquisition, Large Hadron Collider, machine readable, Mark Zuckerberg, Minecraft, natural language processing, Neil Armstrong, Netflix Prize, Norbert Wiener, OpenAI, orbital mechanics / astrodynamics, PageRank, pattern recognition, pneumatic tube, prediction markets, randomized controlled trial, Recombinant DNA, recommendation engine, Renaissance Technologies, Rodney Brooks, self-driving car, Silicon Valley, Silicon Valley startup, Socratic dialogue, speech recognition, statistical model, Stephen Hawking, Stuart Kauffman, theory of mind, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, Turing machine, Von Neumann architecture, Watson beat the top human players on Jeopardy!, world market for maybe five computers, X Prize, Yogi Berra

When you get Google to translate for you, it now uses deep learning designed by Dean’s Google 192 Chapter 12 Figure 12.9 NASCAR jacket Terry Sejnowski wore to open the 2015 NIPS Conference in Montreal. Sponsors ranged from top-tier Internet companies to financial and media companies. They all have a stake in deep learning. Courtesy of the NIPS Foundation. Brain team. When you google a search term, deep learning helps to rank the results. When you talk to the Google assistant, it uses deep learning to recognize the words you are saying, and as it gets better at holding a conversation with you, it will be using deep learning to serve you better.

Chips 205 Inside Information 219 Consciousness 233 Nature Is Cleverer Than We Are 245 Deep Intelligence 261 169 viii Acknowledgments 269 Recommended Reading 275 Glossary 281 Notes 285 Index 321 Contents Preface P P r r e e f f a a c c e e © Massachusetts Institute of TechnologyAll Rights Reserved If you use voice recognition on an Android phone or Google Translate on the Internet, you have communicated with neural networks1 trained by deep learning. In the last few years, deep learning has generated enough profit for Google to cover the costs of all its futuristic projects at Google X, including self-driving cars, Google Glass, and Google Brain.2 Google was one of the first Internet companies to embrace deep learning; in 2013, it hired Geoffrey Hinton, the father of deep learning, and other companies are racing to catch up. The recent progress in artificial intelligence (AI) was made by reverse engineering brains.

All doctors will become better at diagnosing rare skin diseases with the help of deep learning.16 Deep Cancer The detection of metastatic breast cancer in images of lymph node biopsies on slides is done by experts who make mistakes, mistakes that have deadly consequences. This is a pattern recognition problem for which deep learning should excel. And indeed, a deep learning network trained on a large dataset of slides for which ground truth was known reached an accuracy of 0.925, good but not as good as experts who achieved 0.966 on the same test set.17 However, when the predictions of deep learning were combined The Rise of Machine Learning 11 Figure 1.5 Artist’s impression of a deep learning network diagnosing a skin lesion with high accuracy, cover of February 2, 2017, issue of Nature.


pages: 288 words: 86,995

Rule of the Robots: How Artificial Intelligence Will Transform Everything by Martin Ford

AI winter, Airbnb, algorithmic bias, algorithmic trading, Alignment Problem, AlphaGo, Amazon Mechanical Turk, Amazon Web Services, artificial general intelligence, Automated Insights, autonomous vehicles, backpropagation, basic income, Big Tech, big-box store, call centre, carbon footprint, Chris Urmson, Claude Shannon: information theory, clean water, cloud computing, commoditize, computer age, computer vision, Computing Machinery and Intelligence, coronavirus, correlation does not imply causation, COVID-19, crowdsourcing, data is the new oil, data science, deep learning, deepfake, DeepMind, Demis Hassabis, deskilling, disruptive innovation, Donald Trump, Elon Musk, factory automation, fake news, fulfillment center, full employment, future of work, general purpose technology, Geoffrey Hinton, George Floyd, gig economy, Gini coefficient, global pandemic, Googley, GPT-3, high-speed rail, hype cycle, ImageNet competition, income inequality, independent contractor, industrial robot, informal economy, information retrieval, Intergovernmental Panel on Climate Change (IPCC), Internet of things, Jeff Bezos, job automation, John Markoff, Kiva Systems, knowledge worker, labor-force participation, Law of Accelerating Returns, license plate recognition, low interest rates, low-wage service sector, Lyft, machine readable, machine translation, Mark Zuckerberg, Mitch Kapor, natural language processing, Nick Bostrom, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, Ocado, OpenAI, opioid epidemic / opioid crisis, passive income, pattern recognition, Peter Thiel, Phillips curve, post scarcity, public intellectual, Ray Kurzweil, recommendation engine, remote working, RFID, ride hailing / ride sharing, Robert Gordon, Rodney Brooks, Rubik’s Cube, Sam Altman, self-driving car, Silicon Valley, Silicon Valley startup, social distancing, SoftBank, South of Market, San Francisco, special economic zone, speech recognition, stealth mode startup, Stephen Hawking, superintelligent machines, TED Talk, The Future of Employment, The Rise and Fall of American Growth, the scientific method, Turing machine, Turing test, Tyler Cowen, Tyler Cowen: Great Stagnation, Uber and Lyft, uber lyft, universal basic income, very high income, warehouse automation, warehouse robotics, Watson beat the top human players on Jeopardy!, WikiLeaks, women in the workforce, Y Combinator

John Markoff, “Scientists see promise in deep-learning programs,” New York Times, November 23, 2012, www.nytimes.com/2012/11/24/science/scientists-see-advances-in-deep-learning-a-part-of-artificial-intelligence.html. 10. Dario Amodei and Danny Hernandez, “AI and Compute,” OpenAI Blog, May 16, 2018, openai.com/blog/ai-and-compute/. 11. Will Knight, “Facebook’s head of AI says the field will soon ‘hit the wall,’” Wired, December 4, 2019, www.wired.com/story/facebooks-ai-says-field-hit-wall/. 12. Kim Martineau, “Shrinking deep learning’s carbon footprint,” MIT News, August 7, 2020, news.mit.edu/2020/shrinking-deep-learning-carbon-footprint-0807. 13.

While the central processing chip that powers your laptop computer might have two, or perhaps four, computational “cores,” a contemporary high-end GPU would likely have thousands of specialized cores, all of which can crunch numbers at high speed simultaneously. Once researchers discovered that the calculations required by deep learning applications were broadly similar to those needed to render graphics, they began to turn en masse to GPUs, which rapidly evolved into the primary hardware platform for artificial intelligence. Indeed, this transition was a key enabler of the deep learning revolution that took hold beginning in 2012. In September of that year, a team of AI researchers from the University of Toronto put deep learning on the technology industry’s radar by prevailing at the ImageNet Large Scale Visual Recognition Challenge, an important annual event focused on machine vision.

We’ll delve further into the history of deep learning in Chapter 4. The University of Toronto’s team used GPUs manufactured by NVIDIA, a company founded in 1993 whose business focused exclusively on designing and manufacturing state-of-the-art graphics chips. In the wake of the 2012 ImageNet competition and the ensuing widespread recognition of the powerful synergy between deep learning and GPUs, the company’s trajectory shifted dramatically, transforming it into one of the most prominent technology companies associated with the rise of artificial intelligence. Evidence of the deep learning revolution manifested directly in the company’s market value: between January 2012 and January 2020 NVIDIA’s shares soared by more than 1,500 percent.


Driverless: Intelligent Cars and the Road Ahead by Hod Lipson, Melba Kurman

AI winter, Air France Flight 447, AlphaGo, Amazon Mechanical Turk, autonomous vehicles, backpropagation, barriers to entry, butterfly effect, carbon footprint, Chris Urmson, cloud computing, computer vision, connected car, creative destruction, crowdsourcing, DARPA: Urban Challenge, deep learning, digital map, Donald Shoup, driverless car, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, General Motors Futurama, Geoffrey Hinton, Google Earth, Google X / Alphabet X, Hans Moravec, high net worth, hive mind, ImageNet competition, income inequality, industrial robot, intermodal, Internet of things, Jeff Hawkins, job automation, Joseph Schumpeter, lone genius, Lyft, megacity, Network effects, New Urbanism, Oculus Rift, pattern recognition, performance metric, Philippa Foot, precision agriculture, RFID, ride hailing / ride sharing, Second Machine Age, self-driving car, Silicon Valley, smart cities, speech recognition, statistical model, Steve Jobs, technoutopianism, TED Talk, Tesla Model S, Travis Kalanick, trolley problem, Uber and Lyft, uber lyft, Unsafe at Any Speed, warehouse robotics

In the past few years, in search of deep-learning expertise, entire divisions of automotive companies have migrated to Silicon Valley. Deep learning is why software giants like Google and Baidu, already armed with expertise in managing huge banks of data and building intelligent software, are giving the once-invincible automotive giants a run for their money. Deep learning has been so revolutionary to the AI community that its reverberations are still unfolding as we write this book and will likely continue to unfold in the years ahead. Cars won’t be the only technology that’s transformed by deep learning. We predict that deep learning will change the developmental trajectory of mobile robotics in general.

To encourage third-party developers to build intelligent applications using their software tools, Google, Microsoft, and Facebook have each launched their own version of an open source deep-learning development platform. Despite its usefulness, it has taken decades for deep-learning software to develop. Like other types of machine-learning software, a deep-learning network is data driven, consuming huge amounts of visual data in the form of digital images or videos. Improved performance, enabled by the recent maturation of enabling technologies such as high-speed computers and digital cameras, isn’t the only reason deep-learning software has recently gained acceptance, however. The other reason is political.

Inside the network There are several different types of deep-learning networks for image recognition, each with its own twist on the basic architecture and unique refinements in the way the training algorithm is applied. Deep learning is a fast-growing field and new architectures and algorithms appear every few weeks. One common characteristic, however, is that deep-learning networks use a cascade of many layers of artificial neurons to “extract” features from digital images that the software then identifies and labels. Cutting-edge deep-learning networks often have more than 100 layers of artificial neurons (contrast this to Rosenblatt’s Perception, which had only a single layer of eight neurons). Some people believe that deep-learning networks recognize objects the way humans do, by first recognizing a particular tiny feature, then abstracting that to a broader, more abstract concept.


pages: 350 words: 98,077

Artificial Intelligence: A Guide for Thinking Humans by Melanie Mitchell

Ada Lovelace, AI winter, Alignment Problem, AlphaGo, Amazon Mechanical Turk, Apple's 1984 Super Bowl advert, artificial general intelligence, autonomous vehicles, backpropagation, Bernie Sanders, Big Tech, Boston Dynamics, Cambridge Analytica, Charles Babbage, Claude Shannon: information theory, cognitive dissonance, computer age, computer vision, Computing Machinery and Intelligence, dark matter, deep learning, DeepMind, Demis Hassabis, Douglas Hofstadter, driverless car, Elon Musk, en.wikipedia.org, folksonomy, Geoffrey Hinton, Gödel, Escher, Bach, I think there is a world market for maybe five computers, ImageNet competition, Jaron Lanier, job automation, John Markoff, John von Neumann, Kevin Kelly, Kickstarter, license plate recognition, machine translation, Mark Zuckerberg, natural language processing, Nick Bostrom, Norbert Wiener, ought to be enough for anybody, paperclip maximiser, pattern recognition, performance metric, RAND corporation, Ray Kurzweil, recommendation engine, ride hailing / ride sharing, Rodney Brooks, self-driving car, sentiment analysis, Silicon Valley, Singularitarianism, Skype, speech recognition, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Pinker, strong AI, superintelligent machines, tacit knowledge, tail risk, TED Talk, the long tail, theory of mind, There's no reason for any individual to have a computer in his home - Ken Olsen, trolley problem, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, world market for maybe five computers

This acqui-hire instantly put Google at the forefront of deep learning. Soon after, Yann LeCun was lured away from his full-time New York University professorship by Facebook to head up its newly formed AI lab. It didn’t take long before all the big tech companies (as well as many smaller ones) were snapping up deep-learning experts and their graduate students as fast as possible. Seemingly overnight, deep learning became the hottest part of AI, and expertise in deep learning guaranteed computer scientists a large salary in Silicon Valley or, better yet, venture capital funding for their proliferating deep-learning start-up companies.

A recent AI survey paper summed it up: “Because we don’t deeply understand intelligence or know how to produce general AI, rather than cutting off any avenues of exploration, to truly make progress we should embrace AI’s ‘anarchy of methods.’”14 But since the 2010s, one family of AI methods—collectively called deep learning (or deep neural networks)—has risen above the anarchy to become the dominant AI paradigm. In fact, in much of the popular media, the term artificial intelligence itself has come to mean “deep learning.” This is an unfortunate inaccuracy, and I need to clarify the distinction. AI is a field that includes a broad set of approaches, with the goal of creating machines with intelligence. Deep learning is only one such approach. Deep learning is itself one method among many in the field of machine learning, a subfield of AI in which machines “learn” from data or from their own “experiences.”

.… There’s only a few hundred people in the world that can do that really well.”6 Actually, the number of deep-learning experts is growing quickly; many universities now offer courses in the subject, and a growing list of companies have started their own deep-learning training programs for employees. Membership in the deep-learning club can be quite lucrative. At a recent conference I attended, a leader of Microsoft’s AI product group spoke to the audience about the company’s efforts to hire young deep-learning engineers: “If a kid knows how to train five layers of neural networks, the kid can demand five figures.


pages: 307 words: 88,180

AI Superpowers: China, Silicon Valley, and the New World Order by Kai-Fu Lee

"World Economic Forum" Davos, AI winter, Airbnb, Albert Einstein, algorithmic bias, algorithmic trading, Alignment Problem, AlphaGo, artificial general intelligence, autonomous vehicles, barriers to entry, basic income, bike sharing, business cycle, Cambridge Analytica, cloud computing, commoditize, computer vision, corporate social responsibility, cotton gin, creative destruction, crony capitalism, data science, deep learning, DeepMind, Demis Hassabis, Deng Xiaoping, deskilling, Didi Chuxing, Donald Trump, driverless car, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, fake news, full employment, future of work, general purpose technology, Geoffrey Hinton, gig economy, Google Chrome, Hans Moravec, happiness index / gross national happiness, high-speed rail, if you build it, they will come, ImageNet competition, impact investing, income inequality, informal economy, Internet of things, invention of the telegraph, Jeff Bezos, job automation, John Markoff, Kickstarter, knowledge worker, Lean Startup, low skilled workers, Lyft, machine translation, mandatory minimum, Mark Zuckerberg, Menlo Park, minimum viable product, natural language processing, Neil Armstrong, new economy, Nick Bostrom, OpenAI, pattern recognition, pirate software, profit maximization, QR code, Ray Kurzweil, recommendation engine, ride hailing / ride sharing, risk tolerance, Robert Mercer, Rodney Brooks, Rubik’s Cube, Sam Altman, Second Machine Age, self-driving car, sentiment analysis, sharing economy, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Skype, SoftBank, Solyndra, special economic zone, speech recognition, Stephen Hawking, Steve Jobs, strong AI, TED Talk, The Future of Employment, Travis Kalanick, Uber and Lyft, uber lyft, universal basic income, urban planning, vertical integration, Vision Fund, warehouse robotics, Y Combinator

Those changes include the Chinese AI frenzy that AlphaGo’s matches sparked amid the underlying technology that powered it to victory. AlphaGo runs on deep learning, a groundbreaking approach to artificial intelligence that has turbocharged the cognitive capabilities of machines. Deep-learning-based programs can now do a better job than humans at identifying faces, recognizing speech, and issuing loans. For decades, the artificial intelligence revolution always looked to be five years away. But with the development of deep learning over the past few years, that revolution has finally arrived. It will usher in an era of massive productivity increases but also widespread disruptions in labor markets—and profound sociopsychological effects on people—as artificial intelligence takes over human jobs across all sorts of industries.

To understand why, we must first grasp the basics of the technology and how it is set to transform our world. A BRIEF HISTORY OF DEEP LEARNING Machine learning—the umbrella term for the field that includes deep learning—is a history-altering technology but one that is lucky to have survived a tumultuous half-century of research. Ever since its inception, artificial intelligence has undergone a number of boom-and-bust cycles. Periods of great promise have been followed by “AI winters,” when a disappointing lack of practical results led to major cuts in funding. Understanding what makes the arrival of deep learning different requires a quick recap of how we got here. Back in the mid-1950s, the pioneers of artificial intelligence set themselves an impossibly lofty but well-defined mission: to recreate human intelligence in a machine.

After decades spent on the margins of AI research, neural networks hit the mainstream overnight, this time in the form of deep learning. That breakthrough promised to thaw the ice from the latest AI winter, and for the first time truly bring AI’s power to bear on a range of real-world problems. Researchers, futurists, and tech CEOs all began buzzing about the massive potential of the field to decipher human speech, translate documents, recognize images, predict consumer behavior, identify fraud, make lending decisions, help robots “see,” and even drive a car. PULLING BACK THE CURTAIN ON DEEP LEARNING So how does deep learning do this? Fundamentally, these algorithms use massive amounts of data from a specific domain to make a decision that optimizes for a desired outcome.


pages: 174 words: 56,405

Machine Translation by Thierry Poibeau

Alignment Problem, AlphaGo, AltaVista, augmented reality, call centre, Claude Shannon: information theory, cloud computing, combinatorial explosion, crowdsourcing, deep learning, DeepMind, easy for humans, difficult for computers, en.wikipedia.org, geopolitical risk, Google Glasses, information retrieval, Internet of things, language acquisition, machine readable, machine translation, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, natural language processing, Necker cube, Norbert Wiener, RAND corporation, Robert Mercer, seminal paper, Skype, speech recognition, statistical model, technological singularity, Turing test, wikimedia commons

This explains why it is impossible to manually specify all the information that would be necessary for an automatic machine translation system, but also why the translation task has remained highly challenging and computationally expensive up to the present time. In this context, deep learning provides an interesting approach that seems especially fitted for the challenges involved in improving human language processing. An Overview of Deep Learning for Machine Translation Deep learning achieved its first success in image recognition. Rather than using a group of predefined characteristics, deep learning generally operates from a very large set of examples (hundreds of thousands of images of faces, for example) to automatically extract the most relevant characteristics (called features in machine learning).

A very promising trend of research is however to try to get meaningful representations of the internal model calculated by the neural network, so as to better understand how the whole approach works. The deep learning approach to machine translation (or neural machine translation) has proven efficient, first, on short sentences in closely related languages, and more recently on long sentences as well as more diverse languages. Progress is very quick, and the deep learning approach can be considered a revolution for the domain, as was the statistical approach at the beginning of the 1990s. It is interesting to note that deep learning approaches spread very quickly. All the major players in the domain (Google, Bing, Facebook, Systran, etc.) are moving forward to deep learning, and 2016 saw the deployment of the first online systems based on this approach.

Deep learning, on the contrary, makes it possible, at least in theory, to learn complex characteristics fully autonomously and gradually from the data, without any prior human effort. In the case of machine translation, deep learning makes it possible to envision systems where very few elements are specified manually, the idea being to let the system infer by itself the best representation from the data. A translation system based solely on deep learning (aka “deep learning machine translation” or “neural machine translation”) thus simply consists of an “encoder” (the part of the system that analyzes the training data) and a “decoder” (the part of the system that automatically produces a translation from a given sentence, based on the data analyzed by the encoder).


pages: 688 words: 107,867

Python Data Analytics: With Pandas, NumPy, and Matplotlib by Fabio Nelli

Amazon Web Services, backpropagation, centre right, computer vision, data science, Debian, deep learning, DevOps, functional programming, Google Earth, Guido van Rossum, Internet of things, optical character recognition, pattern recognition, sentiment analysis, speech recognition, statistical model, web application

“Science leads us forward in knowledge, but only analysis makes us more aware” This book is dedicated to all those who are constantly looking for awareness Table of Contents Chapter 1:​ An Introduction to Data Analysis Data Analysis Knowledge Domains of the Data Analyst Computer Science Mathematics and Statistics Machine Learning and Artificial Intelligence Professional Fields of Application Understanding the Nature of the Data When the Data Become Information When the Information Becomes Knowledge Types of Data The Data Analysis Process Problem Definition Data Extraction Data Preparation Data Exploration/​Visualization Predictive Modeling Model Validation Deployment Quantitative and Qualitative Data Analysis Open Data Python and Data Analysis Conclusions Chapter 2:​ Introduction to the Python World Python—The Programming Language Python—The Interpreter Python 2 and Python 3 Installing Python Python Distributions Using Python Writing Python Code IPython PyPI—The Python Package Index The IDEs for Python SciPy NumPy Pandas matplotlib Conclusions Chapter 3:​ The NumPy Library NumPy:​ A Little History The NumPy Installation Ndarray:​ The Heart of the Library Create an Array Types of Data The dtype Option Intrinsic Creation of an Array Basic Operations Arithmetic Operators The Matrix Product Increment and Decrement Operators Universal Functions (ufunc) Aggregate Functions Indexing, Slicing, and Iterating Indexing Slicing Iterating an Array Conditions and Boolean Arrays Shape Manipulation Array Manipulation Joining Arrays Splitting Arrays General Concepts Copies or Views of Objects Vectorization Broadcasting Structured Arrays Reading and Writing Array Data on Files Loading and Saving Data in Binary Files Reading Files with Tabular Data Conclusions Chapter 4:​ The pandas Library—An Introduction pandas:​ The Python Data Analysis Library Installation of pandas Installation from Anaconda Installation from PyPI Installation on Linux Installation from Source A Module Repository for Windows Testing Your pandas Installation Getting Started with pandas Introduction to pandas Data Structures The Series The DataFrame The Index Objects Other Functionalities on Indexes Reindexing Dropping Arithmetic and Data Alignment Operations Between Data Structures Flexible Arithmetic Methods Operations Between DataFrame and Series Function Application and Mapping Functions by Element Functions by Row or Column Statistics Functions Sorting and Ranking Correlation and Covariance “Not a Number” Data Assigning a NaN Value Filtering Out NaN Values Filling in NaN Occurrences Hierarchical Indexing and Leveling Reordering and Sorting Levels Summary Statistic by Level Conclusions Chapter 5:​ pandas:​ Reading and Writing Data I/​O API Tools CSV and Textual Files Reading Data in CSV or Text Files Using RegExp to Parse TXT Files Reading TXT Files Into Parts Writing Data in CSV Reading and Writing HTML Files Writing Data in HTML Reading Data from an HTML File Reading Data from XML Reading and Writing Data on Microsoft Excel Files JSON Data The Format HDF5 Pickle—Python Object Serialization Serialize a Python Object with cPickle Pickling with pandas Interacting with Databases Loading and Writing Data with SQLite3 Loading and Writing Data with PostgreSQL Reading and Writing Data with a NoSQL Database:​ MongoDB Conclusions Chapter 6:​ pandas in Depth:​ Data Manipulation Data Preparation Merging Concatenating Combining Pivoting Removing Data Transformation Removing Duplicates Mapping Discretization and Binning Detecting and Filtering Outliers Permutation Random Sampling String Manipulation Built-in Methods for String Manipulation Regular Expressions Data Aggregation GroupBy A Practical Example Hierarchical Grouping Group Iteration Chain of Transformations Functions on Groups Advanced Data Aggregation Conclusions Chapter 7:​ Data Visualization with matplotlib The matplotlib Library Installation The IPython and IPython QtConsole The matplotlib Architecture Backend Layer Artist Layer Scripting Layer (pyplot) pylab and pyplot pyplot A Simple Interactive Chart The Plotting Window Set the Properties of the Plot matplotlib and NumPy Using the kwargs Working with Multiple Figures and Axes Adding Elements to the Chart Adding Text Adding a Grid Adding a Legend Saving Your Charts Saving the Code Converting Your Session to an HTML File Saving Your Chart Directly as an Image Handling Date Values Chart Typology Line Charts Line Charts with pandas Histograms Bar Charts Horizontal Bar Charts Multiserial Bar Charts Multiseries Bar Charts with pandas Dataframe Multiseries Stacked Bar Charts Stacked Bar Charts with a pandas Dataframe Other Bar Chart Representations Pie Charts Pie Charts with a pandas Dataframe Advanced Charts Contour Plots Polar Charts The mplot3d Toolkit 3D Surfaces Scatter Plots in 3D Bar Charts in 3D Multi-Panel Plots Display Subplots Within Other Subplots Grids of Subplots Conclusions Chapter 8:​ Machine Learning with scikit-learn The scikit-learn Library Machine Learning Supervised and Unsupervised Learning Training Set and Testing Set Supervised Learning with scikit-learn The Iris Flower Dataset The PCA Decomposition K-Nearest Neighbors Classifier Diabetes Dataset Linear Regression:​ The Least Square Regression Support Vector Machines (SVMs) Support Vector Classification (SVC) Nonlinear SVC Plotting Different SVM Classifiers Using the Iris Dataset Support Vector Regression (SVR) Conclusions Chapter 9: Deep Learning with TensorFlow Artificial Intelligence, Machine Learning, and Deep Learning Artificial intelligence Machine Learning Is a Branch of Artificial Intelligence Deep Learning Is a Branch of Machine Learning The Relationship Between Artificial Intelligence, Machine Learning, and Deep Learning Deep Learning Neural Networks and GPUs Data Availability:​ Open Data Source, Internet of Things, and Big Data Python Deep Learning Python Frameworks Artificial Neural Networks How Artificial Neural Networks Are Structured Single Layer Perceptron (SLP) Multi Layer Perceptron (MLP) Correspondence Between Artificial and Biological Neural Networks TensorFlow TensorFlow:​ Google’s Framework TensorFlow:​ Data Flow Graph Start Programming with TensorFlow Installing TensorFlow Programming with the IPython QtConsole The Model and Sessions in TensorFlow Tensors Operation on Tensors Single Layer Perceptron with TensorFlow Before Starting Data To Be Analyzed The SLP Model Definition Learning Phase Test Phase and Accuracy Calculation Multi Layer Perceptron (with One Hidden Layer) with TensorFlow The MLP Model Definition Learning Phase Test Phase and Accuracy Calculation Multi Layer Perceptron (with Two Hidden Layers) with TensorFlow Test Phase and Accuracy Calculation Evaluation of Experimental Data Conclusions Chapter 10:​ An Example— Meteorological Data A Hypothesis to Be Tested:​ The Influence of the Proximity of the Sea The System in the Study:​ The Adriatic Sea and the Po Valley Finding the Data Source Data Analysis on Jupyter Notebook Analysis of Processed Meteorological Data The RoseWind Calculating the Mean Distribution of the Wind Speed Conclusions Chapter 11:​ Embedding the JavaScript D3 Library in the IPython Notebook The Open Data Source for Demographics The JavaScript D3 Library Drawing a Clustered Bar Chart The Choropleth Maps The Choropleth Map of the U.​S.​

In this chapter you can have an introductory overview of the world of deep learning, and the artificial neural networks on which its techniques are based. Furthermore, among the new Python frameworks for deep learning, you will use TensorFlow, which is proving to be an excellent tool for research and development of deep learning analysis techniques. With this library you will see how to develop different models of neural networks that are the basis of deep learning. Artificial Intelligence, Machine Learning, and Deep Learning For anyone dealing with the world of data analysis, these three terms are ultimately very common on the web, in text, and on seminars related to the subject.

The Relationship Between Artificial Intelligence, Machine Learning, and Deep Learning To sum up, in this section you have seen that machine learning and deep learning are actually subclasses of artificial intelligence. Figure 9-1 shows a schematization of classes in this relationship. Figure 9-1Schematization of the relationship between artificial intelligence, machine learning, and deep learning Deep Learning In this section, you will learn about some significant factors that led to the development of deep learning and read why only in these last years have there been so many steps forward.


AI 2041 by Kai-Fu Lee, Chen Qiufan

3D printing, Abraham Maslow, active measures, airport security, Albert Einstein, AlphaGo, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, augmented reality, autonomous vehicles, basic income, bitcoin, blockchain, blue-collar work, Cambridge Analytica, carbon footprint, Charles Babbage, computer vision, contact tracing, coronavirus, corporate governance, corporate social responsibility, COVID-19, CRISPR, cryptocurrency, DALL-E, data science, deep learning, deepfake, DeepMind, delayed gratification, dematerialisation, digital map, digital rights, digital twin, Elon Musk, fake news, fault tolerance, future of work, Future Shock, game design, general purpose technology, global pandemic, Google Glasses, Google X / Alphabet X, GPT-3, happiness index / gross national happiness, hedonic treadmill, hiring and firing, Hyperloop, information security, Internet of things, iterative process, job automation, language acquisition, low earth orbit, Lyft, Maslow's hierarchy, mass immigration, mirror neurons, money: store of value / unit of account / medium of exchange, mutually assured destruction, natural language processing, Neil Armstrong, Nelson Mandela, OpenAI, optical character recognition, pattern recognition, plutocrats, post scarcity, profit motive, QR code, quantitative easing, Richard Feynman, ride hailing / ride sharing, robotic process automation, Satoshi Nakamoto, self-driving car, seminal paper, Silicon Valley, smart cities, smart contracts, smart transportation, Snapchat, social distancing, speech recognition, Stephen Hawking, synthetic biology, telemarketer, Tesla Model S, The future is already here, trolley problem, Turing test, uber lyft, universal basic income, warehouse automation, warehouse robotics, zero-sum game

“The Golden Elephant” explores deep learning’s stunning potential—as well as its potential pitfalls, like perpetuating bias. So how do researchers develop, train, and use deep learning? What are its limitations? How is deep learning fueled by data? Why are Internet and finance the two most promising initial industries for AI? In what conditions does deep learning optimally work? When it does work, why does it seem to work so well? And what are the downsides and pitfalls of AI? WHAT IS DEEP LEARNING? Inspired by the tangled webs of neurons in our brains, deep learning constructs software layers of artificial neural networks with input and output layers.

That question about trade-offs lies at the heart of “The Golden Elephant,” which introduces the foundational AI concept of deep leaning. Deep learning is a recent AI breakthrough. Among the many subfields of AI, machine learning is the field that has produced the most successful applications, and within machine learning, the biggest advance is “deep learning”—so much so that the terms “AI,” “machine learning,” and “deep learning” are sometimes used interchangeably (if imprecisely). Deep learning supercharged excitement in AI in 2016 when it powered AlphaGo’s stunning victory over a human competitor in Go, Asia’s most popular intellectual board game. After that headline-grabbing turn, deep learning became a prominent part of most commercial AI applications, and it is featured in most of the stories in AI 2041.

The figure on this page shows such a “cat recognition” deep learning neural network. During this process, deep learning is mathematically trained to maximize the value of an “objective function.” In the case of cat recognition, the objective function is the probability of correct recognitions of “cat” vs. “no cat.” Once “trained,” this deep learning network is essentially a giant mathematical equation that can be tested on images it hasn’t seen, and it will perform “inference” to determine the presence or absence of cats. The advent of deep learning pushed AI capabilities from unusable to usable for many domains.


pages: 161 words: 39,526

Applied Artificial Intelligence: A Handbook for Business Leaders by Mariya Yao, Adelyn Zhou, Marlene Jia

Airbnb, algorithmic bias, AlphaGo, Amazon Web Services, artificial general intelligence, autonomous vehicles, backpropagation, business intelligence, business process, call centre, chief data officer, cognitive load, computer vision, conceptual framework, data science, deep learning, DeepMind, en.wikipedia.org, fake news, future of work, Geoffrey Hinton, industrial robot, information security, Internet of things, iterative process, Jeff Bezos, job automation, machine translation, Marc Andreessen, natural language processing, new economy, OpenAI, pattern recognition, performance metric, price discrimination, randomized controlled trial, recommendation engine, robotic process automation, Salesforce, self-driving car, sentiment analysis, Silicon Valley, single source of truth, skunkworks, software is eating the world, source of truth, sparse data, speech recognition, statistical model, strong AI, subscription business, technological singularity, The future is already here

You can also find updated technical resources on our book website, appliedaibook.com. Deep Learning Deep learning is a subfield of machine learning that builds algorithms by using multi-layered artificial neural networks, which are mathematical structures loosely inspired by how biological neurons fire. Neural networks were invented in the 1950s, but recent advances in computational power and algorithm design—as well as the growth of big data—have enabled deep learning algorithms to approach human-level performance in tasks such as speech recognition and image classification. Deep learning, in combination with reinforcement learning, enabled Google DeepMind’s AlphaGo to defeat human world champions of Go in 2016, a feat that many experts had considered to be computationally impossible.

Much media attention has been focused on deep learning, and an increasing number of sophisticated technology companies have successfully implemented deep learning for enterprise-scale products. Google replaced previous statistical methods for machine translation with neural networks to achieve superior performance.(4) Microsoft announced in 2017 that they had achieved human parity in conversational speech recognition.(5) Promising computer vision startups like Clarifai employ deep learning to achieve state-of-the-art results in recognizing objects in images and video for Fortune 500 brands.(6) While deep learning models outperform older machine learning approaches to many problems, they are more difficult to develop because they require robust training of data sets and specialized expertise in optimization techniques.

Operationalizing and productizing models for enterprise-scale usage also requires different but equally difficult-to-acquire technical expertise. In practice, using simpler AI approaches like older, non-deep-learning machine learning techniques can produce faster and better results than fancy neural nets can. Rather than building custom deep learning solutions, many enterprises opt for Machine Learning as a Service (MLaaS) solutions from Google, Amazon, IBM, Microsoft, or leading AI startups. Deep learning also suffers from technical drawbacks. Successful models typically require a large volume of reliably-labeled data, which enterprises often lack.


Four Battlegrounds by Paul Scharre

2021 United States Capitol attack, 3D printing, active measures, activist lawyer, AI winter, AlphaGo, amateurs talk tactics, professionals talk logistics, artificial general intelligence, ASML, augmented reality, Automated Insights, autonomous vehicles, barriers to entry, Berlin Wall, Big Tech, bitcoin, Black Lives Matter, Boeing 737 MAX, Boris Johnson, Brexit referendum, business continuity plan, business process, carbon footprint, chief data officer, Citizen Lab, clean water, cloud computing, commoditize, computer vision, coronavirus, COVID-19, crisis actor, crowdsourcing, DALL-E, data is not the new oil, data is the new oil, data science, deep learning, deepfake, DeepMind, Demis Hassabis, Deng Xiaoping, digital map, digital rights, disinformation, Donald Trump, drone strike, dual-use technology, Elon Musk, en.wikipedia.org, endowment effect, fake news, Francis Fukuyama: the end of history, future of journalism, future of work, game design, general purpose technology, Geoffrey Hinton, geopolitical risk, George Floyd, global supply chain, GPT-3, Great Leap Forward, hive mind, hustle culture, ImageNet competition, immigration reform, income per capita, interchangeable parts, Internet Archive, Internet of things, iterative process, Jeff Bezos, job automation, Kevin Kelly, Kevin Roose, large language model, lockdown, Mark Zuckerberg, military-industrial complex, move fast and break things, Nate Silver, natural language processing, new economy, Nick Bostrom, one-China policy, Open Library, OpenAI, PalmPilot, Parler "social media", pattern recognition, phenotype, post-truth, purchasing power parity, QAnon, QR code, race to the bottom, RAND corporation, recommendation engine, reshoring, ride hailing / ride sharing, robotic process automation, Rodney Brooks, Rubik’s Cube, self-driving car, Shoshana Zuboff, side project, Silicon Valley, slashdot, smart cities, smart meter, Snapchat, social software, sorting algorithm, South China Sea, sparse data, speech recognition, Steve Bannon, Steven Levy, Stuxnet, supply-chain attack, surveillance capitalism, systems thinking, tech worker, techlash, telemarketer, The Brussels Effect, The Signal and the Noise by Nate Silver, TikTok, trade route, TSMC

They had huge resources for managing the nontechnical aspects of government contracting. With just two total employees, Deep Learning Analytics had none. However, Deep Learning Analytics did have better technology. “We didn’t know anything about radar,” John said, but “it turned out not to be as important as some of the other things” such as “knowing how to do deep learning well” and “a really disciplined software engineering approach.” He said, “That’s one of the recurring stories of . . . deep learning since 2012, is that domain expertise isn’t always the thing that’s going to matter so much. . . . The labeled data was sufficient.” Deep Learning Analytics won a $6 million contract from DARPA for the TRACE program, beating out competitors that had better human expert knowledge on radar imaging.

Many of the datasets used to train deep neural networks are massive. ImageNet, the image database that kicked off the deep learning revolution in 2012, includes 14 million images. In order for a neural network to learn what an object looks like, such as a “cat,” “car,” or “chair,” it needs many examples to develop an internal representation of that object. For any given object, ImageNet contains roughly 500 to 1,000 images of that object to allow for a rich set of examples. Deep learning is a more data-intensive process than writing a set of rules for behavior, but deep learning can also be vastly more effective at building intelligent systems for some tasks.

While AI models are often trained at large data centers, the lower compute requirements mean that inference can increasingly be done on edge devices, such as smartphones, IoT devices, intelligent video cameras, or autonomous cars. Both training and inference are done on computer chips, and advances in computing hardware has been fundamental to the deep learning revolution. Graphics processing units (GPUs) have emerged as a key enabler for deep learning because of their ability to do parallel computation (which is valuable for neural networks) better than traditional central processing units (CPUs). A McKinsey study estimated that 97 percent of deep learning training in data centers in 2017 used GPUs. As machine learning researchers have turned to training bigger models on ever-larger datasets, they have also needed increasingly massive amounts of compute.


pages: 339 words: 94,769

Possible Minds: Twenty-Five Ways of Looking at AI by John Brockman

AI winter, airport security, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Alignment Problem, AlphaGo, artificial general intelligence, Asilomar, autonomous vehicles, basic income, Benoit Mandelbrot, Bill Joy: nanobots, Bletchley Park, Buckminster Fuller, cellular automata, Claude Shannon: information theory, Computing Machinery and Intelligence, CRISPR, Daniel Kahneman / Amos Tversky, Danny Hillis, data science, David Graeber, deep learning, DeepMind, Demis Hassabis, easy for humans, difficult for computers, Elon Musk, Eratosthenes, Ernest Rutherford, fake news, finite state, friendly AI, future of work, Geoffrey Hinton, Geoffrey West, Santa Fe Institute, gig economy, Hans Moravec, heat death of the universe, hype cycle, income inequality, industrial robot, information retrieval, invention of writing, it is difficult to get a man to understand something, when his salary depends on his not understanding it, James Watt: steam engine, Jeff Hawkins, Johannes Kepler, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, Kickstarter, Laplace demon, Large Hadron Collider, Loebner Prize, machine translation, market fundamentalism, Marshall McLuhan, Menlo Park, military-industrial complex, mirror neurons, Nick Bostrom, Norbert Wiener, OpenAI, optical character recognition, paperclip maximiser, pattern recognition, personalized medicine, Picturephone, profit maximization, profit motive, public intellectual, quantum cryptography, RAND corporation, random walk, Ray Kurzweil, Recombinant DNA, Richard Feynman, Rodney Brooks, self-driving car, sexual politics, Silicon Valley, Skype, social graph, speech recognition, statistical model, Stephen Hawking, Steven Pinker, Stewart Brand, strong AI, superintelligent machines, supervolcano, synthetic biology, systems thinking, technological determinism, technological singularity, technoutopianism, TED Talk, telemarketer, telerobotics, The future is already here, the long tail, the scientific method, theory of mind, trolley problem, Turing machine, Turing test, universal basic income, Upton Sinclair, Von Neumann architecture, Whole Earth Catalog, Y2K, you are the product, zero-sum game

Within a few years, Judea’s Bayesian networks had completely overshadowed the previous rule-based approaches to artificial intelligence. The advent of deep learning—in which computers, in effect, teach themselves to be smarter by observing tons of data—has given him pause, because this method lacks transparency. While recognizing the impressive achievements in deep learning by colleagues such as Michael I. Jordan and Geoffrey Hinton, he feels uncomfortable with this kind of opacity. He set out to understand the theoretical limitations of deep-learning systems and points out that basic barriers exist that will prevent them from achieving a human kind of intelligence, no matter what we do.

The new excitement about AI comes because AI researchers have recently produced powerful and effective versions of both of these learning methods. But there is nothing profoundly new about the methods themselves. BOTTOM-UP DEEP LEARNING In the 1980s, computer scientists devised an ingenious way to get computers to detect patterns in data: connectionist, or neural-network, architecture (the “neural” part was, and still is, metaphorical). The approach fell into the doldrums in the 1990s but has recently been revived with powerful “deep-learning” methods like Google’s DeepMind. For example, you can give a deep-learning program a bunch of Internet images labeled “cat,” others labeled “house,” and so on. The program can detect the patterns differentiating the two sets of images and use that information to label new images correctly.

For example, researchers at Google’s DeepMind used a combination of deep learning and reinforcement learning to teach a computer to play Atari video games. The computer knew nothing about how the games worked. It began by acting randomly and got information only about what the screen looked like at each moment and how well it had scored. Deep learning helped interpret the features on the screen, and reinforcement learning rewarded the system for higher scores. The computer got very good at playing several of the games, but it also completely bombed on others that were just as easy for humans to master. A similar combination of deep learning and reinforcement learning has enabled the success of DeepMind’s AlphaZero, a program that managed to beat human players at both chess and Go, equipped with only a basic knowledge of the rules of the game and some planning capacities.


The Myth of Artificial Intelligence: Why Computers Can't Think the Way We Do by Erik J. Larson

AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Alignment Problem, AlphaGo, Amazon Mechanical Turk, artificial general intelligence, autonomous vehicles, Big Tech, Black Swan, Bletchley Park, Boeing 737 MAX, business intelligence, Charles Babbage, Claude Shannon: information theory, Computing Machinery and Intelligence, conceptual framework, correlation does not imply causation, data science, deep learning, DeepMind, driverless car, Elon Musk, Ernest Rutherford, Filter Bubble, Geoffrey Hinton, Georg Cantor, Higgs boson, hive mind, ImageNet competition, information retrieval, invention of the printing press, invention of the wheel, Isaac Newton, Jaron Lanier, Jeff Hawkins, John von Neumann, Kevin Kelly, Large Hadron Collider, Law of Accelerating Returns, Lewis Mumford, Loebner Prize, machine readable, machine translation, Nate Silver, natural language processing, Nick Bostrom, Norbert Wiener, PageRank, PalmPilot, paperclip maximiser, pattern recognition, Peter Thiel, public intellectual, Ray Kurzweil, retrograde motion, self-driving car, semantic web, Silicon Valley, social intelligence, speech recognition, statistical model, Stephen Hawking, superintelligent machines, tacit knowledge, technological singularity, TED Talk, The Coming Technological Singularity, the long tail, the scientific method, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, theory of mind, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, Yochai Benkler

So-­called adversarial attacks are not unique to AlexNet, ­either. Deep learning systems showing impressive per­for­mance on image recognition in fact do not understand what they are perceiving. It is therefore easy to expose the brittleness of the approach. Other experiments have drastically degraded per­for­mance by simply including background objects, easily ignored by ­humans, but problematic for deep learning systems. In other experiments, images that look like salt-­and-­pepper static on TVs—­random assemblages of black and 166 T he P rob­lem of I nference white pixels—­fool deep learning systems, which might classify them as pictures of armadillos, cheetahs, or centipedes.

The system produced question-­answer pairs, accumulating evidence in the pro­cessing pipeline, and scored the list of pairs using statistical I nference and L anguage I I 231 techniques—­a ll pos­si­ble ­because of data about past games, where the outcome is known. It’s notable that Watson d­ idn’t use deep learning—at least not the version of Watson that outplayed ­human champions in the televised event in 2011. Deep learning would not have helped—­and ­here again, this is a nod to the ingenuity of the IBM team. A relatively s­ imple machine-­learning technique known as regularized logistic regression was used, even though more power­ful learning algorithms ­were available. (Deep learning in 2011 was still relatively unknown.) More power­ful learning systems would simply incur more computational training and testing expense—­A I is a toolkit, in the end.

Classical AI scientists dismissed t­ hese as “shallow” or “empirical,” ­because statistical approaches using data ­d idn’t use knowledge and c­ ouldn’t h­ andle reasoning or planning very well (if at all). But with the web providing the much-­needed data, the approaches started showing promise. The deep learning “revolution” began around 2006, with early work by Geoff Hinton, Yann LeCun, and Yoshua Bengio. By 2010, Google, Microsoft, and other Big Tech companies w ­ ere using neural networks for major consumer applications such as voice recognition, and by 2012, Android smartphones featured neural network technology. From about this time up through 2020 (as I write this), deep learning has been the hammer causing all the prob­lems of AI to look like a nail—­ prob­lems that can be approached “from the ground up,” like playing games and recognizing voice and image data, now account for most of the research and commercial dollars in AI.


pages: 346 words: 97,890

The Road to Conscious Machines by Michael Wooldridge

Ada Lovelace, AI winter, algorithmic bias, AlphaGo, Andrew Wiles, Anthropocene, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, backpropagation, basic income, Bletchley Park, Boeing 747, British Empire, call centre, Charles Babbage, combinatorial explosion, computer vision, Computing Machinery and Intelligence, DARPA: Urban Challenge, deep learning, deepfake, DeepMind, Demis Hassabis, don't be evil, Donald Trump, driverless car, Elaine Herzberg, Elon Musk, Eratosthenes, factory automation, fake news, future of work, gamification, general purpose technology, Geoffrey Hinton, gig economy, Google Glasses, intangible asset, James Watt: steam engine, job automation, John von Neumann, Loebner Prize, Minecraft, Mustafa Suleyman, Nash equilibrium, Nick Bostrom, Norbert Wiener, NP-complete, P = NP, P vs NP, paperclip maximiser, pattern recognition, Philippa Foot, RAND corporation, Ray Kurzweil, Rodney Brooks, self-driving car, Silicon Valley, Stephen Hawking, Steven Pinker, strong AI, technological singularity, telemarketer, Tesla Model S, The Coming Technological Singularity, The Future of Employment, the scientific method, theory of mind, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, trolley problem, Turing machine, Turing test, universal basic income, Von Neumann architecture, warehouse robotics

But it is not obvious that just continuing to refine deep learning techniques will address this problem. Deep learning will be part of the solution, but a proper solution will, I think, require something much more than just a larger neural net, or more processing power, or more training data in the form of boring French novels. It will require breakthroughs at least as dramatic as deep learning itself. I suspect those breakthroughs will require explicitly represented knowledge as well as deep learning: somehow, we will have to bridge the gap between the world of explicitly represented knowledge, and the world of deep learning and neural nets.

Like the story of AI itself, the story of neural networks is a troubled one: there have been two ‘neural net winters’, and as recently as the turn of the century, many in AI regarded neural networks as a dead or dying field. But neural nets ultimately triumphed, and the new idea driving their resurgence is a technique called deep learning. Deep learning is the core technology of DeepMind. I will tell you the DeepMind story, and how the systems that DeepMind built attracted global adulation. But while deep learning is a powerful and important technique, it isn’t the end of the story for AI, so, just as we did with other AI technologies, we’ll discuss its limitations in detail too. Machine Learning, Briefly The goal of machine learning is to have programs that can compute a desired output from a given input, without being given an explicit recipe for how to do this.

But a resurgence did indeed begin, around the year 2006, and it led to the biggest and most highly publicized expansion ever in the history of AI. The big idea that drove the third wave of neural net research went by the name of deep learning.6 I would love to tell you that there was a single key idea which characterizes deep learning, but in truth the term refers to a collection of related ideas. Deep learning means deep in at least three different senses. Of these, perhaps the most important, as the name suggests, is simply the idea of having more layers. Each layer can process a problem at a different level of abstraction – layers close to the input layer handle low-level concepts in the data (such as the edges in a picture), and as we move deeper into the network, we find more abstract concepts being handled.


pages: 252 words: 74,167

Thinking Machines: The Inside Story of Artificial Intelligence and Our Race to Build the Future by Luke Dormehl

"World Economic Forum" Davos, Ada Lovelace, agricultural Revolution, AI winter, Albert Einstein, Alexey Pajitnov wrote Tetris, algorithmic management, algorithmic trading, AlphaGo, Amazon Mechanical Turk, Apple II, artificial general intelligence, Automated Insights, autonomous vehicles, backpropagation, Bletchley Park, book scanning, borderless world, call centre, cellular automata, Charles Babbage, Claude Shannon: information theory, cloud computing, computer vision, Computing Machinery and Intelligence, correlation does not imply causation, crowdsourcing, deep learning, DeepMind, driverless car, drone strike, Elon Musk, Flash crash, Ford Model T, friendly AI, game design, Geoffrey Hinton, global village, Google X / Alphabet X, Hans Moravec, hive mind, industrial robot, information retrieval, Internet of things, iterative process, Jaron Lanier, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kickstarter, Kodak vs Instagram, Law of Accelerating Returns, life extension, Loebner Prize, machine translation, Marc Andreessen, Mark Zuckerberg, Menlo Park, Mustafa Suleyman, natural language processing, Nick Bostrom, Norbert Wiener, out of africa, PageRank, paperclip maximiser, pattern recognition, radical life extension, Ray Kurzweil, recommendation engine, remote working, RFID, scientific management, self-driving car, Silicon Valley, Skype, smart cities, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, social intelligence, speech recognition, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Pinker, strong AI, superintelligent machines, tech billionaire, technological singularity, The Coming Technological Singularity, The Future of Employment, Tim Cook: Apple, Tony Fadell, too big to fail, traumatic brain injury, Turing machine, Turing test, Vernor Vinge, warehouse robotics, Watson beat the top human players on Jeopardy!

Having a computer that knows what a cat is may not sound like a particularly useful achievement, but the ability to use deep learning for computer vision has a host of real-world uses. One startup called Dextro is using deep learning to create better tools for online video searches. Instead of relying on keyword tags, Dextro’s neural nets scans through live videos, analysing both audio and image. Ask it about David Cameron, for example, and it will bring up not just Conservative Party videos, but also video in which the UK Prime Minister is only mentioned in passing. Facebook, meanwhile, uses deep learning to automatically tag images. In June 2014, the social network published a paper describing what it refers to as its ‘DeepFace’ facial recognition technology.

In June 2014, the social network published a paper describing what it refers to as its ‘DeepFace’ facial recognition technology. Thanks to deep learning, Facebook’s algorithms have proven almost as accurate as the human brain when it comes to looking at two photos and saying whether they show the same person, regardless of whether different lighting or camera angles are used. Facebook is also using deep learning to create technology able to describe images to blind users – such as verbalising the fact that an image shows a particular friend riding a bicycle through the English countryside on a summer’s day. Other projects combine deep learning with robotics. One group of researchers from the University of Maryland has taught a robot how to cook a simple meal by simply showing it ‘how-to’ cooking videos available on YouTube.

Nobody, not even the smartest neural network, can be expected to learn what something is if they are never explicitly told. In fact, what Hinton discovered was that unsupervised learning could be used to train up layers of features, one layer at a time. This was the catalyst in the field of ‘deep learning’, currently the hottest area in AI. You can think of a deep learning network a bit like a factory line. After the raw materials are input, they are passed down the conveyor belt, with each subsequent stop or layer extracting a different set of high-level features. To continue the example of an image recognition network, the first layer may be used to analyse pixel brightness.


pages: 1,082 words: 87,792

Python for Algorithmic Trading: From Idea to Cloud Deployment by Yves Hilpisch

algorithmic trading, Amazon Web Services, automated trading system, backtesting, barriers to entry, bitcoin, Brownian motion, cloud computing, coronavirus, cryptocurrency, data science, deep learning, Edward Thorp, fiat currency, global macro, Gordon Gekko, Guido van Rossum, implied volatility, information retrieval, margin call, market microstructure, Myron Scholes, natural language processing, paper trading, passive investing, popular electronics, prediction markets, quantitative trading / quantitative finance, random walk, risk free rate, risk/return, Rubik’s Cube, seminal paper, Sharpe ratio, short selling, sorting algorithm, systematic trading, transaction costs, value at risk

For example, assume that a stock trades 10 USD under its 200 days SMA level of 100. It is then expected that the stock price will return to its SMA level sometime soon. Machine and Deep Learning With machine and deep learning algorithms, one generally takes a more black box approach to predicting market movements. For simplicity and reproducibility, the examples in this book mainly rely on historical return observations as features to train machine and deep learning algorithms to predict stock market movements. This book does not introduce algorithmic trading in a systematic fashion. Since the focus lies on applying Python in this fascinating field, readers not familiar with algorithmic trading should consult dedicated resources on the topic, some of which are cited in this chapter and the chapters that follow.

With this reasoning, the prediction problem basically boils down to a classification problem of deciding whether there will be an upwards or downwards movement. Different machine learning algorithms have been developed to attack such classification problems. This chapter introduces logistic regression, as a typical baseline algorithm, for classification. Deep learning-based strategies Deep learning has been popularized by such technological giants as Facebook. Similar to machine learning algorithms, deep learning algorithms based on neural networks allow one to attack classification problems faced in financial market prediction. The chapter is organized as follows. “Using Linear Regression for Market Movement Prediction” introduces linear regression as a technique to predict index levels and the direction of price movements.

For example, testing the very same strategy instead of in-sample on an out-of-sample data set and adding transaction costs—as two ways of getting to a more realistic picture—often shows that the performance of the considered strategy “suddenly” trails the base instrument performance-wise or turns to a net loss. Using Deep Learning for Market Movement Prediction Right from the open sourcing and publication by Google, the deep learning library TensorFlow has attracted much interest and wide-spread application. This section applies TensorFlow in the same way that the previous section applied scikit-learn to the prediction of stock market movements modeled as a classification problem. However, TensorFlow is not used directly; it is rather used via the equally popular Keras deep learning package. Keras can be thought of as providing a higher level abstraction to the TensorFlow package with an easier to understand and use API.


Succeeding With AI: How to Make AI Work for Your Business by Veljko Krunic

AI winter, Albert Einstein, algorithmic trading, AlphaGo, Amazon Web Services, anti-fragile, anti-pattern, artificial general intelligence, autonomous vehicles, Bayesian statistics, bioinformatics, Black Swan, Boeing 737 MAX, business process, cloud computing, commoditize, computer vision, correlation coefficient, data is the new oil, data science, deep learning, DeepMind, en.wikipedia.org, fail fast, Gini coefficient, high net worth, information retrieval, Internet of things, iterative process, job automation, Lean Startup, license plate recognition, minimum viable product, natural language processing, recommendation engine, self-driving car, sentiment analysis, Silicon Valley, six sigma, smart cities, speech recognition, statistical model, strong AI, tail risk, The Design of Experiments, the scientific method, web application, zero-sum game

During discussions, the team has mentioned that this problem could be solved using either an SVM, a decision tree, logistic regression, or a deep learning-based classification. Should you use deep learning? After all, it’s an exceedingly popular technology, has a substantial mindshare, and could solve the problem. Or should you use one of the other suggested options? Answer to question 5:  You can use a deep learning-based classifier, but I typically wouldn’t try it as my first (or even second) choice. Unless your survey is a monster with a thousand questions, it’s unclear that you’ll be able to train a large deep learning network at all.  I’m not persuaded that for the typical survey of only a few questions, more complicated methods are going to produce better results.

During discussions, the team has mentioned that this problem could be solved using either an SVM, a decision tree, logistic regression, or a deep learning-based classification. Should you use deep learning? After all, it’s an exceedingly popular technology, has a substantial mindshare, and could solve the problem. Or should you use one of the other suggested options? Question 6: You answered question 5 using an algorithm of your choice. Suppose the algorithm you chose didn’t provide a good enough prediction of a customer returning the product. Should you use a better ML algorithm? Is it now time to use the latest and greatest from the field of deep learning? Summary  Every AI project uses some form of the ML pipeline.

Harris, Murphy, and Vaisman’s book [66] provides a good summary of the state of data science before the advancement of deep learning. Data scientist—A practitioner of the field of data science. Many sources (including this book) classify AI practitioners as data scientists. Database administrator (DBA)—A professional responsible for the maintenance of a database. Most commonly, a DBA would be responsible for maintaining a RDBMS-based database. Deep learning—A subfield of AI that uses artificial neural networks arranged in a significant number of layers. In the last few years, deep learning algorithms have been successful in a large number of highly visible applications, including image processing and speech and audio recognition.


pages: 416 words: 112,268

Human Compatible: Artificial Intelligence and the Problem of Control by Stuart Russell

3D printing, Ada Lovelace, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Alfred Russel Wallace, algorithmic bias, AlphaGo, Andrew Wiles, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, augmented reality, autonomous vehicles, basic income, behavioural economics, Bletchley Park, blockchain, Boston Dynamics, brain emulation, Cass Sunstein, Charles Babbage, Claude Shannon: information theory, complexity theory, computer vision, Computing Machinery and Intelligence, connected car, CRISPR, crowdsourcing, Daniel Kahneman / Amos Tversky, data science, deep learning, deepfake, DeepMind, delayed gratification, Demis Hassabis, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Ernest Rutherford, fake news, Flash crash, full employment, future of work, Garrett Hardin, Geoffrey Hinton, Gerolamo Cardano, Goodhart's law, Hans Moravec, ImageNet competition, Intergovernmental Panel on Climate Change (IPCC), Internet of things, invention of the wheel, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Nash: game theory, John von Neumann, Kenneth Arrow, Kevin Kelly, Law of Accelerating Returns, luminiferous ether, machine readable, machine translation, Mark Zuckerberg, multi-armed bandit, Nash equilibrium, Nick Bostrom, Norbert Wiener, NP-complete, OpenAI, openstreetmap, P = NP, paperclip maximiser, Pareto efficiency, Paul Samuelson, Pierre-Simon Laplace, positional goods, probability theory / Blaise Pascal / Pierre de Fermat, profit maximization, RAND corporation, random walk, Ray Kurzweil, Recombinant DNA, recommendation engine, RFID, Richard Thaler, ride hailing / ride sharing, Robert Shiller, robotic process automation, Rodney Brooks, Second Machine Age, self-driving car, Shoshana Zuboff, Silicon Valley, smart cities, smart contracts, social intelligence, speech recognition, Stephen Hawking, Steven Pinker, superintelligent machines, surveillance capitalism, Thales of Miletus, The Future of Employment, The Theory of the Leisure Class by Thorstein Veblen, Thomas Bayes, Thorstein Veblen, Tragedy of the Commons, transport as a service, trolley problem, Turing machine, Turing test, universal basic income, uranium enrichment, vertical integration, Von Neumann architecture, Wall-E, warehouse robotics, Watson beat the top human players on Jeopardy!, web application, zero-sum game

The current champion learning algorithm for machine translation is a form of so-called deep learning, and it produces a rule in the form of an artificial neural network with hundreds of layers and millions of parameters.D Other deep learning algorithms have turned out to be very good at classifying the objects in images and recognizing the words in a speech signal. Machine translation, speech recognition, and visual object recognition are three of the most important subfields in AI, which is why there has been so much excitement about the prospects for deep learning. One can argue almost endlessly about whether deep learning will lead directly to human-level AI.

News article on Geoff Hinton having second thoughts about deep networks: Steve LeVine, “Artificial intelligence pioneer says we need to start over,” Axios, September 15, 2017. 9. A catalog of shortcomings of deep learning: Gary Marcus, “Deep learning: A critical appraisal,” arXiv:1801.00631 (2018). 10. A popular textbook on deep learning, with a frank assessment of its weaknesses: François Chollet, Deep Learning with Python (Manning Publications, 2017). 11. An explanation of explanation-based learning: Thomas Dietterich, “Learning at the knowledge level,” Machine Learning 1 (1986): 287–315. 12. A superficially quite different explanation of explanation-based learning: John Laird, Paul Rosenbloom, and Allen Newell, “Chunking in Soar: The anatomy of a general learning mechanism,” Machine Learning 1 (1986): 11–46.

Computers are also made of circuits, both in their memories and in their processing units; but those circuits have to be arranged in certain ways, and layers of software have to be added, before the computer can support the operation of high-level programming languages and logical reasoning systems. At present, however, there is no sign that deep learning systems can develop such capabilities by themselves—nor does it make scientific sense to require them to do so. There are further reasons to think that deep learning may reach a plateau well short of general intelligence, but it’s not my purpose here to diagnose all the problems: others, both inside8 and outside9 the deep learning community, have noted many of them. The point is that simply creating larger and deeper networks and larger data sets and bigger machines is not enough to create human-level AI.


pages: 499 words: 144,278

Coders: The Making of a New Tribe and the Remaking of the World by Clive Thompson

"Margaret Hamilton" Apollo, "Susan Fowler" uber, 2013 Report for America's Infrastructure - American Society of Civil Engineers - 19 March 2013, 4chan, 8-hour work day, Aaron Swartz, Ada Lovelace, AI winter, air gap, Airbnb, algorithmic bias, AlphaGo, Amazon Web Services, Andy Rubin, Asperger Syndrome, augmented reality, Ayatollah Khomeini, backpropagation, barriers to entry, basic income, behavioural economics, Bernie Sanders, Big Tech, bitcoin, Bletchley Park, blockchain, blue-collar work, Brewster Kahle, Brian Krebs, Broken windows theory, call centre, Cambridge Analytica, cellular automata, Charles Babbage, Chelsea Manning, Citizen Lab, clean water, cloud computing, cognitive dissonance, computer vision, Conway's Game of Life, crisis actor, crowdsourcing, cryptocurrency, Danny Hillis, data science, David Heinemeier Hansson, deep learning, DeepMind, Demis Hassabis, disinformation, don't be evil, don't repeat yourself, Donald Trump, driverless car, dumpster diving, Edward Snowden, Elon Musk, Erik Brynjolfsson, Ernest Rutherford, Ethereum, ethereum blockchain, fake news, false flag, Firefox, Frederick Winslow Taylor, Free Software Foundation, Gabriella Coleman, game design, Geoffrey Hinton, glass ceiling, Golden Gate Park, Google Hangouts, Google X / Alphabet X, Grace Hopper, growth hacking, Guido van Rossum, Hacker Ethic, hockey-stick growth, HyperCard, Ian Bogost, illegal immigration, ImageNet competition, information security, Internet Archive, Internet of things, Jane Jacobs, John Markoff, Jony Ive, Julian Assange, Ken Thompson, Kickstarter, Larry Wall, lone genius, Lyft, Marc Andreessen, Mark Shuttleworth, Mark Zuckerberg, Max Levchin, Menlo Park, meritocracy, microdosing, microservices, Minecraft, move 37, move fast and break things, Nate Silver, Network effects, neurotypical, Nicholas Carr, Nick Bostrom, no silver bullet, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, Oculus Rift, off-the-grid, OpenAI, operational security, opioid epidemic / opioid crisis, PageRank, PalmPilot, paperclip maximiser, pattern recognition, Paul Graham, paypal mafia, Peter Thiel, pink-collar, planetary scale, profit motive, ransomware, recommendation engine, Richard Stallman, ride hailing / ride sharing, Rubik’s Cube, Ruby on Rails, Sam Altman, Satoshi Nakamoto, Saturday Night Live, scientific management, self-driving car, side project, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, single-payer health, Skype, smart contracts, Snapchat, social software, software is eating the world, sorting algorithm, South of Market, San Francisco, speech recognition, Steve Wozniak, Steven Levy, systems thinking, TaskRabbit, tech worker, techlash, TED Talk, the High Line, Travis Kalanick, Uber and Lyft, Uber for X, uber lyft, universal basic income, urban planning, Wall-E, Watson beat the top human players on Jeopardy!, WeWork, WikiLeaks, women in the workforce, Y Combinator, Zimmermann PGP, éminence grise

You could now create neural nets with many layers, or even dozens: “deep learning,” as it’s called, because of how many layers are stacked up. By 2012, the field had a seismic breakthrough. Up at the University of Toronto, the British computer scientist Geoff Hinton had been beavering away for two decades on improving neural networks. That year he and a team of students showed off the most impressive neural net yet—by soundly beating competitors at an annual AI shootout. The ImageNet challenge, as it’s known, is an annual competition among AI researchers to see whose system is best at recognizing images. That year, Hinton’s deep-learning neural net got only 15.3 percent of the images wrong.

At first, the result surprised them, though on reflection it made sense: There are a lot of cats on YouTube, so any self-learning algorithm told to pick out salient features that recur over and over again might, in essence, discover humanity’s online obsession with felines. Still, it was a spookily humanlike bit of reasoning. The Terminator was coming to life, and it could grasp the concept of cats! Google soon began throwing enormous resources at deep learning, developing its abilities and integrating it into as many products as possible. They trained deep-learning nets on language pairings—showing it, say, all the Canadian parliamentary proceedings that were translated into both English and French, or their own collections of crowdsourced translations. When they were done, Google Translate became, in a single night, remarkably better—so much improved that Japanese scholars were marveling at the machine’s deft ability to translate complex literary passages between their language and English.

When they were done, Google Translate became, in a single night, remarkably better—so much improved that Japanese scholars were marveling at the machine’s deft ability to translate complex literary passages between their language and English. A few short years later, deep learning had swept the world of software. Companies everywhere were rushing to incorporate it into their services. Ng was snapped up by Baidu, the Chinese search giant, as it frantically sought to catch up to Google’s AI wave. Facebook engineers had long been using many different styles of machine learning to help recognize faces in photos, filter stories in the News Feed, and predict whether users would click on an ad; it set up an experimental AI research lab, and soon Facebook was producing a deep-learning model that could recognize faces with 97.35 percent accuracy, 27 percent better than the state of the art (“closely approaching human-level performance,” as they noted.)


pages: 339 words: 92,785

I, Warbot: The Dawn of Artificially Intelligent Conflict by Kenneth Payne

Abraham Maslow, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, AlphaGo, anti-communist, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, Asperger Syndrome, augmented reality, Automated Insights, autonomous vehicles, backpropagation, Black Lives Matter, Bletchley Park, Boston Dynamics, classic study, combinatorial explosion, computer age, computer vision, Computing Machinery and Intelligence, coronavirus, COVID-19, CRISPR, cuban missile crisis, data science, deep learning, deepfake, DeepMind, delayed gratification, Demis Hassabis, disinformation, driverless car, drone strike, dual-use technology, Elon Musk, functional programming, Geoffrey Hinton, Google X / Alphabet X, Internet of things, job automation, John Nash: game theory, John von Neumann, Kickstarter, language acquisition, loss aversion, machine translation, military-industrial complex, move 37, mutually assured destruction, Nash equilibrium, natural language processing, Nick Bostrom, Norbert Wiener, nuclear taboo, nuclear winter, OpenAI, paperclip maximiser, pattern recognition, RAND corporation, ransomware, risk tolerance, Ronald Reagan, self-driving car, semantic web, side project, Silicon Valley, South China Sea, speech recognition, Stanislav Petrov, stem cell, Stephen Hawking, Steve Jobs, strong AI, Stuxnet, technological determinism, TED Talk, theory of mind, TikTok, Turing machine, Turing test, uranium enrichment, urban sprawl, V2 rocket, Von Neumann architecture, Wall-E, zero-sum game

They can even create wholly convincing photographs of people who’ve never actually existed. Artificial neuroscience The third factor in the deep learning revival was more arcane, but critical. This was the increasing technical sophistication of the neural networks that underpinned connectionism. One important development was the discovery of ‘backprop’, or backward propagation. This was a key bit of maths that allowed the artificial neurons in the connectionist AI to learn effectively. With multiple layers in the modern ‘deep learning network’, and with many more neurons and connections between them, working out the optimum connections between them had been fiendishly difficult.

In 2017, the Pentagon stood up an ‘algorithmic warfare cross functional team’, known as Project Maven. The team would consolidate ‘all initiatives that develop, employ, or field artificial intelligence, automation, machine learning, deep learning, and computer vision algorithms’.12 It was a small team, initially, but would grow rapidly. And one of its main contractors on image recognition? Google, of course. The stage was set for the arrival of deep warbots. Hype or hope? The deep learning revolution has produced a new wave of AI hype, only some of which is justified. In rapid succession, connectionist AI delivered results that would have amazed earlier students of symbolic logic.

While autonomous weapons and AI systems have been around for more than half a century, the recent surge in deep learning AI creates the possibility of swarming behaviours, and this more than any other feature, will likely have an impact on warbot tactics. The swarm attacks! Swarms don’t require a particularly sophisticated intelligence. At least not sophisticated in the sense we often use the term—carefully weighing goals and ways before deciding. In nature, swarming is often an instinctive behaviour. In robotics it’s the sort of skilful control challenge that deep learning AI excels at. Above the deserts of the US Navy’s vast China Lake research facility, Perdix and the Defense Department recently demonstrated the state of the art.


pages: 144 words: 43,356

Surviving AI: The Promise and Peril of Artificial Intelligence by Calum Chace

3D printing, Ada Lovelace, AI winter, Airbnb, Alvin Toffler, artificial general intelligence, augmented reality, barriers to entry, basic income, bitcoin, Bletchley Park, blockchain, brain emulation, Buckminster Fuller, Charles Babbage, cloud computing, computer age, computer vision, correlation does not imply causation, credit crunch, cryptocurrency, cuban missile crisis, deep learning, DeepMind, dematerialisation, Demis Hassabis, discovery of the americas, disintermediation, don't be evil, driverless car, Elon Musk, en.wikipedia.org, epigenetics, Erik Brynjolfsson, everywhere but in the productivity statistics, Flash crash, friendly AI, Geoffrey Hinton, Google Glasses, hedonic treadmill, hype cycle, industrial robot, Internet of things, invention of agriculture, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, life extension, low skilled workers, machine translation, Mahatma Gandhi, means of production, mutually assured destruction, Neil Armstrong, Nicholas Carr, Nick Bostrom, paperclip maximiser, pattern recognition, peer-to-peer, peer-to-peer model, Peter Thiel, radical life extension, Ray Kurzweil, Robert Solow, Rodney Brooks, Second Machine Age, self-driving car, Silicon Valley, Silicon Valley ideology, Skype, South Sea Bubble, speech recognition, Stanislav Petrov, Stephen Hawking, Steve Jobs, strong AI, technological singularity, TED Talk, The future is already here, The Future of Employment, theory of mind, Turing machine, Turing test, universal basic income, Vernor Vinge, wage slave, Wall-E, zero-sum game

They are particularly useful in speech recognition and handwriting recognition systems. Deep learning Deep learning is a subset of machine learning. Its algorithms use several layers of processing, each taking data from previous layers and passing an output up to the next layer. The nature of the output may vary according to the nature of the input, which is not necessarily binary, just on or off, but can be weighted. The number of layers can vary too, with anything above ten layers seen as very deep learning. Artificial neural nets (ANN) are an important type of deep learning system – indeed some people argue that deep learning is simply a re-branding of neural networks.

Early hopes for the quick development of thinking machines were dashed, however, and neural nets fell into disuse until the late 1980s, when they experienced a renaissance along with what came to be known as deep learning thanks to pioneers Yann LeCun (now at Facebook), Geoff Hinton (now at Google) and Yoshua Bengio, a professor at the University of Montreal. Yann LeCun describes deep learning as follows. “A pattern recognition system is like a black box with a camera at one end, a green light and a red light on top, and a whole bunch of knobs on the front. The learning algorithm tries to adjust the knobs so that when, say, a dog is in front of the camera, the red light turns on, and when a car is put in front of the camera, the green light turns on.

The speculation that a system containing enough of the types of operations involved in machine learning might generate a conscious mind intrigues some neuroscientists, and strikes others as wildly implausible, or as something that is many years away. Gary Marcus, a psychology professor at New York University, says “deep learning is only part of the larger challenge of building intelligent machines. Such techniques [are] still a long way from integrating abstract knowledge, such as information about what objects are, what they are for, and how they are typically used. The most powerful A.I. systems, like Watson, . . . use techniques like deep learning as just one element in a very complicated ensemble of techniques . . .” (33) Andrew Ng, formerly head of the Google Brain project and now in charge of Baidu’s AI activities, says that current machine learning techniques are like a “cartoon version” of the human brain.


pages: 477 words: 75,408

The Economic Singularity: Artificial Intelligence and the Death of Capitalism by Calum Chace

"World Economic Forum" Davos, 3D printing, additive manufacturing, agricultural Revolution, AI winter, Airbnb, AlphaGo, Alvin Toffler, Amazon Robotics, Andy Rubin, artificial general intelligence, augmented reality, autonomous vehicles, banking crisis, basic income, Baxter: Rethink Robotics, Berlin Wall, Bernie Sanders, bitcoin, blockchain, Boston Dynamics, bread and circuses, call centre, Chris Urmson, congestion charging, credit crunch, David Ricardo: comparative advantage, deep learning, DeepMind, Demis Hassabis, digital divide, Douglas Engelbart, Dr. Strangelove, driverless car, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Fairchild Semiconductor, Flynn Effect, full employment, future of work, Future Shock, gender pay gap, Geoffrey Hinton, gig economy, Google Glasses, Google X / Alphabet X, Hans Moravec, Herman Kahn, hype cycle, ImageNet competition, income inequality, industrial robot, Internet of things, invention of the telephone, invisible hand, James Watt: steam engine, Jaron Lanier, Jeff Bezos, job automation, John Markoff, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, Kiva Systems, knowledge worker, lifelogging, lump of labour, Lyft, machine translation, Marc Andreessen, Mark Zuckerberg, Martin Wolf, McJob, means of production, Milgram experiment, Narrative Science, natural language processing, Neil Armstrong, new economy, Nick Bostrom, Occupy movement, Oculus Rift, OpenAI, PageRank, pattern recognition, post scarcity, post-industrial society, post-work, precariat, prediction markets, QWERTY keyboard, railway mania, RAND corporation, Ray Kurzweil, RFID, Rodney Brooks, Sam Altman, Satoshi Nakamoto, Second Machine Age, self-driving car, sharing economy, Silicon Valley, Skype, SoftBank, software is eating the world, speech recognition, Stephen Hawking, Steve Jobs, TaskRabbit, technological singularity, TED Talk, The future is already here, The Future of Employment, Thomas Malthus, transaction costs, Two Sigma, Tyler Cowen, Tyler Cowen: Great Stagnation, Uber for X, uber lyft, universal basic income, Vernor Vinge, warehouse automation, warehouse robotics, working-age population, Y Combinator, young professional

The way that some game-playing AIs become superhuman in their field is by playing millions of games against versions of themselves and learning from the outcomes.) In deep learning, the algorithms operate in several layers, each layer processing data from previous ones and passing the output up to the next layer. The output is not necessarily binary, just on or off: it can be weighted. The number of layers can vary too, with anything above ten layers seen as very deep learning – although in December 2015 a Microsoft team won the ImageNet competition with a system which employed a massive 152 layers.[lxvi] Deep learning, and especially artificial neural nets (ANNs), are in many ways a return to an older approach to AI which was explored in the 1960s but abandoned because it proved ineffective.

“Then you teach it Mandarin: it learns Mandarin, but it also becomes better at English, and quite frankly none of us know exactly why.”[xciii] In December 2015, Baidu announced that its speech recognition system Deep Speech 2 performed better than humans with short phrases out of context.[xciv] It uses deep learning techniques to recognise Mandarin. Learning and innovating It can no longer be said that machines do not learn, or that they cannot invent. In December 2013, DeepMind demonstrated an AI system which used a deep learning technique called unsupervised learning to teach itself to play old-style Atari video games like Breakout and Pong.[xcv] These are games which previous AI systems found hard to play because they involve hand-to-eye co-ordination.

doi=10.1257/jep.29.3.3 [lvi] https://reason.com/archives/2015/03/03/how-to-survive-a-robot-uprisin [lvii] http://www.politico.com/magazine/story/2013/11/the-robots-are-here-098995 [lviii] http://www.forbes.com/sites/danschawbel/2015/08/04/geoff-colvin-why-humans-will-triumph-over-machines/2/ [lix] http://www.eastoftheweb.com/short-stories/UBooks/BoyCri.shtml [lx] German academic Marcus Hutter, and Shane Legg, co-founder of DeepMind [lxi] http://www.savethechimps.org/about-us/chimp-facts/ [lxii] The Shape of Automation for Men and Management by Herbert Simon, 1965 [lxiii] Computation: Finite and Infinite Machines by Marvin Minsky, 1967 [lxiv] http://www.wired.com/2016/01/microsoft-neural-net-shows-deep-learning-can-get-way-deeper/ [lxv] http://www.etymonline.com/index.php?term=algorithm [lxvi] http://www.wired.com/2016/01/microsoft-neural-net-shows-deep-learning-can-get-way-deeper/ [lxvii] Moravec wrote about this phenomenon in his 1988 book “Mind Children”. A possible explanation is that the sensory motor skills and spatial awareness that we develop as children are the product of millions of years of evolution.


pages: 340 words: 97,723

The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity by Amy Webb

"Friedman doctrine" OR "shareholder theory", Ada Lovelace, AI winter, air gap, Airbnb, airport security, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic bias, AlphaGo, Andy Rubin, artificial general intelligence, Asilomar, autonomous vehicles, backpropagation, Bayesian statistics, behavioural economics, Bernie Sanders, Big Tech, bioinformatics, Black Lives Matter, blockchain, Bretton Woods, business intelligence, Cambridge Analytica, Cass Sunstein, Charles Babbage, Claude Shannon: information theory, cloud computing, cognitive bias, complexity theory, computer vision, Computing Machinery and Intelligence, CRISPR, cross-border payments, crowdsourcing, cryptocurrency, Daniel Kahneman / Amos Tversky, data science, deep learning, DeepMind, Demis Hassabis, Deng Xiaoping, disinformation, distributed ledger, don't be evil, Donald Trump, Elon Musk, fail fast, fake news, Filter Bubble, Flynn Effect, Geoffrey Hinton, gig economy, Google Glasses, Grace Hopper, Gödel, Escher, Bach, Herman Kahn, high-speed rail, Inbox Zero, Internet of things, Jacques de Vaucanson, Jeff Bezos, Joan Didion, job automation, John von Neumann, knowledge worker, Lyft, machine translation, Mark Zuckerberg, Menlo Park, move fast and break things, Mustafa Suleyman, natural language processing, New Urbanism, Nick Bostrom, one-China policy, optical character recognition, packet switching, paperclip maximiser, pattern recognition, personalized medicine, RAND corporation, Ray Kurzweil, Recombinant DNA, ride hailing / ride sharing, Rodney Brooks, Rubik’s Cube, Salesforce, Sand Hill Road, Second Machine Age, self-driving car, seminal paper, SETI@home, side project, Silicon Valley, Silicon Valley startup, skunkworks, Skype, smart cities, South China Sea, sovereign wealth fund, speech recognition, Stephen Hawking, strong AI, superintelligent machines, surveillance capitalism, technological singularity, The Coming Technological Singularity, the long tail, theory of mind, Tim Cook: Apple, trade route, Turing machine, Turing test, uber lyft, Von Neumann architecture, Watson beat the top human players on Jeopardy!, zero day

By 2009, Hinton’s lab had applied deep neural nets for speech recognition, and a chance meeting with a Microsoft researcher named Li Deng meant that the technology could be piloted in a meaningful way. Deng, a Chinese deep-learning specialist, was a pioneer in speech recognition using large-scale deep learning. By 2010, the technique was being tested at Google. Just two years later, deep neural nets were being used in commercial products. If you used Google Voice and its transcription services, that was deep learning, and the technique became the basis for all the digital assistants we use today. Siri, Google, and Amazon’s Alexa are all powered by deep learning. The AI community of interdisciplinary researchers had grown significantly since the Dartmouth summer.

We can’t evolve significantly on our own, and the existing evolutionary timeframe doesn’t suit our current technological aspirations. The promise of deep learning was an acceleration of the evolution of intelligence itself, which would only temporarily involve humans. A deep neural net would be given a basic set of parameters about the data by a person, and then the system would go out and learn on its own by recognizing patterns using many layers of processing. For researchers, the attraction of deep learning is that by design, machines make decisions unpredictably. Thinking in ways we humans have never imagined—or been able to do ourselves—is vitally important when trying to solve big problems for which there haven’t ever been clear solutions.

With all those simulated neurons and layers, exactly what happened and in which order can’t be easily reverse-engineered. One team of Google researchers did try to develop a new technique to make AI more transparent. In essence, the researchers ran a deep-learning image recognition algorithm in reverse to observe how the system recognized certain things such as trees, snails, and pigs. The project, called DeepDream, used a network created by MIT’s Computer Science and AI Lab and ran Google’s deep-learning algorithm in reverse. Instead of training it to recognize objects using the layer-by-layer approach—to learn that a rose is a rose, and a daffodil is a daffodil—instead it was trained to warp the images and generate objects that weren’t there.


pages: 301 words: 85,126

AIQ: How People and Machines Are Smarter Together by Nick Polson, James Scott

Abraham Wald, Air France Flight 447, Albert Einstein, algorithmic bias, Amazon Web Services, Atul Gawande, autonomous vehicles, availability heuristic, basic income, Bayesian statistics, Big Tech, Black Lives Matter, Bletchley Park, business cycle, Cepheid variable, Checklist Manifesto, cloud computing, combinatorial explosion, computer age, computer vision, Daniel Kahneman / Amos Tversky, data science, deep learning, DeepMind, Donald Trump, Douglas Hofstadter, Edward Charles Pickering, Elon Musk, epigenetics, fake news, Flash crash, Grace Hopper, Gödel, Escher, Bach, Hans Moravec, Harvard Computers: women astronomers, Higgs boson, index fund, information security, Isaac Newton, John von Neumann, late fees, low earth orbit, Lyft, machine translation, Magellanic Cloud, mass incarceration, Moneyball by Michael Lewis explains big data, Moravec's paradox, more computing power than Apollo, natural language processing, Netflix Prize, North Sea oil, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, p-value, pattern recognition, Pierre-Simon Laplace, ransomware, recommendation engine, Ronald Reagan, Salesforce, self-driving car, sentiment analysis, side project, Silicon Valley, Skype, smart cities, speech recognition, statistical model, survivorship bias, systems thinking, the scientific method, Thomas Bayes, Uber for X, uber lyft, universal basic income, Watson beat the top human players on Jeopardy!, young professional

See also Lee Bell, “Nvidia to Train 100,000 Developers in ‘Deep Learning’ AI to Bolster Healthcare Research,” Forbes.com, May 11, 2017, https://blogs.nvidia.com/blog/2016/12/07/mass-general-researchers-ai/. See also Lee Bell, “Nvidia to Train 100,000 Developers in ‘Deep Learning’ AI to Bolster Healthcare Research,” Forbes.com, May 11, 2017, https://blogs.nvidia.com/blog/2016/12/07/mass-general-researchers-ai/. See also Lee Bell, “Nvidia to Train 100,000 Developers in ‘Deep Learning’ AI to Bolster Healthcare Research,” Forbes.com, May 11, 2017, https://www.forbes.com/sites/leebelltech/2017/05/11/nvidia-to-train-100000-developers-in-deep-learning-ai-to-bolster-health-care-research/. 50.  See, e.g., Tom Simonite, “The Recipe for the Perfect Robot Surgeon,” MIT Technology Review, October 14, 2016, https://www.technologyreview.com/s/602595/the-recipe-for-the-perfect-robot-surgeon/. 51.  

But then Koike realized that he could use a piece of open-source AI software from Google, called TensorFlow, to accomplish the same task, by coding up a “deep-learning” algorithm that could classify a cucumber based on a photograph. Koike had never used AI or TensorFlow before, but with all the free resources out there, he didn’t find it hard to teach himself how. When a video of his AI-powered sorting machine hit YouTube, Koike became an international deep-learning/cucumber celebrity. It wasn’t merely that he had given people a feel-good story, saving his mother from hours of drudgery. He’d also sent an inspiring message to students and coders across the world: that if AI can solve problems in cucumber farming, it can solve problems just about anywhere.

We’re omitting a lot of details here that are essential for making this strategy succeed, but they’re all just minutiae, the kind of thing you learn if you study AI in graduate school. If you just think “trial and error,” you’re 90% of the way there. Factor 4: Deep Learning In addition to the richness of our models, the size of our data sets, and the speed of our computers, there’s a fourth major way in which prediction rules have improved dramatically: people have learned how to extract useful information from vastly more complicated inputs. If you’ve heard the term “deep learning” and wondered what it means, we’re about to explain. We said at the beginning of the chapter that computers are agnostic about the type of input you give them.


pages: 345 words: 75,660

Prediction Machines: The Simple Economics of Artificial Intelligence by Ajay Agrawal, Joshua Gans, Avi Goldfarb

Abraham Wald, Ada Lovelace, AI winter, Air France Flight 447, Airbus A320, algorithmic bias, AlphaGo, Amazon Picking Challenge, artificial general intelligence, autonomous vehicles, backpropagation, basic income, Bayesian statistics, Black Swan, blockchain, call centre, Capital in the Twenty-First Century by Thomas Piketty, Captain Sullenberger Hudson, carbon tax, Charles Babbage, classic study, collateralized debt obligation, computer age, creative destruction, Daniel Kahneman / Amos Tversky, data acquisition, data is the new oil, data science, deep learning, DeepMind, deskilling, disruptive innovation, driverless car, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, everywhere but in the productivity statistics, financial engineering, fulfillment center, general purpose technology, Geoffrey Hinton, Google Glasses, high net worth, ImageNet competition, income inequality, information retrieval, inventory management, invisible hand, Jeff Hawkins, job automation, John Markoff, Joseph Schumpeter, Kevin Kelly, Lyft, Minecraft, Mitch Kapor, Moneyball by Michael Lewis explains big data, Nate Silver, new economy, Nick Bostrom, On the Economy of Machinery and Manufactures, OpenAI, paperclip maximiser, pattern recognition, performance metric, profit maximization, QWERTY keyboard, race to the bottom, randomized controlled trial, Ray Kurzweil, ride hailing / ride sharing, Robert Solow, Salesforce, Second Machine Age, self-driving car, shareholder value, Silicon Valley, statistical model, Stephen Hawking, Steve Jobs, Steve Jurvetson, Steven Levy, strong AI, The Future of Employment, the long tail, The Signal and the Noise by Nate Silver, Tim Cook: Apple, trolley problem, Turing test, Uber and Lyft, uber lyft, US Airways Flight 1549, Vernor Vinge, vertical integration, warehouse automation, warehouse robotics, Watson beat the top human players on Jeopardy!, William Langewiesche, Y Combinator, zero-sum game

The vertical axis measures the error rate, so lower is better. In 2010, the best machine predictions made mistakes in 28 percent of images. In 2012, the contestants used deep learning for the first time and the error rate plunged to 16 percent. As Princeton professor and computer scientist Olga Russakovsky notes, “2012 was really the year when there was a massive breakthrough in accuracy, but it was also a proof of concept for deep learning models, which had been around for decades.”8 Rapid improvements in the algorithms continued, and a team beat the human benchmark in the competition for the first time in 2015.

“Mastercard Rolls Out Artificial Intelligence across Its Global Network,” Mastercard press release, November 30, 2016, https://newsroom.mastercard.com/press-releases/mastercard-rolls-out-artificial-intelligence-across-its-global-network/. 3. Adam Geitgey, “Machine Learning Is Fun, Part 5: Language Translation with Deep Learning and the Magic of Sequences,” Medium, August 21, 2016, https://medium.com/@ageitgey/machine-learning-is-fun-part-5-language-translation-with-deep-learning-and-the-magic-of-sequences-2ace0acca0aa. 4. Yiting Sun, “Why 500 Million People in China Are Talking to This AI,”MIT Technology Review, September 14, 2017, https://www.technologyreview.com/s/608841/why-500-million-people-in-china-are-talking-to-this-ai/. 5.

Tests That Were Found to Discriminate,” New York Times, July 23, 2009, https://cityroom.blogs.nytimes.com/2009/07/23/the-fire-dept-tests-that-were-found-to-discriminate/?mcubz=0&_r=0; US v. City of New York (FDNY), https://www.justice.gov/archives/crt-fdny/overview. 6. Paul Voosen, “How AI Detectives Are Cracking Open the Black Box of Deep Learning,” Science, July 6, 2017, http://www.sciencemag.org/news/2017/07/how-ai-detectives-are-cracking-open-black-box-deep-learning. 7. T. Blake, C. Nosko, and S. Tadelis, “Consumer Heterogeneity and Paid Search Effectiveness: A Large-Scale Field Experiment,” Econometrica 83 (2015): 155–174. 8. Hossein Hosseini, Baicen Xiao, and Radha Poovendran, “Deceiving Google’s Cloud Video Intelligence API Built for Summarizing Videos” (paper presented at CVPR Workshops, March 31, 2017), https://arxiv.org/pdf/1703.09793.pdf; see also “Artificial Intelligence Used by Google to Scan Videos Could Easily Be Tricked by a Picture of Noodles,” Quartz, April 4, 2017, https://qz.com/948870/the-ai-used-by-google-to-scan-videos-could-easily-be-tricked-by-a-picture-of-noodles/. 9.


pages: 625 words: 167,349

The Alignment Problem: Machine Learning and Human Values by Brian Christian

Albert Einstein, algorithmic bias, Alignment Problem, AlphaGo, Amazon Mechanical Turk, artificial general intelligence, augmented reality, autonomous vehicles, backpropagation, butterfly effect, Cambridge Analytica, Cass Sunstein, Claude Shannon: information theory, computer vision, Computing Machinery and Intelligence, data science, deep learning, DeepMind, Donald Knuth, Douglas Hofstadter, effective altruism, Elaine Herzberg, Elon Musk, Frances Oldham Kelsey, game design, gamification, Geoffrey Hinton, Goodhart's law, Google Chrome, Google Glasses, Google X / Alphabet X, Gödel, Escher, Bach, Hans Moravec, hedonic treadmill, ImageNet competition, industrial robot, Internet Archive, John von Neumann, Joi Ito, Kenneth Arrow, language acquisition, longitudinal study, machine translation, mandatory minimum, mass incarceration, multi-armed bandit, natural language processing, Nick Bostrom, Norbert Wiener, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, OpenAI, Panopticon Jeremy Bentham, pattern recognition, Peter Singer: altruism, Peter Thiel, precautionary principle, premature optimization, RAND corporation, recommendation engine, Richard Feynman, Rodney Brooks, Saturday Night Live, selection bias, self-driving car, seminal paper, side project, Silicon Valley, Skinner box, sparse data, speech recognition, Stanislav Petrov, statistical model, Steve Jobs, strong AI, the map is not the territory, theory of mind, Tim Cook: Apple, W. E. B. Du Bois, Wayback Machine, zero-sum game

., “Weight Uncertainty in Neural Networks”; and Hernández-Lobato and Adams, “Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks.” For a more detailed history of these ideas, see Gal, “Uncertainty in Deep Learning.” For an overview of probabilistic methods in machine learning more generally, see Ghahramani, “Probabilistic Machine Learning and Artificial Intelligence.” 17. Yarin Gal, personal interview, July 11, 2019. 18. Yarin Gal, “Modern Deep Learning Through Bayesian Eyes” (lecture), Microsoft Research, December 11, 2015, https://www.microsoft.com/en-us/research/video/modern-deep-learning-through-bayesian-eyes/. 19. For a look at using dropout-ensemble uncertainty to detect adversarial examples, see Smith and Gal, “Understanding Measures of Uncertainty for Adversarial Example Detection.” 20.

But on this matter, too, Bellemare would soon come around. Simply plugging deep learning into a classic RL algorithm and running it on seven of the Atari games, Mnih was able to beat every previous RL benchmark in six of them. Not only that: in three of the games, their program appeared to be as good as a human player. They submitted a workshop paper in late 2013, marking their progress.6 “It was just sort of a proof-of-concept paper,” says Bellemare, “that convolutional nets could do this.” “Really,” he says, “it was bringing the deep-learning part to solve what reinforcement-learning researchers hadn’t been able to do in ages, which is to generate these features on the fly.

In better coming to understand our own motivations and drives, we then, in turn, have a chance for complementary and reciprocal insights about how to build an artificial intelligence as flexible, resilient, and intellectually omnivorous as our own. Deepak Pathak looks at the success of deep learning and sees one glaring weakness: each system—be it for machine translation, or object recognition, or even game playing—is purpose-built. Training a huge neural network on a heap of manually labeled images was, as we have seen, the paradigm in which deep learning first truly showed its promise. Explicitly drilling a system to categorize images made a system that could categorize images. Fair enough, he says. “But the problem is, these artificial intelligence systems are not actually intelligent.


pages: 362 words: 97,288

Ghost Road: Beyond the Driverless Car by Anthony M. Townsend

A Pattern Language, active measures, AI winter, algorithmic trading, Alvin Toffler, Amazon Robotics, asset-backed security, augmented reality, autonomous vehicles, backpropagation, big-box store, bike sharing, Blitzscaling, Boston Dynamics, business process, Captain Sullenberger Hudson, car-free, carbon footprint, carbon tax, circular economy, company town, computer vision, conceptual framework, congestion charging, congestion pricing, connected car, creative destruction, crew resource management, crowdsourcing, DARPA: Urban Challenge, data is the new oil, Dean Kamen, deep learning, deepfake, deindustrialization, delayed gratification, deliberate practice, dematerialisation, deskilling, Didi Chuxing, drive until you qualify, driverless car, drop ship, Edward Glaeser, Elaine Herzberg, Elon Musk, en.wikipedia.org, extreme commuting, financial engineering, financial innovation, Flash crash, food desert, Ford Model T, fulfillment center, Future Shock, General Motors Futurama, gig economy, Google bus, Greyball, haute couture, helicopter parent, independent contractor, inventory management, invisible hand, Jane Jacobs, Jeff Bezos, Jevons paradox, jitney, job automation, John Markoff, John von Neumann, Joseph Schumpeter, Kickstarter, Kiva Systems, Lewis Mumford, loss aversion, Lyft, Masayoshi Son, megacity, microapartment, minimum viable product, mortgage debt, New Urbanism, Nick Bostrom, North Sea oil, Ocado, openstreetmap, pattern recognition, Peter Calthorpe, random walk, Ray Kurzweil, Ray Oldenburg, rent-seeking, ride hailing / ride sharing, Rodney Brooks, self-driving car, sharing economy, Shoshana Zuboff, Sidewalk Labs, Silicon Valley, Silicon Valley startup, Skype, smart cities, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, SoftBank, software as a service, sovereign wealth fund, Stephen Hawking, Steve Jobs, surveillance capitalism, technological singularity, TED Talk, Tesla Model S, The Coming Technological Singularity, The Death and Life of Great American Cities, The future is already here, The Future of Employment, The Great Good Place, too big to fail, traffic fines, transit-oriented development, Travis Kalanick, Uber and Lyft, uber lyft, urban planning, urban sprawl, US Airways Flight 1549, Vernor Vinge, vertical integration, Vision Fund, warehouse automation, warehouse robotics

“Contemporary neural networks do well on challenges that remain close to their core training data,” writes New York University computer scientist Gary Marcus in an exhaustive 2018 critique of deep learning, “but start to break down on cases further out in the periphery.” Nowhere are the limits of deep learning becoming clearer than in the development of self-driving vehicles. The most catastrophic failures involving automated driving so far have all occurred around so-called edge cases, those unexpected events where data to train deep-learning models was insufficient or simply nonexistent—a pedestrian walking a bicycle across a darkened street midblock (Tempe, Arizona, March 18, 2018); a white truck trailer occluded against a brightly lit sky (Williston, Florida, May 7, 2016); an unusual set of road-surface markings at a highway off-ramp (Mountain View, California, March 23, 2018).

Chips like Pegasus are shifting the center of effort inside your car, from powertrain to CPU, and changing the kinds of “fuel” needed. The motor in your old car harnessed the power of internal combustion. It sucked in gasoline and transformed it into mechanical power to move you down the road. This new motor in your AV is powered by deep learning. It ingests gigabytes of data and spits out a stream of insights to guide you on your way. Deep learning sounds more mysterious than it is. The artificial neural networks that make it work were first invented more than 70 years ago. These algorithms, loosely based on mammalian brains, were the basis of a promising early branch of AI research. But after several highprofile failures, the mainstream research community largely abandoned the approach.

Even more remarkable was their seemingly intuitive power (learning). You didn’t have to program a deep learning model with descriptions of exactly what to look for to, say, identify photographs of cats. All you had to do was wind the mechanism up with a million pictures of cats and it could deduce the fundamental indicators of cat-ness all by itself. This process, called “training,” works by slowly calibrating the nodes within and between the stack’s various layers, strengthening the connections that contribute to accurate results and pruning those that don’t. Deep learning does have at least one enormous drawback, however. It is a ravenous consumer of computer power.


pages: 392 words: 108,745

Talk to Me: How Voice Computing Will Transform the Way We Live, Work, and Think by James Vlahos

Albert Einstein, AltaVista, Amazon Mechanical Turk, Amazon Web Services, augmented reality, Automated Insights, autonomous vehicles, backpropagation, Big Tech, Cambridge Analytica, Chuck Templeton: OpenTable:, cloud computing, Colossal Cave Adventure, computer age, deep learning, DeepMind, Donald Trump, Elon Musk, fake news, Geoffrey Hinton, information retrieval, Internet of things, Jacques de Vaucanson, Jeff Bezos, lateral thinking, Loebner Prize, machine readable, machine translation, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, Mark Zuckerberg, Menlo Park, natural language processing, Neal Stephenson, Neil Armstrong, OpenAI, PageRank, pattern recognition, Ponzi scheme, randomized controlled trial, Ray Kurzweil, Ronald Reagan, Rubik’s Cube, self-driving car, sentiment analysis, Silicon Valley, Skype, Snapchat, speech recognition, statistical model, Steve Jobs, Steve Wozniak, Steven Levy, TechCrunch disrupt, Turing test, Watson beat the top human players on Jeopardy!

Facebook snapped up LeCun to lead its AI efforts. Bengio remained an independent academic and founded the world’s largest academic research institute for deep learning—the Montreal Institute for Learning Algorithms (MILA). LeCun was gratified to see that the experts who had once shunned his methods had come around. “They said, ‘Okay, now we buy it,’” LeCun later told a reporter. “‘That’s it, now—you won.’” Image recognition was the first problem to succumb to deep learning’s powers. But with the efficacy of the technique no longer in doubt, many of its practitioners turned to an even more enticing task than identifying pictures: understanding words.

They might not correspond to meanings on human-understandable terms; they were the dimensions that proved most useful to the neural network when it sifted through the data. The beauty of deep learning is that—whether with images, the sounds of speech, or the meanings of words—humans don’t have to pick out the key identifying features. That task inevitably eludes our grasp, says Steve Young, a senior member of the technical staff for Siri and a professor of information engineering at Cambridge University. “Deep learning,” he says, “just avoids the problem by essentially throwing the entire signal into the classifier and letting the classifier work out what features are significant.”

Today Siri pulls from an archive of more than a million sound samples, many of which are as small as half a phoneme. The system uses deep learning to select the optimal sound units—anywhere from a dozen to a hundred or more per sentence—so the puzzle pieces fit cleanly. Because it is trained on examples of real people speaking, the neural network also expresses prosody. Alex Acero, who leads the Siri speech team, says that his ultimate aspiration is to make the assistant sound as natural as the Scarlett Johansson–voiced one in the movie Her. The emergence of powerful deep learning techniques for synthesizing voices means, among other things, that these voices are proliferating.


pages: 444 words: 117,770

The Coming Wave: Technology, Power, and the Twenty-First Century's Greatest Dilemma by Mustafa Suleyman

"World Economic Forum" Davos, 23andMe, 3D printing, active measures, Ada Lovelace, additive manufacturing, agricultural Revolution, AI winter, air gap, Airbnb, Alan Greenspan, algorithmic bias, Alignment Problem, AlphaGo, Alvin Toffler, Amazon Web Services, Anthropocene, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, ASML, autonomous vehicles, backpropagation, barriers to entry, basic income, benefit corporation, Big Tech, biodiversity loss, bioinformatics, Bletchley Park, Blitzscaling, Boston Dynamics, business process, business process outsourcing, call centre, Capital in the Twenty-First Century by Thomas Piketty, ChatGPT, choice architecture, circular economy, classic study, clean tech, cloud computing, commoditize, computer vision, coronavirus, corporate governance, correlation does not imply causation, COVID-19, creative destruction, CRISPR, critical race theory, crowdsourcing, cryptocurrency, cuban missile crisis, data science, decarbonisation, deep learning, deepfake, DeepMind, deindustrialization, dematerialisation, Demis Hassabis, disinformation, drone strike, drop ship, dual-use technology, Easter island, Edward Snowden, effective altruism, energy transition, epigenetics, Erik Brynjolfsson, Ernest Rutherford, Extinction Rebellion, facts on the ground, failed state, Fairchild Semiconductor, fear of failure, flying shuttle, Ford Model T, future of work, general purpose technology, Geoffrey Hinton, global pandemic, GPT-3, GPT-4, hallucination problem, hive mind, hype cycle, Intergovernmental Panel on Climate Change (IPCC), Internet Archive, Internet of things, invention of the wheel, job automation, John Maynard Keynes: technological unemployment, John von Neumann, Joi Ito, Joseph Schumpeter, Kickstarter, lab leak, large language model, Law of Accelerating Returns, Lewis Mumford, license plate recognition, lockdown, machine readable, Marc Andreessen, meta-analysis, microcredit, move 37, Mustafa Suleyman, mutually assured destruction, new economy, Nick Bostrom, Nikolai Kondratiev, off grid, OpenAI, paperclip maximiser, personalized medicine, Peter Thiel, planetary scale, plutocrats, precautionary principle, profit motive, prompt engineering, QAnon, quantum entanglement, ransomware, Ray Kurzweil, Recombinant DNA, Richard Feynman, Robert Gordon, Ronald Reagan, Sam Altman, Sand Hill Road, satellite internet, Silicon Valley, smart cities, South China Sea, space junk, SpaceX Starlink, stealth mode startup, stem cell, Stephen Fry, Steven Levy, strong AI, synthetic biology, tacit knowledge, tail risk, techlash, techno-determinism, technoutopianism, Ted Kaczynski, the long tail, The Rise and Fall of American Growth, Thomas Malthus, TikTok, TSMC, Turing test, Tyler Cowen, Tyler Cowen: Great Stagnation, universal basic income, uranium enrichment, warehouse robotics, William MacAskill, working-age population, world market for maybe five computers, zero day

The breakthrough moment took nearly half a century, finally arriving in 2012 in the form of a system called AlexNet. AlexNet was powered by the resurgence of an old technique that has now become fundamental to AI, one that has supercharged the field and was integral to us at DeepMind: deep learning. Deep learning uses neural networks loosely modeled on those of the human brain. In simple terms, these systems “learn” when their networks are “trained” on large amounts of data. In the case of AlexNet, the training data consisted of images. Each red, green, or blue pixel is given a value, and the resulting array of numbers is fed into the network as an input.

In 1987 there were just ninety academic papers published at Neural Information Processing Systems, at what became the field’s leading conference. By the 2020s there were almost two thousand. In the last six years there was a six-fold increase in the number of papers published on deep learning alone, tenfold if you widen the view to machine learning as a whole. With the blossoming of deep learning, billions of dollars poured into AI research at academic institutions and private and public companies. Starting in the 2010s, the buzz, indeed the hype, around AI was back, stronger than ever, making headlines and pushing the frontiers of what’s possible.

GO TO NOTE REFERENCE IN TEXT In 2012, AlexNet beat Jerry Wei, “AlexNet: The Architecture That Challenged CNNs,” Towards Data Science, July 2, 2019, towardsdatascience.com/​alexnet-the-architecture-that-challenged-cnns-e406d5297951. GO TO NOTE REFERENCE IN TEXT Thanks to deep learning Chanan Bos, “Tesla’s New HW3 Self-Driving Computer—It’s a Beast,” CleanTechnica, June 15, 2019, cleantechnica.com/​2019/​06/​15/​teslas-new-hw3-self-driving-computer-its-a-beast-cleantechnica-deep-dive. GO TO NOTE REFERENCE IN TEXT It helps fly drones Jeffrey De Fauw et al., “Clinically Applicable Deep Learning for Diagnosis and Referral in Retinal Disease,” Nature Medicine, Aug. 13, 2018, www.nature.com/​articles/​s41591-018-0107-6. GO TO NOTE REFERENCE IN TEXT By the 2020s there were almost two thousand “Advances in Neural Information Processing Systems,” NeurIPS, papers.nips.cc.


pages: 533 words: 125,495

Rationality: What It Is, Why It Seems Scarce, Why It Matters by Steven Pinker

affirmative action, Albert Einstein, autonomous vehicles, availability heuristic, Ayatollah Khomeini, backpropagation, basic income, behavioural economics, belling the cat, Black Lives Matter, butterfly effect, carbon tax, Cass Sunstein, choice architecture, classic study, clean water, Comet Ping Pong, coronavirus, correlation coefficient, correlation does not imply causation, COVID-19, critical race theory, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, data science, David Attenborough, deep learning, defund the police, delayed gratification, disinformation, Donald Trump, Dr. Strangelove, Easter island, effective altruism, en.wikipedia.org, Erdős number, Estimating the Reproducibility of Psychological Science, fake news, feminist movement, framing effect, George Akerlof, George Floyd, germ theory of disease, high batting average, if you see hoof prints, think horses—not zebras, index card, Jeff Bezos, job automation, John Nash: game theory, John von Neumann, libertarian paternalism, Linda problem, longitudinal study, loss aversion, Mahatma Gandhi, meta-analysis, microaggression, Monty Hall problem, Nash equilibrium, New Journalism, Paul Erdős, Paul Samuelson, Peter Singer: altruism, Pierre-Simon Laplace, placebo effect, post-truth, power law, QAnon, QWERTY keyboard, Ralph Waldo Emerson, randomized controlled trial, replication crisis, Richard Thaler, scientific worldview, selection bias, social discount rate, social distancing, Social Justice Warrior, Stanford marshmallow experiment, Steve Bannon, Steven Pinker, sunk-cost fallacy, TED Talk, the scientific method, Thomas Bayes, Tragedy of the Commons, trolley problem, twin studies, universal basic income, Upton Sinclair, urban planning, Walter Mischel, yellow journalism, zero-sum game

Computer scientists could put multilayer networks on megavitamins, giving them two, fifteen, even a thousand hidden layers, and training them on billions or even trillions of examples. The networks are called deep learning systems because of the number of layers between the input and the output (they’re not deep in the sense of understanding anything). These networks are powering “the great AI awakening” we are living through, which is giving us the first serviceable products for speech and image recognition, question-answering, translation, and other humanlike feats.33 Deep learning networks often outperform GOFAI (good old-fashioned artificial intelligence), which executes logic-like deductions on hand-coded propositions and rules.34 The contrast in the way they work is stark: unlike logical inference, the inner workings of a neural network are inscrutable.

That is why many technology critics fear that as AI systems are entrusted with decisions about the fates of people, they could perpetuate biases that no one can identify and uproot.35 In 2018 Henry Kissinger warned that since deep learning systems don’t work on propositions we can examine and justify, they portend the end of the Enlightenment.36 That is a stretch, but the contrast between logic and neural computation is clear. Is the human brain a big deep learning network? Certainly not, for many reasons, but the similarities are illuminating. The brain has around a hundred billion neurons connected by a hundred trillion synapses, and by the time we are eighteen we have been absorbing examples from our environments for more than three hundred million waking seconds.

New York: Basic Books. Marcus, G. F. 2000. Two kinds of representation. In E. Dietrich & A. B. Markman, eds., Cognitive dynamics: Conceptual and representational change in humans and machines. Mahwah, NJ: Erlbaum. Marcus, G. F. 2018. The deepest problem with deep learning. Medium, Dec. 1. https://medium.com/@GaryMarcus/the-deepest-problem-with-deep-learning-91c5991f5695. Marcus, G. F., & Davis, E. 2019. Rebooting AI: Building artificial intelligence we can trust. New York: Pantheon. Marlowe, F. 2010. The Hadza: Hunter-gatherers of Tanzania. Berkeley: University of California Press. Martin, G.


Machine Learning Design Patterns: Solutions to Common Challenges in Data Preparation, Model Building, and MLOps by Valliappa Lakshmanan, Sara Robinson, Michael Munn

A Pattern Language, Airbnb, algorithmic trading, automated trading system, business intelligence, business logic, business process, combinatorial explosion, computer vision, continuous integration, COVID-19, data science, deep learning, DevOps, discrete time, en.wikipedia.org, Hacker News, industrial research laboratory, iterative process, Kubernetes, machine translation, microservices, mobile money, natural language processing, Netflix Prize, optical character recognition, pattern recognition, performance metric, recommendation engine, ride hailing / ride sharing, selection bias, self-driving car, sentiment analysis, speech recognition, statistical model, the payments system, web application

Neural networks with more than one hidden layer (layers other than the input and output layer) are classified as deep learning (see Figure 1-1). Machine learning models, regardless of how they are depicted visually, are mathematical functions and can therefore be implemented from scratch using a numerical software package. However, ML engineers in industry tend to employ one of several open source frameworks designed to provide intuitive APIs for building models. The majority of our examples will use TensorFlow, an open source machine learning framework created by Google with a focus on deep learning models. Within the TensorFlow library, we’ll be using the Keras API in our examples, which can be imported through tensorflow.keras.

One bit of intuition as to why this works comes from the Uniform Approximation Theorem of deep learning, which, loosely put, states that any function (and its derivatives) can be approximated by a neural network with at least one hidden layer and any “squashing” activation function, like sigmoid. This means that no matter what function we are given, so long as it’s relatively well behaved, there exists a neural network with just one hidden layer that approximates that function as closely as we want.1 Deep learning approaches to solving differential equations or complex dynamical systems aim to represent a function defined implicitly by a differential equation, or system of equations, using a neural network.

While increasing the resolution requires substantially more compute power using finite-difference methods, the neural network is able to maintain high performance with only marginal additional cost. Techniques like the Deep Galerkin Method can then use deep learning to provide a mesh-free approximation of the solution to the given PDE. In this way, solving the PDE is reduced to a chained optimization problem (see “Design Pattern 8: Cascade ”). Deep Galerkin Method The Deep Galerkin Method is a deep learning algorithm for solving partial differential equations. The algorithm is similar in spirit to Galerkin methods used in the field of numeric analysis, where the solution is approximated using a neural network instead of a linear combination of basis functions.


pages: 292 words: 85,151

Exponential Organizations: Why New Organizations Are Ten Times Better, Faster, and Cheaper Than Yours (And What to Do About It) by Salim Ismail, Yuri van Geest

23andMe, 3D printing, Airbnb, Amazon Mechanical Turk, Amazon Web Services, anti-fragile, augmented reality, autonomous vehicles, Baxter: Rethink Robotics, behavioural economics, Ben Horowitz, bike sharing, bioinformatics, bitcoin, Black Swan, blockchain, Blue Ocean Strategy, book value, Burning Man, business intelligence, business process, call centre, chief data officer, Chris Wanstrath, circular economy, Clayton Christensen, clean water, cloud computing, cognitive bias, collaborative consumption, collaborative economy, commoditize, corporate social responsibility, cross-subsidies, crowdsourcing, cryptocurrency, dark matter, data science, Dean Kamen, deep learning, DeepMind, dematerialisation, discounted cash flows, disruptive innovation, distributed ledger, driverless car, Edward Snowden, Elon Musk, en.wikipedia.org, Ethereum, ethereum blockchain, fail fast, game design, gamification, Google Glasses, Google Hangouts, Google X / Alphabet X, gravity well, hiring and firing, holacracy, Hyperloop, industrial robot, Innovator's Dilemma, intangible asset, Internet of things, Iridium satellite, Isaac Newton, Jeff Bezos, Joi Ito, Kevin Kelly, Kickstarter, knowledge worker, Kodak vs Instagram, Law of Accelerating Returns, Lean Startup, life extension, lifelogging, loose coupling, loss aversion, low earth orbit, Lyft, Marc Andreessen, Mark Zuckerberg, market design, Max Levchin, means of production, Michael Milken, minimum viable product, natural language processing, Netflix Prize, NetJets, Network effects, new economy, Oculus Rift, offshore financial centre, PageRank, pattern recognition, Paul Graham, paypal mafia, peer-to-peer, peer-to-peer model, Peter H. Diamandis: Planetary Resources, Peter Thiel, Planet Labs, prediction markets, profit motive, publish or perish, radical decentralization, Ray Kurzweil, recommendation engine, RFID, ride hailing / ride sharing, risk tolerance, Ronald Coase, Rutger Bregman, Salesforce, Second Machine Age, self-driving car, sharing economy, Silicon Valley, skunkworks, Skype, smart contracts, Snapchat, social software, software is eating the world, SpaceShipOne, speech recognition, stealth mode startup, Stephen Hawking, Steve Jobs, Steve Jurvetson, subscription business, supply-chain management, synthetic biology, TaskRabbit, TED Talk, telepresence, telepresence robot, the long tail, Tony Hsieh, transaction costs, Travis Kalanick, Tyler Cowen, Tyler Cowen: Great Stagnation, uber lyft, urban planning, Virgin Galactic, WikiLeaks, winner-take-all economy, X Prize, Y Combinator, zero-sum game

After allowing it to browse ten million randomly selected YouTube video thumbnails for three days, the network began to recognize cats, without actually knowing the concept of “cats.” Importantly, this was without any human intervention or input. In the two years since, the capabilities of Deep Learning have improved considerably. Today, in addition to improving speech recognition, creating a more effective search engine (Ray Kurzweil is working on this within Google) and identifying individual objects, Deep Learning algorithms can also detect particular episodes in videos and even describe them in text, all without human input. Deep Learning algorithms can even play video games by figuring out the rules of the game and then optimizing performance. Think about the implications of this revolutionary breakthrough.

The contest ended early, in September 2009, when one of the 44,014 valid submissions achieved the goal and was awarded the prize. Deep Learning is a new and exciting subset of Machine Learning based on neural net technology. It allows a machine to discover new patterns without being exposed to any historical or training data. Leading startups in this space are DeepMind, bought by Google in early 2014 for $500 million, back when DeepMind had just thirteen employees, and Vicarious, funded with investment from Elon Musk, Jeff Bezos and Mark Zuckerberg. Twitter, Baidu, Microsoft and Facebook are also heavily invested in this area. Deep Learning algorithms rely on discovery and self-indexing, and operate in much the same way that a baby learns first sounds, then words, then sentences and even languages.

To implement platforms, ExOs follow four steps in terms of data and APIs: Gather: The algorithmic process starts with harnessing data, which is gathered via sensors, people, or imported from public datasets. Organize: The next step is to organize the data. This is known as ETL (extract, transform and load). Apply: Once the data is accessible, algorithms such as machine or deep learning extract insights, identify trends and tune new algorithms. These are realized via tools such as Hadoop and Pivotal, or even (open source) deep learning algorithms like DeepMind or Skymind. Expose: The final step is exposing the data in the form of an open platform. Open data and APIs can be used such that an ExO’s community develops valuable services, new functionalities and innovations layered on top of the platform by remixing published data with their own.


System Error by Rob Reich

"Friedman doctrine" OR "shareholder theory", "World Economic Forum" Davos, 2021 United States Capitol attack, A Declaration of the Independence of Cyberspace, Aaron Swartz, AI winter, Airbnb, airport security, Alan Greenspan, Albert Einstein, algorithmic bias, AlphaGo, AltaVista, artificial general intelligence, Automated Insights, autonomous vehicles, basic income, Ben Horowitz, Berlin Wall, Bernie Madoff, Big Tech, bitcoin, Blitzscaling, Cambridge Analytica, Cass Sunstein, clean water, cloud computing, computer vision, contact tracing, contact tracing app, coronavirus, corporate governance, COVID-19, creative destruction, CRISPR, crowdsourcing, data is the new oil, data science, decentralized internet, deep learning, deepfake, DeepMind, deplatforming, digital rights, disinformation, disruptive innovation, Donald Knuth, Donald Trump, driverless car, dual-use technology, Edward Snowden, Elon Musk, en.wikipedia.org, end-to-end encryption, Fairchild Semiconductor, fake news, Fall of the Berlin Wall, Filter Bubble, financial engineering, financial innovation, fulfillment center, future of work, gentrification, Geoffrey Hinton, George Floyd, gig economy, Goodhart's law, GPT-3, Hacker News, hockey-stick growth, income inequality, independent contractor, informal economy, information security, Jaron Lanier, Jeff Bezos, Jim Simons, jimmy wales, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Perry Barlow, Lean Startup, linear programming, Lyft, Marc Andreessen, Mark Zuckerberg, meta-analysis, minimum wage unemployment, Monkeys Reject Unequal Pay, move fast and break things, Myron Scholes, Network effects, Nick Bostrom, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, NP-complete, Oculus Rift, OpenAI, Panopticon Jeremy Bentham, Parler "social media", pattern recognition, personalized medicine, Peter Thiel, Philippa Foot, premature optimization, profit motive, quantitative hedge fund, race to the bottom, randomized controlled trial, recommendation engine, Renaissance Technologies, Richard Thaler, ride hailing / ride sharing, Ronald Reagan, Sam Altman, Sand Hill Road, scientific management, self-driving car, shareholder value, Sheryl Sandberg, Shoshana Zuboff, side project, Silicon Valley, Snapchat, social distancing, Social Responsibility of Business Is to Increase Its Profits, software is eating the world, spectrum auction, speech recognition, stem cell, Steve Jobs, Steven Levy, strong AI, superintelligent machines, surveillance capitalism, Susan Wojcicki, tech billionaire, tech worker, techlash, technoutopianism, Telecommunications Act of 1996, telemarketer, The Future of Employment, TikTok, Tim Cook: Apple, traveling salesman, Triangle Shirtwaist Factory, trolley problem, Turing test, two-sided market, Uber and Lyft, uber lyft, ultimatum game, union organizing, universal basic income, washing machines reduced drudgery, Watson beat the top human players on Jeopardy!, When a measure becomes a target, winner-take-all economy, Y Combinator, you are the product

The extent to which AI technology will threaten to displace highly skilled workers such as doctors ultimately remains to be seen. A 2019 study in the United Kingdom summarized the state of affairs, stating “Our review found the diagnostic performance of deep learning models to be equivalent to that of health-care professionals,” but then went on to conclude that “poor reporting is prevalent in deep learning studies, which limits reliable interpretation of the reported diagnostic accuracy.” In other words, the deep learning models may be able to match human performance at the narrow task of making a diagnosis from an X-ray, but the fact that such studies don’t then use the results of the algorithm in an actual medical setting means it’s not possible to determine whether the model’s prediction would have actually led to a better outcome for the patient.

., “Radiologist-Level Pneumonia Detection on Chest X-Rays with Deep Learning,” CheXNet, December 25, 2017, http://arxiv.org/abs/1711.05225. “people should stop training radiologists”: Geoff Hinton, “Geoff Hinton: On Radiology,” Creative Destruction Lab, uploaded to YouTube November 24, 2016, https://www.youtube.com/watch?v=2HMPRXstSvQ. the work radiologists and other medical professionals do: Hugh Harvey, “Why AI Will Not Replace Radiologists,” Medium, April 7, 2018, https://towardsdatascience.com/why-ai-will-not-replace-radiologists-c7736f2c7d80. “deep learning models to be equivalent”: Xiaoxuan Liu et al., “A Comparison of Deep Learning Performance Against Health-Care Professionals in Detecting Diseases from Medical Imaging: A Systematic Review and Meta-Analysis,” Lancet Digital Health 1, no. 6 (October 1, 2019): e271–97, https://doi.org/10.1016/S2589-7500(19)30123-2.

AI systems can identify patterns in huge pools of data that humans can’t discern and can therefore frequently make more accurate predictions. But such systems are often black boxes, unable to explain why they generate particular outputs. And the scientists who build the systems can’t always explain the outputs either, making the decisions inscrutable. Deep learning refers not to the capacity to generate insights but to a spatial metaphor for the architecture of the AI system. The idea behind deep learning is that inputs to a system form sets of simple patterns that are then combined into increasingly more complex patterns using patterns derived from the previous layer. The moniker deep comes from the fact that such systems now contain many more layers than they did just a decade ago—a result of greater computational power to model the patterns in each layer and vast increases in data that enable these more complex patterns to be discovered.


pages: 253 words: 84,238

A Thousand Brains: A New Theory of Intelligence by Jeff Hawkins

AI winter, Albert Einstein, artificial general intelligence, carbon-based life, clean water, cloud computing, deep learning, different worldview, discovery of DNA, Doomsday Clock, double helix, en.wikipedia.org, estate planning, Geoffrey Hinton, Jeff Hawkins, PalmPilot, Search for Extraterrestrial Intelligence, self-driving car, sensor fusion, Silicon Valley, superintelligent machines, the scientific method, Thomas Kuhn: the structure of scientific revolutions, Turing machine, Turing test

The current wave of AI has attracted thousands of researchers and billions of dollars of investment. Almost all these people and dollars are being applied to improving deep learning technologies. Will this investment lead to human-level machine intelligence, or are deep learning technologies fundamentally limited, leading us once again to reinvent the field of AI? When you are in the middle of a bubble, it is easy to get swept up in the enthusiasm and believe it will go on forever. History suggests we should be cautious. I don’t know how long the current wave of AI will continue to grow. But I do know that deep learning does not put us on the path to creating truly intelligent machines. We can’t get to artificial general intelligence by doing more of what we are currently doing.

They claimed that we could not make truly intelligent machines until we solved how to represent everyday knowledge in a computer. Today’s deep learning networks don’t possess knowledge. A Go-playing computer does not know that Go is a game. It doesn’t know the history of the game. It doesn’t know if it is playing against a computer or a human, or what “computer” and “human” mean. Similarly, a deep learning network that labels images may look at an image and say it is a cat. But the computer has limited knowledge of cats. It doesn’t know that cats are animals, or that they have tails, legs, and lungs. It doesn’t know about cat people versus dog people, or that cats purr and shed fur. All the deep learning network does is determine that a new image is similar to previously seen images that were labeled “cat.”

AI scientists disagree as to whether these language networks possess true knowledge or are just mimicking humans by remembering the statistics of millions of words. I don’t believe any kind of deep learning network will achieve the goal of AGI if the network doesn’t model the world the way a brain does. Deep learning networks work well, but not because they solved the knowledge representation problem. They work well because they avoided it completely, relying on statistics and lots of data instead. How deep learning networks work is clever, their performance is impressive, and they are commercially valuable. I am only pointing out that they don’t possess knowledge and, therefore, are not on the path to having the ability of a five-year-old child.


pages: 1,331 words: 163,200

Hands-On Machine Learning With Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems by Aurélien Géron

AlphaGo, Amazon Mechanical Turk, Anton Chekhov, backpropagation, combinatorial explosion, computer vision, constrained optimization, correlation coefficient, crowdsourcing, data science, deep learning, DeepMind, don't repeat yourself, duck typing, Elon Musk, en.wikipedia.org, friendly AI, Geoffrey Hinton, ImageNet competition, information retrieval, iterative process, John von Neumann, Kickstarter, machine translation, natural language processing, Netflix Prize, NP-complete, OpenAI, optical character recognition, P = NP, p-value, pattern recognition, pull request, recommendation engine, self-driving car, sentiment analysis, SpamAssassin, speech recognition, stochastic process

If any code samples or other technology this work contains or describes is subject to open source licenses or the intellectual property rights of others, it is your responsibility to ensure that your use thereof complies with such licenses and/or rights. 978-1-491-96229-9 [LSI] Preface The Machine Learning Tsunami In 2006, Geoffrey Hinton et al. published a paper1 showing how to train a deep neural network capable of recognizing handwritten digits with state-of-the-art precision (>98%). They branded this technique “Deep Learning.” Training a deep neural net was widely considered impossible at the time,2 and most researchers had abandoned the idea since the 1990s. This paper revived the interest of the scientific community and before long many new papers demonstrated that Deep Learning was not only possible, but capable of mind-blowing achievements that no other Machine Learning (ML) technique could hope to match (with the help of tremendous computing power and great amounts of data).

Part II, Neural Networks and Deep Learning, covers the following topics: What are neural nets? What are they good for? Building and training neural nets using TensorFlow. The most important neural net architectures: feedforward neural nets, convolutional nets, recurrent nets, long short-term memory (LSTM) nets, and autoencoders. Techniques for training deep neural nets. Scaling neural networks for huge datasets. Reinforcement learning. The first part is based mostly on Scikit-Learn while the second part uses TensorFlow. Caution Don’t jump into deep waters too hastily: while Deep Learning is no doubt one of the most exciting areas in Machine Learning, you should master the fundamentals first.

Moreover, most problems can be solved quite well using simpler techniques such as Random Forests and Ensemble methods (discussed in Part I). Deep Learning is best suited for complex problems such as image recognition, speech recognition, or natural language processing, provided you have enough data, computing power, and patience. Other Resources Many resources are available to learn about Machine Learning. Andrew Ng’s ML course on Coursera and Geoffrey Hinton’s course on neural networks and Deep Learning are amazing, although they both require a significant time investment (think months). There are also many interesting websites about Machine Learning, including of course Scikit-Learn’s exceptional User Guide.


pages: 481 words: 125,946

What to Think About Machines That Think: Today's Leading Thinkers on the Age of Machine Intelligence by John Brockman

Adam Curtis, agricultural Revolution, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic trading, Anthropocene, artificial general intelligence, augmented reality, autism spectrum disorder, autonomous vehicles, backpropagation, basic income, behavioural economics, bitcoin, blockchain, bread and circuses, Charles Babbage, clean water, cognitive dissonance, Colonization of Mars, complexity theory, computer age, computer vision, constrained optimization, corporate personhood, cosmological principle, cryptocurrency, cuban missile crisis, Danny Hillis, dark matter, data science, deep learning, DeepMind, Demis Hassabis, digital capitalism, digital divide, digital rights, discrete time, Douglas Engelbart, driverless car, Elon Musk, Emanuel Derman, endowment effect, epigenetics, Ernest Rutherford, experimental economics, financial engineering, Flash crash, friendly AI, functional fixedness, global pandemic, Google Glasses, Great Leap Forward, Hans Moravec, hive mind, Ian Bogost, income inequality, information trail, Internet of things, invention of writing, iterative process, James Webb Space Telescope, Jaron Lanier, job automation, Johannes Kepler, John Markoff, John von Neumann, Kevin Kelly, knowledge worker, Large Hadron Collider, lolcat, loose coupling, machine translation, microbiome, mirror neurons, Moneyball by Michael Lewis explains big data, Mustafa Suleyman, natural language processing, Network effects, Nick Bostrom, Norbert Wiener, paperclip maximiser, pattern recognition, Peter Singer: altruism, phenotype, planetary scale, Ray Kurzweil, Recombinant DNA, recommendation engine, Republic of Letters, RFID, Richard Thaler, Rory Sutherland, Satyajit Das, Search for Extraterrestrial Intelligence, self-driving car, sharing economy, Silicon Valley, Skype, smart contracts, social intelligence, speech recognition, statistical model, stem cell, Stephen Hawking, Steve Jobs, Steven Pinker, Stewart Brand, strong AI, Stuxnet, superintelligent machines, supervolcano, synthetic biology, systems thinking, tacit knowledge, TED Talk, the scientific method, The Wisdom of Crowds, theory of mind, Thorstein Veblen, too big to fail, Turing machine, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!, We are as Gods, Y2K

Some of those patterns are complex, but most are fairly simple. Great effort goes into parsing our speech and deciphering our handwriting. The current fad in thinking machines goes by the name of deep learning. When I first heard of deep learning, I was excited by the idea that machines were finally going to reveal to us deep aspects of existence—truth, beauty, and love. I was rapidly disabused. The deep in deep learning refers to the architecture of the machines doing the learning: They consist of many layers of interlocking logical elements, analogous to the “deep” layers of interlocking neurons in the brain.

Today’s algorithm has nothing like human-level competence in understanding images. Work is under way to add focus of attention and handling of consistent spatial structure to deep learning. That’s the hard work of science and research, and we have no idea how hard it will be, nor how long it will take, nor whether the whole approach will reach a dead end. It took some thirty years to go from backpropagation to deep learning, but along the way many researchers were sure there was no future in backpropagation. They were wrong, but it wouldn’t have been surprising if they were right, as we knew all along that the backpropagation algorithm is not what happens inside people’s heads.

After thirty years of research, a million-times improvement in computer power, and vast data sets from the Internet, we now know the answer to this question: Neural networks scaled up to twelve layers deep, with billions of connections, are outperforming the best algorithms in computer vision for object recognition and have revolutionized speech recognition. It’s rare for any algorithm to scale this well, which suggests that they may soon be able to solve even more difficult problems. Recent breakthroughs have been made that allow the application of deep learning to natural-language processing. Deep recurrent networks with short-term memory were trained to translate English sentences into French sentences at high levels of performance. Other deep-learning networks could create English captions for the content of images with surprising and sometimes amusing acumen. Supervised learning using deep networks is a step forward, but still far from achieving general intelligence.


pages: 241 words: 70,307

Leadership by Algorithm: Who Leads and Who Follows in the AI Era? by David de Cremer

"Friedman doctrine" OR "shareholder theory", algorithmic bias, algorithmic management, AlphaGo, bitcoin, blockchain, business climate, business process, Computing Machinery and Intelligence, corporate governance, data is not the new oil, data science, deep learning, DeepMind, Donald Trump, Elon Musk, fake news, future of work, job automation, Kevin Kelly, Mark Zuckerberg, meta-analysis, Norbert Wiener, pattern recognition, Peter Thiel, race to the bottom, robotic process automation, Salesforce, scientific management, shareholder value, Silicon Valley, Social Responsibility of Business Is to Increase Its Profits, Stephen Hawking, The Future of Employment, Turing test, work culture , workplace surveillance , zero-sum game

AI witnessed a comeback in the last decade, primarily because the world woke up to the realization that deep learning by machines is possible to the level where they can actually perform many tasks better than humans. Where did this wake-up call come from? From a simple game called Go. In 2016, AlphaGo, a program developed by Google DeepMind, beat the human world champion in the Chinese board game, Go. This was a surprise to many, as Go – because of its complexity – was considered the territory of human, not AI, victors. In a decade where our human desire to connect globally, execute tasks faster, and accumulate massive amounts of data, was omnipresent, such deep learning capabilities were, of course, quickly embraced.

‘Most of AI’s Business Uses Will Be in Two Areas.’ Harvard Business Review. July 20. Retrieved from: https://hbr.org/2018/07/most-of-ais-business-uses-will-be-in-two-areas 8 McKinsey (2018). ‘Notes from the AI frontier: Applications and value of deep learning.’ Retrieved from: https://www.mckinsey.com/featured-insights/artificial-intelligence/notes-from-the-ai-frontier-applications-and-value-of-deep-learning 9 Bloomberg (2018, January 15th). ‘Alibaba's AI Outguns Humans in Reading Test.’ Retrieved from https://www.bloomberg.com/news/articles/2018-01-15/alibaba-s-ai-outgunned-humans-in-key-stanford-reading-test 10 Gee, K. (2017).

In that respect, an interesting study from the US National Bureau of Economic Research demonstrated that low-skill service-sector workers (where retention rates are low) stayed in the job 15% longer when an algorithm was used to judge their employability.¹⁴ Automation and innovation Automation and the corresponding use of algorithms with deep learning abilities are also penetrating other industries. The legal sector is another area where many discussions are taking place about how and whether to automate services. Legal counsellors have started to use automated advisors to contest relatively small fines such as parking tickets. The legal sector is also considering the use of AI to help judges go through evidence collected to reach a verdict in court cases.


The Book of Why: The New Science of Cause and Effect by Judea Pearl, Dana Mackenzie

affirmative action, Albert Einstein, AlphaGo, Asilomar, Bayesian statistics, computer age, computer vision, Computing Machinery and Intelligence, confounding variable, correlation coefficient, correlation does not imply causation, Daniel Kahneman / Amos Tversky, data science, deep learning, DeepMind, driverless car, Edmond Halley, Elon Musk, en.wikipedia.org, experimental subject, Great Leap Forward, Gregor Mendel, Isaac Newton, iterative process, John Snow's cholera map, Loebner Prize, loose coupling, Louis Pasteur, Menlo Park, Monty Hall problem, pattern recognition, Paul Erdős, personalized medicine, Pierre-Simon Laplace, placebo effect, Plato's cave, prisoner's dilemma, probability theory / Blaise Pascal / Pierre de Fermat, randomized controlled trial, Recombinant DNA, selection bias, self-driving car, seminal paper, Silicon Valley, speech recognition, statistical model, Stephen Hawking, Steve Jobs, strong AI, The Design of Experiments, the scientific method, Thomas Bayes, Turing test

Some readers may be surprised to see that I have placed present-day learning machines squarely on rung one of the Ladder of Causation, sharing the wisdom of an owl. We hear almost every day, it seems, about rapid advances in machine learning systems—self-driving cars, speech-recognition systems, and, especially in recent years, deep-learning algorithms (or deep neural networks). How could they still be only at level one? The successes of deep learning have been truly remarkable and have caught many of us by surprise. Nevertheless, deep learning has succeeded primarily by showing that certain questions or tasks we thought were difficult are in fact not. It has not addressed the truly difficult questions that continue to prevent us from achieving humanlike AI.

Another advantage causal models have that data mining and deep learning lack is adaptability. Note that in Figure I.1, the estimand is computed on the basis of the causal model alone, prior to an examination of the specifics of the data. This makes the causal inference engine supremely adaptable, because the estimand computed is good for any data that are compatible with the qualitative model, regardless of the numerical relationships among the variables. To see why this adaptability is important, compare this engine with a learning agent—in this instance a human, but in other cases perhaps a deep-learning algorithm or maybe a human using a deep-learning algorithm—trying to learn solely from the data.

A few months later it played sixty online games against top human players without losing a single one, and in 2017 it was officially retired after beating the current world champion, Ke Jie. The one game it lost to Sedol is the only one it will ever lose to a human. All of this is exciting, and the results leave no doubt: deep learning works for certain tasks. But it is the antithesis of transparency. Even AlphaGo’s programmers cannot tell you why the program plays so well. They knew from experience that deep networks have been successful at tasks in computer vision and speech recognition. Nevertheless, our understanding of deep learning is completely empirical and comes with no guarantees. The AlphaGo team could not have predicted at the outset that the program would beat the best human in a year, or two, or five.


pages: 472 words: 117,093

Machine, Platform, Crowd: Harnessing Our Digital Future by Andrew McAfee, Erik Brynjolfsson

"World Economic Forum" Davos, 3D printing, additive manufacturing, AI winter, Airbnb, airline deregulation, airport security, Albert Einstein, algorithmic bias, AlphaGo, Amazon Mechanical Turk, Amazon Web Services, Andy Rubin, AOL-Time Warner, artificial general intelligence, asset light, augmented reality, autism spectrum disorder, autonomous vehicles, backpropagation, backtesting, barriers to entry, behavioural economics, bitcoin, blockchain, blood diamond, British Empire, business cycle, business process, carbon footprint, Cass Sunstein, centralized clearinghouse, Chris Urmson, cloud computing, cognitive bias, commoditize, complexity theory, computer age, creative destruction, CRISPR, crony capitalism, crowdsourcing, cryptocurrency, Daniel Kahneman / Amos Tversky, data science, Dean Kamen, deep learning, DeepMind, Demis Hassabis, discovery of DNA, disintermediation, disruptive innovation, distributed ledger, double helix, driverless car, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Ethereum, ethereum blockchain, everywhere but in the productivity statistics, Evgeny Morozov, fake news, family office, fiat currency, financial innovation, general purpose technology, Geoffrey Hinton, George Akerlof, global supply chain, Great Leap Forward, Gregor Mendel, Hernando de Soto, hive mind, independent contractor, information asymmetry, Internet of things, inventory management, iterative process, Jean Tirole, Jeff Bezos, Jim Simons, jimmy wales, John Markoff, joint-stock company, Joseph Schumpeter, Kickstarter, Kiva Systems, law of one price, longitudinal study, low interest rates, Lyft, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, Marc Andreessen, Marc Benioff, Mark Zuckerberg, meta-analysis, Mitch Kapor, moral hazard, multi-sided market, Mustafa Suleyman, Myron Scholes, natural language processing, Network effects, new economy, Norbert Wiener, Oculus Rift, PageRank, pattern recognition, peer-to-peer lending, performance metric, plutocrats, precision agriculture, prediction markets, pre–internet, price stability, principal–agent problem, Project Xanadu, radical decentralization, Ray Kurzweil, Renaissance Technologies, Richard Stallman, ride hailing / ride sharing, risk tolerance, Robert Solow, Ronald Coase, Salesforce, Satoshi Nakamoto, Second Machine Age, self-driving car, sharing economy, Silicon Valley, Skype, slashdot, smart contracts, Snapchat, speech recognition, statistical model, Steve Ballmer, Steve Jobs, Steven Pinker, supply-chain management, synthetic biology, tacit knowledge, TaskRabbit, Ted Nelson, TED Talk, the Cathedral and the Bazaar, The Market for Lemons, The Nature of the Firm, the strength of weak ties, Thomas Davenport, Thomas L Friedman, too big to fail, transaction costs, transportation-network company, traveling salesman, Travis Kalanick, Two Sigma, two-sided market, Tyler Cowen, Uber and Lyft, Uber for X, uber lyft, ubercab, Vitalik Buterin, warehouse robotics, Watson beat the top human players on Jeopardy!, winner-take-all economy, yield management, zero day

November 11, 2016, http://metricviews.org.uk/2007/11/how-big-hectare. 80 Makoto was impressed: Kaz Sato, “How a Japanese Cucumber Farmer Is Using Deep Learning and TensorFlow,” Google, August 31, 2016, https://cloud.google.com/blog/big-data/2016/08/how-a-japanese-cucumber-farmer-is-using-deep-learning-and-tensorflow. 80 “I can’t wait to try it”: Ibid. 80 “It’s not hyperbole”: Ibid. 80 “If intelligence was a cake”: Carlos E. Perez, “ ‘Predictive Learning’ Is the New Buzzword in Deep Learning,” Intuition Machine, December 6, 2016, https://medium.com/intuitionmachine/predictive-learning-is-the-key-to-deep-learning-acceleration-93e063195fd0#.13qh1nti1. 81 Joshua Brown’s Tesla crashed: Anjali Singhvi and Karl Russell, “Inside the Self-Driving Tesla Fatal Accident,” New York Times, July 12, 2016, https://www.nytimes.com/interactive/2016/07/01/business/inside-tesla-accident.html. 82 it appears that neither Brown: Tesla, “A Tragic Loss,” June 30, 2016, https://www.tesla.com/blog/tragic-loss. 82 “Conventional wisdom would say”: Chris Urmson, “How a Driverless Car Sees the Road,” TED Talk, June 2015, 15:29, https://www.ted.com/talks/chris_urmson_how_a_driverless_car_sees_the_road/transcript?

Because both supervised and unsupervised machine learning approaches use the algorithms described by Hinton and his colleagues in their 2006 paper, they’re now commonly called “deep learning” systems. Demonstrations and Deployments Except for a very small number of cases, such as the system LeCun built for recognizing handwritten numbers on checks, the business application of deep learning is only a few years old. But the technique is spreading with extraordinary speed. The software engineer Jeff Dean,** who heads Google’s efforts to use the technology, notes that as recently as 2012 the company was not using it at all to improve products like Search, Gmail, YouTube, or Maps. By the third quarter of 2015, however, deep learning was being used in approximately 1,200 projects across the company, having surpassed the performance of other methods.

By 2013, these challenges had been broadly addressed (Erik Bernhardsson, “When Machine Learning Matters,” Erik Bernhardsson [blog], August 5, 2016, https://erikbern.com/2016/08/05/when-machine-learning-matters.html), and the company shifted focus toward using machine learning to deliver highly personalized music recommendations (Jordan Novet, “Spotify Intern Dreams Up Better Music Recommendations through Deep Learning,” VentureBeat, August 6, 2014, http://venturebeat.com/2014/08/06/spotify-intern-dreams-up-better-music-recommendations-through-deep-learning). Spotify launched its algorithm-powered Daily Mix option in September 2016 (Spotify, “Rediscover Your Favorite Music with Daily Mix,” September 27, 2016, https://news.spotify.com/us/2016/09/27/rediscover-your-favorite-music-with-daily-mix).


pages: 579 words: 76,657

Data Science from Scratch: First Principles with Python by Joel Grus

backpropagation, confounding variable, correlation does not imply causation, data science, deep learning, Hacker News, higher-order functions, natural language processing, Netflix Prize, p-value, Paul Graham, recommendation engine, SpamAssassin, statistical model

Our documents are our users’ interests, which look like: documents = [ ["Hadoop", "Big Data", "HBase", "Java", "Spark", "Storm", "Cassandra"], ["NoSQL", "MongoDB", "Cassandra", "HBase", "Postgres"], ["Python", "scikit-learn", "scipy", "numpy", "statsmodels", "pandas"], ["R", "Python", "statistics", "regression", "probability"], ["machine learning", "regression", "decision trees", "libsvm"], ["Python", "R", "Java", "C++", "Haskell", "programming languages"], ["statistics", "probability", "mathematics", "theory"], ["machine learning", "scikit-learn", "Mahout", "neural networks"], ["neural networks", "deep learning", "Big Data", "artificial intelligence"], ["Hadoop", "Java", "MapReduce", "Big Data"], ["statistics", "R", "statsmodels"], ["C++", "deep learning", "artificial intelligence", "probability"], ["pandas", "R", "Python"], ["databases", "HBase", "Postgres", "MySQL", "MongoDB"], ["libsvm", "regression", "support vector machines"] ] And we’ll try to find K = 4 topics. In order to calculate the sampling weights, we’ll need to keep track of several counts.

In particular, we’ll look at the data set of users_interests that we’ve used before: users_interests = [ ["Hadoop", "Big Data", "HBase", "Java", "Spark", "Storm", "Cassandra"], ["NoSQL", "MongoDB", "Cassandra", "HBase", "Postgres"], ["Python", "scikit-learn", "scipy", "numpy", "statsmodels", "pandas"], ["R", "Python", "statistics", "regression", "probability"], ["machine learning", "regression", "decision trees", "libsvm"], ["Python", "R", "Java", "C++", "Haskell", "programming languages"], ["statistics", "probability", "mathematics", "theory"], ["machine learning", "scikit-learn", "Mahout", "neural networks"], ["neural networks", "deep learning", "Big Data", "artificial intelligence"], ["Hadoop", "Java", "MapReduce", "Big Data"], ["statistics", "R", "statsmodels"], ["C++", "deep learning", "artificial intelligence", "probability"], ["pandas", "R", "Python"], ["databases", "HBase", "Postgres", "MySQL", "MongoDB"], ["libsvm", "regression", "support vector machines"] ] And we’ll think about the problem of recommending new interests to a user based on her currently specified interests.

After asking around, you manage to get your hands on this data, as a list of pairs (user_id, interest): interests = [ (0, "Hadoop"), (0, "Big Data"), (0, "HBase"), (0, "Java"), (0, "Spark"), (0, "Storm"), (0, "Cassandra"), (1, "NoSQL"), (1, "MongoDB"), (1, "Cassandra"), (1, "HBase"), (1, "Postgres"), (2, "Python"), (2, "scikit-learn"), (2, "scipy"), (2, "numpy"), (2, "statsmodels"), (2, "pandas"), (3, "R"), (3, "Python"), (3, "statistics"), (3, "regression"), (3, "probability"), (4, "machine learning"), (4, "regression"), (4, "decision trees"), (4, "libsvm"), (5, "Python"), (5, "R"), (5, "Java"), (5, "C++"), (5, "Haskell"), (5, "programming languages"), (6, "statistics"), (6, "probability"), (6, "mathematics"), (6, "theory"), (7, "machine learning"), (7, "scikit-learn"), (7, "Mahout"), (7, "neural networks"), (8, "neural networks"), (8, "deep learning"), (8, "Big Data"), (8, "artificial intelligence"), (9, "Hadoop"), (9, "Java"), (9, "MapReduce"), (9, "Big Data") ] For example, Thor (id 4) has no friends in common with Devin (id 7), but they share an interest in machine learning. It’s easy to build a function that finds users with a certain interest: def data_scientists_who_like(target_interest): return [user_id for user_id, user_interest in interests if user_interest == target_interest] This works, but it has to examine the whole list of interests for every search.


pages: 484 words: 104,873

Rise of the Robots: Technology and the Threat of a Jobless Future by Martin Ford

3D printing, additive manufacturing, Affordable Care Act / Obamacare, AI winter, algorithmic management, algorithmic trading, Amazon Mechanical Turk, artificial general intelligence, assortative mating, autonomous vehicles, banking crisis, basic income, Baxter: Rethink Robotics, Bernie Madoff, Bill Joy: nanobots, bond market vigilante , business cycle, call centre, Capital in the Twenty-First Century by Thomas Piketty, carbon tax, Charles Babbage, Chris Urmson, Clayton Christensen, clean water, cloud computing, collateralized debt obligation, commoditize, computer age, creative destruction, data science, debt deflation, deep learning, deskilling, digital divide, disruptive innovation, diversified portfolio, driverless car, Erik Brynjolfsson, factory automation, financial innovation, Flash crash, Ford Model T, Fractional reserve banking, Freestyle chess, full employment, general purpose technology, Geoffrey Hinton, Goldman Sachs: Vampire Squid, Gunnar Myrdal, High speed trading, income inequality, indoor plumbing, industrial robot, informal economy, iterative process, Jaron Lanier, job automation, John Markoff, John Maynard Keynes: technological unemployment, John von Neumann, Kenneth Arrow, Khan Academy, Kiva Systems, knowledge worker, labor-force participation, large language model, liquidity trap, low interest rates, low skilled workers, low-wage service sector, Lyft, machine readable, machine translation, manufacturing employment, Marc Andreessen, McJob, moral hazard, Narrative Science, Network effects, new economy, Nicholas Carr, Norbert Wiener, obamacare, optical character recognition, passive income, Paul Samuelson, performance metric, Peter Thiel, plutocrats, post scarcity, precision agriculture, price mechanism, public intellectual, Ray Kurzweil, rent control, rent-seeking, reshoring, RFID, Richard Feynman, Robert Solow, Rodney Brooks, Salesforce, Sam Peltzman, secular stagnation, self-driving car, Silicon Valley, Silicon Valley billionaire, Silicon Valley startup, single-payer health, software is eating the world, sovereign wealth fund, speech recognition, Spread Networks laid a new fibre optics cable between New York and Chicago, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Steven Levy, Steven Pinker, strong AI, Stuxnet, technological singularity, telepresence, telepresence robot, The Bell Curve by Richard Herrnstein and Charles Murray, The Coming Technological Singularity, The Future of Employment, the long tail, Thomas L Friedman, too big to fail, Tragedy of the Commons, Tyler Cowen, Tyler Cowen: Great Stagnation, uber lyft, union organizing, Vernor Vinge, very high income, warehouse automation, warehouse robotics, Watson beat the top human players on Jeopardy!, women in the workforce

However, the last few years have seen a number of dramatic breakthroughs that have resulted in significant advances in performance, especially when multiple layers of neurons are employed—a technology that has come to be called “deep learning.” Deep learning systems already power the speech recognition capability in Apple’s Siri and are poised to accelerate progress in a broad range of applications that rely on pattern analysis and recognition. A deep learning neural network designed in 2011 by scientists at the University of Lugano in Switzerland, for example, was able to correctly identify more than 99 percent of the images in a large database of traffic signs—a level of accuracy that exceeded that of human experts who competed against the system.

That compares with 97.53 percent accuracy for human observers.9 Geoffrey Hinton of the University of Toronto, one of the leading researchers in the field, notes that deep learning technology “scales beautifully. Basically you just need to keep making it bigger and faster, and it will get better.”10 In other words, even without accounting for likely future improvements in their design, machine learning systems powered by deep learning networks are virtually certain to see continued dramatic progress simply as a result of Moore’s Law. Big data and the smart algorithms that accompany it are having an immediate impact on workplaces and careers as employers, particularly large corporations, increasingly track a myriad of metrics and statistics regarding the work and social interactions of their employees.

Tom Simonite, “Facebook Creates Software That Matches Faces Almost as Well as You Do,” MIT Technology Review, March 17, 2014, http://www.technologyreview.com/news/525586/facebook-creates-software-that-matches-faces-almost-as-well-as-you-do/. 10. As quoted in John Markoff, “Scientists See Promise in Deep-Learning Programs,” New York Times, November 23, 2012, http://www.nytimes.com/2012/11/24/science/scientists-see-advances-in-deep-learning-a-part-of-artificial-intelligence.html. 11. Don Peck, “They’re Watching You at Work,” The Atlantic, December 2013, http://www.theatlantic.com/magazine/archive/2013/12/theyre-watching-you-at-work/354681/. 12. United States Patent No. 8,589,407, “Automated Generation of Suggestions for Personalized Reactions in a Social Network,” November 19, 2013, http://patft.uspto.gov/netacgi/nph-Parser?


pages: 573 words: 157,767

From Bacteria to Bach and Back: The Evolution of Minds by Daniel C. Dennett

Ada Lovelace, adjacent possible, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, AlphaGo, Andrew Wiles, Bayesian statistics, bioinformatics, bitcoin, Bletchley Park, Build a better mousetrap, Claude Shannon: information theory, computer age, computer vision, Computing Machinery and Intelligence, CRISPR, deep learning, disinformation, double entry bookkeeping, double helix, Douglas Hofstadter, Elon Musk, epigenetics, experimental subject, Fermat's Last Theorem, Gödel, Escher, Bach, Higgs boson, information asymmetry, information retrieval, invention of writing, Isaac Newton, iterative process, John von Neumann, language acquisition, megaproject, Menlo Park, Murray Gell-Mann, Necker cube, Norbert Wiener, pattern recognition, phenotype, Richard Feynman, Rodney Brooks, self-driving car, social intelligence, sorting algorithm, speech recognition, Stephen Hawking, Steven Pinker, strong AI, Stuart Kauffman, TED Talk, The Wealth of Nations by Adam Smith, theory of mind, Thomas Bayes, trickle-down economics, Turing machine, Turing test, Watson beat the top human players on Jeopardy!, Y2K

This suggests—but certainly does not prove—that without us machine users to interpret the results, critically and insightfully, deep-learning machines may grow in competence, surpassing animal brains (including ours) by orders of magnitude in the bottom-up task of finding statistical regularities, but never achieve (our kind of) comprehension. “So what?” some might respond. “The computer kind of bottom-up comprehension will eventually submerge the human kind, overpowering it with the sheer size and speed of its learning.” The latest breakthrough in AI, AlphaGo, the deep-learning program that has recently beaten Lee Seedol, regarded by many as the best human player of Go in the world, supports this expectation in one regard if not in others.

Dehaene and Naccache (2001) note “the impossibility for subjects [i.e., executives] to strategically use the unconscious information.” My claim, then, is that deep learning (so far) discriminates but doesn’t notice. That is, the flood of data that a system takes in does not have relevance for the system except as more “food” to “digest.” Being bedridden, not having to fend for itself, it has no goals beyond increasing its store of well-indexed information. Beyond the capacity we share with Watson and other deep learning machines to acquire know-how that depends on statistical regularities that we extract from experience, there is the capacity to decide what to search for and why, given one’s current aims.

A conscious human mind is not a miracle, not a violation of the principles of natural selection, but a novel extension of them, a new crane that adjusts evolutionary biologist Stuart Kauffman’s concept of the adjacent possible: many more places in Design Space are adjacent to us because we have evolved the ability to think about them and either seek them or shun them. The unanswered question for Domingos and other exponents of deep learning is whether learning a sufficiently detailed and dynamic theory of agents with imagination and reason-giving capabilities would enable a system (a computer program, a Master Algorithm) to generate and exploit the abilities of such agents, that is to say, to generate all the morally relevant powers of a person.103 My view is (still) that deep learning will not give us—in the next fifty years—anything like the “superhuman intelligence” that has attracted so much alarmed attention recently (Bostrom 2014; earlier invocations are Moravec 1988; Kurzweil 2005; and Chalmers 2010; see also the annual Edge world question 2015; and Katchadourian 2015).


pages: 590 words: 152,595

Army of None: Autonomous Weapons and the Future of War by Paul Scharre

"World Economic Forum" Davos, active measures, Air France Flight 447, air gap, algorithmic trading, AlphaGo, Apollo 13, artificial general intelligence, augmented reality, automated trading system, autonomous vehicles, basic income, Black Monday: stock market crash in 1987, brain emulation, Brian Krebs, cognitive bias, computer vision, cuban missile crisis, dark matter, DARPA: Urban Challenge, data science, deep learning, DeepMind, DevOps, Dr. Strangelove, drone strike, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, facts on the ground, fail fast, fault tolerance, Flash crash, Freestyle chess, friendly fire, Herman Kahn, IFF: identification friend or foe, ImageNet competition, information security, Internet of things, Jeff Hawkins, Johann Wolfgang von Goethe, John Markoff, Kevin Kelly, Korean Air Lines Flight 007, Loebner Prize, loose coupling, Mark Zuckerberg, military-industrial complex, moral hazard, move 37, mutually assured destruction, Nate Silver, Nick Bostrom, PalmPilot, paperclip maximiser, pattern recognition, Rodney Brooks, Rubik’s Cube, self-driving car, sensor fusion, South China Sea, speech recognition, Stanislav Petrov, Stephen Hawking, Steve Ballmer, Steve Wozniak, Strategic Defense Initiative, Stuxnet, superintelligent machines, Tesla Model S, The Signal and the Noise by Nate Silver, theory of mind, Turing test, Tyler Cowen, universal basic income, Valery Gerasimov, Wall-E, warehouse robotics, William Langewiesche, Y2K, zero day

Grainy SAR images of tanks, artillery, or airplanes parked on a runway often push the limits of human abilities to recognize objects, and historically ATR algorithms have fallen far short of human abilities. The poor performance of military ATR stands in stark contrast to recent advances in computer vision. Artificial intelligence has historically struggled with object recognition and perception, but the field has seen rapid gains recently due to deep learning. Deep learning uses neural networks, a type of AI approach that is analogous to biological neurons in animal brains. Artificial neural networks don’t directly mimic biology, but are inspired by it. Rather than follow a script of if-then steps for how to perform a task, neural networks work based on the strength of connections within a network.

Thus the response of the network to all possible inputs is unknowable. Part of this is due to the early stage of research in neural nets, but part of it is due to the sheer complexity of the deep learning. The JASON group argued that “the very nature of [deep neural networks] may make it intrinsically difficult for them to transition into what is typically recognized as a professionally engineered product.” AI researchers are working on ways to build more transparent AI, but Jeff Clune isn’t hopeful. “As deep learning gets even more powerful and more impressive and more complicated and as the networks grow in size, there will be more and more and more things we don’t understand. . . .

Goodfellow, Jonathan Shlens, and Christian Szegedy, “Explaining and Harnessing Adversarial Examples,” March 20, 2015, https://arxiv.org/abs/1412.6572; Ian Goodfellow, Presentation at Re-Work Deep Learning Summit, 2015, https://www.youtube.com/watch?v=Pq4A2mPCB0Y. 184 “infinitely far to the left”: Jeff Clune, interview, September 28, 2016. 184 “real-world images are a very, very small”: Ibid. 184 present in essentially every deep neural network: “Deep neural networks are easily fooled.” Goodfellow et al., “Explaining and Harnessing Adversarial Examples.” 184 specially evolved noise: Corey Kereliuk, Bob L. Sturm, and Jan Larsen, “Deep Learning and Music Adversaries,” http://www2.imm.dtu.dk/pubdb/views/edoc_download.php/6904/pdf/imm6904.pdf. 184 News-reading trading bots: John Carney, “The Trading Robots Really Are Reading Twitter,” April 23, 2013, http://www.cnbc.com/id/100666302.


pages: 332 words: 93,672

Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy by George Gilder

23andMe, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AlphaGo, AltaVista, Amazon Web Services, AOL-Time Warner, Asilomar, augmented reality, Ben Horowitz, bitcoin, Bitcoin Ponzi scheme, Bletchley Park, blockchain, Bob Noyce, British Empire, Brownian motion, Burning Man, business process, butterfly effect, carbon footprint, cellular automata, Claude Shannon: information theory, Clayton Christensen, cloud computing, computer age, computer vision, crony capitalism, cross-subsidies, cryptocurrency, Danny Hillis, decentralized internet, deep learning, DeepMind, Demis Hassabis, disintermediation, distributed ledger, don't be evil, Donald Knuth, Donald Trump, double entry bookkeeping, driverless car, Elon Musk, Erik Brynjolfsson, Ethereum, ethereum blockchain, fake news, fault tolerance, fiat currency, Firefox, first square of the chessboard, first square of the chessboard / second half of the chessboard, floating exchange rates, Fractional reserve banking, game design, Geoffrey Hinton, George Gilder, Google Earth, Google Glasses, Google Hangouts, index fund, inflation targeting, informal economy, initial coin offering, Internet of things, Isaac Newton, iterative process, Jaron Lanier, Jeff Bezos, Jim Simons, Joan Didion, John Markoff, John von Neumann, Julian Assange, Kevin Kelly, Law of Accelerating Returns, machine translation, Marc Andreessen, Mark Zuckerberg, Mary Meeker, means of production, Menlo Park, Metcalfe’s law, Money creation, money: store of value / unit of account / medium of exchange, move fast and break things, Neal Stephenson, Network effects, new economy, Nick Bostrom, Norbert Wiener, Oculus Rift, OSI model, PageRank, pattern recognition, Paul Graham, peer-to-peer, Peter Thiel, Ponzi scheme, prediction markets, quantitative easing, random walk, ransomware, Ray Kurzweil, reality distortion field, Recombinant DNA, Renaissance Technologies, Robert Mercer, Robert Metcalfe, Ronald Coase, Ross Ulbricht, Ruby on Rails, Sand Hill Road, Satoshi Nakamoto, Search for Extraterrestrial Intelligence, self-driving car, sharing economy, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Singularitarianism, Skype, smart contracts, Snapchat, Snow Crash, software is eating the world, sorting algorithm, South Sea Bubble, speech recognition, Stephen Hawking, Steve Jobs, Steven Levy, Stewart Brand, stochastic process, Susan Wojcicki, TED Talk, telepresence, Tesla Model S, The Soul of a New Machine, theory of mind, Tim Cook: Apple, transaction costs, tulip mania, Turing complete, Turing machine, Vernor Vinge, Vitalik Buterin, Von Neumann architecture, Watson beat the top human players on Jeopardy!, WikiLeaks, Y Combinator, zero-sum game

CHAPTER 3 Google’s Roots and Religions Under the leadership of Larry Page and Sergey Brin, Google developed the integrated philosophy that currently shapes our lives and fortunes, combining a theory of knowledge (nicknamed “Big Data”), a technological vision (centralized cloud computing), a cult of the commons (rooted in “open source” software), a concept of money and value (based on free goods and automated advertising), a theory of morality as “gifts” rather than profits, and a view of progress as evolutionary inevitability and an ever diminishing “carbon footprint.” This philosophy rules our economic lives in America and, increasingly, around the globe. With its development of “deep learning” by machines and its hiring of the inventor-prophet Raymond Kurzweil in 2014, Google enlisted in a chiliastic campaign to blend human and machine cognition. Kurzweil calls it a “singularity,” marked by the triumph of computation over human intelligence. Google networks, clouds, and server farms could be said to have already accomplished much of it.

Google, meanwhile, under its new CEO, Sundar Pichai, pivoted away from its highly publicized “mobile first” mantra, which had led to its acquisitions of Android and Ad Mob, and toward “AI first.” Google was the recognized intellectual leader of the industry, and its AI ostentation was widely acclaimed. Indeed it signed up most of the world’s AI celebrities, including its spearheads of “deep learning” prowess, from Geoffrey Hinton and Andrew Ng to Jeff Dean, the beleaguered Anthony Levandowski, and Demis Hassabis of DeepMind. If Google had been a university, it would have utterly outshone all others in AI talent. It must have been discouraging, then, to find that Amazon had shrewdly captured much of the market for AI services with its 2014 Alexa and Echo projects.

Or do you diffuse the memory and processing all through the machine? In a massively parallel spread like Dally’s J-machine, the memory is always close to the processor. Twenty-six years later, Dally and Jouppi are still at it. At the August 2017 Hot Chips in Cupertino, all the big guys were touting their own chips for what they call “deep learning,” the fashionable Silicon Valley term for the massive acceleration of multi-layered pattern recognition, correlation, and correction tied to feedback that results in a cumulative gain in performance. What they call “learning” originated in earlier ventures in AI. Guess, measure the error, adjust the answer, feed it back are the canonical steps followed in Google’s data centers, enabling such applications as Google Translate, Google Soundwriter, Google Maps, Google Assistant, Waymo cars, search, Google Now, and so on, in real time.4 As recently as 2012, Google was still struggling with the difference between dogs and cats.


pages: 523 words: 61,179

Human + Machine: Reimagining Work in the Age of AI by Paul R. Daugherty, H. James Wilson

3D printing, AI winter, algorithmic management, algorithmic trading, AlphaGo, Amazon Mechanical Turk, Amazon Robotics, augmented reality, autonomous vehicles, blockchain, business process, call centre, carbon footprint, circular economy, cloud computing, computer vision, correlation does not imply causation, crowdsourcing, data science, deep learning, DeepMind, digital twin, disintermediation, Douglas Hofstadter, driverless car, en.wikipedia.org, Erik Brynjolfsson, fail fast, friendly AI, fulfillment center, future of work, Geoffrey Hinton, Hans Moravec, industrial robot, Internet of things, inventory management, iterative process, Jeff Bezos, job automation, job satisfaction, knowledge worker, Lyft, machine translation, Marc Benioff, natural language processing, Neal Stephenson, personalized medicine, precision agriculture, Ray Kurzweil, recommendation engine, RFID, ride hailing / ride sharing, risk tolerance, robotic process automation, Rodney Brooks, Salesforce, Second Machine Age, self-driving car, sensor fusion, sentiment analysis, Shoshana Zuboff, Silicon Valley, Snow Crash, software as a service, speech recognition, tacit knowledge, telepresence, telepresence robot, text mining, the scientific method, uber lyft, warehouse automation, warehouse robotics

These individuals are responsible for making important judgment calls about which AI technologies might best be deployed for specific applications. A huge consideration here is accuracy versus “explainability.” A deep-learning system, for example, provides a high level of prediction accuracy, but companies may have difficulty explaining how those results were derived. In contrast, a decision tree may not lead to results with high prediction accuracy but will enable a significantly greater explainability. So, for instance, an internal system that optimizes a supply chain with small tolerances for scheduling deliveries might best deploy deep-learning technology, whereas a health-care or consumer-facing application that will have to stand up to considerable regulatory scrutiny may be better off utilizing falling rule list algorithms.10 In addition, the explainability strategist might also decide that, for a particular application, the company might be better off avoiding the use of AI altogether.

See personalization cybersecurity, 56–58, 59 Darktrace, 58 DARPA Cyber Grand Challenges, 57, 190 Dartmouth College conference, 40–41 dashboards, 169 data, 10 in AI training, 121–122 barriers to flow of, 176–177 customization and, 78–80 discovery with, 178 dynamic, real-time, 175–176 in enterprise processes, 59 exhaust, 15 in factories, 26–27, 29–30 leadership and, 180 in manufacturing, 38–39 in marketing and sales, 92, 98–99, 100 in R&D, 69–72 in reimagining processes, 154 on supply chains, 33–34 supply chains for, 12, 15 velocity of, 177–178 data hygienists, 121–122 data supply-chain officers, 179 data supply chains, 12, 15, 174–179 decision making, 109–110 about brands, 93–94 black box, 106, 125, 169 employee power to modify AI, 172–174 empowerment for, 15 explainers and, 123–126 transparency in, 213 Deep Armor, 58 deep learning, 63, 161–165 deep-learning algorithms, 125 DeepMind, 121 deep neural networks (DNN), 63 deep reinforcement learning, 21–22 demand planning, 33–34 Dennis, Jamie, 158 design at Airbus, 144 AI system, 128–129 Elbo Chair, 135–137 generative, 135–137, 139, 141 product/service, 74–77 Dickey, Roger, 52–54 digital twins, 10 at GE, 27, 29–30, 183–184, 194 disintermediation, brand, 94–95 distributed learning, 22 distribution, 19–39 Ditto Labs, 98 diversity, 52 Doctors Without Borders, 151 DoubleClick Search, 99 Dreamcatcher, 136–137, 141, 144 drones, 28, 150–151 drug interactions, 72–74 Ducati, 175 Echo, 92, 164–165 Echo Voyager, 28 Einstein, 85–86, 196 Elbo Chair, 136–137, 139 “Elephants Don’t Play Chess” (Brooks), 24 Elish, Madeleine Clare, 170–171 Ella, 198–199 embodied intelligence, 206 embodiment, 107, 139–140 in factories, 21–23 of intelligence, 206 interaction agents, 146–151 jobs with, 147–151 See also augmentation; missing middle empathy engines for health care, 97 training, 117–118, 132 employees agency of, 15, 172–174 amplification of, 138–139, 141–143 development of, 14 hiring, 51–52 job satisfaction in, 46–47 marketing and sales, 90, 92, 100–101 on-demand work and, 111 rehumanizing time and, 186–189 routine/repetitive work and, 26–27, 29–30, 46–47 training/retraining, 15 warehouse, 31–33 empowerment, 137 bot-based, 12, 195–196 in decision making, 15 of salespeople, 90, 92 workforce implications of, 137–138 enabling, 7 enterprise processes, 45–66 compliance, 47–48 determining which to change, 52–54 hiring and recruitment, 51–52 how much to change, 54–56 redefining industries with, 56–58 reimagining around people, 58–59 robotic process automation (RPA) in, 50–52 routine/repetitive, 46–47 ergonomics, 149–150 EstherBot, 199 ethical, moral, legal issues, 14–15, 108 Amazon Echo and, 164–165 explainers and, 123–126 in marketing and sales, 90, 100 moral crumple zones and, 169–172 privacy, 90 in R&D, 83 in research, 78–79 ethics compliance managers, 79, 129–130, 132–133 European Union, 124 Ewing, Robyn, 119 exhaust data, 15 definition of, 122 experimentation, 12, 14 cultures of, 161–165 in enterprise processes, 59 leadership and, 180 learning from, 71 in manufacturing, 39 in marketing and sales, 100 in process reimagining, 160–165 in R&D, 83 in reimagining processes, 154 testing and, 74–77 expert systems, 25, 41 definition of, 64 explainability strategists, 126 explaining outcomes, 107, 114–115, 179 black-box concerns and, 106, 125, 169 jobs in, 122–126 sustaining and, 130 See also missing middle extended intelligence, 206 extended reality, 66 Facebook, 78, 79, 95, 177–178 facial recognition, 65, 90 factories, 10 data flow in, 26–27, 29–30 embodiment in, 140 job losses and gains in, 19, 20 robotic arms in, 21–26 self-aware, 19–39 supply chains and, 33–34 third wave in, 38–39 traditional assembly lines and, 1–2, 4 warehouse management and, 30–33 failure, learning from, 71 fairness, 129–130 falling rule list algorithms, 124–125 Fanuc, 21–22, 128 feedback, 171–172 feedforward neural networks (FNN), 63 Feigenbaum, Ed, 41 financial trading, 167 first wave of business transformation, 5 Fletcher, Seth, 49 food production, 34–37 ForAllSecure, 57 forecasts, 33–34 Fortescue Metals Group, 28 Fraunhofer Institute of Material Flow and Logistics (IML), 26 fusion skills, 12, 181, 183–206, 210 bot-based empowerment, 12, 195–196 developing, 15–16 holistic melding, 12, 197, 200–201 intelligent interrogation, 12, 185, 193–195 judgment integration, 12, 191–193 potential of, 209 reciprocal apprenticing, 12, 201–202 rehumanizing time, 12, 186–189 relentless reimagining, 12, 203–205 responsible normalizing, 12, 189–191 training/retraining for, 211–213 Future of Work survey, 184–185 Garage, Capital One, 205 Gaudin, Sharon, 99 GE.

From the Mechanistic to the Organic The potential power of AI to transform businesses is unprecedented, and yet there is an urgent and growing challenge. Companies are now reaching a crossroad in their use of AI, which we define as systems that extend human capability by sensing, comprehending, acting, and learning. As businesses deploy such systems—spanning from machine learning to computer vision to deep learning—some firms will continue to see modest productivity gains over the short run, but those results will eventually stall out. Other companies will be able to attain breakthrough improvements in performance, often by developing game-changing innovations. What accounts for the difference? It has to do with understanding the true nature of AI’s impact.


pages: 321

Finding Alphas: A Quantitative Approach to Building Trading Strategies by Igor Tulchinsky

algorithmic trading, asset allocation, automated trading system, backpropagation, backtesting, barriers to entry, behavioural economics, book value, business cycle, buy and hold, capital asset pricing model, constrained optimization, corporate governance, correlation coefficient, credit crunch, Credit Default Swap, currency risk, data science, deep learning, discounted cash flows, discrete time, diversification, diversified portfolio, Eugene Fama: efficient market hypothesis, financial engineering, financial intermediation, Flash crash, Geoffrey Hinton, implied volatility, index arbitrage, index fund, intangible asset, iterative process, Long Term Capital Management, loss aversion, low interest rates, machine readable, market design, market microstructure, merger arbitrage, natural language processing, passive investing, pattern recognition, performance metric, Performance of Mutual Funds in the Period, popular capitalism, prediction markets, price discovery process, profit motive, proprietary trading, quantitative trading / quantitative finance, random walk, Reminiscences of a Stock Operator, Renaissance Technologies, risk free rate, risk tolerance, risk-adjusted returns, risk/return, selection bias, sentiment analysis, shareholder value, Sharpe ratio, short selling, Silicon Valley, speech recognition, statistical arbitrage, statistical model, stochastic process, survivorship bias, systematic bias, systematic trading, text mining, transaction costs, Vanguard fund, yield curve

Another alternative is FloatBoost, which incorporates the backtracking mechanism of floating search and repeatedly performs a backtracking to remove unfavorable weak classifiers after a new weak classifier is added by AdaBoost; this ensures a lower error rate and reduced feature set at the cost of about five times longer training time. Deep Learning Deep learning (DL) is a popular topic today – and a term that is used to discuss a number of rather distinct things. Some data scientists think DL is just a buzz word or a rebranding of neural networks. The name comes from Canadian scientist Geoffrey Hinton, who created an unsupervised method known as the restricted Boltzmann machine (RBM) for pretraining NNs with a large number of neuron layers. That was meant to improve on the backpropagation training method, but there is no strong evidence that it really was an improvement. Another direction in deep learning is recurrent neural networks (RNNs) and natural language processing.

This is called the vanishing gradient problem. These days, the words “deep learning” more often refer to convolutional neural networks (CNNs). The architecture of CNNs was introduced by computer scientists Kunihiko Fukushima, who developed the 126 Finding Alphas neocognitron model (feed-forward NN), and Yann LeCun, who modified the backpropagation algorithm for neocognitron training. CNNs require a lot of resources for training, but they can be easily parallelized and therefore are a good candidate for parallel computations. When applying deep learning, we seek to stack several independent neural network layers that by working together produce better results than the shallow individual structures.

For example, in alpha research the task of predicting stock prices can be a good application of supervised learning, and the task of selecting stocks for inclusion in a portfolio is an application of unsupervised learning. Machine Learning in Alpha Research123 Machine learning Unsupervised methods Clusterization algorithms Supervised methods Statistical models Support vector machines Neural networks Deep learning algorithms Fuzzy logic Ensemble methods Random forest AdaBoost Figure 16.1 The most developed directions of machine learning. The most popular are in black Statistical Models Models like naive Bayes, linear discriminant analysis, the hidden Markov model, and logistic regression are good for solving relatively simple problems that do not need high precision of classification or prediction.


pages: 360 words: 100,991

Heart of the Machine: Our Future in a World of Artificial Emotional Intelligence by Richard Yonck

3D printing, AI winter, AlphaGo, Apollo 11, artificial general intelligence, Asperger Syndrome, augmented reality, autism spectrum disorder, backpropagation, Berlin Wall, Bletchley Park, brain emulation, Buckminster Fuller, call centre, cognitive bias, cognitive dissonance, computer age, computer vision, Computing Machinery and Intelligence, crowdsourcing, deep learning, DeepMind, Dunning–Kruger effect, Elon Musk, en.wikipedia.org, epigenetics, Fairchild Semiconductor, friendly AI, Geoffrey Hinton, ghettoisation, industrial robot, Internet of things, invention of writing, Jacques de Vaucanson, job automation, John von Neumann, Kevin Kelly, Law of Accelerating Returns, Loebner Prize, Menlo Park, meta-analysis, Metcalfe’s law, mirror neurons, Neil Armstrong, neurotypical, Nick Bostrom, Oculus Rift, old age dependency ratio, pattern recognition, planned obsolescence, pneumatic tube, RAND corporation, Ray Kurzweil, Rodney Brooks, self-driving car, Skype, social intelligence, SoftBank, software as a service, SQL injection, Stephen Hawking, Steven Pinker, superintelligent machines, technological singularity, TED Talk, telepresence, telepresence robot, The future is already here, The Future of Employment, the scientific method, theory of mind, Turing test, twin studies, Two Sigma, undersea cable, Vernor Vinge, Watson beat the top human players on Jeopardy!, Whole Earth Review, working-age population, zero day

These processors made it possible to speed up the network training by orders of magnitude, performing the major number crunching required to reduce what once took weeks to a matter of days or hours. Different approaches led to further refining of these deep learning techniques, using methods with names such as restricted Boltzmann machines and recurrent neural networks. All of these factors vastly improved the deep learning algorithms being used for many kinds of pattern recognition work. Continuing advances contributed to the significant gains seen by artificial intelligence during this past decade, including Facebook’s development of DeepFace, which identifies human faces in images with 97 percent accuracy.

Continuing advances contributed to the significant gains seen by artificial intelligence during this past decade, including Facebook’s development of DeepFace, which identifies human faces in images with 97 percent accuracy. In 2012, a University of Toronto artificial intelligence team made up of Hinton and two of his students won the annual ImageNet Large Scale Visual Recognition Competition with a deep learning neural network that blew the competition away.5 More recently, Google DeepMind used deep learning to develop the Go-playing AI, AlphaGo, training it by using a database of thirty million recorded moves from expert-level games. In March 2016, AlphaGo beat the world Go grandmaster, Lee Sedol, in four out of five games. Playing Go is considered a much bigger AI challenge than playing chess.

The camera technology wasn’t strong enough for us to actually measure the microexpressions that are on people’s faces, those subconscious reactions that show up on the muscles on our face before we can shut it down with our consciousness, because they’re impulse.” Continuing, Denman notes that the processing power is also now available to us to be able to run the deep learning neural networks needed to make this happen. Because of this, dozens of companies are entering the space, focusing not only on facial information but on the other ways we engage the world emotionally. Tel Aviv–based Beyond Verbal is an emotions analytics company that extracts and identifies feelings conveyed by intonations in the human voice.


pages: 180 words: 55,805

The Price of Tomorrow: Why Deflation Is the Key to an Abundant Future by Jeff Booth

3D printing, Abraham Maslow, activist fund / activist shareholder / activist investor, additive manufacturing, AI winter, Airbnb, Albert Einstein, AlphaGo, Amazon Web Services, artificial general intelligence, augmented reality, autonomous vehicles, basic income, bitcoin, blockchain, Bretton Woods, business intelligence, butterfly effect, Charles Babbage, Claude Shannon: information theory, clean water, cloud computing, cognitive bias, collapse of Lehman Brothers, Computing Machinery and Intelligence, corporate raider, creative destruction, crony capitalism, crowdsourcing, cryptocurrency, currency manipulation / currency intervention, dark matter, deep learning, DeepMind, deliberate practice, digital twin, distributed ledger, Donald Trump, Elon Musk, fiat currency, Filter Bubble, financial engineering, full employment, future of work, game design, gamification, general purpose technology, Geoffrey Hinton, Gordon Gekko, Great Leap Forward, Hyman Minsky, hype cycle, income inequality, inflation targeting, information asymmetry, invention of movable type, Isaac Newton, Jeff Bezos, John Maynard Keynes: Economic Possibilities for our Grandchildren, John von Neumann, Joseph Schumpeter, late fees, low interest rates, Lyft, Maslow's hierarchy, Milgram experiment, Minsky moment, Modern Monetary Theory, moral hazard, Nelson Mandela, Network effects, Nick Bostrom, oil shock, OpenAI, pattern recognition, Ponzi scheme, quantitative easing, race to the bottom, ride hailing / ride sharing, self-driving car, software as a service, technoutopianism, TED Talk, the long tail, the scientific method, Thomas Bayes, Turing test, Uber and Lyft, uber lyft, universal basic income, winner-take-all economy, X Prize, zero-sum game

When artificial intelligence reduces that waste and increases the benefits to society, as a by-product of removing the waste in the system, it reduces the number of jobs in healthcare. With more than $3.5 trillion of annual spending and 19 percent of the US GDP in healthcare, that could mean a lot of jobs. For instance, as reported in a May 2019 Nature Medicine article, researchers created a 3D volumetric deep learning model to screen for lung cancer.55 When comparing a single image, the deep learning model outperformed six radiology experts, with an 11 percent reduction in false positives and a 5 percent reduction in false negatives. According to Dr. Mozziyar Etemadi, one of the study’s coauthors, “AI in 3D can be much more sensitive in its ability to detect early lung cancer than the human eye looking at 2D images.

All of that digitization is also creating some impressive data capture, much more than we are even aware of, and the data collection from connected computers, people, cameras, and sensors has only just started. Connecting those devices to learn from data is arguably a far easier job than that of building the original network. The rate of growth in today’s deep learning in artificial intelligence is largely driven by data collection and large data sets. In fact, every platform company today is really a data company with AI at its core. Other data, too, is moving out of its previous silos, giving rise to an intelligence that can be combined with other data sets to learn at a rate far faster than humans.

Until 2014, even top AI researchers believed top human competitors would beat computers for years to come because of the complexity of the game and the fact that algorithms had to compare every move, which required enormous compute power. But in 2016, Google’s DeepMind program AlphaGo beat one of the top players in the world, Lee Sedol, in a match that made history. AlphaGo’s program was based on deep learning, which was “trained” using thousands of human amateur and professional games. It made history not only because it was the first time a computer beat a top Go master, but also because of the way it did so. In game 2 and the thirty-seventh move, the computer made a move that defied logic, placing a black stone in the middle of an open area—away from the other stones.


pages: 256 words: 67,563

Explaining Humans: What Science Can Teach Us About Life, Love and Relationships by Camilla Pang

autism spectrum disorder, backpropagation, bioinformatics, Brownian motion, correlation does not imply causation, data science, deep learning, driverless car, frictionless, job automation, John Nash: game theory, John von Neumann, Kickstarter, Nash equilibrium, neurotypical, phenotype, random walk, self-driving car, stem cell, Stephen Hawking

Compared to traditional machine learning, a neural network is more independent and requires less input from the programmer to define what it should be searching for, since, through internal layers of logic, it is able to create its own connections. All of the more radical examples of artificial intelligence you may have read about – from fully driverless cars to mass automation of people’s jobs – ultimately rely on deep learning, the closest we have so far got to developing a computer program that can think (within considerable limitations) like a human. Deep learning is also responsible for applications, including criminal checks, drug design, and the computer programs that rival the most competent chess players, all of which depend on an ability to simulate the connective capability of the human mind.

How not to follow the crowd Molecular dynamics, conformity and individuality 7. How to achieve your goals Quantum physics, network theory and goal setting 8. How to have empathy with others Evolution, probability and relationships 9. How to connect with others Chemical bonds, fundamental forces and human connection 10. How to learn from your mistakes Deep learning, feedback loops and human memory 11. How to be polite Game theory, complex systems and etiquette Afterword Acknowledgements Index About the Author Dr Camilla Pang holds a PhD in Biochemistry from University College London and is a Postdoctoral Scientist specialising in Translational Bioinformatics.

No one can ever be entirely dispassionate, objective, or dare I say scientific, about how they form new relationships. But chemistry can give us a new outlook and a fresh perspective: one that provides the confidence to make, break and sometimes re-make the connections that define us. 10. How to learn from your mistakes Deep learning, feedback loops and human memory With ADHD, you’re always forgetting what you’re meant to be doing. My working memory – the part where we hold information for short-term, immediate use – is constantly undermined by new thoughts, impulses or emotional responses. It feels as though everywhere you go, even just to the room next door, the working memory is always being refreshed, losing immediate context.


pages: 296 words: 66,815

The AI-First Company by Ash Fontana

23andMe, Amazon Mechanical Turk, Amazon Web Services, autonomous vehicles, barriers to entry, blockchain, business intelligence, business process, business process outsourcing, call centre, Charles Babbage, chief data officer, Clayton Christensen, cloud computing, combinatorial explosion, computer vision, crowdsourcing, data acquisition, data science, deep learning, DevOps, en.wikipedia.org, Geoffrey Hinton, independent contractor, industrial robot, inventory management, John Conway, knowledge economy, Kubernetes, Lean Startup, machine readable, minimum viable product, natural language processing, Network effects, optical character recognition, Pareto efficiency, performance metric, price discrimination, recommendation engine, Ronald Coase, Salesforce, single source of truth, software as a service, source of truth, speech recognition, the scientific method, transaction costs, vertical integration, yield management

Reinforcement learning is a functionally different approach to supervised and unsupervised ML. Transfer and deep learning overlap with the other types. The table below shows what might be applicable to a particular situation depending on the data at hand, required interpretability, and existing knowledge of the prediction problem. supervised unsupervised reinforcement transfer deep Learns from inputs given outputs inputs without outputs objectives inputs other layers in the network Needs training and feedback data lots of data objectives existing models lots of data and computational resources Good when data is available but the algorithm is missing it’s unclear what is being looked at and/or there are no labels it’s possible to articulate the state, action, reward, and how to modify the state based on the rewards problems are similar, training time and computational resources are limited, and results are needed fast there is lots of unstructured time series data (for convolutional neural networks) or data that’s not independent (for recurrent neural networks)* Selected methods include random forest trees, decision trees (including random forest and gradient boosted types), regression, support vector machines (SVMs), and neural networks clustering (k-means, hierarchical, and others) and Gaussian mixture models* various, but all forms of reinforcement learning Bayesian networks and Markov logic networks convolutional neural networks and recurrent neural networks COMPOUNDING There are many different methods for making predictions, each one generating and accumulating data in various ways.

Data is uploaded from the system of record, predictive models are applied, and actionable insights made available to the user. Data type Unstructured Deep learning over unstructured data to turn it into structured data and extract predictive features. There’s just not enough data about most jobs available for the training of deep learning models. Data categorization Exhaust Higher accuracy by removing reporting bias. A machine-generated summary of the last email from a sales lead is perhaps more indicative of a likelihood to close a deal with that lead than a salesperson’s subjective opinion expressed in a single, categorical input.

The Canadian computer scientist Yoshua Bengio devised a language model based on a neural network that figured out the next best word to use among all the available words in a language based on where that word usually appeared with respect to other words. Geoffrey Hinton, a British-born computer scientist and psychologist, developed a neural network that linked many layers of neurons together, the precursor to deep learning. Importantly, researchers worked to get these neural networks running efficiently on the available computer chips, settling on the chips used for computer graphics because they are particularly good at running many numerical computations in parallel. The result was a trainable neural network: programmable neurons, connected in a weblike network, passing the computations onto another web sitting below it—all computed on a chip that could perform the necessary operations on a reasonable timescale: mere days instead of months.


pages: 688 words: 147,571

Robot Rules: Regulating Artificial Intelligence by Jacob Turner

"World Economic Forum" Davos, Ada Lovelace, Affordable Care Act / Obamacare, AI winter, algorithmic bias, algorithmic trading, AlphaGo, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, autonomous vehicles, backpropagation, Basel III, bitcoin, Black Monday: stock market crash in 1987, blockchain, brain emulation, Brexit referendum, Cambridge Analytica, Charles Babbage, Clapham omnibus, cognitive dissonance, Computing Machinery and Intelligence, corporate governance, corporate social responsibility, correlation does not imply causation, crowdsourcing, data science, deep learning, DeepMind, Demis Hassabis, distributed ledger, don't be evil, Donald Trump, driverless car, easy for humans, difficult for computers, effective altruism, Elon Musk, financial exclusion, financial innovation, friendly fire, future of work, hallucination problem, hive mind, Internet of things, iterative process, job automation, John Markoff, John von Neumann, Loebner Prize, machine readable, machine translation, medical malpractice, Nate Silver, natural language processing, Nick Bostrom, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, nudge unit, obamacare, off grid, OpenAI, paperclip maximiser, pattern recognition, Peace of Westphalia, Philippa Foot, race to the bottom, Ray Kurzweil, Recombinant DNA, Rodney Brooks, self-driving car, Silicon Valley, Stanislav Petrov, Stephen Hawking, Steve Wozniak, strong AI, technological singularity, Tesla Model S, The Coming Technological Singularity, The Future of Employment, The Signal and the Noise by Nate Silver, trolley problem, Turing test, Vernor Vinge

Perez summed up these developments:So not only are researcher[s] who hand optimize gradient descent solutions out of business, so are folks who make a living designing neural architectures! This is actually just the beginning of Deep Learning systems just bootstrapping themselves… This is absolutely shocking and there’s really no end in sight as to how quickly Deep Learning algorithms are going to improve. This meta capability allows you to apply it on itself, recursively creating better and better systems.137 As noted in Chapter 1, various companies and researchers announced in 2017 that they had created AI software which could itself develop further AI software.138 In May 2017, Google demonstrated a meta-learning technology called AutoML.

See Cade Metz, “Google’s Dueling Neural Networks Spar to Get Smarter, No Humans Required”, Wired, 4 November 2017, https://​www.​wired.​com/​2017/​04/​googlesdueling-neural-networks-spar-get-smarter-no-humans-required/​, accessed 16 August 2018. 127Yann LeCun, “Answer to Question: What are Some Recent and Potentially Upcoming Breakthroughs in Deep Learning?”, Quora, 28 July 2016, https://​www.​quora.​com/​What-are-some-recent-and-potentially-upcoming-breakthroughs-in-deep-learning, accessed 16 August 2018. 128Andrea Bertolini, “Robots as Products: The Case for a Realistic Analysis of Robotic Applications and Liability Rules”, Law Innovation and Technology, Vol. 5, No. 2 (2013), 214–247, 234–235. 129See Chapter 1 at s. 5 and FN 111.

id=​rJY0-Kcll, accessed 1 June 2018. 136Andrew Ng, Jiquan Ngiam, Chuan Yu Foo, Yifan Mai, Caroline Suen, Adam Coates, Andrew Maas, Awni Hannun, Brody Huval, Tao Wang, and Sameep Tando, “Optimization: Stochastic Gradient Descent”, Stanford UFLDL Tutorial, http://​ufldl.​stanford.​edu/​tutorial/​supervised/​OptimizationStoc​hasticGradientDe​scent/​, accessed 1 June 2018. 137Carlos E. Perez, “Deep Learning: The Unreasonable Effectiveness of Randomness”, Medium, 6 November 2016, https://​medium.​com/​intuitionmachine​/​deep-learning-the-unreasonable-effectiveness-of-randomness-14d5aef13f87, accessed 1 June 2018. 138See Chapter 1 at s. 5. 139See also Sundar Pichai, “Making AI Work for Everyone”, Google Blog, 17 May 2017, https://​blog.​google/​topics/​machine-learning/​making-ai-work-for-everyone/​, accessed 1 June 2018. 140At present, many AI systems require a significant amount of human fine-tuning, especially when they are produced by companies interested in achieving striking results even at a high cost in terms of resources.


pages: 245 words: 83,272

Artificial Unintelligence: How Computers Misunderstand the World by Meredith Broussard

"Susan Fowler" uber, 1960s counterculture, A Declaration of the Independence of Cyberspace, Ada Lovelace, AI winter, Airbnb, algorithmic bias, AlphaGo, Amazon Web Services, autonomous vehicles, availability heuristic, barriers to entry, Bernie Sanders, Big Tech, bitcoin, Buckminster Fuller, Charles Babbage, Chris Urmson, Clayton Christensen, cloud computing, cognitive bias, complexity theory, computer vision, Computing Machinery and Intelligence, crowdsourcing, Danny Hillis, DARPA: Urban Challenge, data science, deep learning, Dennis Ritchie, digital map, disruptive innovation, Donald Trump, Douglas Engelbart, driverless car, easy for humans, difficult for computers, Electric Kool-Aid Acid Test, Elon Musk, fake news, Firefox, gamification, gig economy, global supply chain, Google Glasses, Google X / Alphabet X, Greyball, Hacker Ethic, independent contractor, Jaron Lanier, Jeff Bezos, Jeremy Corbyn, John Perry Barlow, John von Neumann, Joi Ito, Joseph-Marie Jacquard, life extension, Lyft, machine translation, Mark Zuckerberg, mass incarceration, Minecraft, minimum viable product, Mother of all demos, move fast and break things, Nate Silver, natural language processing, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, One Laptop per Child (OLPC), opioid epidemic / opioid crisis, PageRank, Paradox of Choice, payday loans, paypal mafia, performance metric, Peter Thiel, price discrimination, Ray Kurzweil, ride hailing / ride sharing, Ross Ulbricht, Saturday Night Live, school choice, self-driving car, Silicon Valley, Silicon Valley billionaire, speech recognition, statistical model, Steve Jobs, Steven Levy, Stewart Brand, TechCrunch disrupt, Tesla Model S, the High Line, The Signal and the Noise by Nate Silver, theory of mind, traumatic brain injury, Travis Kalanick, trolley problem, Turing test, Uber for X, uber lyft, Watson beat the top human players on Jeopardy!, We are as Gods, Whole Earth Catalog, women in the workforce, work culture , yottabyte

We could make all the books available electronically and have the students access the books on their phones, because all students have phones. Wrong. Phones are great for reading short works, but long works are difficult and uncomfortable to read on a phone. Studies show that reading on a screen is worse than reading on paper in an educational context. Speed, accuracy, and deep learning all suffer when research subjects read on screens. Paper is simply a superior technology for the kind of deep learning that we want students to engage in as part of their education. Reading on a screen is fun and convenient, yes—but reading for comprehension is not about fun or convenience. It’s about learning. When it comes to learning, students generally prefer paper to screens.3 Another technochauvinist might suggest giving all the students iPads or Chromebooks or some kind of e-reader and making all the books available electronically: another good idea, but the obstacles are obvious.

“Reflecting on One Very, Very Strange Year at Uber.” Susan Fowler (blog), February 19, 2017. https://www.susanjfowler.com/blog/2017/2/19/reflecting-on-one-very-strange-year-at-uber. Gomes, Lee. “Facebook AI Director Yann LeCun on His Quest to Unleash Deep Learning and Make Machines Smarter.” IEEE Spectrum (blog), February 18, 2015. http://spectrum.ieee.org/automaton/robotics/artificial-intelligence/facebook-ai-director-yann-lecun-on-deep-learning. Gray, Jonathan, Liliana Bounegru, and Lucy Chambers, eds. The Data Journalism Handbook: How Journalists Can Use Data to Improve the News. Sebastopol, CA: O’Reilly Media, 2012. Grazian, David. Mix It Up: Popular Culture, Mass Media, and Society. 2nd ed.

We can take the model and run new data through it to get a numerical answer that predicts something: how likely it is that a squiggle on a page is the letter A; how likely it is that a given customer will pay back the mortgage money a bank loans to him; which is the best next move to make in a game of tic-tac-toe, checkers, or chess. Machine learning, deep learning, neural networks, and predictive analytics are some of the narrow AI concepts that are currently popular. For every AI system that exists today, there is a logical explanation for how it works. Understanding the computational logic can demystify AI, just like dismantling a computer helps to demystify hardware.


pages: 370 words: 107,983

Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All by Robert Elliott Smith

"World Economic Forum" Davos, Ada Lovelace, adjacent possible, affirmative action, AI winter, Alfred Russel Wallace, algorithmic bias, algorithmic management, AlphaGo, Amazon Mechanical Turk, animal electricity, autonomous vehicles, behavioural economics, Black Swan, Brexit referendum, British Empire, Cambridge Analytica, cellular automata, Charles Babbage, citizen journalism, Claude Shannon: information theory, combinatorial explosion, Computing Machinery and Intelligence, corporate personhood, correlation coefficient, crowdsourcing, Daniel Kahneman / Amos Tversky, data science, deep learning, DeepMind, desegregation, discovery of DNA, disinformation, Douglas Hofstadter, Elon Musk, fake news, Fellow of the Royal Society, feminist movement, Filter Bubble, Flash crash, Geoffrey Hinton, Gerolamo Cardano, gig economy, Gödel, Escher, Bach, invention of the wheel, invisible hand, Jacquard loom, Jacques de Vaucanson, John Harrison: Longitude, John von Neumann, Kenneth Arrow, Linda problem, low skilled workers, Mark Zuckerberg, mass immigration, meta-analysis, mutually assured destruction, natural language processing, new economy, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, On the Economy of Machinery and Manufactures, p-value, pattern recognition, Paul Samuelson, performance metric, Pierre-Simon Laplace, post-truth, precariat, profit maximization, profit motive, Silicon Valley, social intelligence, statistical model, Stephen Hawking, stochastic process, Stuart Kauffman, telemarketer, The Bell Curve by Richard Herrnstein and Charles Murray, The Future of Employment, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Bayes, Thomas Malthus, traveling salesman, Turing machine, Turing test, twin studies, Vilfredo Pareto, Von Neumann architecture, warehouse robotics, women in the workforce, Yochai Benkler

Avoiding the stretched metaphors of neural networks, the mathematical functions represented now look like this: The typical word used to describe these networks is ‘deep’ and the weight-tuning algorithms employed are now called ‘deep learning’. That term is certainly justified in terms of the layers of mathematics being utilized in these deeply nested functions. In deep learning networks, layers upon layers of numerical functions (often Bell Curves, or functions that combine into other, similar, simple shapes) are being moved around fields of numbers, to produce intricately linked mathematical ‘atoms’ of emergent representations.

The main thing that brains and connectionist algorithms share in common is opacity, but it would be a huge mistake to think that the algorithms have reached the complexity of brains. The largest known deep learning networks have something like 1010 parameters (the numerical weights in those matrices).19 Most have far fewer, but to get a feel for their maximum size, consider that the length of a year is around 1010 milliseconds. To imagine how much larger and more complex the human brain is than the largest deep learning neural network, imagine stretching each of the milliseconds in that year to be the length of a year themselves. That’s the comparable size of the brain (1020), if we just compare the number of real neural synapses to the number of numbers in the biggest ever deep learning connectionist algorithm.

New York: Wiley. 16Ironically, while Babbage’s key innovation of introducing Jacquard loom cards were included in most tabulating machines from the time IBM standardized them in 1928, they only began to be widely used to program general-purpose computers just before Rosenblatt developed the Mark I Perceptron. 17Unfortunately, Turing’s boss, NPL Director Sir Charles Galton Darwin (named for his grandfather Charles Darwin and great-uncle Francis Galton), dismissed the paper as a ‘schoolboy essay’, and it was only published posthumously, in 1968, 14 years after Turing was driven to suicide by the British authorities’ persecution of his crime of homosexuality. 18M. Minsky and S. Papert, 1969, Perceptrons: An Introduction to Computational Geometry. Cambridge, MA: MIT Press. 19Jeremy Hsu, 2015, Biggest Neural Network Ever Pushes AI Deep Learning. IEEE Spectrum, https://spectrum.ieee.org/tech-talk/computing/software/biggest-neural-network-ever-pushes-ai-deep-learning 20Esther M. Sternberg, 2001, The Balance Within: The Science Connecting Health and Emotions. Times Books. 21Antonio Damasio, 2018, The Strange Order of Things: Life, Feeling, and the Making of Cultures. New York: Pantheon. 22Antonio Damasio and Gil Carvalho, 2013, The Nature of Feelings: Evolutionary and Neurobiological Origins.


pages: 208 words: 57,602

Futureproof: 9 Rules for Humans in the Age of Automation by Kevin Roose

"World Economic Forum" Davos, adjacent possible, Airbnb, Albert Einstein, algorithmic bias, algorithmic management, Alvin Toffler, Amazon Web Services, Atul Gawande, augmented reality, automated trading system, basic income, Bayesian statistics, Big Tech, big-box store, Black Lives Matter, business process, call centre, choice architecture, coronavirus, COVID-19, data science, deep learning, deepfake, DeepMind, disinformation, Elon Musk, Erik Brynjolfsson, factory automation, fake news, fault tolerance, Frederick Winslow Taylor, Freestyle chess, future of work, Future Shock, Geoffrey Hinton, George Floyd, gig economy, Google Hangouts, GPT-3, hiring and firing, hustle culture, hype cycle, income inequality, industrial robot, Jeff Bezos, job automation, John Markoff, Kevin Roose, knowledge worker, Kodak vs Instagram, labor-force participation, lockdown, Lyft, mandatory minimum, Marc Andreessen, Mark Zuckerberg, meta-analysis, Narrative Science, new economy, Norbert Wiener, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, off-the-grid, OpenAI, pattern recognition, planetary scale, plutocrats, Productivity paradox, QAnon, recommendation engine, remote working, risk tolerance, robotic process automation, scientific management, Second Machine Age, self-driving car, Shoshana Zuboff, Silicon Valley, Silicon Valley startup, social distancing, Steve Jobs, Stuart Kauffman, surveillance capitalism, tech worker, The Future of Employment, The Wealth of Nations by Adam Smith, TikTok, Travis Kalanick, Uber and Lyft, uber lyft, universal basic income, warehouse robotics, Watson beat the top human players on Jeopardy!, work culture

Didn’t their fears always end up being overblown? Several years ago, when I started as a tech columnist for the Times, most of what I heard about AI mirrored my own optimistic views. I met with start-up founders and engineers in Silicon Valley who showed me how new advances in fields like deep learning were helping them build all kinds of world-improving tools: algorithms that could increase farmers’ crop yields, software that would help hospitals run more efficiently, self-driving cars that could shuttle us around while we took naps and watched Netflix. This was the euphoric peak of the AI hype cycle, a time when all of the American tech giants—Google, Facebook, Apple, Amazon, Microsoft—were pouring billions of dollars into developing new AI products and shoving machine learning algorithms into as many of their apps as possible.

A 2019 report by Wells Fargo estimated that as many as two hundred thousand finance employees will lose their jobs over the next decade, thanks to tools like these. Medicine is undergoing a machine makeover, as AI learns to do much of the work that used to require trained human specialists. In 2018, a Chinese tech company built a deep learning algorithm that diagnosed brain cancer and other diseases faster and more accurately than a team of fifteen top doctors. The same year, American researchers developed an algorithm capable of identifying malignant tumors on a CT scan with an error rate twenty times lower than a human radiologist.

In fact, several companies are already using AI to generate fashion designs. In 2017, an Amazon research team developed a machine learning algorithm that analyzes images of garments in a particular style and learns to generate new garments in that style. Glitch, an AI fashion company started by two MIT graduates, sells pieces that are entirely designed by deep learning algorithms. Will AI spare all TSA agents, or replace all fashion designers? Of course not. But the fallout from automation probably won’t be as tidy as watching some occupations go extinct while others survive without a scratch. * * * — In short, what I should have told the executives at the fancy dinner was that they were asking the wrong question.


pages: 396 words: 117,149

The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World by Pedro Domingos

Albert Einstein, Amazon Mechanical Turk, Arthur Eddington, backpropagation, basic income, Bayesian statistics, Benoit Mandelbrot, bioinformatics, Black Swan, Brownian motion, cellular automata, Charles Babbage, Claude Shannon: information theory, combinatorial explosion, computer vision, constrained optimization, correlation does not imply causation, creative destruction, crowdsourcing, Danny Hillis, data is not the new oil, data is the new oil, data science, deep learning, DeepMind, double helix, Douglas Hofstadter, driverless car, Erik Brynjolfsson, experimental subject, Filter Bubble, future of work, Geoffrey Hinton, global village, Google Glasses, Gödel, Escher, Bach, Hans Moravec, incognito mode, information retrieval, Jeff Hawkins, job automation, John Markoff, John Snow's cholera map, John von Neumann, Joseph Schumpeter, Kevin Kelly, large language model, lone genius, machine translation, mandelbrot fractal, Mark Zuckerberg, Moneyball by Michael Lewis explains big data, Narrative Science, Nate Silver, natural language processing, Netflix Prize, Network effects, Nick Bostrom, NP-complete, off grid, P = NP, PageRank, pattern recognition, phenotype, planetary scale, power law, pre–internet, random walk, Ray Kurzweil, recommendation engine, Richard Feynman, scientific worldview, Second Machine Age, self-driving car, Silicon Valley, social intelligence, speech recognition, Stanford marshmallow experiment, statistical model, Stephen Hawking, Steven Levy, Steven Pinker, superintelligent machines, the long tail, the scientific method, The Signal and the Noise by Nate Silver, theory of mind, Thomas Bayes, transaction costs, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, white flight, yottabyte, zero-sum game

Learning Deep Architectures for AI,* by Yoshua Bengio (Now, 2009), is a brief introduction to deep learning. The problem of error signal diffusion in backprop is described in “Learning long-term dependencies with gradient descent is difficult,”* by Yoshua Bengio, Patrice Simard, and Paolo Frasconi (IEEE Transactions on Neural Networks, 1994). “How many computers to identify a cat? 16,000,” by John Markoff (New York Times, 2012), reports on the Google Brain project and its results. Convolutional neural networks, the current deep learning champion, are described in “Gradient-based learning applied to document recognition,”* by Yann LeCun, Léon Bottou, Yoshua Bengio, and Patrick Haffner (Proceedings of the IEEE, 1998).

Hinton, a psychologist turned computer scientist and great-great-grandson of George Boole, the inventor of the logical calculus used in all digital computers, is the world’s leading connectionist. He has tried longer and harder to understand how the brain works than anyone else. He tells of coming home from work one day in a state of great excitement, exclaiming “I did it! I’ve figured out how the brain works!” His daughter replied, “Oh, Dad, not again!” Hinton’s latest passion is deep learning, which we’ll meet later in this chapter. He was also involved in the development of backpropagation, an even better algorithm than Boltzmann machines for solving the credit-assignment problem that we’ll look at next. Boltzmann machines could solve the credit-assignment problem in principle, but in practice learning was very slow and painful, making this approach impractical for most applications.

A hard core of connectionists soldiered on, but by and large the attention of the machine-learning field moved elsewhere. (We’ll survey those lands in Chapters 6 and 7.) Today, however, connectionism is resurgent. We’re learning deeper networks than ever before, and they’re setting new standards in vision, speech recognition, drug discovery, and other areas. The new field of deep learning is on the front page of the New York Times. Look under the hood, and . . . surprise: it’s the trusty old backprop engine, still humming. What changed? Nothing much, say the critics: just faster computers and bigger data. To which Hinton and others reply: exactly, we were right all along! In truth, connectionists have made genuine progress.


pages: 561 words: 157,589

WTF?: What's the Future and Why It's Up to Us by Tim O'Reilly

"Friedman doctrine" OR "shareholder theory", 4chan, Affordable Care Act / Obamacare, Airbnb, AlphaGo, Alvin Roth, Amazon Mechanical Turk, Amazon Robotics, Amazon Web Services, AOL-Time Warner, artificial general intelligence, augmented reality, autonomous vehicles, barriers to entry, basic income, behavioural economics, benefit corporation, Bernie Madoff, Bernie Sanders, Bill Joy: nanobots, bitcoin, Blitzscaling, blockchain, book value, Bretton Woods, Brewster Kahle, British Empire, business process, call centre, Capital in the Twenty-First Century by Thomas Piketty, Captain Sullenberger Hudson, carbon tax, Carl Icahn, Chuck Templeton: OpenTable:, Clayton Christensen, clean water, cloud computing, cognitive dissonance, collateralized debt obligation, commoditize, computer vision, congestion pricing, corporate governance, corporate raider, creative destruction, CRISPR, crowdsourcing, Danny Hillis, data acquisition, data science, deep learning, DeepMind, Demis Hassabis, Dennis Ritchie, deskilling, DevOps, Didi Chuxing, digital capitalism, disinformation, do well by doing good, Donald Davies, Donald Trump, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, fake news, Filter Bubble, Firefox, Flash crash, Free Software Foundation, fulfillment center, full employment, future of work, George Akerlof, gig economy, glass ceiling, Glass-Steagall Act, Goodhart's law, Google Glasses, Gordon Gekko, gravity well, greed is good, Greyball, Guido van Rossum, High speed trading, hiring and firing, Home mortgage interest deduction, Hyperloop, income inequality, independent contractor, index fund, informal economy, information asymmetry, Internet Archive, Internet of things, invention of movable type, invisible hand, iterative process, Jaron Lanier, Jeff Bezos, jitney, job automation, job satisfaction, John Bogle, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Zimmer (Lyft cofounder), Kaizen: continuous improvement, Ken Thompson, Kevin Kelly, Khan Academy, Kickstarter, Kim Stanley Robinson, knowledge worker, Kodak vs Instagram, Lao Tzu, Larry Ellison, Larry Wall, Lean Startup, Leonard Kleinrock, Lyft, machine readable, machine translation, Marc Andreessen, Mark Zuckerberg, market fundamentalism, Marshall McLuhan, McMansion, microbiome, microservices, minimum viable product, mortgage tax deduction, move fast and break things, Network effects, new economy, Nicholas Carr, Nick Bostrom, obamacare, Oculus Rift, OpenAI, OSI model, Overton Window, packet switching, PageRank, pattern recognition, Paul Buchheit, peer-to-peer, peer-to-peer model, Ponzi scheme, post-truth, race to the bottom, Ralph Nader, randomized controlled trial, RFC: Request For Comment, Richard Feynman, Richard Stallman, ride hailing / ride sharing, Robert Gordon, Robert Metcalfe, Ronald Coase, Rutger Bregman, Salesforce, Sam Altman, school choice, Second Machine Age, secular stagnation, self-driving car, SETI@home, shareholder value, Silicon Valley, Silicon Valley startup, skunkworks, Skype, smart contracts, Snapchat, Social Responsibility of Business Is to Increase Its Profits, social web, software as a service, software patent, spectrum auction, speech recognition, Stephen Hawking, Steve Ballmer, Steve Jobs, Steven Levy, Stewart Brand, stock buybacks, strong AI, synthetic biology, TaskRabbit, telepresence, the built environment, the Cathedral and the Bazaar, The future is already here, The Future of Employment, the map is not the territory, The Nature of the Firm, The Rise and Fall of American Growth, The Wealth of Nations by Adam Smith, Thomas Davenport, Tony Fadell, Tragedy of the Commons, transaction costs, transcontinental railway, transportation-network company, Travis Kalanick, trickle-down economics, two-pizza team, Uber and Lyft, Uber for X, uber lyft, ubercab, universal basic income, US Airways Flight 1549, VA Linux, warehouse automation, warehouse robotics, Watson beat the top human players on Jeopardy!, We are the 99%, web application, Whole Earth Catalog, winner-take-all economy, women in the workforce, Y Combinator, yellow journalism, zero-sum game, Zipcar

According to Google, RankBrain’s opinion has become the third most important among the more than two hundred factors that it uses to rank pages. Google has also applied deep learning to language translation. The results were so startlingly better that after a few months of testing, the team stopped all work on the old Google Translate system discussed earlier in this chapter and replaced it entirely with the new one based on deep learning. It isn’t yet quite as good as human translators, but it’s close, at least for everyday functional use, though perhaps not for literary purposes. Deep learning is also used in Google Photos. If you have tried Google Photos, you’ve seen how it can recognize objects in your photos.

See “We shape our tools and thereafter our tools shape us,” McLuhan Galaxy, April 1, 2013, https://mcluhan galaxy.wordpress.com/2013/04/01/we-shape-our-tools-and-thereafter-our-tools-shape-us/. 165 “what a typical Deep Learning system is”: Lee Gomes, “Facebook AI Director Yann LeCun on His Quest to Unleash Deep Learning and Make Machines Smarter,” IEEE Spectrum, February 28, 2015, http://spectrum.ieee.org/automaton/robotics/artificial-intelligence/face book-ai-director-yann-lecun-on-deep-learning. 165 “can’t run it faster than real time”: Yann LeCun, Facebook post, December 5, 2016, retrieved March 31, 2017, https://m. facebook.com/story.php?story_fbid =10154017359117143&id=722677142. 165 third most important: Sullivan, “FAQ: All About the Google RankBrain Algorithm.” 165 stopped all work on the old Google Translate system: Gideon Lewis-Kraus, “The Great A.I.

The trick is to figure out in which direction to tweak each knob and by how much without actually fiddling with them. This involves computing a “gradient,” which for each knob indicates how the light changes when the knob is tweaked. Now, imagine a box with 500 million knobs, 1,000 light bulbs, and 10 million images to train it with. That’s what a typical Deep Learning system is. Deep learning uses layers of recognizers. Before you can recognize a dog, you have to be able to recognize shapes. Before you can recognize shapes, you have to be able to recognize edges, so that you can distinguish a shape from its background. These successive stages of recognition each produce a compressed mathematical representation that is passed up to the next layer.


The Ethical Algorithm: The Science of Socially Aware Algorithm Design by Michael Kearns, Aaron Roth

23andMe, affirmative action, algorithmic bias, algorithmic trading, Alignment Problem, Alvin Roth, backpropagation, Bayesian statistics, bitcoin, cloud computing, computer vision, crowdsourcing, data science, deep learning, DeepMind, Dr. Strangelove, Edward Snowden, Elon Musk, fake news, Filter Bubble, general-purpose programming language, Geoffrey Hinton, Google Chrome, ImageNet competition, Lyft, medical residency, Nash equilibrium, Netflix Prize, p-value, Pareto efficiency, performance metric, personalized medicine, pre–internet, profit motive, quantitative trading / quantitative finance, RAND corporation, recommendation engine, replication crisis, ride hailing / ride sharing, Robert Bork, Ronald Coase, self-driving car, short selling, sorting algorithm, sparse data, speech recognition, statistical model, Stephen Hawking, superintelligent machines, TED Talk, telemarketer, Turing machine, two-sided market, Vilfredo Pareto

The technical name for the algorithmic framework we have been describing is a generative adversarial network (GAN), and the approach we’ve outlined above indeed seems to be highly effective: GANs are an important component of the collection of techniques known as deep learning, which has resulted in qualitative improvements in machine learning for image classification, speech recognition, automatic natural language translation, and many other fundamental problems. (The Turing Award, widely considered the Nobel Prize of computer science, was recently awarded to Yoshua Bengio, Geoffrey Hinton, and Yann LeCun for their pioneering contributions to deep learning.) Fig. 21. Synthetic cat images created by a generative adversarial network (GAN), from https://ajolicoeur.wordpress.com/cats.

To see a particularly egregious example, let’s go back to 2015, when the market for machine learning talent was heating up. The techniques of deep learning had recently reemerged from relative obscurity (its previous incarnation was called backpropagation in neural networks, which we discussed in the introduction), delivering impressive results in computer vision and image recognition. But there weren’t yet very many experts who were good at training these algorithms—which was still more of a black art, or perhaps an artisanal craft, than a science. The result was that deep learning experts were commanding salaries and signing bonuses once reserved for Wall Street.

But money alone wasn’t enough to recruit talent—top researchers want to work where other top researchers are—so it was important for AI labs that wanted to recruit premium talent to be viewed as places that were already on the cutting edge. In the United States, this included research labs at companies such as Google and Facebook. One way to do this was to beat the big players in a high-profile competition. The ImageNet competition was perfect—focused on exactly the kind of vision task for which deep learning was making headlines. The contest required each team’s computer program to classify the objects in images into a thousand different and highly specific categories, including “frilled lizard,” “banded gecko,” “oscilloscope,” and “reflex camera.” Each team could train their algorithm on a set of 1.5 million images that the competition organizers made available to all participants.


pages: 347 words: 97,721

Only Humans Need Apply: Winners and Losers in the Age of Smart Machines by Thomas H. Davenport, Julia Kirby

"World Economic Forum" Davos, AI winter, Amazon Robotics, Andy Kessler, Apollo Guidance Computer, artificial general intelligence, asset allocation, Automated Insights, autonomous vehicles, basic income, Baxter: Rethink Robotics, behavioural economics, business intelligence, business process, call centre, carbon-based life, Clayton Christensen, clockwork universe, commoditize, conceptual framework, content marketing, dark matter, data science, David Brooks, deep learning, deliberate practice, deskilling, digital map, disruptive innovation, Douglas Engelbart, driverless car, Edward Lloyd's coffeehouse, Elon Musk, Erik Brynjolfsson, estate planning, financial engineering, fixed income, flying shuttle, follow your passion, Frank Levy and Richard Murnane: The New Division of Labor, Freestyle chess, game design, general-purpose programming language, global pandemic, Google Glasses, Hans Lippershey, haute cuisine, income inequality, independent contractor, index fund, industrial robot, information retrieval, intermodal, Internet of things, inventory management, Isaac Newton, job automation, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Joi Ito, Khan Academy, Kiva Systems, knowledge worker, labor-force participation, lifelogging, longitudinal study, loss aversion, machine translation, Mark Zuckerberg, Narrative Science, natural language processing, Nick Bostrom, Norbert Wiener, nuclear winter, off-the-grid, pattern recognition, performance metric, Peter Thiel, precariat, quantitative trading / quantitative finance, Ray Kurzweil, Richard Feynman, risk tolerance, Robert Shiller, robo advisor, robotic process automation, Rodney Brooks, Second Machine Age, self-driving car, Silicon Valley, six sigma, Skype, social intelligence, speech recognition, spinning jenny, statistical model, Stephen Hawking, Steve Jobs, Steve Wozniak, strong AI, superintelligent machines, supply-chain management, tacit knowledge, tech worker, TED Talk, the long tail, transaction costs, Tyler Cowen, Tyler Cowen: Great Stagnation, Watson beat the top human players on Jeopardy!, Works Progress Administration, Zipcar

Determining the meaning and significance of these has always been the province of human beings—and a key aspect of human cognition. But now a wide variety of tools are capable of it, too. Words are increasingly “understood”—counted, classified, interpreted, predicted, etc.—through technologies such as machine learning, natural language processing, neural networks, deep learning, and so forth. Some of the same technologies are being used to analyze and identify images. Humans are still better able to make subjective judgments on unstructured data, such as interpreting the meaning of a poem, or distinguishing between images of good neighborhoods and bad ones. But computers are making headway even on these fronts.

Machine learning and neural network analysis is the most promising technology for this application. One branch of machine learning, for example, is particularly well suited to analyzing data in multiple dimensions. Images and video are an example of this type of data—any individual pixel has x and y coordinates, color, intensity, and in videos, time. “Deep learning” neural network approaches have been developed to deal with data in multiple dimensions; the “deep” refers not to “profound,” but rather to a hierarchy of dimensions in the data. It’s this technology that is letting Google engineers identify photos of cats on the Internet. Although it’s difficult to imagine more important tasks than that, perhaps in the near future it will let smart machines watch video taken by drones and security cameras and determine whether something bad is happening.

There are already, as we have discussed, systems to understand text and speech, systems to engage in intelligent Q&A with humans, and systems to recognize a variety of images. It’s just that they’re not yet embedded in the brain of a robot. Jim Lawton, the head of products at Rethink Robotics, commented to us in an interview: “An important area of experimentation today is around the intersection of collaborative robots, big data, and deep learning. The goal is to combine automation of physical tasks and cognitive tasks. For example, a robot could start combining all the information about how much torque is applied in a screw. Robots are, after all, a big bucket of sensors. A truly intelligent robot could begin to see what works in terms of how much torque in a screw leads to field failures.


pages: 447 words: 111,991

Exponential: How Accelerating Technology Is Leaving Us Behind and What to Do About It by Azeem Azhar

"Friedman doctrine" OR "shareholder theory", "World Economic Forum" Davos, 23andMe, 3D printing, A Declaration of the Independence of Cyberspace, Ada Lovelace, additive manufacturing, air traffic controllers' union, Airbnb, algorithmic management, algorithmic trading, Amazon Mechanical Turk, autonomous vehicles, basic income, Berlin Wall, Bernie Sanders, Big Tech, Bletchley Park, Blitzscaling, Boeing 737 MAX, book value, Boris Johnson, Bretton Woods, carbon footprint, Chris Urmson, Citizen Lab, Clayton Christensen, cloud computing, collective bargaining, computer age, computer vision, contact tracing, contact tracing app, coronavirus, COVID-19, creative destruction, crowdsourcing, cryptocurrency, cuban missile crisis, Daniel Kahneman / Amos Tversky, data science, David Graeber, David Ricardo: comparative advantage, decarbonisation, deep learning, deglobalization, deindustrialization, dematerialisation, Demis Hassabis, Diane Coyle, digital map, digital rights, disinformation, Dissolution of the Soviet Union, Donald Trump, Double Irish / Dutch Sandwich, drone strike, Elon Musk, emotional labour, energy security, Fairchild Semiconductor, fake news, Fall of the Berlin Wall, Firefox, Frederick Winslow Taylor, fulfillment center, future of work, Garrett Hardin, gender pay gap, general purpose technology, Geoffrey Hinton, gig economy, global macro, global pandemic, global supply chain, global value chain, global village, GPT-3, Hans Moravec, happiness index / gross national happiness, hiring and firing, hockey-stick growth, ImageNet competition, income inequality, independent contractor, industrial robot, intangible asset, Jane Jacobs, Jeff Bezos, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Perry Barlow, Just-in-time delivery, Kickstarter, Kiva Systems, knowledge worker, Kodak vs Instagram, Law of Accelerating Returns, lockdown, low skilled workers, lump of labour, Lyft, manufacturing employment, Marc Benioff, Mark Zuckerberg, megacity, Mitch Kapor, Mustafa Suleyman, Network effects, new economy, NSO Group, Ocado, offshore financial centre, OpenAI, PalmPilot, Panopticon Jeremy Bentham, Peter Thiel, Planet Labs, price anchoring, RAND corporation, ransomware, Ray Kurzweil, remote working, RFC: Request For Comment, Richard Florida, ride hailing / ride sharing, Robert Bork, Ronald Coase, Ronald Reagan, Salesforce, Sam Altman, scientific management, Second Machine Age, self-driving car, Shoshana Zuboff, Silicon Valley, Social Responsibility of Business Is to Increase Its Profits, software as a service, Steve Ballmer, Steve Jobs, Stuxnet, subscription business, synthetic biology, tacit knowledge, TaskRabbit, tech worker, The Death and Life of Great American Cities, The Future of Employment, The Nature of the Firm, Thomas Malthus, TikTok, Tragedy of the Commons, Turing machine, Uber and Lyft, Uber for X, uber lyft, universal basic income, uranium enrichment, vertical integration, warehouse automation, winner-take-all economy, workplace surveillance , Yom Kippur War

AlexNet had a success rate as high as 87 per cent. Deep learning worked. The triumph of deep learning sparked an AI feeding frenzy. Scientists rushed to build artificial intelligence systems, applying deep neural networks and their derivatives to a vast array of problems: from spotting manufacturing defects to translating between languages; from voice recognition to detecting credit card fraud; from discovering new medicines to recommending the next video we should watch. Investors opened their pocketbooks eagerly to back these inventors. In short order, deep learning was everywhere. As a result, neural networks demanded increasing amounts of data and processing power.

By 2010, Moore’s Law had resulted in enough power to facilitate a new kind of machine learning, ‘deep learning’, which involved creating layers of artificial neurons modelled on the cells that underpin human brains. These ‘neural networks’ had long been heralded as the next big thing in AI. Yet they had been stymied by a lack of computational power. Not any more, however. In 2012, a group of leading AI researchers – Alex Krizhevsky, Ilya Sutskever and Geoffrey Hinton – developed a ‘deep convolutional neural network’ which applied deep learning to the kinds of image-sorting tasks that AIs had long struggled with. It was rooted in extraordinary computing clout.

This giant crane-block battery depends on a particular mix of four, very well-understood technologies – the cranes, the building aggregate, the generator which converts the dropping of blocks to energy, and the shipping systems that let us move these things around. And then there’s a fifth, more unexpected technology: an automated ‘machine vision system’ using deep learning. Each crane has a set of cameras whose input is automatically processed by a computer. This computer controls the cranes and the lifting and placing of the blocks. It obviates any human operators – and it is the absence of a human operator that allows Energy Vault to hit a competitive price. Of course, technologies have always combined.


pages: 588 words: 131,025

The Patient Will See You Now: The Future of Medicine Is in Your Hands by Eric Topol

23andMe, 3D printing, Affordable Care Act / Obamacare, Anne Wojcicki, Atul Gawande, augmented reality, Big Tech, bioinformatics, call centre, Clayton Christensen, clean water, cloud computing, commoditize, computer vision, conceptual framework, connected car, correlation does not imply causation, creative destruction, crowdsourcing, dark matter, data acquisition, data science, deep learning, digital divide, disintermediation, disruptive innovation, don't be evil, driverless car, Edward Snowden, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Firefox, gamification, global village, Google Glasses, Google X / Alphabet X, Ignaz Semmelweis: hand washing, information asymmetry, interchangeable parts, Internet of things, Isaac Newton, it's over 9,000, job automation, Julian Assange, Kevin Kelly, license plate recognition, lifelogging, Lyft, Mark Zuckerberg, Marshall McLuhan, meta-analysis, microbiome, Nate Silver, natural language processing, Network effects, Nicholas Carr, obamacare, pattern recognition, personalized medicine, phenotype, placebo effect, quantum cryptography, RAND corporation, randomized controlled trial, Salesforce, Second Machine Age, self-driving car, Silicon Valley, Skype, smart cities, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, Snapchat, social graph, speech recognition, stealth mode startup, Steve Jobs, synthetic biology, the scientific method, The Signal and the Noise by Nate Silver, The Wealth of Nations by Adam Smith, traumatic brain injury, Turing test, Uber for X, uber lyft, Watson beat the top human players on Jeopardy!, WikiLeaks, X Prize

Underlying its predictive capabilities was quite a portfolio of machine learning systems, including Bayesian nets, Markov chains, support vector machine algorithms, and genetic algorithms.33 I won’t go into any more depth; my brain is not smart enough to understand it all, and fortunately it’s not particularly relevant to where we are going here. Another subtype of AI and machine learning,2,20,34–48 known as deep learning, has deep importance to medicine. Deep learning is behind Siri’s ability to decode speech as well as Google Brain experiments to recognize images. Researchers at Google X extracted ten million still images from YouTube videos and fed them to the network of one thousand computers to see what the Brain, with its one million simulated neurons and one billion simulated synapses would come up with on its own.35,36 The answer—cats.

M. van Rijmenam, “How Machine Learning Could Result in Great Applications for Your Business,” Big Data-Startups Blog, January 10, 2014, http://www.bigdata-startups.com/machine-learning-result-great-applications-business/. 35. N. Jones, “The Learning Machines,” Nature 505 (2014): 146–148. 36. J. Markoff, “Scientists See Promise in Deep-Learning Programs,” New York Times, November 24, 2012, http://www.nytimes.com/2012/11/24/science/scientists-see-advances-in-deep-learning-a-part-of-artificial-intelligence.html. 37. “Don’t Be Evil, Genius,” The Economist, February 1, 2014, http://www.economist.com/node/21595462/print. 38. J. Pearson, “Superintelligent AI Could Wipe Out Humanity, If We’re Not Ready for It,” Motherboard, April 23, 2014, http://motherboard.vice.com/read/super-intelligent-ai-could-wipe-out-humanity-if-were-not-ready-for-it. 39.

McAfee, “The Dawn of the Age of Artificial Intelligence,” The Atlantic, February 2014, http://www.theatlantic.com/business/print/2014/02/the-dawn-of-the-age-of-artificial-intelligence/283730/. 40. S. Schneider, “The Philosophy of ‘Her,’” New York Times, March 2, 2014, http://opinionator.blogs.nytimes.com/2014/03/02/the-philosophy-of-her/?ref=opinion. 41. A. Vance, “The Race to Buy the Human Brains Behind Deep Learning Machines,” Bloomberg Businessweek, January 27, 2014, http://www.businessweek.com/printer/articles/180155-the-race-to-buy-the-human-brains-behind-deep-learning-machines. 42. G. Satell, “Why the Future of Technology Is All Too Human,” Forbes, February 23, 2014, http://www.forbes.com/sites/gregsatell/2014/02/23/why-the-future-of-technology-is-all-too-human/. 43. D. Auerbach, “A.I.


Virtual Competition by Ariel Ezrachi, Maurice E. Stucke

"World Economic Forum" Davos, Airbnb, Alan Greenspan, Albert Einstein, algorithmic management, algorithmic trading, Arthur D. Levinson, barriers to entry, behavioural economics, cloud computing, collaborative economy, commoditize, confounding variable, corporate governance, crony capitalism, crowdsourcing, Daniel Kahneman / Amos Tversky, David Graeber, deep learning, demand response, Didi Chuxing, digital capitalism, disintermediation, disruptive innovation, double helix, Downton Abbey, driverless car, electricity market, Erik Brynjolfsson, Evgeny Morozov, experimental economics, Firefox, framing effect, Google Chrome, independent contractor, index arbitrage, information asymmetry, interest rate derivative, Internet of things, invisible hand, Jean Tirole, John Markoff, Joseph Schumpeter, Kenneth Arrow, light touch regulation, linked data, loss aversion, Lyft, Mark Zuckerberg, market clearing, market friction, Milgram experiment, multi-sided market, natural language processing, Network effects, new economy, nowcasting, offshore financial centre, pattern recognition, power law, prediction markets, price discrimination, price elasticity of demand, price stability, profit maximization, profit motive, race to the bottom, rent-seeking, Richard Thaler, ride hailing / ride sharing, road to serfdom, Robert Bork, Ronald Reagan, search costs, self-driving car, sharing economy, Silicon Valley, Skype, smart cities, smart meter, Snapchat, social graph, Steve Jobs, sunk-cost fallacy, supply-chain management, telemarketer, The Chicago School, The Myth of the Rational Market, The Wealth of Nations by Adam Smith, too big to fail, transaction costs, Travis Kalanick, turn-by-turn navigation, two-sided market, Uber and Lyft, Uber for X, uber lyft, vertical integration, Watson beat the top human players on Jeopardy!, women in the workforce, yield management

Antonio Regalado, “Is Google Cornering the Market on Deep Learning?” MIT Technology Review, January 29, 2014, http://www.technologyreview.com /news/524026/is-google-cornering-the-market-on-deep-learning/; Nicola Jones, “Computer Science: The Learning Machines,” Nature, January 8, 2014, http://www.nature.com/news/computer-science-the-learning-machines -1.14481. European Data Protection Supervisor, Towards a New Digital Ethics: Data, Dignity and Technology, Opinion 4/2015 (September 11, 2015), 9. Robert D. Hof, “Deep Learning,” MIT Technology Review, April 23, 2013, http://www.technologyreview.com/featuredstory/513696/deep-learning/. Tereza Pultarova, “Jaguar Land Rover to Lead Driverless Car Research,” E&T (October 9, 2015), http://eandt.theiet.org /news/2015/oct /jaguar-land -rover-driverless-cars.cfm; David Talbot, “CES 2015: Nvidia Demos a Car Computer Trained with ‘Deep Learning,’ ” MIT Technology Review, January 6, 2015), http://www.technologyreview.com /news/533936/ces-2015 -nvidia-demos-a-car-computer-trained-with-deep-learning /; David Levitin, 2015.

Recent years have witnessed groundbreaking research and progress in the design and development of smart, self-learning algorithms to assist in pricing decisions, planning, trade, and logistics. The field has attracted significant investment in deep learning by leading market players.38 In 2011, International Business Machines Corp.’s Jeopardy!-winning Watson computer showcased the power of its deep-learning techniques, which enabled the computer to optimize its strategy following trials and feedback.39 Since then, IBM has invested in widening the capacity and functionality of the technology, with the aim of making it “the equivalent of a computing operating system for an emerging class of data-fueled artificialintelligence applications.”40 Recently, the launch of the Deep Q network by Google showcased enhanced self-learning capacity.

Tereza Pultarova, “Jaguar Land Rover to Lead Driverless Car Research,” E&T (October 9, 2015), http://eandt.theiet.org /news/2015/oct /jaguar-land -rover-driverless-cars.cfm; David Talbot, “CES 2015: Nvidia Demos a Car Computer Trained with ‘Deep Learning,’ ” MIT Technology Review, January 6, 2015), http://www.technologyreview.com /news/533936/ces-2015 -nvidia-demos-a-car-computer-trained-with-deep-learning /; David Levitin, 2015. “The Sum of Human Knowledge,” Wall Street Journal, September 18, 2015, http://www.wsj.com /articles/the-sum-of-human -knowledge-1442610803. Lohr, “IBM’s AI System Watson to Get Second Home.” European Data Protection Supervisor, Towards a New Digital Ethics.


pages: 371 words: 108,317

The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future by Kevin Kelly

A Declaration of the Independence of Cyberspace, Aaron Swartz, AI winter, Airbnb, Albert Einstein, Alvin Toffler, Amazon Web Services, augmented reality, bank run, barriers to entry, Baxter: Rethink Robotics, bitcoin, blockchain, book scanning, Brewster Kahle, Burning Man, cloud computing, commoditize, computer age, Computer Lib, connected car, crowdsourcing, dark matter, data science, deep learning, DeepMind, dematerialisation, Downton Abbey, driverless car, Edward Snowden, Elon Musk, Filter Bubble, Freestyle chess, Gabriella Coleman, game design, Geoffrey Hinton, Google Glasses, hive mind, Howard Rheingold, index card, indoor plumbing, industrial robot, Internet Archive, Internet of things, invention of movable type, invisible hand, Jaron Lanier, Jeff Bezos, job automation, John Markoff, John Perry Barlow, Kevin Kelly, Kickstarter, lifelogging, linked data, Lyft, M-Pesa, machine readable, machine translation, Marc Andreessen, Marshall McLuhan, Mary Meeker, means of production, megacity, Minecraft, Mitch Kapor, multi-sided market, natural language processing, Netflix Prize, Network effects, new economy, Nicholas Carr, off-the-grid, old-boy network, peer-to-peer, peer-to-peer lending, personalized medicine, placebo effect, planetary scale, postindustrial economy, Project Xanadu, recommendation engine, RFID, ride hailing / ride sharing, robo advisor, Rodney Brooks, self-driving car, sharing economy, Silicon Valley, slashdot, Snapchat, social graph, social web, software is eating the world, speech recognition, Stephen Hawking, Steven Levy, Ted Nelson, TED Talk, The future is already here, the long tail, the scientific method, transport as a service, two-sided market, Uber for X, uber lyft, value engineering, Watson beat the top human players on Jeopardy!, WeWork, Whole Earth Review, Yochai Benkler, yottabyte, zero-sum game

It can take many millions of these nodes (each one producing a calculation feeding others around it), stacked up to 15 levels high, to recognize a human face. In 2006, Geoff Hinton, then at the University of Toronto, made a key tweak to this method, which he dubbed “deep learning.” He was able to mathematically optimize results from each layer so that the learning accumulated faster as it proceeded up the stack of layers. Deep-learning algorithms accelerated enormously a few years later when they were ported to GPUs. The code of deep learning alone is insufficient to generate complex logical thinking, but it is an essential component of all current AIs, including IBM’s Watson; DeepMind, Google’s search engine; and Facebook’s algorithms.

thousand games of chess: Personal correspondence with Daylen Yang (author of the Stockfish chess app), Stefan Meyer-Kahlen (developed the multiple award-winning computer chess program Shredder), and Danny Kopec (American chess International Master and cocreator of one of the standard computer chess testing systems), September 2014. “akin to building a rocket ship”: Caleb Garling, “Andrew Ng: Why ‘Deep Learning’ Is a Mandate for Humans, Not Just Machines,” Wired, May 5, 2015. In 2006, Geoff Hinton: Kate Allen, “How a Toronto Professor’s Research Revolutionized Artificial Intelligence,” Toronto Star, April 17, 2015. he dubbed “deep learning”: Yann LeCun, Yoshua Bengio, and Geoffrey Hinton, “Deep Learning,” Nature 521, no. 7553 (2015): 436–44. the network effect: Carl Shapiro and Hal R. Varian, Information Rules: A Strategic Guide to the Network Economy (Boston: Harvard Business Review Press, 1998).

Every time you type a query, click on a search-generated link, or create a link on the web, you are training the Google AI. When you type “Easter Bunny” into the image search bar and then click on the most Easter Bunny–looking image, you are teaching the AI what an Easter Bunny looks like. Each of the 3 billion queries that Google conducts each day tutors the deep-learning AI over and over again. With another 10 years of steady improvements to its AI algorithms, plus a thousandfold more data and a hundred times more computing resources, Google will have an unrivaled AI. In a quarterly earnings conference call in the fall of 2015, Google CEO Sundar Pichai stated that AI was going to be “a core transformative way by which we are rethinking everything we are doing. . . .


pages: 413 words: 119,587

Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots by John Markoff

A Declaration of the Independence of Cyberspace, AI winter, airport security, Andy Rubin, Apollo 11, Apple II, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, backpropagation, basic income, Baxter: Rethink Robotics, Bill Atkinson, Bill Duvall, bioinformatics, Boston Dynamics, Brewster Kahle, Burning Man, call centre, cellular automata, Charles Babbage, Chris Urmson, Claude Shannon: information theory, Clayton Christensen, clean water, cloud computing, cognitive load, collective bargaining, computer age, Computer Lib, computer vision, crowdsourcing, Danny Hillis, DARPA: Urban Challenge, data acquisition, Dean Kamen, deep learning, DeepMind, deskilling, Do you want to sell sugared water for the rest of your life?, don't be evil, Douglas Engelbart, Douglas Engelbart, Douglas Hofstadter, Dr. Strangelove, driverless car, dual-use technology, Dynabook, Edward Snowden, Elon Musk, Erik Brynjolfsson, Evgeny Morozov, factory automation, Fairchild Semiconductor, Fillmore Auditorium, San Francisco, From Mathematics to the Technologies of Life and Death, future of work, Galaxy Zoo, General Magic , Geoffrey Hinton, Google Glasses, Google X / Alphabet X, Grace Hopper, Gunnar Myrdal, Gödel, Escher, Bach, Hacker Ethic, Hans Moravec, haute couture, Herbert Marcuse, hive mind, hype cycle, hypertext link, indoor plumbing, industrial robot, information retrieval, Internet Archive, Internet of things, invention of the wheel, Ivan Sutherland, Jacques de Vaucanson, Jaron Lanier, Jeff Bezos, Jeff Hawkins, job automation, John Conway, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Perry Barlow, John von Neumann, Kaizen: continuous improvement, Kevin Kelly, Kiva Systems, knowledge worker, Kodak vs Instagram, labor-force participation, loose coupling, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, medical residency, Menlo Park, military-industrial complex, Mitch Kapor, Mother of all demos, natural language processing, Neil Armstrong, new economy, Norbert Wiener, PageRank, PalmPilot, pattern recognition, Philippa Foot, pre–internet, RAND corporation, Ray Kurzweil, reality distortion field, Recombinant DNA, Richard Stallman, Robert Gordon, Robert Solow, Rodney Brooks, Sand Hill Road, Second Machine Age, self-driving car, semantic web, Seymour Hersh, shareholder value, side project, Silicon Valley, Silicon Valley startup, Singularitarianism, skunkworks, Skype, social software, speech recognition, stealth mode startup, Stephen Hawking, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Strategic Defense Initiative, strong AI, superintelligent machines, tech worker, technological singularity, Ted Nelson, TED Talk, telemarketer, telepresence, telepresence robot, Tenerife airport disaster, The Coming Technological Singularity, the medium is the message, Thorstein Veblen, Tony Fadell, trolley problem, Turing test, Vannevar Bush, Vernor Vinge, warehouse automation, warehouse robotics, Watson beat the top human players on Jeopardy!, We are as Gods, Whole Earth Catalog, William Shockley: the traitorous eight, zero-sum game

As a consultant Hinton had introduced the deep learning neural net approach early on at Microsoft, and he was vindicated in 2012, when Microsoft’s head of research Richard Rashid gave a lecture in a vast auditorium in Tianjin, China. As the research executive spoke in English he paused after each sentence, which was then immediately translated by software into spoken Chinese in a simulation of his own voice. At the end of the talk, there was silence and then stunned applause from the audience. The demonstration hadn’t been perfect, but by adding deep learning algorithm techniques the company had adopted from Hinton’s research, it had been able to reduce recognition errors by more than 30 percent.

Thinking of themselves as the “three musketeers,” Hinton, LeCun, and Bengio set out to change that. Beginning in 2004 they embarked on a “conspiracy”—in LeCun’s words—to boost the popularity of the networks, complete with a rebranding campaign offering more alluring concepts of the technology such as “deep learning” and “deep belief nets.” LeCun had by this time moved to New York University, partly for closer ties with neuroscientists and with researchers applying machine-learning algorithms to the problem of vision. Hinton approached a Canadian foundation, the Canadian Institute for Advanced Research, for support to organize a research effort in the field and to hold several workshops each year.

It had a broad portfolio of research projects, stretching from Thrun’s work in autonomous cars to efforts to scale up neural networks, loosely identified as “brain” projects, evoking a new wave of AI. The Human Brain Project was initially led by Andrew Ng, who had been a colleague with Thrun at the resurrected Stanford Artificial Intelligence Laboratory. Ng was an expert in machine learning and adept in some of the deep learning neural network techniques that Hinton and LeCun had pioneered. In 2011, he began spending time at Google building a machine vision system and the following year it had matured to the point where Google researchers presented a paper on how the network performed in an unsupervised learning experiment using YouTube videos.


Text Analytics With Python: A Practical Real-World Approach to Gaining Actionable Insights From Your Data by Dipanjan Sarkar

bioinformatics, business intelligence, business logic, computer vision, continuous integration, data science, deep learning, Dr. Strangelove, en.wikipedia.org, functional programming, general-purpose programming language, Guido van Rossum, information retrieval, Internet of things, invention of the printing press, iterative process, language acquisition, machine readable, machine translation, natural language processing, out of africa, performance metric, premature optimization, recommendation engine, self-driving car, semantic web, sentiment analysis, speech recognition, statistical model, text mining, Turing test, web application

Dipanjan’s interests include learning about new technology, financial markets, disruptive startups, data science, and more recently, artificial intelligence and deep learning. In his spare time he loves reading, gaming, and watching popular sitcoms and football. About the Technical Reviewer Shanky Sharma Currently leading the AI team at Nextremer India, Shanky Sharma’s work entails implementing various AI and machine learning–related projects and working on deep learning for speech recognition in Indic languages. He hopes to grow and scale new horizons in AI and machine learning technologies. Statistics intrigue him and he loves playing with numbers, designing algorithms, and giving solutions to people.

Besides these, there are several other frameworks and libraries that are not dedicated towards text analytics but that are useful when you want to use machine learning techniques on textual data. These include the scikit-learn, numpy, and scipy stack. Besides these, deep learning and tensor-based libraries like theano, tensorflow, and keras also come in handy if you want to build advanced deep learning models based on deep neural nets, convnets, and LSTM-based models . You can install most of these libraries using the pip install <library> command from the command prompt or terminal. We will talk about any caveats if present in the upcoming chapters when we use these libraries.

Besides being a multipurpose language, the wide variety of frameworks, libraries, and platforms that have been developed by using Python and to be used for Python form a complete robust ecosystem around Python. These libraries make life easier by giving us a wide variety of capabilities and functionality to perform various tasks with minimal code. Some examples would be libraries for handling databases, text data, machine learning, signal processing, image processing, deep learning, artificial intelligence—and the list goes on. Open source: As open source, Python is actively developed and updated constantly with improvements, optimizations, and new features. Now the Python Software Foundation (PSF) owns all Python-related intellectual property (IP) and administers all license-related issues.


pages: 287 words: 95,152

The Dawn of Eurasia: On the Trail of the New World Order by Bruno Macaes

active measures, Berlin Wall, Brexit referendum, British Empire, computer vision, deep learning, Deng Xiaoping, different worldview, digital map, Donald Trump, energy security, European colonialism, eurozone crisis, failed state, Francis Fukuyama: the end of history, gentrification, geopolitical risk, global value chain, illegal immigration, intermodal, iterative process, land reform, liberal world order, Malacca Straits, mass immigration, megacity, middle-income trap, open borders, Parag Khanna, savings glut, scientific worldview, Silicon Valley, South China Sea, speech recognition, Suez canal 1869, The Brussels Effect, trade liberalization, trade route, Transnistria, young professional, zero-sum game, éminence grise

The comparison forces itself upon the visitor to Baidu, who is greeted in the garden and then the lobby by massive slides connecting to the upper levels, the symbol of fast-paced internet companies worldwide. The Institute of Deep Learning is one of the ways that Baidu, the Chinese search giant, is trying to keep ahead of the innovation pack, notably by making sure that the latest technological advances can quickly be used across its different businesses, including the core search algorithm. Deep learning is an old idea in artificial intelligence and many think it is our best hope in building software that will get us very near to – and in some cases surpass – human abilities.

In this, machine intelligence comes to resemble the way a large array of neurons works in the human brain. Speech and image recognition are among the most immediate applications of deep learning. Yuanqing told me how Baidu had been able to develop practically infallible speech recognition applications, even if the user chooses to whisper to his device rather than speak. They were now concentrating their efforts on how to apply deep learning to automated driving. Applying it to prediction systems still lies considerably in the future, but the future is getting closer each day. ‘How would you describe what is different about the way the Chinese approach technology?’

‘It learns to recognize speech and the learning process works equally for every language. Feed it the data and it will learn Latin or Sanskrit. Some algorithms have even invented their own languages.’ I had come to the Baidu Technology Park in the Haidian District of Beijing to meet Yuanqing Lin, Director of the Baidu Institute of Deep Learning. The complex consists of five individual buildings connected by gallery bridges and overlooking a central botanical garden where small streams of water and recently planted trees gradually reveal a space capsule in the centre. It is easy to get lost here. Haidian hosts a number of technology parks, each with tens or hundreds of both established companies and start-ups, a scale perhaps still half of Silicon Valley but fast approaching it.


pages: 326 words: 88,968

The Science and Technology of Growing Young: An Insider's Guide to the Breakthroughs That Will Dramatically Extend Our Lifespan . . . And What You Can Do Right Now by Sergey Young

23andMe, 3D printing, Albert Einstein, artificial general intelligence, augmented reality, basic income, Big Tech, bioinformatics, Biosphere 2, brain emulation, caloric restriction, caloric restriction, Charles Lindbergh, classic study, clean water, cloud computing, cognitive bias, computer vision, coronavirus, COVID-19, CRISPR, deep learning, digital twin, diversified portfolio, Doomsday Clock, double helix, Easter island, Elon Musk, en.wikipedia.org, epigenetics, European colonialism, game design, Gavin Belson, George Floyd, global pandemic, hockey-stick growth, impulse control, Internet of things, late capitalism, Law of Accelerating Returns, life extension, lockdown, Lyft, Mark Zuckerberg, meta-analysis, microbiome, microdosing, moral hazard, mouse model, natural language processing, personalized medicine, plant based meat, precision agriculture, radical life extension, Ralph Waldo Emerson, Ray Kurzweil, Richard Feynman, ride hailing / ride sharing, Ronald Reagan, self-driving car, seminal paper, Silicon Valley, stem cell, Steve Jobs, tech billionaire, TED Talk, uber lyft, ultra-processed food, universal basic income, Virgin Galactic, Vision Fund, X Prize

Now this algorithm is hard at work for the British National Health System, proactively analyzing data livestreamed from outpatients. When a potential flare-up is indicated by KenSci’s AI, doctors are alerted so that they can intervene before the situation becomes an emergency. 2.AI Case Study #2: Deep Learning and Computer Vision for Diagnosis Diabetic retinopathy (DR) is a complication of diabetes. Over time, excess sugar damages tiny blood vessels connected to the retina. The body grows new blood vessels, but they rupture easily. Untreated, this eventually leads to total blindness. If caught early, it is highly treatable, but it is fairly asymptomatic at first, and ophthalmologists who can identify the disease are rare.

With thirteen million people in his home country of India suffering from DR, Google scientist Varun Gulshan knew there must be a better way to diagnose and treat the disease using AI. His team first obtained one million retinal scans that had already been analyzed and graded by ophthalmologists. They then used the AI techniques of deep learning and computer vision to teach their algorithm to recognize DR, just like a qualified ophthalmologist would. Today, the shortage of doctors to monitor the retinal condition of diabetic patients is less of a problem, thanks to AI. 3.AI Case Study #3: Natural Language Processing and Taking AI Health Care to the Next Level Using AI to analyze raw data and even images is one thing.

Medical records can be monitored to identify patterns associated with negative health events like hospital-acquired infections, heart attacks, and so on. On a pretty narrow basis, NLP can already perform the detailed analysis I described above to enhance physician decision making. In time, AI will be able to combine computer vision, deep learning, natural language processing, and other techniques to provide extremely reliable diagnostic outcomes. It will take all of the guesswork and inconsistency out of medical care and make our old one-size-fits-all approach seem barbaric in retrospect. We have a long way to go, but within the Near Horizon of Longevity, precision medicine will become, without a great deal of hyperbole, perfect medicine.


pages: 296 words: 78,631

Hello World: Being Human in the Age of Algorithms by Hannah Fry

23andMe, 3D printing, Air France Flight 447, Airbnb, airport security, algorithmic bias, algorithmic management, augmented reality, autonomous vehicles, backpropagation, Brixton riot, Cambridge Analytica, chief data officer, computer vision, crowdsourcing, DARPA: Urban Challenge, data science, deep learning, DeepMind, Douglas Hofstadter, driverless car, Elon Musk, fake news, Firefox, Geoffrey Hinton, Google Chrome, Gödel, Escher, Bach, Ignaz Semmelweis: hand washing, John Markoff, Mark Zuckerberg, meta-analysis, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, pattern recognition, Peter Thiel, RAND corporation, ransomware, recommendation engine, ride hailing / ride sharing, selection bias, self-driving car, Shai Danziger, Silicon Valley, Silicon Valley startup, Snapchat, sparse data, speech recognition, Stanislav Petrov, statistical model, Stephen Hawking, Steven Levy, systematic bias, TED Talk, Tesla Model S, The Wisdom of Crowds, Thomas Bayes, trolley problem, Watson beat the top human players on Jeopardy!, web of trust, William Langewiesche, you are the product

In that perfect storm of misplaced trust and power and influence, the consequences have the potential to fundamentally alter our society. * This is paraphrased from a comment made by the computer scientist and machine-learning pioneer Andrew Ng in a talk he gave in 2015. See Tech Events, ‘GPU Technology Conference 2015 day 3: What’s Next in Deep Learning’, YouTube, 20 Nov. 2015, https://www.youtube.com/watch?v=qP9TOX8T-kI. † Simulating the brain of a worm is precisely the goal of the international science project OpenWorm. They’re hoping to artificially reproduce the network of 302 neurons found within the brain of the C. elegans worm. To put that into perspective, we humans have around 100,000,000,000 neurons.

This is another ‘machine-learning algorithm’, like the random forests we met in the ‘Justice’ chapter. It goes beyond what the operators program it to do and learns itself from the images it’s given. It’s this ability to learn that endows the algorithm with ‘artificial intelligence’. And the many layers of knobs and dials also give the network a deep structure, hence the term ‘deep learning’. Neural networks have been around since the middle of the twentieth century, but until quite recently we’ve lacked the widespread access to really powerful computers necessary to get the best out of them. The world was finally forced to sit up and take them seriously in 2012 when computer scientist Geoffrey Hinton and two of his students entered a new kind of neural network into an image recognition competition.12 The challenge was to recognize – among other things – dogs.

The world was finally forced to sit up and take them seriously in 2012 when computer scientist Geoffrey Hinton and two of his students entered a new kind of neural network into an image recognition competition.12 The challenge was to recognize – among other things – dogs. Their artificially intelligent algorithm blew the best of its competitors out of the water and kicked off a massive renaissance in deep learning. An algorithm that works without our knowing how it makes its decisions might sound like witchcraft, but it might not be all that dissimilar from how we learn ourselves. Consider this comparison. One team recently trained an algorithm to distinguish between photos of wolves and pet huskies. They then showed how, thanks to the way it had tuned its own dials, the algorithm wasn’t using anything to do with the dogs as clues at all.


pages: 475 words: 134,707

The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health--And How We Must Adapt by Sinan Aral

Airbnb, Albert Einstein, algorithmic bias, AlphaGo, Any sufficiently advanced technology is indistinguishable from magic, AOL-Time Warner, augmented reality, behavioural economics, Bernie Sanders, Big Tech, bitcoin, Black Lives Matter, Cambridge Analytica, carbon footprint, Cass Sunstein, computer vision, contact tracing, coronavirus, correlation does not imply causation, COVID-19, crowdsourcing, cryptocurrency, data science, death of newspapers, deep learning, deepfake, digital divide, digital nomad, disinformation, disintermediation, Donald Trump, Drosophila, Edward Snowden, Elon Musk, en.wikipedia.org, end-to-end encryption, Erik Brynjolfsson, experimental subject, facts on the ground, fake news, Filter Bubble, George Floyd, global pandemic, hive mind, illegal immigration, income inequality, Kickstarter, knowledge worker, lockdown, longitudinal study, low skilled workers, Lyft, Mahatma Gandhi, Mark Zuckerberg, Menlo Park, meta-analysis, Metcalfe’s law, mobile money, move fast and break things, multi-sided market, Nate Silver, natural language processing, Neal Stephenson, Network effects, performance metric, phenotype, recommendation engine, Robert Bork, Robert Shiller, Russian election interference, Second Machine Age, seminal paper, sentiment analysis, shareholder value, Sheryl Sandberg, skunkworks, Snapchat, social contagion, social distancing, social graph, social intelligence, social software, social web, statistical model, stem cell, Stephen Hawking, Steve Bannon, Steve Jobs, Steve Jurvetson, surveillance capitalism, Susan Wojcicki, Telecommunications Act of 1996, The Chicago School, the strength of weak ties, The Wisdom of Crowds, theory of mind, TikTok, Tim Cook: Apple, Uber and Lyft, uber lyft, WikiLeaks, work culture , Yogi Berra

This characterization may seem dramatic, but there is no doubt that technological innovation in the fabrication of falsity is advancing at a breakneck pace. The development of “deepfakes” is generating exceedingly convincing synthetic audio and video that is even more likely to fool us than textual fake news. Deepfake technology uses deep learning, a form of machine learning based on multilayered neural networks, to create hyperrealistic fake video and audio. If seeing is believing, then the next generation of falsity threatens to convince us more than any fake media we have seen so far. In 2018 movie director (and expert impersonator) Jordan Peele teamed up with BuzzFeed to create a deepfake video of Barack Obama calling Donald Trump a “complete and total dipshit.”

Peele added a tongue-in-cheek nod to the obvious falsity of his deepfake when he made Obama say, “Now, I would never say these things…at least not in a public address.” But what happens when the videos are not made to be obviously fake, but instead made to convincingly deceive? Deepfake technology is based on a specific type of deep learning called generative adversarial networks, or GANs, which was first developed by Ian Goodfellow while he was a graduate student at the University of Montreal. One night while drinking beer with fellow graduate students at a local watering hole, Goodfellow was confronted with a machine-learning problem that had confounded his friends: training a computer to create photos by itself.

Conventional methods were failing miserably. But that night while enjoying a few pints, Goodfellow had an epiphany. He wondered if they could solve the problem by pitting two neural networks against each other. It was the origin of GANs—a technology that Yann LeCun, former head of Facebook AI Research, dubbed “the coolest idea in deep learning in the last 20 years.” It’s also what manipulated Barack Obama to call Donald Trump a “dipshit.” GANs pit two networks against each other: a “generator,” whose job is to generate synthetic media, and a “discriminator,” whose job is to determine if the content is real or fake. The generator learns from the discriminator’s decisions and optimizes its media to create more and more convincing video and audio.


Human Frontiers: The Future of Big Ideas in an Age of Small Thinking by Michael Bhaskar

"Margaret Hamilton" Apollo, 3D printing, additive manufacturing, AI winter, Albert Einstein, algorithmic trading, AlphaGo, Anthropocene, artificial general intelligence, augmented reality, autonomous vehicles, backpropagation, barriers to entry, basic income, behavioural economics, Benoit Mandelbrot, Berlin Wall, Big bang: deregulation of the City of London, Big Tech, Bletchley Park, blockchain, Boeing 747, brain emulation, Brexit referendum, call centre, carbon tax, charter city, citizen journalism, Claude Shannon: information theory, Clayton Christensen, clean tech, clean water, cognitive load, Columbian Exchange, coronavirus, cosmic microwave background, COVID-19, creative destruction, CRISPR, crony capitalism, cyber-physical system, dark matter, David Graeber, deep learning, DeepMind, deindustrialization, dematerialisation, Demis Hassabis, demographic dividend, Deng Xiaoping, deplatforming, discovery of penicillin, disruptive innovation, Donald Trump, double entry bookkeeping, Easter island, Edward Jenner, Edward Lorenz: Chaos theory, Elon Musk, en.wikipedia.org, endogenous growth, energy security, energy transition, epigenetics, Eratosthenes, Ernest Rutherford, Eroom's law, fail fast, false flag, Fellow of the Royal Society, flying shuttle, Ford Model T, Francis Fukuyama: the end of history, general purpose technology, germ theory of disease, glass ceiling, global pandemic, Goodhart's law, Google Glasses, Google X / Alphabet X, GPT-3, Haber-Bosch Process, hedonic treadmill, Herman Kahn, Higgs boson, hive mind, hype cycle, Hyperloop, Ignaz Semmelweis: hand washing, Innovator's Dilemma, intangible asset, interchangeable parts, Internet of things, invention of agriculture, invention of the printing press, invention of the steam engine, invention of the telegraph, invisible hand, Isaac Newton, ITER tokamak, James Watt: steam engine, James Webb Space Telescope, Jeff Bezos, jimmy wales, job automation, Johannes Kepler, John von Neumann, Joseph Schumpeter, Kenneth Arrow, Kevin Kelly, Kickstarter, knowledge economy, knowledge worker, Large Hadron Collider, liberation theology, lockdown, lone genius, loss aversion, Louis Pasteur, Mark Zuckerberg, Martin Wolf, megacity, megastructure, Menlo Park, Minecraft, minimum viable product, mittelstand, Modern Monetary Theory, Mont Pelerin Society, Murray Gell-Mann, Mustafa Suleyman, natural language processing, Neal Stephenson, nuclear winter, nudge unit, oil shale / tar sands, open economy, OpenAI, opioid epidemic / opioid crisis, PageRank, patent troll, Peter Thiel, plutocrats, post scarcity, post-truth, precautionary principle, public intellectual, publish or perish, purchasing power parity, quantum entanglement, Ray Kurzweil, remote working, rent-seeking, Republic of Letters, Richard Feynman, Robert Gordon, Robert Solow, secular stagnation, shareholder value, Silicon Valley, Silicon Valley ideology, Simon Kuznets, skunkworks, Slavoj Žižek, sovereign wealth fund, spinning jenny, statistical model, stem cell, Steve Jobs, Stuart Kauffman, synthetic biology, techlash, TED Talk, The Rise and Fall of American Growth, the scientific method, The Wealth of Nations by Adam Smith, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, Thomas Malthus, TikTok, total factor productivity, transcontinental railway, Two Sigma, Tyler Cowen, Tyler Cowen: Great Stagnation, universal basic income, uranium enrichment, We wanted flying cars, instead we got 140 characters, When a measure becomes a target, X Prize, Y Combinator

We have private companies like SpaceX but precious little crewed space flight; we have the Standard Model of physics but haven't gone beyond it; we have libraries of humanities research that no one reads; we have a decades-long war on cancer, but still have cancer. Our time comes with a litany of big ideas: blockchain, mobile social networks, supermaterials like graphene, deep learning neural networks, quantum biology, massive multiplayer online games, molecular machines, behavioural economics, algorithmic trading, gravitational wave and exoplanet astronomy, parametric architecture, e-sports, the ending of taboos around gender and sexuality, to name a few. But execution and purchase are more problematic than in the past.

5 There was a sense of ‘melancholy’, ‘existential angst’, at how it was possible for outsiders to make a jump that, in AlQuraishi's words, worked at twice the pace of regular advance, and possibly more. It was ‘an anomalous leap’ in one of the core scientific problems of our time. What did just happen? The artificial intelligence company DeepMind, part of the Alphabet group, had been quietly working on software called AlphaFold. DeepMind uses deep learning neural networks, a newly potent technique of machine learning (ML), to predict how proteins fold. These networks aim to mimic the functioning of the human brain, using layers of mathematical functions that can, by changing their weightings, appear to learn. This makes predictions which are scored against what is already known.

This is true of everything since – tools and ideas work in tandem.19 In the twenty-first century our capacity to develop big ideas will rest on the development of our tools more than any other factor. Hence the significance of AI. It is the calculus, the telescope, the compass of our time. Demis Hassabis himself makes the link explicit, calling AI a sort of general-purpose Hubble space telescope for science.20 Big ideas like AlphaFold and AlphaGo, instances of the big idea of deep learning neural networks, are steadily making a difference at the coalface. To see how AI reshapes ideas, consider the volume of data produced by contemporary experiments. At CERN, the Large Hadron Collider produces 25 gigabytes of data every second.21 NASA missions churn out more than 125 gigabytes of data every second.22 Climate scientists, particle physicists, population ecologists, derivatives traders and economic forecasters all generate and must process vast amounts of data.


pages: 368 words: 96,825

Bold: How to Go Big, Create Wealth and Impact the World by Peter H. Diamandis, Steven Kotler

3D printing, additive manufacturing, adjacent possible, Airbnb, Amazon Mechanical Turk, Amazon Web Services, Apollo 11, augmented reality, autonomous vehicles, Boston Dynamics, Charles Lindbergh, cloud computing, company town, creative destruction, crowdsourcing, Daniel Kahneman / Amos Tversky, data science, deal flow, deep learning, dematerialisation, deskilling, disruptive innovation, driverless car, Elon Musk, en.wikipedia.org, Exxon Valdez, fail fast, Fairchild Semiconductor, fear of failure, Firefox, Galaxy Zoo, Geoffrey Hinton, Google Glasses, Google Hangouts, gravity well, hype cycle, ImageNet competition, industrial robot, information security, Internet of things, Jeff Bezos, John Harrison: Longitude, John Markoff, Jono Bacon, Just-in-time delivery, Kickstarter, Kodak vs Instagram, Law of Accelerating Returns, Lean Startup, life extension, loss aversion, Louis Pasteur, low earth orbit, Mahatma Gandhi, Marc Andreessen, Mark Zuckerberg, Mars Rover, meta-analysis, microbiome, minimum viable product, move fast and break things, Narrative Science, Netflix Prize, Network effects, Oculus Rift, OpenAI, optical character recognition, packet switching, PageRank, pattern recognition, performance metric, Peter H. Diamandis: Planetary Resources, Peter Thiel, pre–internet, Ray Kurzweil, recommendation engine, Richard Feynman, ride hailing / ride sharing, risk tolerance, rolodex, Scaled Composites, self-driving car, sentiment analysis, shareholder value, Sheryl Sandberg, Silicon Valley, Silicon Valley startup, skunkworks, Skype, smart grid, SpaceShipOne, stem cell, Stephen Hawking, Steve Jobs, Steven Levy, Stewart Brand, Stuart Kauffman, superconnector, Susan Wojcicki, synthetic biology, technoutopianism, TED Talk, telepresence, telepresence robot, Turing test, urban renewal, Virgin Galactic, Wayback Machine, web application, X Prize, Y Combinator, zero-sum game

Now imagine that this same AI also has contextual understanding—meaning the system recognizes that your conversation with your friend is heading in the direction of family life—so the AI reminds you of the names of each of your friend’s family members, as well as any upcoming birthdays they might have. Behind many of the AI successes mentioned in this section is an algorithm called Deep Learning. Developed by University of Toronto’s Geoffrey Hinton for image recognition, Deep Learning has become the dominant approach in the field. And it should come as no surprise that in spring of 2013, Hinton was recruited, like Kurzweil, to join Google41—a development that will most likely lead to even faster progress. More recently, Google and NASA Ames Research Center—one of NASA’s field centers—jointly acquired a 512 qubit (quantum bit) computer manufactured by D-Wave Systems to study machine learning.

v=9pmPa_KxsAM. 40 Joann Muller, “No Hands, No Feet: My Unnerving Ride in Google’s Driverless Car,” Forbes, March 21, 2013, http://www.forbes.com/sites/joannmuller/2013/03/21/no-hands-no-feet-my-unnerving-ride-in-googles-driverless-car/. 41 Robert Hof, “10 Breakthrough Technologies 2013: Deep Learning,” MIT Technology Review, April 23, 2013, http://www.technologyreview.com/featuredstory/513696/deep-learning/. 42 Steven Levy, “Google’s Larry Page on Why Moon Shots Matter,” Wired, January 17, 2013, http://www.wired.com/2013/01/ff-qa-larry-page/all/. 43 Larry Page, “Beyond Today—Larry Page—Zeitgeist 2012.” 44 Larry Page, “Google+: Calico Announcement,” Google+, September 2013, https://plus.google.com/+LarryPage/posts/Lh8SKC6sED1. 45 Harry McCracken and Lev Grossman, “Google vs.

Instead, the point is that AI has been in a deceptive phase for the past fifty years, ever since 1956, when a bunch of top brains came together for the first time at the Dartmouth Summer Research Project44 and made a “spectacularly wrong prediction” about their ability to crack AI over a single hot New England summer. But today, couple the successes of Deep Learning and IBM’s Watson to the near-term predictions of technology oracles like Ray Kurzweil, and we find a field reaching the knee of the exponential growth curve—that is, a field ready to run wild in disruption. So what does this mean to you, the exponential entrepreneur? This is a multibillion-dollar question.


Artificial Whiteness by Yarden Katz

affirmative action, AI winter, algorithmic bias, AlphaGo, Amazon Mechanical Turk, autonomous vehicles, benefit corporation, Black Lives Matter, blue-collar work, Californian Ideology, Cambridge Analytica, cellular automata, Charles Babbage, cloud computing, colonial rule, computer vision, conceptual framework, Danny Hillis, data science, David Graeber, deep learning, DeepMind, desegregation, Donald Trump, Dr. Strangelove, driverless car, Edward Snowden, Elon Musk, Erik Brynjolfsson, European colonialism, fake news, Ferguson, Missouri, general purpose technology, gentrification, Hans Moravec, housing crisis, income inequality, information retrieval, invisible hand, Jeff Bezos, Kevin Kelly, knowledge worker, machine readable, Mark Zuckerberg, mass incarceration, Menlo Park, military-industrial complex, Nate Silver, natural language processing, Nick Bostrom, Norbert Wiener, pattern recognition, phenotype, Philip Mirowski, RAND corporation, recommendation engine, rent control, Rodney Brooks, Ronald Reagan, Salesforce, Seymour Hersh, Shoshana Zuboff, Silicon Valley, Silicon Valley billionaire, Silicon Valley ideology, Skype, speech recognition, statistical model, Stephen Hawking, Stewart Brand, Strategic Defense Initiative, surveillance capitalism, talking drums, telemarketer, The Signal and the Noise by Nate Silver, W. E. B. Du Bois, Whole Earth Catalog, WikiLeaks

Even from a traditional cognitivist perspective, it is possible to critique these systems for having an inductive bias that diverges wildly from people’s behavior in the same contexts. Indeed, cognitive scientists have challenged the claims made about deep learning–based systems. One study evaluated DeepMind’s systems and offered several important objections.28 For one, the Atari-playing system received the equivalent of roughly thirty-eight days’ worth of play time. This extensive training allowed the system to obtain high scores, especially in games that do not require longer-term planning. However, a person who gets only two hours of play time can beat the deep learning system in games that do require longer-term planning. More important, such systems do not acquire the same knowledge about games that people do.

People, by contrast, can flexibly adopt different goals and styles of play: if asked to play with a different goal, such as losing as quickly as possible, or reaching the next level in the game but just barely, many people have little difficulty doing so. The AlphaGo system suffers from similar limitations. It is highly tuned to the configuration of the Go game on which it was trained. If the board size were to change, for example, there would be little reason to expect AlphaGo to work without retraining. AlphaGo also reveals that these deep learning systems are not as radically empiricist as advertised. The rules of Go are built into AlphaGo, a fact that is typically glossed over. This is hard-coded, symbolic knowledge, not the blank slate that was trumpeted. Nonetheless, the idea of a radically empiricist and general system (which in actuality is confined to narrow domains) is taken to mean DeepMind’s approach is ready for grand quests.

Political and social contexts, which are generally of no interest to AI practitioners, shape how people see their world. The historical power dynamics among people can be read in photographs, although AI systems are blind to such dynamics. The blind spots can be exposed by probing vision systems in a different way from that intended by their developers. To illustrate this, I have used Google’s deep learning–based image captioning system called “Show and Tell”—representative of the systems that have been claimed to outperform people in the visual arena—to analyze a series of images.34 Show and Tell was trained on thousands of photographs and can produce a label for an image it has not processed before.


pages: 284 words: 84,169

Talk on the Wild Side by Lane Greene

Affordable Care Act / Obamacare, Albert Einstein, Boris Johnson, deep learning, Donald Trump, ending welfare as we know it, experimental subject, facts on the ground, fake news, framing effect, Google Chrome, Higgs boson, illegal immigration, invisible hand, language acquisition, Large Hadron Collider, machine translation, meta-analysis, Money creation, moral panic, natural language processing, obamacare, public intellectual, Ronald Reagan, Sapir-Whorf hypothesis, Snapchat, sparse data, speech recognition, Steven Pinker, TED Talk, Turing test, Wall-E

So the language model, seeing both “a nice man” and “an ice man” as possibilities, will rank the more common phrase, “a nice man”, as more plausible. * These big, statistics-driven approaches made language technologies a lot better, but only gradually. The next big leap came with a new kind of machine learning, called “deep learning”. Deep learning relies on digital neural networks, meant to mimic the human brain at a simple level. Virtual “neurons” are connected in several layers. As the network is fed training data, the connections between the many neurons in the various layers are strengthened or weakened, in a way scientists believe is analogous to how humans learn.

That same year, Microsoft announced a speech-recognition system that made as few errors as a human transcriber. The system was powered by six neural networks, each of which tackled some parts of the problem better than others. None of these systems are perfect at the time of writing, and they almost certainly won’t be any time soon. “Deep learning” brought a sudden jump in quality in many language technologies, but it still cannot flexibly handle language like humans can. Translation and speech-recognition systems perform much better when their tasks are limited to a single domain, like medicine or law. This allows the software to make much better guesses about the appropriate output, by focusing on vocabulary and turns of phrase that are common to those domains.

Why are even the world’s best computers, programmed by the world’s best artificial-intelligence scientists, still struggling to do what children seem to do by magic? The rules-based approaches in artificial intelligence didn’t “scale up”, as the geeks say, and now it’s the turn of neural networks and deep learning to make the next round of progress. What humans may do when they learn language is combine these two approaches in ways that still elude the artificial-intelligence engineers. The 1980s saw a debate between those who believe the mind manipulates abstract symbols computationally – a bit like rules-based, “good old-fashioned” AI – and the early pioneers of digital neural networks, who thought that learning language was merely a matter of strengthening some neural connections here, and weakening some there, as the mind was presented with new data, a bit like today’s digital neural networks.


pages: 259 words: 84,261

Scary Smart: The Future of Artificial Intelligence and How You Can Save Our World by Mo Gawdat

3D printing, accounting loophole / creative accounting, AI winter, AlphaGo, anthropic principle, artificial general intelligence, autonomous vehicles, basic income, Big Tech, Black Lives Matter, Black Monday: stock market crash in 1987, butterfly effect, call centre, carbon footprint, cloud computing, computer vision, coronavirus, COVID-19, CRISPR, cryptocurrency, deep learning, deepfake, DeepMind, Demis Hassabis, digital divide, digital map, Donald Trump, Elon Musk, fake news, fulfillment center, game design, George Floyd, global pandemic, Google Glasses, Google X / Alphabet X, Law of Accelerating Returns, lockdown, microplastics / micro fibres, Nick Bostrom, off-the-grid, OpenAI, optical character recognition, out of africa, pattern recognition, Ponzi scheme, Ray Kurzweil, recommendation engine, self-driving car, Silicon Valley, smart contracts, Stanislav Petrov, Stephen Hawking, subprime mortgage crisis, superintelligent machines, TED Talk, TikTok, Turing machine, Turing test, universal basic income, Watson beat the top human players on Jeopardy!, Y2K

This was also the decade when, for the first time, statistical learning techniques were used to recognize faces in images. All of the above, however, were based on traditional computer programming, and while they delivered impressive results, they failed to offer the accuracy and scale today’s computer vision can offer, due to the advancement of Deep Learning artificial intelligence techniques, which have completely surpassed and replaced all prior methods. This intelligence did not learn to see by following a programmer’s list of instructions, but rather through the very act of seeing itself. With AI helping computers see, they can now do it much better than we do, specifically when it comes to individual tasks.

In the 1980s, efforts to revive AI, mostly led by Japan, channelled investment into research, once more leading to the development of very little real intelligence (as compared to the hype and excitement surrounding it) until it was all halted again in 1987 – again, due to an economic crisis. This is known as the second AI winter. Sporadic attempts at AI followed the economic recovery, but it wasn’t until the turn of the millennium, when we stumbled upon the biggest breakthrough in the history of AI, that we started to make real progress. This breakthrough has become known as Deep Learning. My first eye-opening exposure to the topic was through a white paper that was published by Google in 2009. The paper discussed how Google deployed a bit of its abundant computer power to run an experiment in which the machine was asked to ‘watch’ YouTube videos frame by frame and try to observe recurring patterns.

Once that pattern was labelled as a cat, the machine could easily find every single one of those felines across the hundreds of millions of videos on YouTube. It wasn’t too long afterwards that the machine could find letters, words, humans, nudity, cars and most of the other recurring entities that exist online. These neural networks, as we call them, built through Deep Learning, truly were the beginning of AI as we know it today. Everything before that can be considered almost negligible, though as I will show you in the next chapter, it was actually typical of the type of build-up needed to finally find the breakthrough. Since then, funding has flooded into the field of AI.


pages: 569 words: 156,139

Amazon Unbound: Jeff Bezos and the Invention of a Global Empire by Brad Stone

activist fund / activist shareholder / activist investor, air freight, Airbnb, Amazon Picking Challenge, Amazon Robotics, Amazon Web Services, autonomous vehicles, Bernie Sanders, big data - Walmart - Pop Tarts, Big Tech, Black Lives Matter, business climate, call centre, carbon footprint, Clayton Christensen, cloud computing, Colonization of Mars, commoditize, company town, computer vision, contact tracing, coronavirus, corporate governance, COVID-19, crowdsourcing, data science, deep learning, disinformation, disintermediation, Donald Trump, Downton Abbey, Elon Musk, fake news, fulfillment center, future of work, gentrification, George Floyd, gigafactory, global pandemic, Greta Thunberg, income inequality, independent contractor, invisible hand, Jeff Bezos, John Markoff, Kiva Systems, Larry Ellison, lockdown, Mahatma Gandhi, Mark Zuckerberg, Masayoshi Son, mass immigration, minimum viable product, move fast and break things, Neal Stephenson, NSO Group, Paris climate accords, Peter Thiel, Ponzi scheme, Potemkin village, private spaceflight, quantitative hedge fund, remote working, rent stabilization, RFID, Robert Bork, Ronald Reagan, search inside the book, Sheryl Sandberg, Silicon Valley, Silicon Valley startup, Snapchat, social distancing, SoftBank, SpaceX Starlink, speech recognition, Steve Ballmer, Steve Jobs, Steven Levy, tech billionaire, tech bro, techlash, TED Talk, Tim Cook: Apple, Tony Hsieh, too big to fail, Tragedy of the Commons, two-pizza team, Uber for X, union organizing, warehouse robotics, WeWork

Persuaded, Prasad joined to work on the problems of far-field speech recognition, but he ended up as an advocate for the deep learning model. Evi’s knowledge graphs were too regimented to be Alexa’s foundational response model; if a user says, “Play music by Sting,” such a system may think he is trying to say “bye” to the artist and get confused, Prasad later explained. By using the statistical training methods of deep learning, the system could quickly ascertain that when the sentence is uttered, the intent is almost certainly to blast “Every Breath You Take.” But Evi’s Tunstall-Pedoe argued that knowledge graphs were the more practical solution and mistrusted the deep learning approach. He felt it was error-prone and would require an endless diet of training data to properly mold Alexa’s learning models.

Tunstall-Pedoe said he had to fight with colleagues in the U.S. over the unusual idea of having Alexa respond to such social cues, recalling that “People were uncomfortable with the idea of programming a machine to respond to ‘hello.’ ” Integrating Evi’s technology helped Alexa respond to factual queries, such as requests to name the planets in the solar system, and it gave the impression that Alexa was smart. But was it? Proponents of another method of natural language understanding, called deep learning, believed that Evi’s knowledge graphs wouldn’t give Alexa the kind of authentic intelligence that would satisfy Bezos’s dream of a versatile assistant that could talk to users and answer any question. In the deep learning method, machines were fed large amounts of data about how people converse and what responses proved satisfying, and then were programmed to train themselves to predict the best answers.

“giant treelike structure”: James Vlahos, “Amazon Alexa and the Search for the One Perfect Answer,” Wired, February 18, 2018, https://www.wired.com/story/amazon-alexa-search-for-the-one-perfect-answer/ (January 19, 2021). harness a large number of high-powered computer processors to train its speech models: Nikko Ström, “Nikko Ström at AI Frontiers: Deep Learning in Alexa,” Slideshare, January 14, 2017, https://www.slideshare.net/AIFrontiers/nikko-strm-deep-learning-in-alexa (January 19, 2021). a patent on the idea was filed: Amazon. Techniques for mobile deceive charging using robotic devices. U.S. Patent 9711985, filed March 30, 2015. https://www.freepatentsonline.com/9711985.html (January 19, 2021).


pages: 406 words: 109,794

Range: Why Generalists Triumph in a Specialized World by David Epstein

Airbnb, Albert Einstein, Apollo 11, Apple's 1984 Super Bowl advert, Atul Gawande, Checklist Manifesto, Claude Shannon: information theory, Clayton Christensen, clockwork universe, cognitive bias, correlation does not imply causation, Daniel Kahneman / Amos Tversky, deep learning, deliberate practice, Exxon Valdez, fail fast, Flynn Effect, Freestyle chess, functional fixedness, game design, Gene Kranz, Isaac Newton, Johannes Kepler, knowledge economy, language acquisition, lateral thinking, longitudinal study, Louis Pasteur, Mark Zuckerberg, medical residency, messenger bag, meta-analysis, Mikhail Gorbachev, multi-armed bandit, Nelson Mandela, Netflix Prize, pattern recognition, Paul Graham, precision agriculture, prediction markets, premature optimization, pre–internet, random walk, randomized controlled trial, retrograde motion, Richard Feynman, Richard Feynman: Challenger O-ring, Silicon Valley, Silicon Valley billionaire, Stanford marshmallow experiment, Steve Jobs, Steve Wozniak, Steven Pinker, sunk-cost fallacy, systems thinking, Walter Mischel, Watson beat the top human players on Jeopardy!, Y Combinator, young professional

The economists suggested that the professors who caused short-term struggle but long-term gains were facilitating “deep learning” by making connections. They “broaden the curriculum and produce students with a deeper understanding of the material.” It also made their courses more difficult and frustrating, as evidenced by both the students’ lower Calculus I exam scores and their harsher evaluations of their instructors. And vice versa. The calculus professor who ranked dead last in deep learning out of the hundred studied—that is, his students underperformed in subsequent classes—was sixth in student evaluations, and seventh in student performance during his own class.

., “Mastering Chess and Shogi by Self-Play with a General Reinforcement Learning Algorithm,” arXiv (2017): 1712.01815. “In narrow enough worlds”: In addition to an interview with Gary Marcus, I used video of his June 7, 2017, lecture at the AI for Good Global Summit in Geneva, as well as several of his papers and essays: “Deep Learning: A Critical Appraisal,” arXiv: 1801.00631; “In Defense of Skepticism About Deep Learning,” Medium, January 14, 2018; “Innateness, AlphaZero, and Artificial Intelligence,” arXiv: 1801.05667. IBM’s Watson: For a balanced take on Watson’s challenges in healthcare—from one critic calling it “a joke,” to others suggesting it falls far short of the original hype but does indeed have value—see: D.

In one of Kornell and Bjork’s interleaving studies, 80 percent of students were sure they had learned better with blocked than mixed practice, whereas 80 percent performed in a manner that proved the opposite. The feeling of learning, it turns out, is based on before-your-eyes progress, while deep learning is not. “When your intuition says block,” Kornell told me, “you should probably interleave.” Interleaving is a desirable difficulty that frequently holds for both physical and mental skills. A simple motor-skill example is an experiment in which piano students were asked to learn to execute, in one-fifth of a second, a particular left-hand jump across fifteen keys.


Bulletproof Problem Solving by Charles Conn, Robert McLean

active transport: walking or cycling, Airbnb, Amazon Mechanical Turk, asset allocation, availability heuristic, Bayesian statistics, behavioural economics, Big Tech, Black Swan, blockchain, book value, business logic, business process, call centre, carbon footprint, cloud computing, correlation does not imply causation, Credit Default Swap, crowdsourcing, David Brooks, deep learning, Donald Trump, driverless car, drop ship, Elon Musk, endowment effect, fail fast, fake news, future of work, Garrett Hardin, Hyperloop, Innovator's Dilemma, inventory management, iterative process, loss aversion, megaproject, meta-analysis, Nate Silver, nudge unit, Occam's razor, pattern recognition, pets.com, prediction markets, principal–agent problem, RAND corporation, randomized controlled trial, risk tolerance, Silicon Valley, SimCity, smart contracts, stem cell, sunk-cost fallacy, the rule of 72, the scientific method, The Signal and the Noise by Nate Silver, time value of money, Tragedy of the Commons, transfer pricing, Vilfredo Pareto, walkable city, WikiLeaks

Experiments are a powerful and often overlooked tool in the big gun arsenal; if you can't make one, sometimes you can find a natural experiment. Machine learning is emerging as a powerful tool in many problem spheres; we argue to understand problem structure and develop hypotheses before employing deep learning algorithms (huge mistakes can come from bad data and bad structuring, and these models offer little transparency). You can outsource problem solving, including deep learning, through crowdsourcing on platforms such as Kaggle. Where there is an adversary whose behavior many change in reaction to your choices, you can look to game theory approaches with logic trees to work out the best course of action.

The most important defining question at the outset is to understand the nature of your problem: Are you primarily trying to understand the drivers of causation of your problem (how much each element contributes and in what direction), or are you primarily trying to predict a state of the world in order to make a decision? The first question leads you mostly down the left‐hand branch into various statistical analyses, including creating or discovering experiments. The second question leads you mostly down the right‐hand side of the tree into forecasting models, the family of machine or deep learning algorithms, and game theory. Some problems have elements of both sides, and require combining tools from both branches of the decision tree. And simulations and forecasting models can be found on both sides of the tree (see Exhibit 6.1). EXHIBIT 6.1 When you are focused on understanding the complex causes of your problem, so that you can develop strategies for intervention, you are usually in the world of statistics.

Precision here relates to how many correct images of sharks are identified from the total images (Exhibit 6.9). This is an impressive result. But it leaves open the question of false negatives—how many sharks aren't spotted (which seems important!)—but the ability to detect with high precision other species like dolphins is thought to minimize the number of false negatives. Like other deep‐learning algorithms, the expectation is that results will improve further with additional data.12 In addition, trialing technology, such as multispectral cameras, is expected to provide better ocean penetration, particularly for cloudy days. EXHIBIT 6.9 The beach of the future, as The Ripper team pictures it, will incorporate three technologies: A drone tethering system that allows the drone to be powered and sit atop a cable with full view of the beach, able to operate 24/7, but most likely for the hours the beach has lifeguards on duty.


pages: 370 words: 112,809

The Equality Machine: Harnessing Digital Technology for a Brighter, More Inclusive Future by Orly Lobel

2021 United States Capitol attack, 23andMe, Ada Lovelace, affirmative action, Airbnb, airport security, Albert Einstein, algorithmic bias, Amazon Mechanical Turk, augmented reality, barriers to entry, basic income, Big Tech, bioinformatics, Black Lives Matter, Boston Dynamics, Charles Babbage, choice architecture, computer vision, Computing Machinery and Intelligence, contact tracing, coronavirus, corporate social responsibility, correlation does not imply causation, COVID-19, crowdsourcing, data science, David Attenborough, David Heinemeier Hansson, deep learning, deepfake, digital divide, digital map, Elon Musk, emotional labour, equal pay for equal work, feminist movement, Filter Bubble, game design, gender pay gap, George Floyd, gig economy, glass ceiling, global pandemic, Google Chrome, Grace Hopper, income inequality, index fund, information asymmetry, Internet of things, invisible hand, it's over 9,000, iterative process, job automation, Lao Tzu, large language model, lockdown, machine readable, machine translation, Mark Zuckerberg, market bubble, microaggression, Moneyball by Michael Lewis explains big data, natural language processing, Netflix Prize, Network effects, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, occupational segregation, old-boy network, OpenAI, openstreetmap, paperclip maximiser, pattern recognition, performance metric, personalized medicine, price discrimination, publish or perish, QR code, randomized controlled trial, remote working, risk tolerance, robot derives from the Czech word robota Czech, meaning slave, Ronald Coase, Salesforce, self-driving car, sharing economy, Sheryl Sandberg, Silicon Valley, social distancing, social intelligence, speech recognition, statistical model, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, surveillance capitalism, tech worker, TechCrunch disrupt, The Future of Employment, TikTok, Turing test, universal basic income, Wall-E, warehouse automation, women in the workforce, work culture , you are the product

Malice or Competence: What We Fear For all the talk about the possibilities of AI and robotics, we’re really only at the embryonic stage of our grand machine-human integration. And AI means different things in different conversations. The most common use refers to machine learning—using statistical models to analyze large quantities of data. The next step from basic machine learning, referred to as deep learning, uses a multilayered architecture of networks, making connections and modeling patterns across data sets. AI can be understood as any machine—defined for our purposes as hardware running digital software—that mimics human behavior (i.e., human reactions). It makes decisions and judgments based on learning derived from data inputs and can mimic our senses, such as vision.

Thanks to algorithmic data mining, we can gather quite a bit of information simply from browsing the web. Recently, a group of researchers from Europe and the United States collaborated to develop automated assessments of organizational diversity and detection of discrimination by race, sex, age, and other parameters. The researchers applied the deep-learned predictors of gender and race—algorithms that use facial and name recognition to predict identity—to the executive management and board member profiles of the 500 largest companies on the 2016 Forbes Global 2000 list. They then ranked the companies by a sex and race diversity index.34 Overall, in the photos found online, women represented only 21.2 percent of all corporate executives.

The year after she was diagnosed, she created a system that uses computer vision technology to independently learn about the patterns of diagnosing breast cancer. She partnered with Dr. Constance Lehman, chief of breast imaging at Boston’s Massachusetts General Hospital. Lehman herself serves on several key national committees and was eager to apply deep learning to all aspects of breast cancer care, from prevention to detection to treatment. Barzilay and Lehman fed the algorithm both the image and the outcome over time so that it could teach itself what can be detected—what a human eye might miss. They fed the algorithm 70,000 images of lesions with known outcomes, both malignant and benign.


pages: 170 words: 49,193

The People vs Tech: How the Internet Is Killing Democracy (And How We Save It) by Jamie Bartlett

Ada Lovelace, Airbnb, AlphaGo, Amazon Mechanical Turk, Andrew Keen, autonomous vehicles, barriers to entry, basic income, Bernie Sanders, Big Tech, bitcoin, Black Lives Matter, blockchain, Boris Johnson, Californian Ideology, Cambridge Analytica, central bank independence, Chelsea Manning, cloud computing, computer vision, creative destruction, cryptocurrency, Daniel Kahneman / Amos Tversky, data science, deep learning, DeepMind, disinformation, Dominic Cummings, Donald Trump, driverless car, Edward Snowden, Elon Musk, Evgeny Morozov, fake news, Filter Bubble, future of work, general purpose technology, gig economy, global village, Google bus, Hans Moravec, hive mind, Howard Rheingold, information retrieval, initial coin offering, Internet of things, Jeff Bezos, Jeremy Corbyn, job automation, John Gilmore, John Maynard Keynes: technological unemployment, John Perry Barlow, Julian Assange, manufacturing employment, Mark Zuckerberg, Marshall McLuhan, Menlo Park, meta-analysis, mittelstand, move fast and break things, Network effects, Nicholas Carr, Nick Bostrom, off grid, Panopticon Jeremy Bentham, payday loans, Peter Thiel, post-truth, prediction markets, QR code, ransomware, Ray Kurzweil, recommendation engine, Renaissance Technologies, ride hailing / ride sharing, Robert Mercer, Ross Ulbricht, Sam Altman, Satoshi Nakamoto, Second Machine Age, sharing economy, Silicon Valley, Silicon Valley billionaire, Silicon Valley ideology, Silicon Valley startup, smart cities, smart contracts, smart meter, Snapchat, Stanford prison experiment, Steve Bannon, Steve Jobs, Steven Levy, strong AI, surveillance capitalism, TaskRabbit, tech worker, technological singularity, technoutopianism, Ted Kaczynski, TED Talk, the long tail, the medium is the message, the scientific method, The Spirit Level, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, too big to fail, ultimatum game, universal basic income, WikiLeaks, World Values Survey, Y Combinator, you are the product

More data fed in makes ML better, which allows it to make more sense of new data, which makes it better still, and so on. More sophisticated ML is being developed all the time. The latest involves teaching machines to solve problems for themselves rather than just feeding them examples, by setting out rules and leaving them to get on with it. This is sometimes called ‘deep learning’, which attempts to mimic the activity that occurs in layers of brain neurons through a neural network to spot patterns or images very effectively.2 To understand how this is different and potentially more powerful than classic ML, consider the ancient Chinese game Go. Machines have been beating humans at chess for years, but Go is more difficult for machines because of the sheer number of possible moves: in the course of a match, there are more possible combinations than there are atoms in the universe.

In 2016, to the surprise of many experts, AlphaGo decisively beat the world’s best Go player, Lee Sedol. This stunning result was quickly surpassed when, in late 2017, Deep Mind released AlphaGo Zero, a software that was given no human examples at all and was taught the rules of how to win by using a deep learning technique with no prior examples. It started off dreadfully bad but improved slightly with each game, and within 40 days of constant self-play it had become so strong that it thrashed the original AlphaGo 100–0. Go is now firmly in the category of ‘games that humans will never win against machines again’.

Although the specific application is very different, driverless vehicles like Stefan’s Starsky trucks use similar techniques of data extraction and analysis as AI-powered crime-prediction technology or CV analysis. Google’s DeepMind, for example, doesn’t just win at Go – it is currently pioneering exciting new medical research and has already dramatically cut the energy bills at Google’s huge data centres by using deep learning to optimise the air conditioning systems.6 There are countervailing tendencies, of course – some experts have got together to develop ‘open source’ AI which is more transparent and, hopefully, carefully designed, but the direction of progress is clear – just follow the money. Over the past few years, big tech firms have bought promising AI start-ups by the truckload.


pages: 288 words: 73,297

The Biology of Desire: Why Addiction Is Not a Disease by Marc Lewis Phd

behavioural economics, deep learning, delayed gratification, helicopter parent, impulse control, language acquisition, meta-analysis, no-fly zone, Rat Park, Ronald Reagan, Skype, Stanford marshmallow experiment, Walter Mischel

The learning spiral would have first quickened, showing a snowball effect in behaviour and a cascade of neural changes, when Donna and Brian began to pursue drugs for the feelings they provided, not as a means to an end—when desire kicked in and their quest for drugs overrode their other goals. That’s when their lives started to unravel. Figure 2. Deep learning: a profile of addiction onset, including phases of accelerated learning, stability, and reduction. Because the onset of addiction must include one or more phases of accelerated learning, but can also simmer for long periods, I’ve settled on the phrase deep learning. This is meant to describe the overall profile of addictive learning, including periods of rapid change, periods of coasting, and temporary remissions (in medical parlance).

That’s how brains change over the course of development, and that’s how habits are formed. So, what exactly is addiction? It’s a habit that grows and self-perpetuates relatively quickly, when we repeatedly pursue the same highly attractive goal. Or, in a phrase, motivated repetition that gives rise to deep learning. Addictive patterns grow more quickly and become more deeply entrenched than other, less compelling habits because of the intensity of the attraction that motivates us to repeat them, especially when they leave us gasping for more and other goals have lost their appeal. The neurobiological mechanics of this process involve multiple brain regions, interlaced to form a web that holds the addiction in place.

Fletcher, Inside Rehab: The Surprising Truth About Addiction Treatment—and How to Get Help That Works (New York: Viking, 2013). 5. David Sack, “Addiction Is a Disease and Needs to Be Treated as Such,” Room for Debate blog, New York Times, February 11, 2014. INDEX AA. See Alcoholics Anonymous abstinence AA and, 15 grey matter volume and, 137 as habit, 68 self-regulation and, 138 accelerated learning deep learning and, 172–174, 172 (fig.) desire and, 170–174 accumbens as addiction front-runner, 126 amygdala and OFC and, 79, 81 Brian and, 126 cognition linked with emotion and, 81–82 desire ignited in, 57–59 dopamine and, 58–59, 81 emotion and, 79 entrenched habits, 163, 170 impulsive to compulsive behaviour and, 127–128 location, 44 (fig.), 45, 56–57 love and, 166–167 mice and, 82 motivation and, 79 motivational core (motivational engine), and, 45 Natalie and, 126 now appeal and, 83–84 reward and, 126 ACEs.


pages: 410 words: 119,823

Radical Technologies: The Design of Everyday Life by Adam Greenfield

3D printing, Airbnb, algorithmic bias, algorithmic management, AlphaGo, augmented reality, autonomous vehicles, bank run, barriers to entry, basic income, bitcoin, Black Lives Matter, blockchain, Boston Dynamics, business intelligence, business process, Californian Ideology, call centre, cellular automata, centralized clearinghouse, centre right, Chuck Templeton: OpenTable:, circular economy, cloud computing, Cody Wilson, collective bargaining, combinatorial explosion, Computer Numeric Control, computer vision, Conway's Game of Life, CRISPR, cryptocurrency, David Graeber, deep learning, DeepMind, dematerialisation, digital map, disruptive innovation, distributed ledger, driverless car, drone strike, Elon Musk, Ethereum, ethereum blockchain, facts on the ground, fiat currency, fulfillment center, gentrification, global supply chain, global village, Goodhart's law, Google Glasses, Herman Kahn, Ian Bogost, IBM and the Holocaust, industrial robot, informal economy, information retrieval, Internet of things, Jacob Silverman, James Watt: steam engine, Jane Jacobs, Jeff Bezos, Jeff Hawkins, job automation, jobs below the API, John Conway, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Perry Barlow, John von Neumann, joint-stock company, Kevin Kelly, Kickstarter, Kiva Systems, late capitalism, Leo Hollis, license plate recognition, lifelogging, M-Pesa, Mark Zuckerberg, means of production, megacity, megastructure, minimum viable product, money: store of value / unit of account / medium of exchange, natural language processing, Network effects, New Urbanism, Nick Bostrom, Occupy movement, Oculus Rift, off-the-grid, PalmPilot, Pareto efficiency, pattern recognition, Pearl River Delta, performance metric, Peter Eisenman, Peter Thiel, planetary scale, Ponzi scheme, post scarcity, post-work, printed gun, proprietary trading, RAND corporation, recommendation engine, RFID, rolodex, Rutger Bregman, Satoshi Nakamoto, self-driving car, sentiment analysis, shareholder value, sharing economy, Shenzhen special economic zone , Sidewalk Labs, Silicon Valley, smart cities, smart contracts, social intelligence, sorting algorithm, special economic zone, speech recognition, stakhanovite, statistical model, stem cell, technoutopianism, Tesla Model S, the built environment, The Death and Life of Great American Cities, The Future of Employment, Tony Fadell, transaction costs, Uber for X, undersea cable, universal basic income, urban planning, urban sprawl, vertical integration, Vitalik Buterin, warehouse robotics, When a measure becomes a target, Whole Earth Review, WikiLeaks, women in the workforce

Over time, it will learn to recognize what distinguishes a good performance from an unacceptable one, and how to improve the odds of success next time out. It will refine its ability to detect what is salient in any given situation, and act on that insight. This process is called “machine learning.” What distinguishes this from “deep” learning, as some would have us call the process through which a machine develops insight? And why does it seem to have become so prominent in recent years? In the beginning was the program. Classically, using computers to solve problems in the real world meant writing programs, and that meant expressing those problems in terms that could be parsed and executed by machine.

Stripped of its mystification, then, machine learning is the process by way of which algorithms are taught to recognize patterns in the world, through the automated analysis of very large data sets. When neural networks are stacked in multiple layers, each stocked with neurons responsible for discerning a particular kind of pattern, they are capable of modeling high-level abstractions. (This stacking accounts for the “deep” in deep learning, in at least one of the circulating definitions.) It is this that ultimately gives the systems running these algorithms the ability to perform complicated tasks without being explicitly instructed in how to do so, and it is is how they now stand to acquire the capabilities we have previously thought of as the exclusive province of the human.

The software controlling a moving vehicle must integrate in real time a highly unstable environment, engine conditions, changes in weather, and the inherently unpredictable behavior of animals, pedestrians, bicyclists, other drivers and random objects it may encounter.8 (Now the significance of those reports you may have encountered of Google pre-driving nominally autonomous vehicles through the backstreets of its Peninsular domain becomes clearer: its engineers are training their guidance algorithm in what to expect from its first environment.) For autonomous vehicles, drones, robots and other systems intended to reckon with the real world in this way, then, the grail is unsupervised deep learning. As the name implies, the algorithms involved are neither prompted nor guided, but are simply set loose on vast fields of data. Order simply emerges. The equivalent of classification for unsupervised learning is clustering, in which an algorithm starts to develop a sense for what is significant in its environment via a process of accretion.


Autonomous Driving: How the Driverless Revolution Will Change the World by Andreas Herrmann, Walter Brenner, Rupert Stadler

Airbnb, Airbus A320, algorithmic bias, augmented reality, autonomous vehicles, blockchain, call centre, carbon footprint, clean tech, computer vision, conceptual framework, congestion pricing, connected car, crowdsourcing, cyber-physical system, DARPA: Urban Challenge, data acquisition, deep learning, demand response, digital map, disruptive innovation, driverless car, Elon Musk, fault tolerance, fear of failure, global supply chain, industrial cluster, intermodal, Internet of things, Jeff Bezos, John Zimmer (Lyft cofounder), Lyft, manufacturing employment, market fundamentalism, Mars Rover, Masdar, megacity, Pearl River Delta, peer-to-peer rental, precision agriculture, QWERTY keyboard, RAND corporation, ride hailing / ride sharing, self-driving car, sensor fusion, sharing economy, Silicon Valley, smart cities, smart grid, smart meter, Steve Jobs, Tesla Model S, Tim Cook: Apple, trolley problem, uber lyft, upwardly mobile, urban planning, Zipcar

GPUs) and enormous data volumes. Complex, multilevel, artificial neuronal networks are also often described as deep learning. Deep learning is applied nowadays primarily in the areas of language recognition, image recognition and autonomous driving. Box 11.3. Artificial Intelligence, Deep Learning and Neural Networks Lutz Junge, Principal Engineer, Electronics Research Lab, Volkswagen Group of America End-2-end deep neural networks could learn to drive a car just by monitoring multiple human drivers and adopting the rules of driving. Deep learning has produced promising results when applied to tasks that require bridging the gap between two different domains.

LTE-V and 5G will play an important role in this future ecosystem and the telecommunications operators need to improve performance to fulfil the needs of the car industry, specifically the delay time, coverage, end-to-end-security and reliability of their networks. Ultimately, we need a well-standardised system based on future technologies like 5G, artificial intelligence, deep learning, big data, high-resolution maps and high-resolution GPS. The key question is: Who is the owner of these data and who could earn money in which part of this ecosystem? Founding the 5G Automotive Association and developing this ecosystem in collaboration is a big step in the right direction. The vision for future 5G networks is: 1 millisecond delay end-to-end + 10 gigabits per second speed + 99.9999% reliability + E2E security + 10 years lifespan for embedded M2M devices with one battery + capacity for about 500 billion devices + lower costs.

However, it now seems clear that Google is primarily interested in collecting data on drivers and their vehicles, and that it sees the software as a new business model and does not want to produce cars. Players 183 Early in 2016, Nvidia surprisingly announced its own computing platform for controlling autonomous vehicles. This platform has sufficient processing power to support deep learning, sensor fusion and surround vision, all of which are key elements for a self-driving car. It also announced that its PX2 would be used as a standard computer in the Roborace series for self-driving race. Nvidia has also already built autonomous test vehicles, which have only been driven on test routes to date.


pages: 513 words: 152,381

The Precipice: Existential Risk and the Future of Humanity by Toby Ord

3D printing, agricultural Revolution, Albert Einstein, Alignment Problem, AlphaGo, Anthropocene, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, availability heuristic, biodiversity loss, Columbian Exchange, computer vision, cosmological constant, CRISPR, cuban missile crisis, decarbonisation, deep learning, DeepMind, defense in depth, delayed gratification, Demis Hassabis, demographic transition, Doomsday Clock, Dr. Strangelove, Drosophila, effective altruism, Elon Musk, Ernest Rutherford, global pandemic, Goodhart's law, Hans Moravec, Herman Kahn, Higgs boson, Intergovernmental Panel on Climate Change (IPCC), Isaac Newton, James Watt: steam engine, Large Hadron Collider, launch on warning, Mark Zuckerberg, Mars Society, mass immigration, meta-analysis, Mikhail Gorbachev, mutually assured destruction, Nash equilibrium, Nick Bostrom, Norbert Wiener, nuclear winter, ocean acidification, OpenAI, p-value, Peter Singer: altruism, planetary scale, power law, public intellectual, race to the bottom, RAND corporation, Recombinant DNA, Ronald Reagan, self-driving car, seminal paper, social discount rate, Stanislav Petrov, Stephen Hawking, Steven Pinker, Stewart Brand, supervolcano, survivorship bias, synthetic biology, tacit knowledge, the scientific method, Tragedy of the Commons, uranium enrichment, William MacAskill

Technical improvements in their design and training, combined with richer datasets and more computing power, have allowed us to train much larger and deeper networks than ever before.73 This deep learning gives the networks the ability to learn subtle concepts and distinctions. Not only can they now recognize a cat, they have outperformed humans in distinguishing different breeds of cats.74 They recognize human faces better than we can ourselves, and distinguish identical twins.75 And we have been able to use these abilities for more than just perception and classification. Deep learning systems can translate between languages with a proficiency approaching that of a human translator.

Games have been a central part of AI since the days of the Dartmouth conference. Steady incremental progress took chess from amateur play in 1957 all the way to superhuman level in 1997, and substantially beyond.77 Getting there required a vast amount of specialist human knowledge of chess strategy. In 2017, deep learning was applied to chess with impressive results. A team of researchers at the AI company DeepMind created AlphaZero: a neural network–based system that learned to play chess from scratch. It went from novice to grand master in just four hours.78 In less than the time it takes a professional to play two games, it discovered strategic knowledge that had taken humans centuries to unearth, playing beyond the level of the best humans or traditional programs.

The world’s best Go players had long thought that their play was close to perfection, so were shocked to find themselves beaten so decisively.80 As the reigning world champion, Ke Jie, put it: “After humanity spent thousands of years improving our tactics, computers tell us that humans are completely wrong… I would go as far as to say not a single human has touched the edge of the truth of Go.”81 It is this generality that is the most impressive feature of cutting edge AI, and which has rekindled the ambitions of matching and exceeding every aspect of human intelligence. This goal is sometimes known as artificial general intelligence (AGI), to distinguish it from the narrow approaches that had come to dominate. While the timeless games of chess and Go best exhibit the brilliance that deep learning can attain, its breadth was revealed through the Atari video games of the 1970s. In 2015, researchers designed an algorithm that could learn to play dozens of extremely different Atari games at levels far exceeding human ability.82 Unlike systems for chess or Go, which start with a symbolic representation of the board, the Atari-playing systems learned and mastered these games directly from the score and the raw pixels on the screen.


Seeking SRE: Conversations About Running Production Systems at Scale by David N. Blank-Edelman

Affordable Care Act / Obamacare, algorithmic trading, AlphaGo, Amazon Web Services, backpropagation, Black Lives Matter, Bletchley Park, bounce rate, business continuity plan, business logic, business process, cloud computing, cognitive bias, cognitive dissonance, cognitive load, commoditize, continuous integration, Conway's law, crowdsourcing, dark matter, data science, database schema, Debian, deep learning, DeepMind, defense in depth, DevOps, digital rights, domain-specific language, emotional labour, en.wikipedia.org, exponential backoff, fail fast, fallacies of distributed computing, fault tolerance, fear of failure, friendly fire, game design, Grace Hopper, imposter syndrome, information retrieval, Infrastructure as a Service, Internet of things, invisible hand, iterative process, Kaizen: continuous improvement, Kanban, Kubernetes, loose coupling, Lyft, machine readable, Marc Andreessen, Maslow's hierarchy, microaggression, microservices, minimum viable product, MVC pattern, performance metric, platform as a service, pull request, RAND corporation, remote working, Richard Feynman, risk tolerance, Ruby on Rails, Salesforce, scientific management, search engine result page, self-driving car, sentiment analysis, Silicon Valley, single page application, Snapchat, software as a service, software is eating the world, source of truth, systems thinking, the long tail, the scientific method, Toyota Production System, traumatic brain injury, value engineering, vertical integration, web application, WebSocket, zero day

If you want to go deeper into your investigation, I suggest exploring some open source code, some of which you can find in my repository and two reference books. Deep Learning you can read online for free at deeplearningbook.org, or get the print version. It’s time to apply machine learning in your organization. My GitHub Repository https://github.com/ricardoamaro/MachineLearning4SRE Recommended Books Russell, Stuart J., Peter Norvig, and John F. Canny. Artificial Intelligence: A Modern Approach. Upper Saddle River, NJ: Pearson International (2003). Goodfellow, Ian, Yoshua Bengio, and Aaron Courville. Deep Learning. Cambridge, MA: MIT Press (2016). www.mitpress.mit.edu/books/deep-learning. Contributor Bio Ricardo Amaro is currently performing senior site reliability engineering functions in Acquia, one of the largest companies in the world of Free Software with around 20,000 servers in production.

The rise of AI is happening now because we have enough data5 from all of the big data initiatives out there and cheap GPUs6 for machine learning algorithms to churn on. In addition, what’s surprising about deep learning is how simple it is. Last decade, no one suspected that we would achieve these incredible results with respect to machine perception problems. Now, it turns out that all you need is sufficiently large parametric models trained with gradient descent on sufficiently many examples. It’s not complicated, it’s just a lot of it. Richard Feynman, in the 1972 interview, Take the World from a Different Point of View So that gives us a fairly good idea of why we now can do deep learning in such a reachable way and who to blame for the whole thing — gamers and games.

Neural networks A bit complicated to begin with, so let’s take a moment to dive into them separately. What Are Neural Networks? In this section, we try to answer this question by briefly walking you through the basics of neural networks, deep learning techniques, and how to apply them to real data problems. Neurons and Neural Networks Unfortunately, it’s not possible for us to jump right into some deep learning magic Python for SRE without first talking a bit about neural networks to get a basis for our code. A neural network is composed of artificial neurons. An artificial neuron is just a mathematical function conceived as a model of the biological neuron.


pages: 282 words: 63,385

Attention Factory: The Story of TikTok and China's ByteDance by Matthew Brennan

Airbnb, AltaVista, augmented reality, Benchmark Capital, Big Tech, business logic, Cambridge Analytica, computer vision, coronavirus, COVID-19, deep learning, Didi Chuxing, Donald Trump, en.wikipedia.org, fail fast, Google X / Alphabet X, growth hacking, ImageNet competition, income inequality, invisible hand, Kickstarter, Mark Zuckerberg, Menlo Park, natural language processing, Netflix Prize, Network effects, paypal mafia, Pearl River Delta, pre–internet, recommendation engine, ride hailing / ride sharing, Sheryl Sandberg, Silicon Valley, Snapchat, social graph, Steve Jobs, TikTok, Travis Kalanick, WeWork, Y Combinator

66 https://techcrunch.com/2012/12/05/prismatic/ 67 http://yingdudasha.cn/ 68 Image source: https://m.weibo.cn/2745813247/3656157740605616 * “Real stuff” is my imperfect translation of 干货 gānhuò, which could also be translated as “the real McCoy” or “something of substance” Chapter 3 Recommendation, From YouTube to TikTok Chapter Timeline 2009 – Netflix awards a $1 million prize for an algorithm that increased the accuracy of their video recommendation by 10% 2011 – YouTube introduces machine learning algorithmic recommendation engine, Sibyl, with immediate impact 2012 Aug – ByteDance launches news aggregation app Toutiao 2012 Sep t – AlexNet breakthrough at the ImageNet challenge triggers a global explosion of interest in AI 2013 Mar – Facebook changes its newsfeed to a “personalized newspaper ” 2014 April – Instagram begins using an “explore ” tab of personalized content 2015 – Google Brain’s deep learning algorithms begin supercharging a wide variety of Google products, including YouTube recommendations It was 2010, and YouTube had a big problem. Despite being the third most visited website on the internet, “YouTube.com as a homepage was not driving a ton of engagement,” 69 admitted John McFadden, technical lead for YouTube recommendations.

The machine learning worked so well that soon, more people were choosing what to watch based on the “recommended videos” list than any other way of picking videos, such as web searches or email referrals. Google continued to iterate and further optimize the recommendation system, later switching from Sibyl to using Google Brain developed by the company’s now-famous moonshot laboratory group Google X, led by Stanford professor Andrew Ng. Google Brain leveraged groundbreaking new advances in deep learning. Whereas Sibyl’s impact had already been impressive, the results of Google Brain were nothing short of astounding. Over the three years of 2014 to 2017, the aggregate time spent watching videos on YouTube’s homepage grew twenty times. Recommendations drove over 70% of all time on YouTube. 76 There was increasing acknowledgment across social media that YouTube’s suggested videos had become eerie accurate at guessing what would interest you.

Recommendations drove over 70% of all time on YouTube. 76 There was increasing acknowledgment across social media that YouTube’s suggested videos had become eerie accurate at guessing what would interest you. Above: The immediate impact made by YouTube’s use of machine learning algorithmic recommendation engine, Sibyl, in 2011. 77 Rapid progress in the field of AI, including the breakthrough known as “deep learning,” meant this method of content distribution would rapidly come of age with profound implications. For a company like ByteDance, the timing of these new advances could not have been better. They were in the early innings of a new era that would see the effectiveness and accuracy of algorithmic recommendation jump forward in leaps and bounds.


pages: 385 words: 112,842

Arriving Today: From Factory to Front Door -- Why Everything Has Changed About How and What We Buy by Christopher Mims

air freight, Airbnb, Amazon Robotics, Amazon Web Services, Apollo 11, augmented reality, autonomous vehicles, big-box store, blue-collar work, Boeing 747, book scanning, business logic, business process, call centre, cloud computing, company town, coronavirus, cotton gin, COVID-19, creative destruction, data science, Dava Sobel, deep learning, dematerialisation, deskilling, digital twin, Donald Trump, easy for humans, difficult for computers, electronic logging device, Elon Musk, Frederick Winslow Taylor, fulfillment center, gentrification, gig economy, global pandemic, global supply chain, guest worker program, Hans Moravec, heat death of the universe, hive mind, Hyperloop, immigration reform, income inequality, independent contractor, industrial robot, interchangeable parts, intermodal, inventory management, Jacquard loom, Jeff Bezos, Jessica Bruder, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, Joseph Schumpeter, Kaizen: continuous improvement, Kanban, Kiva Systems, level 1 cache, Lewis Mumford, lockdown, lone genius, Lyft, machine readable, Malacca Straits, Mark Zuckerberg, market bubble, minimum wage unemployment, Nomadland, Ocado, operation paperclip, Panamax, Pearl River Delta, planetary scale, pneumatic tube, polynesian navigation, post-Panamax, random stow, ride hailing / ride sharing, robot derives from the Czech word robota Czech, meaning slave, Rodney Brooks, rubber-tired gantry crane, scientific management, self-driving car, sensor fusion, Shenzhen special economic zone , Shoshana Zuboff, Silicon Valley, six sigma, skunkworks, social distancing, South China Sea, special economic zone, spinning jenny, standardized shipping container, Steve Jobs, supply-chain management, surveillance capitalism, TED Talk, the scientific method, Tim Cook: Apple, Toyota Production System, traveling salesman, Turing test, two-sided market, Uber and Lyft, Uber for X, uber lyft, Upton Sinclair, vertical integration, warehouse automation, warehouse robotics, workplace surveillance

In the truck’s perceptual system are the classic object-recognition algorithms, made possible by deep learning, that allow it to identify people, obstacles, and other vehicles. It’s the same sort of technology that allows Facebook to recognize your friends’ faces in photos, or Google to cough up pictures of felines when you type in “cats.” Then there are the many niche algorithms the truck is running, each purpose-built. One discerns what traffic lights at intersections are conveying; another interprets turn signals on cars. To interpret the truck’s raw sensory input, its algorithms rely mostly on deep learning. One level above that, in the middle of the truck’s complicated algorithm of interpretation and decision-making, it uses Bayesian analysis.

$3.6 billion into trucking: Cyndia Zwahlen, “Freight Tech VC on Track to Top 2018’s Record $3.6 billion,” Trucks.com, April 29, 2019, https://www.trucks.com/2019/04/29/freight-tech-vc-top-record-3-6-billion. Chapter 13: The Future of Trucking brittle and shallow: Jason Pontin, “Greedy, Brittle, Opaque, and Shallow: The Downsides to Deep Learning,” Wired, February 2, 2018, https://www.wired.com/story/greedy-brittle-opaque-and-shallow-the-downsides-to-deep-learning. a map unlike any created before it: Christopher Mims, “The Key to Autonomous Driving? An Impossibly Perfect Map,” Wall Street Journal, October 11, 2018, https://www.wsj.com/articles/the-key-to-autonomous-driving-an-impossibly-perfect-map-1539259260.

Controlling this truck is a computer that’s dreaming, in a way—a dream not so different from the one you’re experiencing at this very moment, and in every moment of your waking life. Our waking dream, and the truck’s, is a fable about the state of our bodies, the world outside, and possible futures both might soon inhabit. The truck is using deep learning to construct a reference view of reality, including the state of the road ahead, the vehicles around us, and the tarmac beneath our eighteen rumbling wheels, each spinning sixteen times a second. Within 100 milliseconds, the truck’s AI needs to digest all its sensory inputs by converting pixels captured by its cameras, and a point cloud generated by its lidar, into cars, trucks, and motorcycles.


pages: 480 words: 123,979

Dawn of the New Everything: Encounters With Reality and Virtual Reality by Jaron Lanier

4chan, air gap, augmented reality, back-to-the-land, Big Tech, Bill Atkinson, Buckminster Fuller, Burning Man, carbon footprint, cloud computing, collaborative editing, commoditize, Computer Lib, cosmological constant, creative destruction, crowdsourcing, deep learning, Donald Trump, Douglas Engelbart, Douglas Hofstadter, El Camino Real, Elon Musk, fake news, Firefox, game design, general-purpose programming language, gig economy, Google Glasses, Grace Hopper, Gödel, Escher, Bach, Hacker Ethic, Hans Moravec, Howard Rheingold, hype cycle, impulse control, information asymmetry, intentional community, invisible hand, Ivan Sutherland, Jaron Lanier, John Gilmore, John Perry Barlow, John von Neumann, Kevin Kelly, Kickstarter, Kuiper Belt, lifelogging, mandelbrot fractal, Mark Zuckerberg, Marshall McLuhan, Menlo Park, military-industrial complex, Minecraft, Mitch Kapor, Mondo 2000, Mother of all demos, Murray Gell-Mann, Neal Stephenson, Netflix Prize, Network effects, new economy, Nick Bostrom, Norbert Wiener, Oculus Rift, pattern recognition, Paul Erdős, peak TV, Plato's cave, profit motive, Project Xanadu, quantum cryptography, Ray Kurzweil, reality distortion field, recommendation engine, Richard Feynman, Richard Stallman, Ronald Reagan, self-driving car, Silicon Valley, Silicon Valley startup, Skinner box, Skype, Snapchat, stem cell, Stephen Hawking, Steve Bannon, Steve Jobs, Steven Levy, Stewart Brand, systems thinking, technoutopianism, Ted Nelson, telemarketer, telepresence, telepresence robot, Thorstein Veblen, Turing test, Vernor Vinge, Whole Earth Catalog, Whole Earth Review, WikiLeaks, wikimedia commons

Here is how we build systems today: A bit-precise structure of communication abstractions surrounds the “pay dirt” modules like “deep learning”9 ones that accomplish the most valuable functions. These critical “AI-like” algorithms are not bit-perfect, but even though they’re approximate, they’re still robust. They provide the capabilities at the core of the programs that run our lives these days. They analyze the results of medical trials and operate self-driving vehicles. In a phenotropic architecture, the roles of the bit-perfect and approximate/robust components of a program are often reversed. Modules are connected in a phenotropic system by approximate but robust methods like deep learning and other ideas usually associated with “artificial intelligence.”

The Wisdom of Imperfection Since the modules of an ideal future phenotropic system would be connected through approximate means, using machine vision and other techniques usually associated with artificial intelligence, a lot of the manic, tricky hacking games that go on today wouldn’t even get off the ground. It would be hard, for instance, to inject malware into a computer through a deep learning network, say, by pointing a camera at an image that is supposed to cause the infection. Hard is not the same thing as impossible, but the quest for perfection in security is a fool’s game. To be clear, you can inject malware using an image (it’s done all the time), but it’s only easy to do that when software ingests the image bit by bit and processes it using a precise protocol.

But if an image is ingested only as an analog-style approximation and only analyzed statistically, as if a camera had been pointed at it, then there is much less vulnerability.12 The image is not the problem; the rigidity of the protocol is the problem. Sometimes it’s best when engineers don’t know exactly how software works. The approximate nature of modern algorithms associated with “deep learning” and related terms is inherently resistant to the tricks of the hacker trade, but we apply those capabilities only to performing specialized tasks, not for building architectures. Another way of framing the phenotropic idea is that we should use them in architecture. Just like in biology, security is enhanced when a system becomes robust, which is not the same thing as perfect.


pages: 482 words: 121,173

Tools and Weapons: The Promise and the Peril of the Digital Age by Brad Smith, Carol Ann Browne

"World Economic Forum" Davos, Affordable Care Act / Obamacare, AI winter, air gap, airport security, Alan Greenspan, Albert Einstein, algorithmic bias, augmented reality, autonomous vehicles, barriers to entry, Berlin Wall, Big Tech, Bletchley Park, Blitzscaling, Boeing 737 MAX, business process, call centre, Cambridge Analytica, Celtic Tiger, Charlie Hebdo massacre, chief data officer, cloud computing, computer vision, corporate social responsibility, data science, deep learning, digital divide, disinformation, Donald Trump, Eben Moglen, Edward Snowden, en.wikipedia.org, Hacker News, immigration reform, income inequality, Internet of things, invention of movable type, invention of the telephone, Jeff Bezos, Kevin Roose, Laura Poitras, machine readable, Mark Zuckerberg, minimum viable product, national security letter, natural language processing, Network effects, new economy, Nick Bostrom, off-the-grid, operational security, opioid epidemic / opioid crisis, pattern recognition, precision agriculture, race to the bottom, ransomware, Ronald Reagan, Rubik’s Cube, Salesforce, school vouchers, self-driving car, Sheryl Sandberg, Shoshana Zuboff, Silicon Valley, Skype, speech recognition, Steve Ballmer, Steve Jobs, surveillance capitalism, tech worker, The Rise and Fall of American Growth, Tim Cook: Apple, Wargames Reagan, WikiLeaks, women in the workforce

This approach uses statistical methods for pattern recognition, prediction, and reasoning, in effect building systems through algorithms that learn from data. During the last decade, leaps in computer and data science have led to the expanded use of so-called deep learning, or with neural networks. Our human brains contain neurons with synaptic connections that make possible our ability to discern patterns in the world around us.7 Computer-based neural networks contain computational units referred to as neurons, and they’re connected artificially so that AI systems can reason.8 In essence, the deep learning approach feeds huge amounts of relevant data to train a computer to recognize a pattern, using many layers of these artificial neurons.

Terrence J. Sejnowski, The Deep Learning Revolution (Cambridge, MA: MIT Press, 2018), 31; in 1986 Eric Horvitz coauthored one of the leading papers that made the case that expert systems would not be scalable. D.E. Heckerman and E.J. Horvitz, “The Myth of Modularity in Rule-Based Systems for Reasoning with Uncertainty,” Conference on Uncertainty in Artificial Intelligence, Philadelphia, July 1986; https://dl.acm.org/citation.cfm?id=3023728. Back to note reference 5. Ibid. Back to note reference 6. Charu C. Aggarwal, Neural Networks and Deep Learning: A Textbook (Cham, Switzerland: Springer, 2018), 1.

Aggarwal, Neural Networks and Deep Learning: A Textbook (Cham, Switzerland: Springer, 2018), 1. The convergence of intellectual disciplines involved and affected by these developments in recent decades is described in S.J. Gershman, E.J. Horvitz, and J.B. Tenenbaum, Science 349, 273–78 (2015). Back to note reference 7. Aggarwal, Neural Networks and Deep Learning, 1. Back to note reference 8. Ibid., 17–30. Back to note reference 9. See Sejnowski for a thorough history of the developments that have led to advances in neural networks over the past two decades. Back to note reference 10. Dom Galeon, “Microsoft’s Speech Recognition Tech Is Officially as Accurate as Humans,” Futurism, October 20, 2016, https://futurism.com/microsofts-speech-recognition-tech-is-officially-as-accurate-as-humans/; Xuedong Huang, “Microsoft Researchers Achieve New Conversational Speech Recognition Milestone,” Microsoft Research Blog, Microsoft, August 20, 2017, https://www.microsoft.com/en-us/research/blog/microsoft-researchers-achieve-new-conversational-speech-recognition-milestone/.


pages: 337 words: 103,522

The Creativity Code: How AI Is Learning to Write, Paint and Think by Marcus Du Sautoy

3D printing, Ada Lovelace, Albert Einstein, algorithmic bias, AlphaGo, Alvin Roth, Andrew Wiles, Automated Insights, Benoit Mandelbrot, Bletchley Park, Cambridge Analytica, Charles Babbage, Claude Shannon: information theory, computer vision, Computing Machinery and Intelligence, correlation does not imply causation, crowdsourcing, data is the new oil, data science, deep learning, DeepMind, Demis Hassabis, Donald Trump, double helix, Douglas Hofstadter, driverless car, Elon Musk, Erik Brynjolfsson, Fellow of the Royal Society, Flash crash, Gödel, Escher, Bach, Henri Poincaré, Jacquard loom, John Conway, Kickstarter, Loebner Prize, machine translation, mandelbrot fractal, Minecraft, move 37, music of the spheres, Mustafa Suleyman, Narrative Science, natural language processing, Netflix Prize, PageRank, pattern recognition, Paul Erdős, Peter Thiel, random walk, Ray Kurzweil, recommendation engine, Rubik’s Cube, Second Machine Age, Silicon Valley, speech recognition, stable marriage problem, Turing test, Watson beat the top human players on Jeopardy!, wikimedia commons

., ‘Unifying Count-Based Exploration and Intrinsic Motivation’, in Advances in Neural Information Processing Systems, pp. 1471–9, NIPS Proceedings, 2016 Bokde, Dheeraj, Sheetal Girase and Debajyoti Mukhopadhyay, ‘Matrix Factorization Model in Collaborative Filtering Algorithms: A Survey’, Procedia Computer Science, vol. 49, 136–46 (2015) Briot, Jean-Pierre and François Pachet, ‘Music Generation by Deep Learning: Challenges and Directions’, arXiv:1712. 04371 (2017) Briot, Jean-Pierre, Gaëtan Hadjeres and François Pachet, ‘Deep Learning Techniques for Music Generation: A Survey’, arXiv:1709.01620 (2017) Brown, Tom B., et al., ‘Adversarial Patch’, arXiv:1712.09665 (2017) Cavallo, Flaminia, Alison Pease, Jeremy Gow and Simon Colton, ‘Using Theory Formation Techniques for the Invention of Fictional Concepts’, in Proceedings of the Fourth International Conference on Computational Creativity (2013) Clarke, Eric F., ‘Imitating and Evaluating Real and Transformed Musical Performances’, Music Perception: An Interdisciplinary Journal, vol. 10, 317–41 (1993) Colton, Simon, ‘Refactorable Numbers: A Machine Invention’, Journal of Integer Sequences, vol. 2, article 99.1.2 (1999) , ‘The Painting Fool: Stories from Building an Automated Painter’, in Jon McCormack and Mark d’Inverno (eds.), Computers and Creativity, Springer, 2012 and Stephen Muggleton, ‘Mathematical Applications of Inductive Logic Programming’, Machine Learning, vol. 64(1), 25–64 (2006) and Dan Ventura, ‘You Can’t Know My Mind: A Festival of Computational Creativity’, in Proceedings of the Fifth International Conference on Computational Creativity (2014) , et al., ‘The “Beyond the Fence” Musical and “Computer Says Show” Documentary’, in Proceedings of the Seventh International Conference on Computational Creativity (2016) d’Inverno, Mark and Arthur Still, ‘A History of Creativity for Future AI Research’, in Proceedings of the Seventh International Conference on Computational Creativity (2016) du Sautoy, Marcus, ‘Finitely Generated Groups, p-Adic Analytic Groups and Poincaré Series’, Annals of Mathematics, vol. 137, 639–70 (1993) du Sautoy, Marcus, ‘Counting Subgroups in Nilpotent Groups and Points on Elliptic Curves’, J. reine angew.

Interestingly, some of the traits that the model picked out could be clearly identified: for example, action films or drama films. But others were much subtler and had no obvious label, and yet the computer had picked up a trend in the data. For me this is what is so exciting about these new algorithms. They have the potential to tell us something new about ourselves. In a way the deep-learning algorithm is picking up traits in our human code that we still haven’t been able to articulate in words. It’s as if we didn’t know what colour was and had no words to distinguish red from blue, but through the expression of our likes and dislikes the algorithm divided objects in front of us into two groups that correspond to blue and red.

Allen, 2016 Eagleton, Terry, The Ideology of the Aesthetic, Blackwell, 1990 Ford, Martin, The Rise of the Robots: Technology and the Threat of Mass Unemployment, Oneworld, 2015 Fuentes, Agustín, The Creative Spark: How Imagination Made Humans Exceptional, Dutton, 2017 Gaines, James, Evening in the Palace of Reason: Bach Meets Frederick the Great in the Age of Enlightenment, Fourth Estate, 2005 Ganesalingam, Mohan, The Language of Mathematics: A Linguistic and Philosophical Investigation, Springer, 2013 Gaut, Berys and Matthew Kieran (eds.), Creativity and Philosophy, Routledge, 2018 Goodfellow, Ian, Yoshua Bengio and Aaron Courville, Deep Learning, MIT Press, 2016 Harari, Yuval Noah, Homo Deus: A Brief History of Tomorrow, Harvill Secker, 2016 Hardy, G. H., A Mathematician’s Apology, CUP, 1940 Harel, David, Computers Ltd: What They Really Can’t Do, OUP, 2000 Hayles, N. Katherine, Unthought: The Power of the Cognitive Nonconscious, University of Chicago Press, 2017 Hofstadter, Douglas, Gödel, Escher, Bach: An Eternal Golden Braid, Penguin Books, 1979 , Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought, Basic Books 1995 , I am a Strange Loop, Basic Books, 2007 Kasparov, Garry, Deep Thinking: Where Artificial Intelligence Ends and Human Creativity Begins, John Murray, 2017 McAfee, Andrew and Erik Brynjolfsson, Machine Platform Crowd: Harnessing Our Digital Future, Norton, 2017 McCormack, Jon and Mark d’Inverno (eds.), Computers and Creativity, Springer, 2012 Monbiot, George, Out of the Wreckage: A New Politics for an Age of Crisis, Verso, 2017 Montfort, Nick, World Clock, Bad Quarto, 2013 Moretti, Franco, Graphs, Maps, Trees: Abstract Models for Literary History, Verso, 2005 Paul, Elliot Samuel and Scott Barry Kaufman (eds.), The Philosophy of Creativity: New Essays, OUP, 2014 Shalev-Shwartz, Shai and Shai Ben-David, Understanding Machine Learning: From Theory to Algorithms, CUP, 2014 Steels, Luc, The Talking Heads Experiment: Origins of Words and Meanings, Language Science Press, 2015 Steiner, Christopher, Automate This: How Algorithms Took Over the Markets, Our Jobs, and the World, Penguin Books, 2012 Tatlow, Ruth, Bach and the Riddle of the Number Alphabet, CUP, 1991 , Bach’s Numbers: Compositional Proportions and Significance, CUP, 2015 Tegmark, Max, Life 3.0: Being Human in the Age of Artificial Intelligence, Allen Lane, 2017 Wilson, Edward O., The Origins of Creativity, Allen Lane, 2017 Yorke, John, Into the Woods: A Five Act Journey into Story, Penguin Books, 2013 Papers For papers with references to arXiv visit the open access archive of papers at https://arxiv.org/.


Calling Bullshit: The Art of Scepticism in a Data-Driven World by Jevin D. West, Carl T. Bergstrom

airport security, algorithmic bias, AlphaGo, Amazon Mechanical Turk, Andrew Wiles, Anthropocene, autism spectrum disorder, bitcoin, Charles Babbage, cloud computing, computer vision, content marketing, correlation coefficient, correlation does not imply causation, crowdsourcing, cryptocurrency, data science, deep learning, deepfake, delayed gratification, disinformation, Dmitri Mendeleev, Donald Trump, Elon Musk, epigenetics, Estimating the Reproducibility of Psychological Science, experimental economics, fake news, Ford Model T, Goodhart's law, Helicobacter pylori, Higgs boson, invention of the printing press, John Markoff, Large Hadron Collider, longitudinal study, Lyft, machine translation, meta-analysis, new economy, nowcasting, opioid epidemic / opioid crisis, p-value, Pluto: dwarf planet, publication bias, RAND corporation, randomized controlled trial, replication crisis, ride hailing / ride sharing, Ronald Reagan, selection bias, self-driving car, Silicon Valley, Silicon Valley startup, social graph, Socratic dialogue, Stanford marshmallow experiment, statistical model, stem cell, superintelligent machines, systematic bias, tech bro, TED Talk, the long tail, the scientific method, theory of mind, Tim Cook: Apple, twin studies, Uber and Lyft, Uber for X, uber lyft, When a measure becomes a target

Connect enough of these perceptrons together in the right ways, and you can build a chess-playing computer, a self-driving car, or an algorithm that translates speech in real time like Douglas Adams’s Babel Fish. You don’t hear the term “perceptron” often these days, but these circuits are the building blocks for the convolutional neural networks and deep learning technologies that appear in headlines daily. The same old magic is still selling tickets. The inventor of the perceptron, Frank Rosenblatt, was a psychologist by training, with broad interests in astronomy and neurobiology. He also had a knack for selling big ideas. While working at the Cornell Aeronautical Laboratory, he used a two-million-dollar IBM 704 computer to simulate his first perceptron.

The newspapers are full of breathless articles gushing about the latest breakthrough that someone promises is just around the corner. AI jobs are paying superstar salaries. Tech firms are wooing away from campus professors with AI expertise. Venture capital firms are throwing money at anyone who can say “deep learning” with a straight face. Here Rosenblatt deserves credit because many of his ambitious predictions have come true. The algorithms and basic architecture behind modern AI—machines that mimic human intelligence—are pretty much the same as he envisioned. Facial recognition technology, virtual assistants, machine translation systems, and stock-trading bots are all built upon perceptron-like algorithms.

Because the data are central to these systems, one rarely needs professional training in computer science to spot unconvincing claims or problematic applications. Most of the time, we don’t need to understand the learning algorithm in detail. Nor do we need to understand the workings of the program that the learning algorithm generates. (In so-called deep learning models, no one—including the creators of the algorithm—really understands the workings of the program that algorithm generates.) All you have to do to spot problems is to think about the training data and the labels that are fed into the algorithm. Begin with bad data and labels, and you’ll get a bad program that makes bad predictions in return.


pages: 280 words: 74,559

Fully Automated Luxury Communism by Aaron Bastani

"Peter Beck" AND "Rocket Lab", Alan Greenspan, Anthropocene, autonomous vehicles, banking crisis, basic income, Berlin Wall, Bernie Sanders, Boston Dynamics, Bretton Woods, Brexit referendum, capital controls, capitalist realism, cashless society, central bank independence, collapse of Lehman Brothers, computer age, computer vision, CRISPR, David Ricardo: comparative advantage, decarbonisation, deep learning, dematerialisation, DIY culture, Donald Trump, double helix, driverless car, electricity market, Elon Musk, energy transition, Erik Brynjolfsson, fake news, financial independence, Francis Fukuyama: the end of history, future of work, Future Shock, G4S, general purpose technology, Geoffrey Hinton, Gregor Mendel, housing crisis, income inequality, industrial robot, Intergovernmental Panel on Climate Change (IPCC), Internet of things, Isaac Newton, James Watt: steam engine, Jeff Bezos, Jeremy Corbyn, Jevons paradox, job automation, John Markoff, John Maynard Keynes: technological unemployment, Joseph Schumpeter, Kevin Kelly, Kuiper Belt, land reform, Leo Hollis, liberal capitalism, low earth orbit, low interest rates, low skilled workers, M-Pesa, market fundamentalism, means of production, mobile money, more computing power than Apollo, new economy, off grid, pattern recognition, Peter H. Diamandis: Planetary Resources, post scarcity, post-work, price mechanism, price stability, private spaceflight, Productivity paradox, profit motive, race to the bottom, rewilding, RFID, rising living standards, Robert Solow, scientific management, Second Machine Age, self-driving car, sensor fusion, shareholder value, Silicon Valley, Simon Kuznets, Slavoj Žižek, SoftBank, stem cell, Stewart Brand, synthetic biology, technological determinism, technoutopianism, the built environment, the scientific method, The Wealth of Nations by Adam Smith, Thomas Malthus, transatlantic slave trade, Travis Kalanick, universal basic income, V2 rocket, Watson beat the top human players on Jeopardy!, We are as Gods, Whole Earth Catalog, working-age population

While the world economy may be much bigger now than it was in 1900, employing more people and enjoying far higher output per person, the lines of work nearly everyone performs – drivers, nurses, teachers and cashiers – aren’t particularly new. Actually Existing Automation In March 2017 Amazon launched its Amazon GO store in downtown Seattle. Using computer vision, deep learning algorithms, and sensor fusion to identify selected items the company looked to build a near fully automated store without cashiers. Here Amazon customers would be able to buy items simply by swiping in with a phone, choosing the things they wanted and swiping out to leave, their purchases automatically debited to their Amazon account.

Before Amazon Go was even announced, the British Retail Consortium predicted almost a third of the country’s 3 million retail jobs would disappear by 2025, resulting in 900,000 lost jobs as companies turn to technology to replace workers. As with self-driving cars and Atlas, all of this is possible because of extreme supply in information – from things like image and range sensors, to stereo cameras, deep learning algorithms, and the ubiquity of smartphones and online accounts. The same holds true elsewhere in the supply chain, from the warehousing robots using sensors and barcodes controlled by a central server, to the autonomous vehicles set to oversee distribution and delivery – whether by vehicle or drone.

Incredibly, it has a self-teaching neural network which constantly adds to its knowledge of how the heart works with each new case it examines. It is in areas such as this where automation will make initial incursions into medicine, boosting productivity by accompanying, rather than replacing, existing workers. Yet such systems will improve with each passing year and some, like ‘godfather of deep learning’ Geoffrey Hinton, believe that medical schools will soon stop training radiologists altogether. Perhaps that is presumptuous – after all, we’d want a level of quality control and maybe even the final diagnosis to involve a human – but even then, this massively upgraded, faster process might need one trained professional where at present there are dozens, resulting in a quicker, superior service that costs less in both time and money.


pages: 269 words: 70,543

Tech Titans of China: How China's Tech Sector Is Challenging the World by Innovating Faster, Working Harder, and Going Global by Rebecca Fannin

"World Economic Forum" Davos, Adam Neumann (WeWork), Airbnb, augmented reality, autonomous vehicles, Benchmark Capital, Big Tech, bike sharing, blockchain, call centre, cashless society, Chuck Templeton: OpenTable:, clean tech, cloud computing, computer vision, connected car, corporate governance, cryptocurrency, data is the new oil, data science, deep learning, Deng Xiaoping, Didi Chuxing, digital map, disruptive innovation, Donald Trump, El Camino Real, electricity market, Elon Musk, fake news, family office, fear of failure, fulfillment center, glass ceiling, global supply chain, Great Leap Forward, income inequality, industrial robot, information security, Internet of things, invention of movable type, Jeff Bezos, Kickstarter, knowledge worker, Lyft, Mark Zuckerberg, Mary Meeker, megacity, Menlo Park, money market fund, Network effects, new economy, peer-to-peer lending, personalized medicine, Peter Thiel, QR code, RFID, ride hailing / ride sharing, Sand Hill Road, self-driving car, sharing economy, Shenzhen was a fishing village, Silicon Valley, Silicon Valley startup, Skype, smart cities, smart transportation, Snapchat, social graph, SoftBank, software as a service, South China Sea, sovereign wealth fund, speech recognition, stealth mode startup, Steve Jobs, stock buybacks, supply-chain management, tech billionaire, TechCrunch disrupt, TikTok, Tim Cook: Apple, Travis Kalanick, Uber and Lyft, Uber for X, uber lyft, urban planning, Vision Fund, warehouse automation, WeWork, winner-take-all economy, Y Combinator, young professional

Determined to keep a lead in cutting-edge AI technology, Baidu budgeted $300 million for a second Silicon Valley research lab in 2017, supplementing its first in 2014, and the Beijing-based titan has set up an engineering office in Seattle to focus on autonomous driving and internet security. Baidu has pumped loads of capital into AI startups in the United States with technologies for deep learning, data analytics, and computer vision. See table 2-3. “Having missed out on the social mobile and e-commerce waves of the past few years, Baidu is trying not to repeat the same mistake by going all in on AI, on all fronts,” observes Evdemon of Sinovation Ventures, the Beijing-based venture capital firm headed by AI expert and investor Kai-Fu Lee.

Users open the app and access news through Toutiao’s 4,000 media partnerships without following other accounts, unlike Facebook or Twitter. Anu Hariharan, a partner with Y Combinator’s Continuity Fund in San Francisco, likens Toutiao to YouTube and technology news aggregator Techmeme in one. She finds the most interesting thing about Toutiao to be how it uses machine-and deep-learning algorithms to serve up personalized, high-quality content without any user inputs, social graphs, or product purchase history to rely on.19 From Sea to Shining Sea ByteDance has been moving up in recent years with content deals and smart acquisitions, fulfilling founder Zhang’s mission of making his startup borderless.

The New York City Police Department is reportedly monitoring citizens using cameras and facial recognition software developed in China, from SenseTime partner Hikvision.1 In the United States, tech giants Google, Microsoft, Amazon, Facebook, and IBM dominate AI for many futuristic and practical uses. Google self-driving cars are being tested on California’s Highway 101; Facebook spins out posts based on deep learning of content preferences; Amazon’s Alexa powers lights, TVs, and speakers by voice activation; and Microsoft’s Azure relies on cognitive computing for speech and language applications, while IBM Watson’s AI-based computer system increases productivity and improves customer service for call centers, production lines, and warehouses.


pages: 626 words: 167,836

The Technology Trap: Capital, Labor, and Power in the Age of Automation by Carl Benedikt Frey

3D printing, AlphaGo, Alvin Toffler, autonomous vehicles, basic income, Bernie Sanders, Branko Milanovic, British Empire, business cycle, business process, call centre, Cambridge Analytica, Capital in the Twenty-First Century by Thomas Piketty, Charles Babbage, Clayton Christensen, collective bargaining, computer age, computer vision, Corn Laws, Cornelius Vanderbilt, creative destruction, data science, David Graeber, David Ricardo: comparative advantage, deep learning, DeepMind, deindustrialization, demographic transition, desegregation, deskilling, Donald Trump, driverless car, easy for humans, difficult for computers, Edward Glaeser, Elon Musk, Erik Brynjolfsson, everywhere but in the productivity statistics, factory automation, Fairchild Semiconductor, falling living standards, first square of the chessboard / second half of the chessboard, Ford Model T, Ford paid five dollars a day, Frank Levy and Richard Murnane: The New Division of Labor, full employment, future of work, game design, general purpose technology, Gini coefficient, Great Leap Forward, Hans Moravec, high-speed rail, Hyperloop, income inequality, income per capita, independent contractor, industrial cluster, industrial robot, intangible asset, interchangeable parts, Internet of things, invention of agriculture, invention of movable type, invention of the steam engine, invention of the wheel, Isaac Newton, James Hargreaves, James Watt: steam engine, Jeremy Corbyn, job automation, job satisfaction, job-hopping, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Joseph Schumpeter, Kickstarter, Kiva Systems, knowledge economy, knowledge worker, labor-force participation, labour mobility, Lewis Mumford, Loebner Prize, low skilled workers, machine translation, Malcom McLean invented shipping containers, manufacturing employment, mass immigration, means of production, Menlo Park, minimum wage unemployment, natural language processing, new economy, New Urbanism, Nick Bostrom, Norbert Wiener, nowcasting, oil shock, On the Economy of Machinery and Manufactures, OpenAI, opioid epidemic / opioid crisis, Pareto efficiency, pattern recognition, pink-collar, Productivity paradox, profit maximization, Renaissance Technologies, rent-seeking, rising living standards, Robert Gordon, Robert Solow, robot derives from the Czech word robota Czech, meaning slave, safety bicycle, Second Machine Age, secular stagnation, self-driving car, seminal paper, Silicon Valley, Simon Kuznets, social intelligence, sparse data, speech recognition, spinning jenny, Stephen Hawking, tacit knowledge, The Future of Employment, The Rise and Fall of American Growth, The Wealth of Nations by Adam Smith, Thomas Malthus, total factor productivity, trade route, Triangle Shirtwaist Factory, Turing test, union organizing, universal basic income, warehouse automation, washing machines reduced drudgery, wealth creators, women in the workforce, working poor, zero-sum game

In particular, the narrow focus meant that the algorithm often lost the broader context. A solution to this problem has been found in so-called deep learning, which uses artificial neural networks with more layers. These advances allow machine translators to better capture the structure of complex sentences. Neural Machine Translation (NMT), as it is called, used to be computationally expensive both in training and in translation inference. But due to the progression of Moore’s Law and the availability of larger data sets, NMT has now become viable. In machine translation, deep learning is not without its own drawbacks. One major challenge relates to the translation of rare words.

Today, some 3.5 million Americans work as cashiers across the country. But if you go to an Amazon Go store, you will not see a single cashier or even a self-service checkout stand. Customers walk in, scan their phones, and walk out with what they need. To achieve this, Amazon is leveraging recent advances in computer vision, deep learning, and sensors that track customers, the items they reach for, and take with them. Amazon then bills the credit card passed through the turnstile when the customer leaves the store and sends the receipt to the Go app. While the rollout of the first Seattle, Washington, prototype store was delayed because of issues with tracking multiple users and objects, Amazon now runs three Go stores in Seattle and another in Chicago, Illinois, and plans to launch another three thousand by 2021.

Agrawal, Joshua Gans, and Avi Goldfarb (Chicago: University of Chicago Press), figure 1. 14. “Germany Starts Facial Recognition Tests at Rail Station,” 2017, New York Post, December 17. 15. N. Coudray et al., 2018, “Classification and Mutation Prediction from Non–Small Cell Lung Cancer Histopathology Images Using Deep Learning,” Nature Medicine 24 (10): 1559–1567. 16. A. Esteva et al., 2017, “Dermatologist-Level Classification of Skin Cancer with Deep Neural Networks,” Nature 542 (7639): 115. 17. W. Xiong et al., 2017, “The Microsoft 2017 Conversational Speech Recognition System,” Microsoft AI and Research Technical Report MSR-TR-2017-39, August, https://www.microsoft.com/en-us/research/wp-content/uploads/2017/08/ms_swbd17-2.pdf. 18.


pages: 328 words: 84,682

The Business of Platforms: Strategy in the Age of Digital Competition, Innovation, and Power by Michael A. Cusumano, Annabelle Gawer, David B. Yoffie

activist fund / activist shareholder / activist investor, Airbnb, AltaVista, Amazon Web Services, AOL-Time Warner, asset light, augmented reality, autonomous vehicles, barriers to entry, bitcoin, blockchain, business logic, Cambridge Analytica, Chuck Templeton: OpenTable:, cloud computing, collective bargaining, commoditize, CRISPR, crowdsourcing, cryptocurrency, deep learning, Didi Chuxing, distributed ledger, Donald Trump, driverless car, en.wikipedia.org, fake news, Firefox, general purpose technology, gig economy, Google Chrome, GPS: selective availability, Greyball, independent contractor, Internet of things, Jeff Bezos, Jeff Hawkins, John Zimmer (Lyft cofounder), Kevin Roose, Lean Startup, Lyft, machine translation, Mark Zuckerberg, market fundamentalism, Metcalfe’s law, move fast and break things, multi-sided market, Network effects, pattern recognition, platform as a service, Ponzi scheme, recommendation engine, Richard Feynman, ride hailing / ride sharing, Robert Metcalfe, Salesforce, self-driving car, sharing economy, Silicon Valley, Skype, Snapchat, SoftBank, software as a service, sovereign wealth fund, speech recognition, stealth mode startup, Steve Ballmer, Steve Jobs, Steven Levy, subscription business, Susan Wojcicki, TaskRabbit, too big to fail, transaction costs, transport as a service, Travis Kalanick, two-sided market, Uber and Lyft, Uber for X, uber lyft, vertical integration, Vision Fund, web application, zero-sum game

Both involve a dramatic change in platform ecosystems. VOICE WARS: RAPID GROWTH BUT CHAOTIC PLATFORM COMPETITION Although artificial intelligence has been around for decades, one branch has made exceptional progress: machine learning (using special software algorithms to analyze and learn from data) and the subfield of deep learning (using hardware and software to build massively parallel processors called neural networks to mimic how the brain works). Applications of these technologies have led to dramatic improvements in certain forms of pattern recognition, especially for images and voice. Apple got the world excited about a voice interface when it introduced Siri in 2011.

No company in 2018 seemed to have a clear path to making a profit directly from this technology. As we finished this book, it was too early to tell how the voice wars will play out. The market was still like the Wild West—more chaos than order. Between 2017 and 2018, improvements in machine learning and deep learning were creating better voice experiences across all competitors. Google appeared to be the technical leader in AI, with many applications in search, advertisements, and machine translation, among others. Apple, which lagged behind in early benchmarks, was improving quickly, as were the benchmarks for Microsoft’s Cortana and Amazon’s Alexa.4 In 2018, Google had the advantage of hundreds of millions of devices (Android smartphones) that have Google’s voice capabilities embedded.

See self-regulation and curation customer tying, 201 Cusumano, Michael, vii–viii, ix, 10–11, 125, 182 data analyzing customer behavior, 53 captured by Google, 10 increasing transparency of performance data on platform members, 91–92 Predix platform governance and use of data, 165 value of, 51 Zuckerberg on data portability vs. security, 190–91 Davis, Carl, 201 deep learning and machine learning, 220–21 Deliveroo (United Kingdom), 83–84, 195–96 demand-side economies of scale, 249n18. See also network effects differentiation and niche competition overview, 25, 41–42, 44–47 AT&T’s lack of, 36–37 digital technology impact, 54–56 niche companies, 46, 55–56 Digital Foundries for Predix, 164–65 digital platforms diffusion and refinement of, 49 gaining market share, 31 global power of, 9 impact on business, 10–11 instant messaging, 41–42 Yellow Pages compared to, 40–41 digital revolution, 11–12, 107 digital technologies overview, 20, 27–28 impact on platform market drivers, 49–58 and network effects, 50–52 organizational commitment to one technology, 54 direct or same-side network effects, 16, 42–44 Dorsey, Jack, 89 Dubinsky, Donna, 69 D-Wave, 227, 228, 229 EachNet (China), 120–23, 120f eBay competition, 56 EachNet in China, 120–23, 120f fraud prevention by, 93 mistakes in China, 119–20, 121–23, 137, 186 success of, 119 economies of scale demand-side, 249n18 digital technologies and, 57 on innovation platforms, 79–80 network effects based on, 16–17 transitioning from losing money to making money, 137 See also network effects ecosystem companies overview, vii, 4 for GE’s Predix platform, 164–65 global power of, 8–9 market value of, 8 platform ecosystems as barriers to entry, 48 ecosystem rules, establishing and enforcing overview, 85–86 innovation platforms, 86–90 transaction platforms, 90–93 Editas Medicine, 232 emerging platforms.


pages: 291 words: 80,068

Framers: Human Advantage in an Age of Technology and Turmoil by Kenneth Cukier, Viktor Mayer-Schönberger, Francis de Véricourt

Albert Einstein, Andrew Wiles, Apollo 11, autonomous vehicles, Ben Bernanke: helicopter money, Berlin Wall, bitcoin, Black Lives Matter, blockchain, Blue Ocean Strategy, circular economy, Claude Shannon: information theory, cognitive dissonance, cognitive load, contact tracing, coronavirus, correlation does not imply causation, COVID-19, credit crunch, CRISPR, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, deep learning, DeepMind, defund the police, Demis Hassabis, discovery of DNA, Donald Trump, double helix, Douglas Hofstadter, Elon Musk, en.wikipedia.org, fake news, fiat currency, framing effect, Francis Fukuyama: the end of history, Frank Gehry, game design, George Floyd, George Gilder, global pandemic, global village, Gödel, Escher, Bach, Higgs boson, Ignaz Semmelweis: hand washing, informal economy, Isaac Newton, Jaron Lanier, Jeff Bezos, job-hopping, knowledge economy, Large Hadron Collider, lockdown, Louis Pasteur, Mark Zuckerberg, Mercator projection, meta-analysis, microaggression, Mustafa Suleyman, Neil Armstrong, nudge unit, OpenAI, packet switching, pattern recognition, Peter Thiel, public intellectual, quantitative easing, Ray Kurzweil, Richard Florida, Schrödinger's Cat, scientific management, self-driving car, Silicon Valley, Steve Jobs, Steven Pinker, TED Talk, The Structural Transformation of the Public Sphere, Thomas Kuhn: the structure of scientific revolutions, TikTok, Tim Cook: Apple, too big to fail, transaction costs, Tyler Cowen

In Germany: Alexander Frölich, “Rechtsextremisten steuern die Corona-Proteste zum Teil schon,” Der Tagesspiegel, November 16, 2020, https://www.tagesspiegel.de/berlin/berliner-sicherheitsbehoerden-alarmiert-rechtsextremisten-steuern-die-corona-proteste-zum-teil-schon/26627460.html; Tilma Steffen and Ferdinand Otto, “Aktivisten kamen als Gäste der AfD in den Bundestag,” Die Zeit, November 19, 2020, https://www.zeit.de/politik/deutschland/2020-11/bundestag-afd-stoerer-corona-protest-einschleusung. On François Chollet: François Chollet, Deep Learning with Python (Shelter Island, NY: Manning, 2017). See https://blog.keras.io/the-limitations-of-deep-learning.html. In an interview with Kenneth Cukier in February 2021, he elaborated on how to improve “extreme generatlization,” or framing: “The way you learn and adapt is by constantly making analogies with past situations and concepts. If you have a very rich and diverse bank of past situations and concepts to leverage, you will be able to make more powerful analogies.”

The machine, as Dennett suggests, can do a lot of calculating with an immense amount of formal logic and processing reams of data, but it cannot frame. Much has changed in AI since Dennett wrote his three scenarios. AI no longer relies on humans feeding abstract rules into machines. Instead, the most popular methods today, such as machine learning and deep learning, involve systems partially self-optimizing from massive amounts of data. But although the process is different, the difficulty hasn’t gone away. Even with lots of training data, when a robot encounters a novel situation like a ticking bomb, it can be at an utter loss. Framing—capturing some essence of reality through a mental model in order to devise an effective course of action—is something humans do and machines cannot.

On a death every three seconds: Joe Myers, “This Is How Many People Antibiotic Resistance Could Kill Every Year by 2050 If Nothing Is Done,” World Economic Forum, September 23, 2016, https://www.weforum.org/agenda/2016/09/this-is-how-many-people-will-die-from-antimicrobial-resistance-every-year-by-2050-if-nothing-is-done/. Coolidge’s son’s infection: Chelsea Follett, “U.S. President’s Son Dies of an Infected Blister?,” HumanProgress, March 1, 2016, https://www.humanprogress.org/u-s-presidents-son-dies-of-an-infected-blister/. AI to identify antibiotics: Jonathan M. Stokes et al., “A Deep Learning Approach to Antibiotic Discovery,” Cell 180, no. 4 (February 20, 2020): 688–702. Barzilay quotes: Regina Barzilay, in an interview with Kenneth Cukier, February and November 2020. On Colin Kaepernick: Eric Reid, “Why Colin Kaepernick and I Decided to Take a Knee,” New York Times, September 25, 2017, https://www.nytimes.com/2017/09/25/opinion/colin-kaepernick-football-protests.html.


pages: 1,172 words: 114,305

New Laws of Robotics: Defending Human Expertise in the Age of AI by Frank Pasquale

affirmative action, Affordable Care Act / Obamacare, Airbnb, algorithmic bias, Amazon Mechanical Turk, Anthropocene, augmented reality, Automated Insights, autonomous vehicles, basic income, battle of ideas, Bernie Sanders, Big Tech, Bill Joy: nanobots, bitcoin, blockchain, Brexit referendum, call centre, Cambridge Analytica, carbon tax, citizen journalism, Clayton Christensen, collective bargaining, commoditize, computer vision, conceptual framework, contact tracing, coronavirus, corporate social responsibility, correlation does not imply causation, COVID-19, critical race theory, cryptocurrency, data is the new oil, data science, decarbonisation, deep learning, deepfake, deskilling, digital divide, digital twin, disinformation, disruptive innovation, don't be evil, Donald Trump, Douglas Engelbart, driverless car, effective altruism, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Evgeny Morozov, fake news, Filter Bubble, finite state, Flash crash, future of work, gamification, general purpose technology, Google Chrome, Google Glasses, Great Leap Forward, green new deal, guns versus butter model, Hans Moravec, high net worth, hiring and firing, holacracy, Ian Bogost, independent contractor, informal economy, information asymmetry, information retrieval, interchangeable parts, invisible hand, James Bridle, Jaron Lanier, job automation, John Markoff, Joi Ito, Khan Academy, knowledge economy, late capitalism, lockdown, machine readable, Marc Andreessen, Mark Zuckerberg, means of production, medical malpractice, megaproject, meta-analysis, military-industrial complex, Modern Monetary Theory, Money creation, move fast and break things, mutually assured destruction, natural language processing, new economy, Nicholas Carr, Nick Bostrom, Norbert Wiener, nuclear winter, obamacare, One Laptop per Child (OLPC), open immigration, OpenAI, opioid epidemic / opioid crisis, paperclip maximiser, paradox of thrift, pattern recognition, payday loans, personalized medicine, Peter Singer: altruism, Philip Mirowski, pink-collar, plutocrats, post-truth, pre–internet, profit motive, public intellectual, QR code, quantitative easing, race to the bottom, RAND corporation, Ray Kurzweil, recommendation engine, regulatory arbitrage, Robert Shiller, Rodney Brooks, Ronald Reagan, self-driving car, sentiment analysis, Shoshana Zuboff, Silicon Valley, Singularitarianism, smart cities, smart contracts, software is eating the world, South China Sea, Steve Bannon, Strategic Defense Initiative, surveillance capitalism, Susan Wojcicki, tacit knowledge, TaskRabbit, technological solutionism, technoutopianism, TED Talk, telepresence, telerobotics, The Future of Employment, The Turner Diaries, Therac-25, Thorstein Veblen, too big to fail, Turing test, universal basic income, unorthodox policies, wage slave, Watson beat the top human players on Jeopardy!, working poor, workplace surveillance , Works Progress Administration, zero day

Rashida Richardson, Jason M. Schultz, and Kate Crawford, “Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice,” New York University Law Review 94 (2019): 192–233. 37. Fabio Ciucci, “AI (Deep Learning) Explained Simply,” Data Science Central (blog), November 20, 2018, https://www.datasciencecentral.com/profiles/blogs/ai-deep-learning-explained-simply (“MLs fail when the older data gets less relevant or wrong very soon and often. The task or rules learned must keep the same, or at most rarely updated, so you can re-train.”). 38. Kiel Brennan-Marquez, “Plausible Cause: Explanatory Standards in the Age of Powerful Machines,” Vanderbilt Law Review 70 (2017): 1249–1302. 39.

Certainly narrow AI, designed to make specific predictions, is based on quantifying probability.23 It is but one of many steps taken over the past two decades to modernize medicine with a more extensive evidence base.24 Medical researchers have seized on predictive analytics, big data, artificial intelligence, machine learning, and deep learning as master metaphors for optimizing system performance. Literature in each of these areas can help regulators identify problematic data in AI. Moreover, critiques of the limits of AI itself (including lack of reproducibility, narrow validity, overblown claims, and opaque data) should also inform legal standards.25 The key idea here is that AI’s core competence—helping humans avoid errors—must now be turned on the humans who create AI.

Ajay Agrawal, Joshua Gans, and Avi Goldfarb, Prediction Machines: The Simple Economics of Artificial Intelligence (Cambridge, MA: Harvard Business Review Press, 2018). 10. For example, a self-driving car’s “vision” system may interpret a stop sign as a “45 miles per hour” sign if some pieces of tape are placed on the sign. Kevin Eykholt, Ivan Evtimov, Earlence Fernandes, Bo Li, Amir Rahmati, Chaowei Xiao, Atul Prakash, et al., “Robust Physical-World Attacks on Deep Learning Visual Classification,” arXiv:1707.08945v5 [cs.CR] (2018). 11. Eric Topol, Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again (New York: Basic Books, 2019). 12. Kim Saverno, “Ability of Pharmacy Clinical Decision-Support Software to Alert Users about Clinically Important Drug-Drug Interactions,” Journal of American Medical Informatics Association 18, no. 1 (2011): 32–37. 13.


pages: 442 words: 94,734

The Art of Statistics: Learning From Data by David Spiegelhalter

Abraham Wald, algorithmic bias, Anthropocene, Antoine Gombaud: Chevalier de Méré, Bayesian statistics, Brexit referendum, Carmen Reinhart, Charles Babbage, complexity theory, computer vision, confounding variable, correlation coefficient, correlation does not imply causation, dark matter, data science, deep learning, DeepMind, Edmond Halley, Estimating the Reproducibility of Psychological Science, government statistician, Gregor Mendel, Hans Rosling, Higgs boson, Kenneth Rogoff, meta-analysis, Nate Silver, Netflix Prize, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, p-value, placebo effect, probability theory / Blaise Pascal / Pierre de Fermat, publication bias, randomized controlled trial, recommendation engine, replication crisis, self-driving car, seminal paper, sparse data, speech recognition, statistical model, sugar pill, systematic bias, TED Talk, The Design of Experiments, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, Thomas Bayes, Thomas Malthus, Two Sigma

Measures that lack value for prediction or classification may be identified by data visualization or regression methods and then discarded, or the numbers of features may be reduced by forming composite measures that encapsulate most of the information. Recent developments in extremely complex models, such as those labelled as deep learning, suggest that this initial stage of data reduction may not be necessary and the total raw data can be processed in a single algorithm. Classification and Prediction A bewildering range of alternative methods are now readily available for building classification and prediction algorithms.

Neural networks comprise layers of nodes, each node depending on the previous layer by weights, rather like a series of logistic regressions piled on top of each other. Weights are learned by an optimization procedure, and, rather like random forests, multiple neural networks can be constructed and averaged. Neural networks with many layers have become known as deep-learning models: Google’s Inception image-recognition system is said to have over twenty layers and over 300,000 parameters to estimate. K-nearest-neighbour classifies according to the majority outcome among close cases in the training set. The results of applying some of these methods to the Titanic data, with tuning parameters chosen using tenfold cross-validation and ROC as an optimization criterion, are shown in Table 6.4.

data literacy: the ability to understand the principles behind learning from data, carry out basic data analyses, and critique the quality of claims made on the basis of data. data science: the study and application of techniques for deriving insights from data, including constructing algorithms for prediction. Traditional statistical science forms part of data science, which also includes a strong element of coding and data management. deep learning: a machine-learning technique that extends standard artificial neural network models to many layers representing different levels of abstraction, say going from individual pixels of an image through to recognition of objects. dependent events: when the probability of one event depends on the outcome of another event.


Gods and Robots: Myths, Machines, and Ancient Dreams of Technology by Adrienne Mayor

AlphaGo, Any sufficiently advanced technology is indistinguishable from magic, Asilomar, autonomous vehicles, caloric restriction, caloric restriction, classic study, deep learning, driverless car, Elon Musk, industrial robot, Islamic Golden Age, Jacquard loom, life extension, Menlo Park, Nick Bostrom, Panopticon Jeremy Bentham, popular electronics, self-driving car, Silicon Valley, Stephen Hawking, Thales and the olive presses, Thales of Miletus, theory of mind, TikTok, Turing test

Made, Not Born 1 1 The Robot and the Witch: Talos and Medea 7 2 Medea’s Cauldron of Rejuvenation 33 3 The Quest for Immortality and Eternal Youth 45 4 Beyond Nature: Enhanced Powers Borrowed from Gods and Animals 61 5 Daedalus and the Living Statues 85 6 Pygmalion’s Living Doll and Prometheus’s First Humans 105 7 Hephaestus: Divine Devices and Automata 129 8 Pandora: Beautiful, Artificial, Evil 156 9 Between Myth and History: Real Automata and Lifelike Artifices in the Ancient World 179 EPILOGUE. Awe, Dread, Hope: Deep Learning and Ancient Stories 213 Glossary 219 Notes 223 Bibliography 251 Index 265 ILLUSTRATIONS COLOR PLATES 1. Death of Talos 2. Jason uses a tool to destroy Talos 3. Foundry workers making a statue of an athlete 4. Blacksmith at work with tools 5. Medea rejuvenates a ram in her cauldron 6.

Bpk Bildagentur / Photo by Johannes Laurentius / Antikensammlung, Staatliche Museen, Berlin / Art Resource, NY. PLATE 14 (FIG. 8.7). Detail, Pandora admired by gods and goddesses, on the red-figure calyx krater by the Niobid Painter, about 460 BC, inv. 1856,1213.1. © The Trustees of the British Museum. EPILOGUE AWE, DREAD, HOPE DEEP LEARNING AND ANCIENT STORIES Ancient myths articulated timeless hopes and fears about artificial life, human limits, and immortality. What could we—and Artificial Intelligence—learn from the classical tales? THE MIX OF exuberance and anxiety aroused by a blurring of the lines between nature and machines might seem a uniquely modern response to the juggernaut of scientific progress in the age of technology.

As we saw, Prometheus warned humankind that Pandora’s jar should never be opened. Are Stephen Hawking, Elon Musk, Bill Gates, and other prescient thinkers the Promethean Titans of our era? They have warned scientists to halt or at least slow the reckless pursuit of AI, because they foresee that once it is set in motion, humans will be unable to control it. “Deep learning” algorithms allow AI computers to extract patterns from vast data, extrapolate to novel situations, and decide on actions with no human guidance. Inevitably AI entities will ask—and answer—questions of their own devising. Computers have already developed altruism and deceit on their own. Will AI become curious to discover hidden knowledge and make decisions by its own logic?


The Internet Trap: How the Digital Economy Builds Monopolies and Undermines Democracy by Matthew Hindman

A Declaration of the Independence of Cyberspace, accounting loophole / creative accounting, activist fund / activist shareholder / activist investor, AltaVista, Amazon Web Services, barriers to entry, Benjamin Mako Hill, bounce rate, business logic, Cambridge Analytica, cloud computing, computer vision, creative destruction, crowdsourcing, David Ricardo: comparative advantage, death of newspapers, deep learning, DeepMind, digital divide, discovery of DNA, disinformation, Donald Trump, fake news, fault tolerance, Filter Bubble, Firefox, future of journalism, Ida Tarbell, incognito mode, informal economy, information retrieval, invention of the telescope, Jeff Bezos, John Perry Barlow, John von Neumann, Joseph Schumpeter, lake wobegon effect, large denomination, longitudinal study, loose coupling, machine translation, Marc Andreessen, Mark Zuckerberg, Metcalfe’s law, natural language processing, Netflix Prize, Network effects, New Economic Geography, New Journalism, pattern recognition, peer-to-peer, Pepsi Challenge, performance metric, power law, price discrimination, recommendation engine, Robert Metcalfe, search costs, selection bias, Silicon Valley, Skype, sparse data, speech recognition, Stewart Brand, surveillance capitalism, technoutopianism, Ted Nelson, The Chicago School, the long tail, The Soul of a New Machine, Thomas Malthus, web application, Whole Earth Catalog, Yochai Benkler

Google has even built new globally distributed database systems called Spanner and F1, in which operations across different data centers are synced using atomic clocks.22 The latest iteration of Borg, Google’s cluster management system, coordinates “hundreds of thousands of jobs, from many thousands of different applications, across a number of clusters each with up to tens of thousands of machines.”23 In recent years Google’s data centers have expanded their capabilities in other ways, too. As Google has increasingly focused on problems like computer vision, speech recognition, and natural language processing, it has worked to deploy deep learning, a variant of neural network methods. Google’s investments in deep learning have been massive and multifaceted, including (among other things) major corporate acquisitions and the development of the TensorFlow high-level programming toolkit.24 But one critical component has been the development of a custom computer chip built specially for machine learning.

INDEX Italic pages refer to figures and tables ABC News, 75 A/B testing: attention economy and, 13; personalization and, 43, 53, 57; nature of the internet and, 170, 174–76; news and, 150–54, 158; recommendation systems and, 43; tilted playing field and, 27–28, 31 Accelerated Mobile Platform (AMP), 80, 144, 179 activism, 48, 103, 169 Adams, William, 65 Adaptive Semantics, 19 advertising: attention economy and, 3–4, 7, 11, 13; branding and, 28–32, 36, 86, 166; click-through rates and, 56–57; cost per thousand impressions (CPM) and, 69; economic geography and, 3–4, 63, 67–69, 80; Google and, 68–69; inversion of, 135, 178; local vs. national per-person rates and, 68; models and, 181–84; nature of internet and, 163–64, 168, 176, 178; new economics of, 67–69; news and, 102, 109, 129, 134–35, 138, 140–46, 163–64, 168, 176, 178; newspapers and, 11, 28, 57, 68–69, 102, 129, 135, 138, 140, 142–44, 164, 178–79; political economy and, 38, 40–41, 48, 56–58, 60; property targeting and, 56–57; remnant, 69; Super Bowl and, 68; television and, 68; tilted playing field and, 15, 17, 25, 28–36, 30 African Americans, 123–25, 126, 191 agglomeration, 9, 63, 82–83 aggregation, 65–67, 76–77 AIG, 86 Ajax, 34 algorithms: black box problem and, 52; deep learning and, 21; filters and, 39, 43, 48, 54, 60; K-nearest neighbor, 44–45, 54; lessons from, 48–53, 59–61; nature of internet and, 178; need for more data and, 50–51, 61; news and, 147, 151–52; personalization and, 39–44, 48–54, 60–61; principle component analysis (PCA) and, 46; Restricted Boltzmann Machine, 46; ridge regression, 46; root mean-squared error (RMSE) and, 43–44, 47–48, 50; Russian hackers and, 178; search costs and, 41–43; singular value decomposition (SVD), 45, 58–59.

and, 1, 3 auctions, 3, 42, 86, 101 audience reach: attention economy and, 11–14; categories of content and, 165; churn and, 9, 84–85, 88–95, 100–101, 163–64, 167; compounded, 133; Darwinism and, 13, 136, 165–67, 169, 203n9; distribution of, 6, 12–13, 84–85, 88–100, 112, 114, 155, 167–69, 171, 179, 185; evolutionary model and, 164–67; false solutions and, 137–46; growth patterns in, 84, 88, 91, 95–96, 100; headlines and, 13, 32, 36, 38, 107, 147, 149–50, 154–57, 160–61; logarithms and, 88–96, 97, 100, 184–86; methodology and, 184–92; mobile devices and, 2–4, 13, 39, 69, 109, 137, 142–44, 147, 152, 160, 165, 167, 170, 179; nature of internet and, 162–80; news and, 104–18, 121–22, 126–30, 133–39, 142–49, 152, 154, 157–61, 169; overlap and, 67, 76–77, 108, 110; paywalls and, 132, 137–40, 147, 160; public sphere and, 10–11, 13–14, 99, 169; rankings and, 7, 25, 31, 54, 84–96, 100, 110–12, 136, 157, 186; recommendation systems and, 60 (see also recommendation systems); stability and, 165; traffic and, 83–96, 99–101, 104–11, 114–18, 121, 129, 134, 169, 186–89; unique visitors and, 87–88, 106–11, 128, 134 AWS, 153, 168, 203n28 Ayres, Ian, 3 Bagdikian, Benjamin, 171 Bai, Matt, 7 Bankler, Yochai, 12, 170 Banko, Michele, 51 Bank of America, 9 Barlow, John Perry, 162–63, 176 BBC, 32 Beam, Christopher, 35 behavioral targeting, 55–58 Being Digital (Negroponte), 38 Bell, Robert, 44–49 bell curve, 92–93, 95 BellKor, 44–49 Bellman, Steven, 34 Bell Telephone, 16–17 Berners-Lee, Tim, 3 Bezos, Jeff, 72, 139, 147, 150 Bieschke, Eric, 38 BigChaos, 47 BigTable, 21, 23 Bing: attention economy and, 3; economic geography and, 79; Experimentation System and, 28; experiments and, 24–25, 28, 31; market share of, 3, 30–31, 195n63; nature of internet and, 174; news and, 32, 61, 134; page views and, 24; revenue and, 24, 31, 70; tilted playing field and, 24, 28, 30–32 “Bing It On” (Microsoft campaign), 31 black box problem, 52 blogs: attention economy and, 7–9, 13; Darwinism and, 136; economic geography and, 77; methodology and, 189; nature of internet and, 169; news and, 121, 133, 136–37, 155–56, 159; personalization and, Index 45, 50; political, 136; tilted playing field and, 17, 25, 35; Webb and, 45 Boczkowski, Pablo, 70–71 Borg, 21, 23 Boston, 113–14 Bosworth, Andrew, 162 Bowman, Douglas, 25 Box, George, 64 Bradshaw, Samantha, 177 Branch, John, 152 Brand, Stewart, 164 branding, 28–32, 36, 86, 166 Brexit, 58 Brill, Eric, 51 Broadband penetration, 124, 126, 190 browsers, 2, 24–25, 34, 107, 143, 175, 195n63 budget constraints, 72, 181 Buffet, Warren, 102 bundling, 65–67, 76–77, 197n8, 198n10, 198n14 Buzzfeed, 137, 145, 149–51, 159–60, 168 Caffeine, 21, 23 Cambridge Analytica, 40, 58, 59 Campus Network, 35–36 capitalism, 20, 85, 162, 176 Census Bureau, 190 Center for Public Integrity, 140 Chancellor, Joseph, 58 Chandler, Alfred, 20 Chartbeat, 107, 149, 153 Chicago, 113, 141 China, 80, 177–78, 193n5 Christie’s, 42 Chrome, 24–25, 145 churn, 9, 84–85, 88–95, 100–1, 163–64, 167 Chyri, Iris, 130 CineMatch, 43–44, 46, 50 Clauset, Aaron, 184 clickbait, 150 click-through rates, 56–57 cloud computing, 34, 153–54, 168, 203n28 CNN, 9, 32, 39, 72, 107, 159 Colossus, 21, 23 Columbia Journalism Review, 119, 148 Comcast, 172 comment systems, 19 comparative advantage, 62–63, 80, 82 • 227 competition: attention economy and, 1, 3–4, 6, 9, 11, 13, 165–66; bundling and, 67; economic geography and, 64, 67, 74, 78; nature of internet and, 164–70, 173–75; news and, 104, 114–15, 135, 137, 146, 149, 159, 164–70, 173–75; personalization and, 40; political economy and, 40–53, 60; search engines and, 1; social networks and, 35–36; tilted playing field and, 16–17, 21–22, 26–37; traffic and, 83, 86–87, 101 comScore: attention economy and, 10; nature of internet and, 187–88; news and, 104–10, 113, 116, 119–22, 127, 128–29; traffic and, 87, 199n19 concentration: attention economy and, 2–9, 13; dispersion and, 6, 42, 100, 200n23; economic geography and, 63–64, 68, 78, 80–81; forces of, 5–8, 19, 30, 61, 64, 80, 184; lopsided understanding of internet and, 5–8; markets and, 9, 30, 68, 78, 85–88, 99–100, 104, 114–15, 122, 127–30, 171, 184, 199n15; methodology and, 184; nature of internet and, 164, 171–72, 179; news and, 104, 114–15, 122, 126–28, 130, 134; personalization and, 39, 61; revenue and, 2–4, 8, 68, 171, 179; tilted playing field and, 19, 30, 32; traffic and, 7–8, 30, 32, 63, 83–88, 96, 99–101, 104, 122, 171, 199n15 Congress, 104, 141–42 conservatives, 32, 48, 69, 72, 75, 131, 175 consumption: attention economy and, 6, 10; branding and, 28–32, 36, 86, 166; bundling and, 65–67, 76–77, 197n8, 198n10, 198n14; digital content production and, 71–80; economic geography and, 63–68, 70–76, 79; experience goods and, 29–30; methodology and, 181–84; models and, 181–84; nature of internet and, 164–65, 168, 172–75; news and, 110, 113, 122, 125–29, 143, 149; personalization and, 42–43, 50, 57–58; preferences and, 5, 8, 32, 43, 54, 63–65, 69–81, 181–84, 198n41; price discrimination and, 66, 139; switching costs and, 8, 34, 63, 72, 78–79, 164; tilted playing field and, 17, 22–26, 29–34, 37; traffic and, 86; unique visitors and, 87–88, 106–11, 128, 134 cookies, 107 cost per thousand impressions (CPM), 69 court system, 6, 129, 148 creative destruction, 84, 167 228 • Index CSS, 143 CU Community, 35–36 DailyKos, 75 Daily Me, The, 38–39, 61 Daily You, The (Turow), 39 Dallas Morning News, 118 dark fiber, 21 Darwinism, 13, 136, 165–67, 169, 203n9 data centers, 2, 12, 15–16, 20–23, 54 data mining, 59, 135 data packets, 22, 171 “Declaration of the Independence of Cyberspace, A” (Barlow), 162–63 deep learning, 21 DeepMind, 23, 194n24 democracy, 7, 70, 103–4, 163, 175–78, 180 DeNardis, Laura, 171 Department of Justice, 115, 130 Des Moines Register, 136 Detroit, 83 digital content production model, 71–80 Dinosaur Planet, 46–47, 49 disinformation, 177–78 dispersion, 6, 42, 100, 200n23 distribution costs, 12–13, 155, 167–69, 179 diversity: economic geography and, 8, 63–64, 79; methodology and, 189, 191; nature of internet and, 180; news and, 104–5, 115, 118, 129–30, 149; personalization and, 53, 60–61 DNAinfo, 141 Dremel, 27 Duarte, Matias, 27 duopolies, 4, 30, 42, 68, 180 Earl, Jennifer, 169 Easterbrook, Frank, 175 eBay, 3, 26, 39, 42, 86–87, 175 economic geography: advertising and, 3–4, 63, 67–69, 80; agglomeration and, 9, 63, 82–83; aggregation and, 65–67, 76–77; Apple and, 79–80; apps and, 80; auctions and, 3, 42, 86, 101; audience reach and, 87, 104, 106–11, 114–18, 121, 129, 134, 169, 186–89; Bing and, 79; blogs and, 77; bundling and, 65–67, 76–77, 197n8, 198n10, 198n14; comparative advantage and, 62–63, 80; competition and, 64, 67, 74, 78; concentration and, 63–64, 68, 78, 80–81; consumption and, 63–67, 70–76, 79; customers and, 17, 24, 30, 33, 50, 57, 68, 149, 164, 172, 174; digital content production and, 71–80; dispersion and, 6, 42, 100, 200n23; distribution costs and, 12–13, 155, 167–69, 179; diversity and, 8, 63–64, 79; economies of scale and, 63–67, 74, 76, 79, 81; efficiency and, 63, 65, 67–68; Facebook and, 68, 79–80; Google and, 68–69, 79–80; government subsidies and, 133, 137, 140–42; hyperlocal sites and, 68, 77–78, 81, 101–4, 119, 121, 130–37, 164, 180; increasing returns and, 36–37, 63–64, 80–81, 181, 184; international trade and, 5, 62, 80; investment and, 73; journalism and, 78, 81; Krugman and, 6, 62–63, 80; lock-in and, 34–37, 61, 101, 173; long tails and, 8, 70, 84, 184, 199n15; markets and, 63–68, 74–75, 77–78; media preferences and, 69–71; Microsoft and, 65–66; models and, 63–65, 69–81, 198n27, 198n41; Netflix and, 70; networks and, 68–69; news and, 65–80; newspapers and, 65, 68–69, 78; page views and, 24, 87, 106, 108–18, 121, 125–29, 151, 157, 160–61, 188–89, 200n18; paywalls and, 107, 109, 132, 137–40, 147, 160; preferences and, 5, 8, 32, 43, 54, 63–65, 69–81, 181–84, 198n41; profit and, 73–77, 81; protectionism and, 174; quality and, 72–78, 81, 83–84; revenue and, 63, 65–68, 73–75, 79; search costs and, 8, 30, 34, 37, 41–43, 63, 72–74, 168, 181–82; search engines and, 64, 79–81; software and, 65–66; stickiness and, 74; subscriptions and, 65, 67; television and, 66, 70; traffic and, 63, 77–81; video and, 69, 76; Yahoo!


pages: 1,409 words: 205,237

Architecting Modern Data Platforms: A Guide to Enterprise Hadoop at Scale by Jan Kunigk, Ian Buss, Paul Wilkinson, Lars George

Amazon Web Services, barriers to entry, bitcoin, business intelligence, business logic, business process, cloud computing, commoditize, computer vision, continuous integration, create, read, update, delete, data science, database schema, Debian, deep learning, DevOps, domain-specific language, fault tolerance, Firefox, FOSDEM, functional programming, Google Chrome, Induced demand, information security, Infrastructure as a Service, Internet of things, job automation, Kickstarter, Kubernetes, level 1 cache, loose coupling, microservices, natural language processing, Network effects, platform as a service, single source of truth, source of truth, statistical model, vertical integration, web application

Data scientists typically also need extensive experience with SQL as a tool to drill down into the datasets that they require to build statistical models, via SparkSQL, Hive, or Impala. Machine learning and deep learning Simply speaking, machine learning is where the rubber of big data analytics hits the road. While certainly a hyped term, machine learning goes beyond classic statistics, with more advanced algorithms that predict an outcome by learning from the data—often without explicitly being programmed. The most advanced methods in machine learning, referred to as deep learning, are able to automatically discover the relevant data features for learning, which essentially enables use cases like computer vision, natural language processing, or fraud detection for any corporation.

Many, if not most, enterprises have already embarked on their data-driven journeys and are making serious investments in hardware, software, and services. The big data market is projected to continue growing apace, reaching somewhere in the region of $90 billion of annual revenue by 2025. Related markets, such as deep learning and artificial intelligence, that are enabled by data platforms are also set to see exponential growth over the next decade. The move to Hadoop, and to modern data platforms in general, has coincided with a number of secular trends in enterprise IT, a selection of which are discussed here. Some of these trends are directly caused by the focus on big data, but others are a result of a multitude of other factors, such as the desire to reduce software costs, consolidate and simplify IT operations, and dramatically reduce the time to procure new hardware and resources for new use cases.

It is now generally accepted that, for storage and data processing, the right way to scale a platform is to do so horizontally using distributed clusters of commodity (which does not necessarily mean the cheapest) servers rather than vertically with ever more powerful machines. Although some workloads, such as deep learning, are more difficult to distribute and parallelize, they can still benefit from plenty of machines with lots of cores, RAM, and GPUs, and the data to drive such workloads will be ingested, cleaned, and prepared in horizontally scalable environments. Adoption of Open Source Although proprietary software will always have its place, enterprises have come to appreciate the benefits of placing open source software at the center of their data strategies, with its attendant advantages of transparency and data freedom.


pages: 533

Future Politics: Living Together in a World Transformed by Tech by Jamie Susskind

3D printing, additive manufacturing, affirmative action, agricultural Revolution, Airbnb, airport security, algorithmic bias, AlphaGo, Amazon Robotics, Andrew Keen, Apollo Guidance Computer, artificial general intelligence, augmented reality, automated trading system, autonomous vehicles, basic income, Bertrand Russell: In Praise of Idleness, Big Tech, bitcoin, Bletchley Park, blockchain, Boeing 747, brain emulation, Brexit referendum, British Empire, business process, Cambridge Analytica, Capital in the Twenty-First Century by Thomas Piketty, cashless society, Cass Sunstein, cellular automata, Citizen Lab, cloud computing, commons-based peer production, computer age, computer vision, continuation of politics by other means, correlation does not imply causation, CRISPR, crowdsourcing, cryptocurrency, data science, deep learning, DeepMind, digital divide, digital map, disinformation, distributed ledger, Donald Trump, driverless car, easy for humans, difficult for computers, Edward Snowden, Elon Musk, en.wikipedia.org, end-to-end encryption, Erik Brynjolfsson, Ethereum, ethereum blockchain, Evgeny Morozov, fake news, Filter Bubble, future of work, Future Shock, Gabriella Coleman, Google bus, Google X / Alphabet X, Googley, industrial robot, informal economy, intangible asset, Internet of things, invention of the printing press, invention of writing, Isaac Newton, Jaron Lanier, John Markoff, Joseph Schumpeter, Kevin Kelly, knowledge economy, Large Hadron Collider, Lewis Mumford, lifelogging, machine translation, Metcalfe’s law, mittelstand, more computing power than Apollo, move fast and break things, natural language processing, Neil Armstrong, Network effects, new economy, Nick Bostrom, night-watchman state, Oculus Rift, Panopticon Jeremy Bentham, pattern recognition, payday loans, Philippa Foot, post-truth, power law, price discrimination, price mechanism, RAND corporation, ransomware, Ray Kurzweil, Richard Stallman, ride hailing / ride sharing, road to serfdom, Robert Mercer, Satoshi Nakamoto, Second Machine Age, selection bias, self-driving car, sexual politics, sharing economy, Silicon Valley, Silicon Valley startup, Skype, smart cities, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, smart contracts, Snapchat, speech recognition, Steve Bannon, Steve Jobs, Steve Wozniak, Steven Levy, tech bro, technological determinism, technological singularity, technological solutionism, the built environment, the Cathedral and the Bazaar, The Structural Transformation of the Public Sphere, The Wisdom of Crowds, Thomas L Friedman, Tragedy of the Commons, trolley problem, universal basic income, urban planning, Watson beat the top human players on Jeopardy!, work culture , working-age population, Yochai Benkler

An unsupervised machine can therefore be used to ‘discover knowledge’, that is, to make connections of which its human programmers were totally unaware.36 In reinforcement learning, the machine is given ‘rewards’ and ‘punishments’ telling it whether what it did was right or wrong. The machine self-improves. OUP CORRECTED PROOF – FINAL, 26/05/18, SPi РЕЛИЗ ПОДГОТОВИЛА ГРУППА "What's News" VK.COM/WSNWS 36 FUTURE POLITICS Many of the advances described in this chapter, particularly those involving images, speech, and text, are the result of so-called ‘deep learning’ techniques that use ‘neural networks’ inspired by the structure of animal brains. Google launched one in 2012, integrating 1,000 large computers with more than a billion connections. This computer was presented with 10 million ‘random’ images from YouTube videos. It was not told what to look for, and the images were not labelled.

After three days, one unit had learned to identify human faces and another had learned to respond to images of a cat’s face (this is YouTube after all).37 Engineers at Google now use ‘duelling’ neural networks to train each other: one AI system creates realistic images while a second AI system plays the role of critic, trying to work out whether they’re fake or real.38 The rapid increase in the use of deep learning can be seen from the AI systems used in games. The version of Deep Blue that beat Garry Kasparov at chess in 1997 was programmed with many general principles of good play.What’s most remarkable about AlphaGo Zero—the latest and most powerful incarnation of the Go-playing AI systems—however, is that it ‘learned’ not by playing against the very best humans or even learning from human play, but by playing against itself over and over again, starting from completely random moves, and rapidly improving over time.39 Machine learning has been around for a while.

Cade Metz,‘Google’s Dueling Neural Networks Spar to Get Smarter, No Humans Required’, Wired, 11 April 2017 <https://www.wired. com/2017/04/googles-dueling-neural-networks-spar-get-smarterno-humans-required/> (accessed 28 November 2017). 39. Silver et al., ‘Mastering’. 40. Domingos, Master Algorithm, 7. 41. Neil Lawrence, quoted in Alex Hern, ‘Why Data is the New Coal’, The Guardian, 27 September 2016 <https://www.theguardian.com/ technology/2016/sep/27/data-efficiency-deep-learning> (accessed 28 November 2017). 42. Ray Kurzweil, The Singularity is Near (New York:Viking, 2005), 127, cited in Susskind and Susskind, Future of the Professions, 157; Peter H. Diamandis and Steven Kotler, Abundance:The Future is Better Than You Think (New York: Free Press, 2014), 55. 43. Paul Mason, Postcapitalism: A Guide to Our Future (London: Allen Lane, 2015), 121. 44.


pages: 439 words: 131,081

The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World by Max Fisher

2021 United States Capitol attack, 4chan, A Declaration of the Independence of Cyberspace, Airbnb, Bellingcat, Ben Horowitz, Bernie Sanders, Big Tech, Bill Gates: Altair 8800, bitcoin, Black Lives Matter, call centre, centre right, cloud computing, Comet Ping Pong, Computer Lib, coronavirus, COVID-19, crisis actor, crowdsourcing, dark pattern, data science, deep learning, deliberate practice, desegregation, disinformation, domesticated silver fox, Donald Trump, Douglas Engelbart, Douglas Engelbart, end-to-end encryption, fake news, Filter Bubble, Future Shock, game design, gamification, George Floyd, growth hacking, Hacker Conference 1984, Hacker News, hive mind, illegal immigration, Jeff Bezos, John Perry Barlow, Jon Ronson, Joseph Schumpeter, Julian Assange, Kevin Roose, lockdown, Lyft, Marc Andreessen, Mark Zuckerberg, Max Levchin, military-industrial complex, Oklahoma City bombing, Parler "social media", pattern recognition, Paul Graham, Peter Thiel, profit maximization, public intellectual, QAnon, recommendation engine, ride hailing / ride sharing, Rutger Bregman, Saturday Night Live, Sheryl Sandberg, side project, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Snapchat, social distancing, Social Justice Warrior, social web, Startup school, Stephen Hawking, Steve Bannon, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Susan Wojcicki, tech billionaire, tech worker, Ted Nelson, TED Talk, TikTok, Uber and Lyft, uber lyft, Whole Earth Catalog, WikiLeaks, Y Combinator

In a 2016 paper, Google’s engineers announced a “fundamental paradigm shift” to a new kind of machine learning they called “deep learning.” In the earlier A.I., an automated system had built the programs that picked videos. But, as with the spam-catching A.I.s, humans oversaw that system, intervening as it evolved to guide it and make changes. Now, deep learning was sophisticated enough to assume that oversight job, too. As a result, in most cases, “there’s going to be no humans actually making algorithmic tweaks, measuring those tweaks, and then implementing those tweaks,” the head of an agency that developed talent for YouTube wrote in an article deciphering the deep-learning paper. “So, when YouTube claims they can’t really say why the algorithm does what it does, they probably mean that very literally.”

“So, when YouTube claims they can’t really say why the algorithm does what it does, they probably mean that very literally.” It was as if Coca-Cola stocked a billion soda machines with some A.I.-designed beverage without a single human checking the bottles’ contents—and if the drink-filling A.I. was programmed only to boost sales, without regard for health or safety. As one of YouTube’s deep-learning engineers told an industry conference, “Product tells us that we want to increase this metric, and then we go and increase it.” The average user’s time on the platform skyrocketed. The company estimated that 70 percent of its time on site, an astronomical share of its business, was the result of videos pushed by its algorithm-run recommendation system.

Most were young, save one: the suit-wearing Bolsonaro, whom Dominguez first saw as a guest on one of Moura’s videos. It was 2016. At the time, Bolsonaro, a longtime lawmaker in Brazil’s version of the U.S. House of Representatives, was shunned even in his own party. But YouTube, chasing its billion-hour watch-time goal, had just installed its new, deep-learning A.I. In Brazil, far-right YouTubers—Bolsonaro’s real party—saw their exposure skyrocket. “It all started from there,” said Dominguez, now a lanky eighteen-year-old with glasses and a ponytail, calling YouTube the new home of the Brazilian right. The recommendation algorithm had “woken up Brazilians,” he said.


pages: 404 words: 95,163

Amazon: How the World’s Most Relentless Retailer Will Continue to Revolutionize Commerce by Natalie Berg, Miya Knights

3D printing, Adam Neumann (WeWork), Airbnb, Amazon Robotics, Amazon Web Services, asset light, augmented reality, Bernie Sanders, big-box store, business intelligence, cloud computing, Colonization of Mars, commoditize, computer vision, connected car, deep learning, DeepMind, digital divide, Donald Trump, Doomsday Clock, driverless car, electronic shelf labels (ESLs), Elon Musk, fulfillment center, gig economy, independent contractor, Internet of things, inventory management, invisible hand, Jeff Bezos, Kiva Systems, market fragmentation, new economy, Ocado, pattern recognition, Ponzi scheme, pre–internet, QR code, race to the bottom, random stow, recommendation engine, remote working, Salesforce, sensor fusion, sharing economy, Skype, SoftBank, Steve Bannon, sunk-cost fallacy, supply-chain management, TaskRabbit, TechCrunch disrupt, TED Talk, trade route, underbanked, urban planning, vertical integration, warehouse automation, warehouse robotics, WeWork, white picket fence, work culture

Addresses the perennial headache that is online returns, while driving footfall to Kohl’s. We expect this to be rolled out internationally. No 2018 Amazon Go Retail First checkout-free store. Shoppers scan their Amazon app to enter. The high-tech convenience store uses a combination of computer vision, sensor fusion and deep learning to create a frictionless customer experience. No 2019 and beyond Fashion or furniture stores would be a logical next step NOTE Amazon Go officially opened its doors to the public in 2018 SOURCE Amazon; author research as of June 2018 However, it was Amazon’s rather ironic launch of physical bookstores in 2015 that marked a genuine shift in strategy, as this was the first time Amazon mimicked digital merchandising and pricing in a physical setting.

Autonomous computing Where connectivity and interfaces have, to date, been hardware-based, the third global technology driver is predicated on the development of increasingly ‘intelligent’ software that can almost think for itself and come up with answers to questions without necessarily being programmed with the necessary information. Instead, autonomous computing systems can cross-reference and correlate disparate data sources, augment their own algorithms, and answer complex ‘what if?’ sorts of questions. As such, AI, including machine learning and deep learning techniques, could not exist without autonomous computing development as the last global technology driver. AI development is, in fact, responsible for many of the functional computing advances of the last 15 years, from search algorithms, spam filters and fraud prevention systems to self-driving vehicles and smart personal assistants.

In all of these applications, it can use the massive computing power of its AWS division to crunch billions of data points in support of testing a variety of options and outcomes to rapidly work out what will and won’t cost-effectively work with customers. McKinsey estimates put the proportion of Amazon purchases driven by product recommendations at 35 per cent.3 In 2016, it made its AI framework, DSSTNE (pronounced as ‘destiny’) free, to help expand the ways deep learning can extend beyond speech and language understanding and object recognition to areas such as search and recommendations. The decision to open source DSSTNE also demonstrates when Amazon recognizes the need to collaborate over making gains with the vast potential of AI. On the Amazon site, these recommendations can be personalized, based on categories and ranges previously searched or browsed, to increase conversion.


pages: 234 words: 67,589

Internet for the People: The Fight for Our Digital Future by Ben Tarnoff

4chan, A Declaration of the Independence of Cyberspace, accounting loophole / creative accounting, Alan Greenspan, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic management, AltaVista, Amazon Web Services, barriers to entry, Bernie Sanders, Big Tech, Black Lives Matter, blue-collar work, business logic, call centre, Charles Babbage, cloud computing, computer vision, coronavirus, COVID-19, decentralized internet, deep learning, defund the police, deindustrialization, desegregation, digital divide, disinformation, Edward Snowden, electricity market, fake news, Filter Bubble, financial intermediation, future of work, gamification, General Magic , gig economy, God and Mammon, green new deal, independent contractor, information asymmetry, Internet of things, Jeff Bezos, Jessica Bruder, John Markoff, John Perry Barlow, Kevin Roose, Kickstarter, Leo Hollis, lockdown, lone genius, low interest rates, Lyft, Mark Zuckerberg, means of production, Menlo Park, natural language processing, Network effects, Nicholas Carr, packet switching, PageRank, pattern recognition, pets.com, profit maximization, profit motive, QAnon, recommendation engine, rent-seeking, ride hailing / ride sharing, Sheryl Sandberg, Shoshana Zuboff, side project, Silicon Valley, single-payer health, smart grid, social distancing, Steven Levy, stock buybacks, supply-chain management, surveillance capitalism, techlash, Telecommunications Act of 1996, TikTok, transportation-network company, Travis Kalanick, Uber and Lyft, Uber for X, uber lyft, undersea cable, UUNET, vertical integration, Victor Gruen, web application, working poor, Yochai Benkler

Market for cloud infrastructure services: As of Q4 2020, AWS has 32 percent of the global market for cloud infrastructure services, followed by Microsoft Azure at 20 percent and Google Cloud at 9 percent, according to Felix Richter, “Amazon Leads $130-Billion Cloud Market,” Statista, February 4, 2021. 108, These dynamics accelerated … The 2010s saw the revival of neural networks under the banner of “deep learning,” which involves the use of many-layered networks. This revival was made possible by a number of factors, foremost among them advances in computing power and the abundance of training data that could be sourced from the internet. Deep learning is the paradigm that underlies much of what is currently known as “artificial intelligence,” and has centrally contributed to significant breakthroughs in computer vision and natural language processing. See Andrey Kurenkov, “A Brief History of Neural Nets and Deep Learning,” Skynet Today, September 27, 2020, and Alex Hanna et al., “Lines of Sight,” Logic, December 20, 2020. 109, The sophistication of these systems … “Data imperative”: Marion Fourcade and Kieran Healy, “Seeing Like a Market,” Socio-Economic Review 15, no. 1 (2017): 9–29. 110, The same individual … Smartphone usage: “Mobile Fact Sheet,” April 7, 2021, Pew Research Center.


pages: 419 words: 109,241

A World Without Work: Technology, Automation, and How We Should Respond by Daniel Susskind

"World Economic Forum" Davos, 3D printing, agricultural Revolution, AI winter, Airbnb, Albert Einstein, algorithmic trading, AlphaGo, artificial general intelligence, autonomous vehicles, basic income, Bertrand Russell: In Praise of Idleness, Big Tech, blue-collar work, Boston Dynamics, British Empire, Capital in the Twenty-First Century by Thomas Piketty, cloud computing, computer age, computer vision, computerized trading, creative destruction, David Graeber, David Ricardo: comparative advantage, deep learning, DeepMind, Demis Hassabis, demographic transition, deskilling, disruptive innovation, Donald Trump, Douglas Hofstadter, driverless car, drone strike, Edward Glaeser, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, fake news, financial innovation, flying shuttle, Ford Model T, fulfillment center, future of work, gig economy, Gini coefficient, Google Glasses, Gödel, Escher, Bach, Hans Moravec, income inequality, income per capita, industrial robot, interchangeable parts, invisible hand, Isaac Newton, Jacques de Vaucanson, James Hargreaves, job automation, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Joi Ito, Joseph Schumpeter, Kenneth Arrow, Kevin Roose, Khan Academy, Kickstarter, Larry Ellison, low skilled workers, lump of labour, machine translation, Marc Andreessen, Mark Zuckerberg, means of production, Metcalfe’s law, natural language processing, Neil Armstrong, Network effects, Nick Bostrom, Occupy movement, offshore financial centre, Paul Samuelson, Peter Thiel, pink-collar, precariat, purchasing power parity, Ray Kurzweil, ride hailing / ride sharing, road to serfdom, Robert Gordon, Sam Altman, Second Machine Age, self-driving car, shareholder value, sharing economy, Silicon Valley, Snapchat, social intelligence, software is eating the world, sovereign wealth fund, spinning jenny, Stephen Hawking, Steve Jobs, strong AI, tacit knowledge, technological solutionism, TED Talk, telemarketer, The Future of Employment, The Rise and Fall of American Growth, the scientific method, The Theory of the Leisure Class by Thorstein Veblen, The Wealth of Nations by Adam Smith, Thorstein Veblen, Travis Kalanick, Turing test, Two Sigma, Tyler Cowen, Tyler Cowen: Great Stagnation, universal basic income, upwardly mobile, warehouse robotics, Watson beat the top human players on Jeopardy!, We are the 99%, wealth creators, working poor, working-age population, Y Combinator

Int-$ (the “international dollar”) is a hypothetical currency that tries to take account of different price levels across different countries. 29.  For instance, Daron Acemoglu and Pascual Restrepo, “Artificial Intelligence, Automation and Work” in Ajay Agrawal, Joshua Gans, and Avi Goldfarb, eds., Economics of Artificial Intelligence (Chicago: Chicago University Press, 2018). 30.  Dayong Wang, Aditya Khosla, Rishab Gargeya, et al., “Deep Learning for Identifying Metastatic Breast Cancer,” https://arxiv.org, arXiv:1606.05718 (2016). 31.  Maura Grossman and Gordon Cormack, “Technology-Assisted Review in e-Discovery Can Be More Effective and More Efficient than Exhaustive Manual Review,” Richmond Journal of Law and Technology 17, no. 3 (2011). 32.  

Though by no means limited to diagnosis. See Eric Topol, “High-Performance Medicine: The Convergence of Human and Artificial Intelligence,” Nature 25 (2019), 44–56, for a broader overview of the uses of AI in medicine. 41.  Jeffrey De Fauw, Joseph Ledsam, Bernardino Romera-Paredes, et al., “Clinically Applicable Deep Learning for Diagnosis and Referral in Retinal Disease,” Nature Medicine 24 (2018), 1342–50. 42.  Pallab Ghosh, “AI Early Diagnosis Could Save Heart and Cancer Patients,” BBC News, 2 January 2018. 43.  Echo Huang, “A Chinese Hospital Is Betting Big on Artificial Intelligence to Treat Patients,” Quartz, 4 April 2018. 44.  

Davis, Abe, Michael Rubinstein, Neal Wadhwa, et al. “The Visual Microphone: Passive Recovery of Sound from Video.” ACM Transactions on Graphics (TOG) 33, no. 4 (2014). Dawkins, Richard. The Blind Watchmaker. London: Penguin Books, 2016. De Fauw, Jeffrey, Joseph Ledsam, Bernardino Romera-Paredes, et al. “Clinically Applicable Deep Learning for Diagnosis and Referral in Retinal Disease.” Nature Medicine 24 (2018): 1342–50. Deloitte. “From Brawn to Brains: The Impact of Technology on Jobs in the UK” (2015). Deming, David. “The Growing Importance of Social Skills in the Labor Market.” Quarterly Journal of Economics 132, no. 4 (2017): 1593–1640.


pages: 501 words: 114,888

The Future Is Faster Than You Think: How Converging Technologies Are Transforming Business, Industries, and Our Lives by Peter H. Diamandis, Steven Kotler

Ada Lovelace, additive manufacturing, Airbnb, Albert Einstein, AlphaGo, Amazon Mechanical Turk, Amazon Robotics, augmented reality, autonomous vehicles, barriers to entry, Big Tech, biodiversity loss, bitcoin, blockchain, blood diamond, Boston Dynamics, Burning Man, call centre, cashless society, Charles Babbage, Charles Lindbergh, Clayton Christensen, clean water, cloud computing, Colonization of Mars, computer vision, creative destruction, CRISPR, crowdsourcing, cryptocurrency, data science, Dean Kamen, deep learning, deepfake, DeepMind, delayed gratification, dematerialisation, digital twin, disruptive innovation, Donald Shoup, driverless car, Easter island, Edward Glaeser, Edward Lloyd's coffeehouse, Elon Musk, en.wikipedia.org, epigenetics, Erik Brynjolfsson, Ethereum, ethereum blockchain, experimental economics, fake news, food miles, Ford Model T, fulfillment center, game design, Geoffrey West, Santa Fe Institute, gig economy, gigafactory, Google X / Alphabet X, gravity well, hive mind, housing crisis, Hyperloop, impact investing, indoor plumbing, industrial robot, informal economy, initial coin offering, intentional community, Intergovernmental Panel on Climate Change (IPCC), Internet of things, invention of the telegraph, Isaac Newton, Jaron Lanier, Jeff Bezos, job automation, Joseph Schumpeter, Kevin Kelly, Kickstarter, Kiva Systems, late fees, Law of Accelerating Returns, life extension, lifelogging, loss aversion, Lyft, M-Pesa, Mary Lou Jepsen, Masayoshi Son, mass immigration, megacity, meta-analysis, microbiome, microdosing, mobile money, multiplanetary species, Narrative Science, natural language processing, Neal Stephenson, Neil Armstrong, Network effects, new economy, New Urbanism, Nick Bostrom, Oculus Rift, One Laptop per Child (OLPC), out of africa, packet switching, peer-to-peer lending, Peter H. Diamandis: Planetary Resources, Peter Thiel, planned obsolescence, QR code, RAND corporation, Ray Kurzweil, RFID, Richard Feynman, Richard Florida, ride hailing / ride sharing, risk tolerance, robo advisor, Satoshi Nakamoto, Second Machine Age, self-driving car, Sidewalk Labs, Silicon Valley, Skype, smart cities, smart contracts, smart grid, Snapchat, SoftBank, sovereign wealth fund, special economic zone, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Steve Jurvetson, Steven Pinker, Stewart Brand, supercomputer in your pocket, supply-chain management, tech billionaire, technoutopianism, TED Talk, Tesla Model S, Tim Cook: Apple, transaction costs, Uber and Lyft, uber lyft, unbanked and underbanked, underbanked, urban planning, Vision Fund, VTOL, warehouse robotics, Watson beat the top human players on Jeopardy!, We wanted flying cars, instead we got 140 characters, X Prize

Once we develop this capability, the term “turbo-boost” doesn’t even come close. Yet, we’re already close. AI in the cloud provides the necessary power for JARVIS-like performance. Blending Xiaoice’s conversational friendliness with AlphaGo Zero’s decision-making precision takes this even further. Add in the latest deep learning developments and you get a system that is starting to be able to think for itself. Is it JARVIS? Not yet. But it’s JARVIS-lite—and yet another reason why technological acceleration is itself accelerating. Networks Networks are means of transportation. They’re how goods, services, and, more critically, information and innovation, move from Point A to Point B.

Online retailers are also in the mix, with Amazon acquiring the 3-D body-scanning startup Body Labs in 2017 as a way to make bespoke clothing just another feature available through Prime Wardrobe. As far as an AI fashion advisor goes, those too are here, courtesy of both Alibaba and Amazon. During their annual Singles’ Day shopping festival, Alibaba’s FashionAI concept store uses deep learning to make suggestions based on advice from human fashion experts and store inventory, driving a significant portion of the day’s $25 billion in sales. Similarly, Amazon’s shopping algorithm makes personalized clothing recommendations based on user preferences and social media behavior. And the VR system itself?

Already, affective computing is seeping into e-learning, where AIs adjust the presentation style if the learner gets bored; robotic caregiving, where it improves the quality of robo-nursing; and social monitoring, like a car that engages additional safety measures should the driver become angry. But its biggest impact is on entertainment, where things are getting personal. Facial expressions, hand gestures, eye gaze, vocal tone, head movement, speech frequency and duration are all signals thick with emotional information. By coupling next generation sensors with deep learning techniques, we can read these signals and employ them to analyze a user’s mood. And the basic technology is here. Affectiva, a startup created by Rosalind Picard, the head of MIT’s Affective Computing Group, is an emotional recognition platform used by both the gaming and the marketing industry.


pages: 301 words: 89,076

The Globotics Upheaval: Globalisation, Robotics and the Future of Work by Richard Baldwin

agricultural Revolution, Airbnb, AlphaGo, AltaVista, Amazon Web Services, Apollo 11, augmented reality, autonomous vehicles, basic income, Big Tech, bread and circuses, business process, business process outsourcing, call centre, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, commoditize, computer vision, Corn Laws, correlation does not imply causation, Credit Default Swap, data science, David Ricardo: comparative advantage, declining real wages, deep learning, DeepMind, deindustrialization, deskilling, Donald Trump, Douglas Hofstadter, Downton Abbey, Elon Musk, Erik Brynjolfsson, facts on the ground, Fairchild Semiconductor, future of journalism, future of work, George Gilder, Google Glasses, Google Hangouts, Hans Moravec, hiring and firing, hype cycle, impulse control, income inequality, industrial robot, intangible asset, Internet of things, invisible hand, James Watt: steam engine, Jeff Bezos, job automation, Kevin Roose, knowledge worker, laissez-faire capitalism, Les Trente Glorieuses, low skilled workers, machine translation, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, manufacturing employment, Mark Zuckerberg, mass immigration, mass incarceration, Metcalfe’s law, mirror neurons, new economy, optical character recognition, pattern recognition, Ponzi scheme, post-industrial society, post-work, profit motive, remote working, reshoring, ride hailing / ride sharing, Robert Gordon, Robert Metcalfe, robotic process automation, Ronald Reagan, Salesforce, San Francisco homelessness, Second Machine Age, self-driving car, side project, Silicon Valley, Skype, Snapchat, social intelligence, sovereign wealth fund, standardized shipping container, statistical model, Stephen Hawking, Steve Jobs, supply-chain management, systems thinking, TaskRabbit, telepresence, telepresence robot, telerobotics, Thomas Malthus, trade liberalization, universal basic income, warehouse automation

The deep answer is Moore’s law and Gilder’s law have shifted into their eruptive growth phases when it comes to machine translation. WHY NOW? THE DEEP LEARNING TAKEOVER For a decade, hundreds of Google engineers made incremental progress on translation using the traditional, hands-on approach. In February 2016, Google’s AI maharishi, Jeff Dean, turned the Google Translate team on to Google’s homegrown machine-learning technique called Deep Learning. The job required huge amounts of computer muscle, but Google had that thanks to Moore’s law. The missing link was the data. That changed in 2016 when the United Nations (UN) posted online a data set with nearly 800,000 documents that had been manually translated into the six official UN languages: Arabic, English, Spanish, French, Russian, and Chinese.

Job Destruction Is the Business Model We should listen to Andrew Ng. He is one of the intellectual high priests of digital technology. He was the chief scientist at the Chinese online search giant Baidu, leading over a thousand researchers. Before that, he worked at Google developing the company’s breakthrough machine-learning approach, called Deep Learning. This is the thing behind many of Google’s wonders including its self-driving cars. As if all that wasn’t enough for one person’s career, when he was a professor at Stanford University, he co-founded the online education platform Coursera. His YouTube lecture on AI has been watched over 1.5 million times.


Know Thyself by Stephen M Fleming

Abraham Wald, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, AlphaGo, autism spectrum disorder, autonomous vehicles, availability heuristic, backpropagation, citation needed, computer vision, confounding variable, data science, deep learning, DeepMind, Demis Hassabis, Douglas Hofstadter, Dunning–Kruger effect, Elon Musk, Estimating the Reproducibility of Psychological Science, fake news, global pandemic, higher-order functions, index card, Jeff Bezos, l'esprit de l'escalier, Lao Tzu, lifelogging, longitudinal study, meta-analysis, mutually assured destruction, Network effects, patient HM, Pierre-Simon Laplace, power law, prediction markets, QWERTY keyboard, recommendation engine, replication crisis, self-driving car, side project, Skype, Stanislav Petrov, statistical model, theory of mind, Thomas Bayes, traumatic brain injury

Fleming, Marta Saez Garcia, Gabriel Weindel, and Karen Davranche. “Revealing Subthreshold Motor Contributions to Perceptual Confidence.” Neuroscience of Consciousness 2019, no. 1 (2019): niz001. Gal, Yarin, and Zoubin Ghahramani. “Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning,” arXiv.org, October 4, 2016. Galvin, Susan J., John V. Podd, Vit Drga, and John Whitmore. “Type 2 Tasks in the Theory of Signal Detectability: Discrimination Between Correct and Incorrect Decisions.” Psychonomic Bulletin & Review 10, no. 4 (2003): 843–876. Garrison, Jane R., Emilio Fernandez-Egea, Rashid Zaman, Mark Agius, and Jon S.

International Journal of Law and Psychiatry 62 (2019): 56–76. Kelley, W. M., C. N. Macrae, C. L. Wyland, S. Caglar, S. Inati, and T. F. Heatherton. “Finding the Self? An Event-Related fMRI Study.” Journal of Cognitive Neuroscience 14, no. 5 (2002): 785–794. Kendall, Alex, and Yarin Gal. “What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision?” arXiv.org, October 5, 2017. Kentridge, R. W., and C. A. Heywood. “Metacognition and Awareness.” Consciousness and Cognition 9, no. 2 (2000): 308–312. Kepecs, Adam, Naoshige Uchida, Hatim A. Zariwala, and Zachary F. Mainen. “Neural Correlates, Computation and Behavioural Impact of Decision Confidence.”

Samaha, Jason, Missy Switzky, and Bradley R. Postle. “Confidence Boosts Serial Dependence in Orientation Estimation.” Journal of Vision 19, no. 4 (2019): 25. Samek, Wojciech, Grégoire Montavon, Andrea Vedaldi, Lars Kai Hansen, and Klaus-Robert Müller. Explainable AI: Interpreting, Explaining and Visualizing Deep Learning. Cham, Switzerland: Springer, 2019. Schäfer, Anton Maximilian, and Hans-Georg Zimmermann. “Recurrent Neural Networks Are Universal Approximators.” International Journal of Neural Systems 17, no. 4 (2007): 253–263. Schechtman, Marya. The Constitution of Selves. Ithaca, NY: Cornell University Press, 1996.


pages: 428 words: 121,717

Warnings by Richard A. Clarke

"Hurricane Katrina" Superdome, active measures, Albert Einstein, algorithmic trading, anti-communist, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, Bear Stearns, behavioural economics, Bernie Madoff, Black Monday: stock market crash in 1987, carbon tax, cognitive bias, collateralized debt obligation, complexity theory, corporate governance, CRISPR, cuban missile crisis, data acquisition, deep learning, DeepMind, discovery of penicillin, double helix, Elon Musk, failed state, financial thriller, fixed income, Flash crash, forensic accounting, friendly AI, Hacker News, Intergovernmental Panel on Climate Change (IPCC), Internet of things, James Watt: steam engine, Jeff Bezos, John Maynard Keynes: Economic Possibilities for our Grandchildren, knowledge worker, Maui Hawaii, megacity, Mikhail Gorbachev, money market fund, mouse model, Nate Silver, new economy, Nicholas Carr, Nick Bostrom, nuclear winter, OpenAI, pattern recognition, personalized medicine, phenotype, Ponzi scheme, Ray Kurzweil, Recombinant DNA, Richard Feynman, Richard Feynman: Challenger O-ring, risk tolerance, Ronald Reagan, Sam Altman, Search for Extraterrestrial Intelligence, self-driving car, Silicon Valley, smart grid, statistical model, Stephen Hawking, Stuxnet, subprime mortgage crisis, tacit knowledge, technological singularity, The Future of Employment, the scientific method, The Signal and the Noise by Nate Silver, Tunguska event, uranium enrichment, Vernor Vinge, WarGames: Global Thermonuclear War, Watson beat the top human players on Jeopardy!, women in the workforce, Y2K

DARPA (the Defense Advanced Research Projects Agency), whose mission is to ensure that the U.S. military is “the initiator and not the victim of strategic technological surprises,”19 launched a program for “explainable AI.” “Machine learning and deep learning algorithms . . . we don’t fully understand today how they work.” The new explainable-AI initiative “will give the human operator more details about how the machine used deep learning to come up with the answer.”20 In 2015, business tycoons Elon Musk and Sam Altman created the OpenAI Institute, a nonprofit company that focuses on researching AI. Musk and Altman believe that by making all of OpenAI’s findings open-source and funding it by private donations, eliminating the need for financial return, they can ensure that AI will be developed for the benefit of all people, not for self-interested or destructive aims.

Then, about as quickly as they slipped, the markets recovered. Later investigations suggested that errors in the autonomous algorithms of high-frequency traders were at least partly to blame. While AI has fundamentally shifted life on Wall Street, so too is it changing Main Street in new and potentially profound ways. Andrew Ng, the father of deep learning (a branch of machine learning that attempts to mirror human cognition), believes that focusing on the threat from superintelligence is misplaced. Ng is behind Google Brain, one of the most aspirational AI systems yet, and he feels that “worrying about AI evil Superintelligence today is like worrying about overpopulation on the planet Mars.

See also Upper Big Branch Mine disaster Cassandra system, 122, 125, 133, 137–38, 140–41 fatality rate, 123–24, 125–28 federal regulations, 124–30, 137–39 federal research program, 124–25 history of, 122–23 institutional refusal and, 137–42 Coastal wetlands, 41, 42–44 Coastal Wetlands Planning, Protection, and Restoration Act (CWPPRA), 43–44 Coding errors, 366–67 Cognitive biases, 34–35, 171–72 heuristics and, 189–91 Cognitive style, 14–15 Cold and the Dark, The: The World after Nuclear War (Sagan, Ehrlich, and Kennedy), 273–74 Cold Start doctrine, 264–65, 267, 270 Cold War, 25–26, 267–68, 271–74, 277–78 Collateralized debt obligations (CDOs), 147–48 Columbia University, 237, 238 Coming Plague, The (Garrett), 232 Complexity, and vulnerabilities, 366–67 Complexity Mismatch, 116, 178–79, 215, 299 Comprehensive Nuclear-Test-Ban Treaty (CTBT), 266 Computers in Crisis (Murray), 193–94 Conference on the Long-Term Worldwide Biological Consequences of Nuclear War (1983), 273 Congressional oversight committees, 355 Consensus science, 172–73 Continuous miners, 131–32 Conventional wisdom, 28, 355 Coplan, Jeremy, 186 Corvette hacks, 297–98 Cosmic Catastrophes (Morrison and Chapman), 302, 303, 304–5, 308–9, 312, 314–15, 319 Cost-benefit analysis, 361–62 Countervalue strike, 275, 278–79 Cowardice, 180 Cox, Jeff, 150 Cretaceous-Paleogene extinction event, 307–9 Crichton, Michael, 172–73 Crick, Francis, 328 Crimea, 285 CRISPR, 231–32, 326, 327, 329–49 CRISPR/Cas9, 326, 330–49, 360, 366–67 CRISPR Therapeutics, 333 Critical infrastructure protection (CIP), 287 Critics, 168, 170, 186–88 Crittenden, Gary, 143–44, 156 Crocker, Ryan, 73 Crocker’s Rules, 208–9 Cuban Missile Crisis, 26, 274 Cybersecurity, 283–300 Cynomolgus monkeys, 334–35 Daniel, 2 DARPA (Defense Advanced Research Projects Agency), 210, 382n Darwin, Charles, 325 Data, 36–37, 184 “Decay heat,” 85 Decision makers (the audience), 168, 170, 176–82, 380n false Cassandras and, 191–98 making the same mistakes, 189–91 responses, 358–64 scanning for problems, 354–56 Deep Impact (movie), 313–14 Deep learning, 210, 212 Demon core, 83 Deutsche Bank, 157 Devil’s advocates, 359, 379n DiBartolomeo, Dan, 105–6 Diffusion of responsibility, 176–77, 215, 235, 321, 348 Dinosaurs, 307–9 DiPascali, Frank, 107 Disembodied AI, 207 DNA, 326, 327–28, 336–37 Dole, Bob, 28–29 Dot-com bubble, 147 Doudna, Jennifer, 326–30, 335–36, 338–41, 343, 345, 346–49, 360 Drijfhout, Sybren, 253 Duchenne muscular dystrophy, 332 Duelfer, Charles, 30–31 Eagles, the (band), 305 Earth Institute, 238 Earthquake preparedness, 352–53 Ebola virus, 3, 219–20 Edwards, Edwin, 43 Eemian interglacial, 249, 250 Eggers, Dave, 39 Egypt, 59, 63, 66–67 Ehrlich, Paul, 192–93 Ein-Dor, Tsachi, 13, 186, 380n Einstein, Albert, 185 Eisman, Steve, 149, 152 Electricity Information Sharing and Analysis Center (E-ISAC), 287 Electric Power Research Institute, 286 Electromagnetic pulse (EMP), 274, 352 Embodied AI, 206 EMCON (emissions control), 29–30 Empire State Building, 260 Empirical method, 36, 184, 185 Energy policy, 243–44 Enron, 152 Enthoven, Alain, 361 Epidemic Intelligence Service, 354–55 Epidemic That Never Was, The (Neustadt), 196–97 EQ (emotional quotient), 183 Erasmus Medical Center, 222 Ermarth, Fritz, 27 Erroneous Consensus, 172–73 Ethics of AI growth, 205–6 of gene editing, 334, 339–40, 343 Eugenics, 342, 344 Evolution, 329–33 Expert Political Judgment (Tetlock), 13–15 Explainable AI, 210 Fairfield Greenwich Group, 108, 113 Fallujah, 68, 69 False Cassandras, 191–98 Famines, 192 Farmington Mine disaster, 127–28 Farson, Richard, 175 “Fast-failure” review, 357 Fatalism, 2 Fate of the States (Whitney), 153 Federal Bureau of Investigation (FBI), 8, 100, 112, 115 Federal Deposit Insurance Corporation (FDIC), 160 Federal Emergency Management Agency (FEMA), 40, 46–48, 51, 53–54, 323–24 Hurricane Pam exercise, 40, 47–49 Federal Reserve Bank, 159 Feedback loops, 16, 192–93 Fermi, Enrico, 373n Feynman, Richard, 240 Figueres, Christiana, 247 Financial Crisis Inquiry Commission, 162 Financial crisis of 2008, 143–65 Madoff fraud and SEC, 118–19 primary cause of, 147–48 Whitney and, 143–46, 148–50, 156–60 Flash Crash of 2010, 211 Fletcher, Charles, 256–57 Flood Control Act of 1928, 42 Flood Control Act of 1965, 46 Flu pandemic of 1918, 195, 198, 217, 221–24 Flu pandemic of 2009, 217–18, 221–22 Forbes, 154 Ford, Gerald, 196–97 Ford, Robert, 57–74 aid to Syrian opposition, 62–63, 64–65 ambassadorship in Egypt, 67 ambassadorship in Syria, 57–58 departure from Syria, 60–62 warning and prediction of, 64–74 Foreign Service, U.S., 57, 58, 67 Fortune, 146, 148–49, 161 Fossil fuels, 16, 42, 257–58.


pages: 521 words: 118,183

The Wires of War: Technology and the Global Struggle for Power by Jacob Helberg

"World Economic Forum" Davos, 2021 United States Capitol attack, A Declaration of the Independence of Cyberspace, active measures, Affordable Care Act / Obamacare, air gap, Airbnb, algorithmic management, augmented reality, autonomous vehicles, Berlin Wall, Bernie Sanders, Big Tech, bike sharing, Black Lives Matter, blockchain, Boris Johnson, Brexit referendum, cable laying ship, call centre, Cambridge Analytica, Cass Sunstein, cloud computing, coronavirus, COVID-19, creative destruction, crisis actor, data is the new oil, data science, decentralized internet, deep learning, deepfake, deglobalization, deindustrialization, Deng Xiaoping, deplatforming, digital nomad, disinformation, don't be evil, Donald Trump, dual-use technology, Edward Snowden, Elon Musk, en.wikipedia.org, end-to-end encryption, fail fast, fake news, Filter Bubble, Francis Fukuyama: the end of history, geopolitical risk, glass ceiling, global pandemic, global supply chain, Google bus, Google Chrome, GPT-3, green new deal, information security, Internet of things, Jeff Bezos, Jeffrey Epstein, John Markoff, John Perry Barlow, knowledge economy, Larry Ellison, lockdown, Loma Prieta earthquake, low earth orbit, low skilled workers, Lyft, manufacturing employment, Marc Andreessen, Mark Zuckerberg, Mary Meeker, Mikhail Gorbachev, military-industrial complex, Mohammed Bouazizi, move fast and break things, Nate Silver, natural language processing, Network effects, new economy, one-China policy, open economy, OpenAI, Parler "social media", Peter Thiel, QAnon, QR code, race to the bottom, Ralph Nader, RAND corporation, reshoring, ride hailing / ride sharing, Ronald Reagan, Russian election interference, Salesforce, Sam Altman, satellite internet, self-driving car, Sheryl Sandberg, side project, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Skype, smart grid, SoftBank, Solyndra, South China Sea, SpaceX Starlink, Steve Jobs, Steven Levy, Stuxnet, supply-chain attack, Susan Wojcicki, tech worker, techlash, technoutopianism, TikTok, Tim Cook: Apple, trade route, TSMC, Twitter Arab Spring, uber lyft, undersea cable, Unsafe at Any Speed, Valery Gerasimov, vertical integration, Wargames Reagan, Westphalian system, white picket fence, WikiLeaks, Y Combinator, zero-sum game

But the recent explosion in AI applications has been driven by major advances in what’s known as machine learning, which, as the AI expert Pedro Domingos puts it, “automates automation itself.”10 Key to these machine learning advances is “deep learning,” powered by “neural networks.” In essence, these neural networks mimic how our brains function. Take the process of identifying the image of a cat. In the past, an engineer might have meticulously spelled out certain rules: two triangles on top of a circle likely means “cat.” With deep learning, however, you’d set a neural network loose on an immense dataset of millions of images labeled “cat” or “no cat” and allow the algorithm to puzzle out patterns for itself.11 (Neural networks have yet to learn to generate good names for cats, however.

Because of the COVID-19 pandemic, our conversation took place by Zoom, and Daniel offered a timely illustration. “Today, I took it for granted that the voice I hear over Zoom is your voice, and that the face I see over Zoom is your face,” he said. “Now there’s nifty prototypes that people are using to do deepfakes live.”31 A society disrupted by deepfakes, he suggested, was not far off. Using deep learning, deepfakes mimic visual and speech patterns to create eerily realistic images, audio, and video. The believability of synthetic content has progressed along with advances in neural networks. As recently as 2015, algorithms trying to generate the original face of a man produced results that looked only somewhat more realistic than a painting produced by a talented ten-year-old.

And as disturbing as a world awash in deepfakes would be, that’s just the beginning of what the front-end future holds in store. The Language of Deception For millennia, language has been what sets us apart and makes us human. But that’s changing. We now face security risks stemming from unprecedented advances in “natural language processing”—basically, applying those deep learning neural networks to process or generate human-sounding speech. When you ask, “Hey Siri, what’s the weather today?” or when your wife says “Alexa, play Hamilton” for the 500th time during lockdown, your device’s natural language processing abilities are what enable it to interpret your voice and act on those commands.


pages: 665 words: 159,350

Shape: The Hidden Geometry of Information, Biology, Strategy, Democracy, and Everything Else by Jordan Ellenberg

Albert Einstein, AlphaGo, Andrew Wiles, autonomous vehicles, British Empire, Brownian motion, Charles Babbage, Claude Shannon: information theory, computer age, coronavirus, COVID-19, deep learning, DeepMind, Donald Knuth, Donald Trump, double entry bookkeeping, East Village, Edmond Halley, Edward Jenner, Elliott wave, Erdős number, facts on the ground, Fellow of the Royal Society, Geoffrey Hinton, germ theory of disease, global pandemic, government statistician, GPT-3, greed is good, Henri Poincaré, index card, index fund, Isaac Newton, Johannes Kepler, John Conway, John Nash: game theory, John Snow's cholera map, Louis Bachelier, machine translation, Mercator projection, Mercator projection distort size, especially Greenland and Africa, Milgram experiment, multi-armed bandit, Nate Silver, OpenAI, Paul Erdős, pets.com, pez dispenser, probability theory / Blaise Pascal / Pierre de Fermat, Ralph Nelson Elliott, random walk, Rubik’s Cube, self-driving car, side hustle, Snapchat, social distancing, social graph, transcontinental railway, urban renewal

We could try to solve the problem by taking in more variables as input (you have to figure the size of the team’s stadium would be relevant, for instance), but in the end, linear strategies only get you so far. That class of strategies is just not big enough to, for instance, tell you which images are cats. For that, you have to venture into the wild world of the nonlinear. DX21 The biggest thing going on right now in machine learning is the technique called deep learning. It powers AlphaGo, the computer that beat Lee Se-dol, it powers Tesla’s fleet of sort-of-self-driving cars, and it powers Google Translate. It is sometimes presented as a kind of oracle, offering superhuman insight automatically and at scale. Another name for the technique, neural networks, makes it sound as if the method is somehow capturing the workings of the human brain itself.

Maybe—hopefully—there is some strategy obtainable by setting the fourteen knobs just right that will assign large values to all the points with an X and small values to all the points with an O, and thereby allow me to make educated guesses about other points in the plane I haven’t yet labeled. And if there is such a strategy, hopefully I can learn it by gradient descent, twiddling each knob a bit and seeing how much that diminishes my strategy’s wrongness about the examples it’s already been given. Find the best small twiddle you can make, make it, repeat. The “deep” in deep learning just means the network has a lot of columns. The number of boxes in each column is called the width, and that number can get pretty big, too, in practice, but “wide learning” just doesn’t have the same terminological zing. Today’s deep networks are more complicated than the ones in these pictures, to be sure.

See also redistricting approximation, 193, 271–72, 272n, 274–75 archeology, 258–59 architecture, 411 Arctic distortion, 307 Aristotle, 405–6 Arithmetic (Diophantus), 151 arithmetic mean, 215 arithmetic progressions, 234, 238, 249. See also geometric progressions Arkansas Voters First, 409 Arnell Group, 278 Aronofsky, Darren, 276–77 Articles of Confederation, 353n artificial intelligence (AI) car key search analogy, 186–87 and chess-playing computers, 145 and deep learning, 177–86 and Go-playing programs, 141 and gradient descent, 166–68, 169–73, 174–76 and image analysis, 168–73 and parity problem, 204 sexual anxiety touched off by, 204 and strategy assessment, 173–77 See also machine learning; neural networks Ash, Robert, 290 Ashbery, John, 217 associative learning, 222 astronomy, 250 asymmetry, 89, 135, 257, 306, 342–43, 386, 398 athoni, 190–91 atoms (numerical sequences), 265 Aubrey, John, 11–12 audioactive decay, 265 Australasian Journal of Philosophy, 33–34 authalic projection, 308 autocompletion, 263 automata, 252–53, 253n autonomous vehicles, 177–78, 204–5 Awesome Theorem, 308–10 axioms and appeal of geometry, 3 and commutativity, 297 and geometry pedagogy, 12–13, 15, 18, 22, 24–26 and gradient descent, 172 and metaphorical value of geometry, 411 and transitivity of equality, 20 ayahuasca, 2, 3–4 Babbage, Charles, 133, 155, 252–53 babies and geometry, 2 Babson, Roger, 280–81 Babson College, 280 baby examples, 158–59 Bachelier, Louis, 80–82, 88, 90, 279, 324 Bacon, Kevin, 314–15, 334.


pages: 331 words: 47,993

Artificial You: AI and the Future of Your Mind by Susan Schneider

artificial general intelligence, brain emulation, deep learning, Elon Musk, Extropian, heat death of the universe, hive mind, life extension, megastructure, Nick Bostrom, pattern recognition, precautionary principle, radical life extension, Ray Kurzweil, Search for Extraterrestrial Intelligence, silicon-based life, Stephen Hawking, superintelligent machines, technological singularity, TED Talk, The Coming Technological Singularity, theory of mind, traumatic brain injury, Turing machine, Turing test, Whole Earth Review, wikimedia commons

Indeed, androids are already being built to tug at our heartstrings. Can we look beneath the surface and tell whether an AI is truly conscious? You might think that we should just examine the architecture of the Samantha program. But even today, programmers are having difficulties understanding why today’s deep-learning systems do what they do (this has been called the “Black Box Problem”). Imagine trying to make sense of the cognitive architecture of a superintelligence that can rewrite its own code. And even if a map of the cognitive architecture of a superintelligence was laid out in front of us, how would we recognize certain architectural features as being those central to consciousness?

Although many superintelligences would be beyond our grasp, perhaps we can be more confident when speculating on the nature of “early” superintelligences—that is, those that emerge from a civilization that was previously right on the cusp of developing superintelligence. Some of the first superintelligent AIs could have cognitive systems that are modeled after biological brains—the way, for instance, that deep-learning systems are roughly modeled on the brain’s neural networks. So their computational structure might be comprehensible to us, at least in rough outlines. They may even retain goals that biological beings have, such as reproduction and survival. I will turn to this issue of early superintelligence in more detail shortly.9 Dyson sphere But superintelligent AIs, being self-improving, could quickly transition to an unrecognizable form.


pages: 344 words: 94,332

The 100-Year Life: Living and Working in an Age of Longevity by Lynda Gratton, Andrew Scott

"World Economic Forum" Davos, 3D printing, Airbnb, asset light, assortative mating, behavioural economics, carbon footprint, carbon tax, classic study, Clayton Christensen, collapse of Lehman Brothers, creative destruction, crowdsourcing, deep learning, delayed gratification, disruptive innovation, diversification, Downton Abbey, driverless car, Erik Brynjolfsson, falling living standards, financial engineering, financial independence, first square of the chessboard, first square of the chessboard / second half of the chessboard, future of work, gender pay gap, gig economy, Google Glasses, indoor plumbing, information retrieval, intangible asset, Isaac Newton, job satisfaction, longitudinal study, low skilled workers, Lyft, Nelson Mandela, Network effects, New Economic Geography, old age dependency ratio, pattern recognition, pension reform, Peter Thiel, Ray Kurzweil, Richard Florida, Richard Thaler, risk free rate, Second Machine Age, sharing economy, Sheryl Sandberg, side project, Silicon Valley, smart cities, Stanford marshmallow experiment, Stephen Hawking, Steve Jobs, tacit knowledge, The Future of Employment, uber lyft, warehouse robotics, women in the workforce, young professional

Even here, however, some technology experts argue that the advantage of humans over machines will be short-lived. Fast developments in Cloud Robotics and Deep Learning could close the gap between human and machine performance. Developments in Cloud Robotics, where networks of robots have access to each others’ learning through the cloud network, could result in learning at an exponential rate – certainly far faster than human learning. In Deep Learning the technology attempts to mimic the way humans make inductive reasoning through association by experience, again potentially through leveraging the experience of every other robot via the cloud.

Index The letter f following an entry indicates a figure 3.0 scenarios here–here, here, here, here 3.5 scenarios here–here, here, here 4.0 scenarios here–here, here–here, here, here 5.0 scenarios here–here, here, here, here, here, here–here Acorns here activities of daily living (ADL) here adolescence here–here, here adult equivalence scales here age cognition and here–here corporations and here explorers and here–here government policy and here independent producers and here life stages and here–here, here–here portfolios and here predictability of here segregation and here–here, here–here, here, here–here age process algorithms here, here ageing process here, here ageism here, here agency here, here, here finance and here–here agriculture here–here Amazon here anxiety here appearance here Apple iPhone here reputation here Archer, Margaret here Artificial Intelligence (AI) here, here, here, here education and here human skills and here medical diagnoses and here–here, here skills and knowledge and here–here Asia here assets here, here see also intangible assets; tangible assets; transformational assets assortative mating here–here, here Astor, Brooke here Autor, David here–here, here Baby Boomers here–here beauty here Becker, Gary: ‘Treatise on the Family’ here, here–here, here behavioural nudges here Benartzi, Shlomo here benefits here–here see also welfare Bennis, Warren here birth rates, decline in here–here, here brain, the here–here, here–here cognition here Braithwaite, Valerie here Brontë, Charlotte: Jane Eyre here Buffett, Warren here–here Calico (California Life Company) here Calment, Jeanne here careers breaks and here changes and here–here dual careers here, here, here cell aging here centenarians here, here–here change here–here catalysts for here–here corporations and here–here, here education and here–here government policy and here–here, here identity and here–here inequalities and here–here mastery and here–here planning and experimentation and here–here rate of here–here Cherlin, Andrew here chess here children here, here–here, here Christensen, Clayton here Cloud Robotics here cohort estimate of life expectancy here, here, here companies here, here–here, here–here Amazon here Apple here–here change and here–here, here creative clusters here–here economies of scale and here–here Facebook here flexibility here–here, here–here reputation and here–here research and here small business ecosystems here–here technology and here–here Twitter here value creation here–here WhatsApp here compression of morbidity here–here computing power here–here, here–here see also Moore’s Law connectivity here–here consumerism here, here consumption complementarities here–here consumption levels here, here continuums here corporations here–here, here–here see also companies creative clusters here–here independent producers and here–here creativity here cross-age friendships here crucible experiences here–here Deep Learning here dementia here depreciation here developing countries life expectancy and here–here, here state pensions and here Dickens, Charles: Old Curiosity Shop, The here diet here Dimson, Elroy here disabilities here discounting here discretionary time here diverse networks here, here–here Doctorow, Corey: Makers, The here Downton Abbey effect, the here–here Doyle, Arthur Conan here driverless cars here, here dual career households here, here, here Dweck, Carol here–here dynamic/diverse networks here, here–here Easterlin’s Paradox here economy, the here–here agriculture and here–here gig economy here job creation and here–here leisure industry and here service sector and here sharing economy here, here stability and here education here, here–here, here–here see also mastery experiential learning here–here, here, here human skills and judgement and here ideas and creativity and here institutions here–here learning methods here mental flexibility and agility and here–here multi-stage life and here specialization here–here, here, here technology and here, here, here training here efficacy here, here, here–here elasticity here–here emerging markets life expectancy and here state pensions and here emotional spillover here employers here–here, here employment see also companies; employment changes age and here, here–here, here–here changes and here, here, here–here, here–here city migration and here–here creation here–here demographics and here, here–here diverse networks and here–here elasticity and here–here environmental concerns and here–here, here family structures and here–here, here, here–here, here, here, here–here flexibility and here–here, here, here–here, here–here, here–here, here hollowing out of work here–here, here, here home and here job classification here–here knowledge and skills and here levels here, here matches here–here mobility here multi-stage life and here office-based here paid leave here participation rates here–here, here pay here–here, here psychological contract here satisfaction here–here self-employment here–here specialization and here–here statistics here status and here supply and here–here technology and here, here–here, here–here, here, here, here–here, here unique human skills here–here, here vacancies here–here women and here–here working hours here–here, here working week here–here employment changes here, here, here–here companies and here–here industry sectors and here–here, here entrepreneurship here–here see also independent producers equity release schemes here experiential learning here–here, here, here experimentation here, here–here, here–here explorers here–here, here–here adventurers here age and here–here assets and here crucible experiences and here–here options and here–here searchers here, here exponential discounting here exponential growth here–here Facebook here families here, here, here–here, here children here, here–here, here dual career households here, here, here marriage here–here work and here, here finance here, here–here see also pensions age process algorithms here, here agency and here–here automation and here–here costs here–here efficacy and here–here equity release schemes here flexibility here governments and here–here, here, here–here health and here housing and here–here hyperbolic discounting here–here inheritances here–here investment here, here–here, here–here, here, here old age and here–here pay here–here, here pension replacement rates here–here, here, here–here portfolios here–here psychology and here–here retirement and here–here fitness and health here–here see also health Fleming, Ian here flexibility here, here–here, here, here–here, here–here, here corporations and here–here government policy and here–here working patterns and here flexibility stigma here, here Ford, Henry here Foxconn here Frey, Carl here Friedman, Stewart here–here, here Fries, James here, here Future of Work Consortium here future selves here–here future selves case studies Jane here–here, here–here Jimmy here–here, here galumphing here–here gender here, here see also women inequality here–here, here–here, here, here, here specialization of labour here, here–here, here, here, here–here Generation Y here generational attitudes here gerontology here Giddens, Anthony here, here gig economy here–here globalization here Goldin, Claudia here, here Google here governments here, here–here, here inequalities and here–here pensions and here–here rate of change and here–here Gratton, Lynda here Shift, The here growth mindset here–here Groysberg, Boris here Haffenden, Margaret here Hagestad, Gunhild here–here, here Harvard Grant Study here health here, here–here brain, the here–here chronic diseases here–here, here compression of morbidity here–here dementia here diseases of old age here–here finance and here improvements in here–here inequality here, here–here infectious diseases here public health here stress here–here healthy life expectancy here heterogeneity here hollowing out of work here–here, here, here home, work and here household here–here see also home economies of scale and here–here relationships here, here–here, here, here housing here–here imputed rent here, here ownership here HR policies here–here human skills here–here, here, here, here hyperbolic discounting here–here Ibarra, Herminia here identity here–here, here, here–here, here–here see also self-control; self-knowledge improvisation here–here imputed rent here, here income see also welfare distribution here growth and here inequalities here–here, here–here skills and knowledge and here–here income effect here–here independent producers here–here, here–here assets and here case study here–here creative clusters and here–here learning and here–here prototyping here–here reputation and curating and here–here India here–here Individual, the here Industrial Revolution, the here–here, here, here, here inequalities here–here gender and here–here, here–here, here, here, here government policy and here–here health here, here–here income here–here, here–here life expectancy and here–here, here–here, here infant mortality here intangible assets here–here, here–here, here case studies here–here, here–here, here corporations and here–here endowed individual characteristics here, here independent producers and here marriage and here productive assets see productive assets time and here transformational assets see transformational assets transitions and here–here vitality assets see vitality assets International Labour Organization (ILO) here ‘Women and the Future of Work’ here investment here, here–here, here–here, here Japan centenarians here–here life expectancy here, here–here,here–here, here pensions and here population decline and here job classification here–here job creation here–here job satisfaction here–here juvenescence here, here–here, here Kahneman, Daniel here Kegan, Robert here Keynes, John Maynard: Economic Possibilities of Our Grandchildren here knowledge see skills and knowledge Kurzweil, Ray here labour market see employment Lampedusa, Giuseppe : Leopard, The here law (occupation) here–here leadership here learning methods here leisure class here leisure industry here, here, here–here leisure time here, here, here–here, here–here, here–here Keynes, John Maynard and here life expectancy here–here, here see also long life best practice here, here calculating here–here, here chronic diseases and here–here cohort estimate of here, here, here developing countries and here–here diseases of old age and here–here government plans and here healthy life expectancy here historical here, here, here increase in here–here, here India and here–here inequalities in here–here, here–here, here infant mortality and here Japan and here, here–here, here–here, here limit to here–here period life expectancy measure here, here–here public health innovations and here South Korea here US and here–here Western Europe here life stages here–here, here–here age and here–here experiential learning and here explorers and here–here, here–here independent producers and here–here, here–here juvenescence and here, here–here multi-stage model here, here–here, here, here, here–here, here–here, here new stages here, here see also life stages case studies portfolios and here–here, here–here three-stage model here, here–here, here–here, here–here, here–here, here–here, here–here transitions and here life stages case studies diversity and here Jane here–here Jimmy here–here, here lifetime allowances here–here, here, here liminality here Linde, Charlotte here lockstep of action here–here, here London here–here London Business School here long life see also life expectancy as a curse here, here as a gift here, here Luddites, the here machine learning here marriage here–here Marsh, Paul here Marshall, Anthony here mastery here–here matching here–here Millenials here Mirvas, Philip here Modigliani, Franco here MOOCs (Massive Open Online Courses) here, here Moore’s Law here–here, here Moravec’s Paradox here, here morbidity here–here compression of here–here Morrissey, Francis here mortality here mortality risk here multiple selves here–here National Commission on Technology, Automation, and Economic Progress here neighbourhoods here neoplasticity here neoteny here, here new experiences here occupations here–here old age dependency ration here–here, here Ondine, curse of here options here, here–here Osborne, Michael here paid leave here Parfit, Derek here participation rates here–here, here peers here–here pension case studies Jack here, here–here, here, here Jane here, here, here–here, here, here, here, here, here–here Jimmy here–here, here, here–here, here, here, here–here, here three-stage life model here–here, here–here, here–here pension replacement rate here–here, here, here–here pensions here, here–here, here see also pension case studies amount required here–here funded schemes here goals and here government policy and here–here investment and here, here occupational pensions here–here Pay As You Go schemes here–here, here, here pension replacement rate here–here, here, here–here reform and here state pensions here–here, here period life expectancy measure here, here–here personal brands here pharmacy (occupation) here planning here plasticity here–here play here–here politics, engagement with here Polyani’s Paradox here–here, here population here–here, here–here portfolios (financial) here–here portfolios (life stage) here–here, here–here switching costs here transitions and here–here posse here–here, here possible selves here, here–here possible selves case studies Jane here–here Jimmy here–here, here Preston, Samuel here production complementarities here, here–here, here productive assets here–here, here case studies here, here, here–here, here, here, here, here–here, here, here marriage and here transitions and here professional social capital here–here prototyping here–here psychology here, here–here see also self-control age process algorithms here, here automation and here–here behavioural nudges here saving and here–here pure relationships here, here pyramid schemes here re-creation and recreation here–here, here–here recruitment here reflexive project, the here regenerative community here, here, here Relation P here relationships here–here, here, here children and here–here divorce and here–here, here dual career households here families and here–here, here–here friendships here, here–here household here, here–here, here, here marriage and here–here, here–here matches and here–here multi-generational living here–here, here options and here–here pure relationship here switching roles here, here, here, here–here reputation here–here, here–here, here–here retirees here–here retirement see also pensions age of here, here, here, here, here–here, here consumption levels and here corporations and here, here government policy and here–here stimulation in here, here risk here risk pooling here robotics here, here, here, here see also Artificial Intelligence role models here routine here routine activities here routine-busting here routine tasks here–here Rule of here here Sabbath, the here sabbaticals here–here Save More Tomorrow (SMarT plan) here–here Scharmer, Otto here second half of the chessboard here–here segregation of the ages here–here, here–here, here, here–here self-control here–here, here–here age process algorithms here, here automation and here behavioural nudges here self-employment here–here self-knowledge here–here, here finance and here–here service sector here sexuality here–here Shakespeare, William King Lear here sharing economy here–here, here, here short-termism here–here skills and knowledge here, here–here, here see also human skills earning potential and here professional social capital and here–here technology and here–here valuable here–here Slim, Carlos here smart cities here–here independent producers and here–here social media here, here–here society here spare time here see also leisure time standardized practices here–here Staunton, Mike here strategic bequest motive, the here–here substitution effect here switching here, here, here, here–here tangible assets here–here, here, here, here, here see also housing; pensions case studies here, here, here, here, here, here transitions and here taxation here, here–here Teachers Insurance and Annuity Assurance scheme here technology here, here see also Artificial Intelligence computing power here–here, here–here see also Moore’s Law driverless cars here–here, here education and here, here, here employment and here, here–here, here–here, here, here, here–here, here human skills and here, here innovation and here matching and here relationships and here teenagers here–here, here–here, here, here Thaler, Richard here thick market effects here–here Thomas, R. here time here, here–here see also sabbaticals discretionary time here flexibility and here–here, here Industrial Revolution, the here–here, here–here, here intangible assets and here leisure and here, here, here–here, here–here, here–here restructuring here, here spare time here working hours here–here, here, here–here working hours paradox here–here, here working week, the here–here, here time poor here–here trade unions here transformational assets here, here–here, here–here, here, here, here case studies here–here, here–here, here–here, here–here, here, here, here, here crucible experiences and here corporations and here transitions here, here–here, here–here, here corporations and here financing here–here government policy and here, here nature of here–here portfolios and here–here re-creating here recharging here–here tribal rituals here Twitter here Uhlenberg, Peter here–here, here UK, occupational pension schemes and here–here Unilever here universities here US here–here compression of morbidity and here occupational pension schemes and here Valliant, George here value creation here vitality assets here, here–here, here case studies here, here–here, here, here, here–here, here, here, here, here transitions and here–here website here week, the here–here weekend, the here, here weight loss here welfare here–here see also benefits Wharton School of the University of Pennsylvania here–here, here WhatsApp here Wolfran, Hans-Joachim here women see also gender children and here–here relationships and here, here, here work and here–here Women and Love here work see employment working hours here–here, here, here–here working week, the here–here, here Yahoos here–here youthfulness here–here Bloomsbury Information An imprint of Bloomsbury Publishing Plc 50 Bedford Square 1385 Broadway London New York WC1B 3DP NY 10018 UK USA www.bloomsbury.com BLOOMSBURY and the Diana logo are trademarks of Bloomsbury Publishing Plc First published 2016 © Lynda Gratton and Andrew Scott, 2016 Lynda Gratton and Andrew Scott have asserted their right under the Copyright, Designs and Patents Act, 1988, to be identified as Author of this work.


pages: 336 words: 93,672

The Future of the Brain: Essays by the World's Leading Neuroscientists by Gary Marcus, Jeremy Freeman

23andMe, Albert Einstein, backpropagation, bioinformatics, bitcoin, brain emulation, cloud computing, complexity theory, computer age, computer vision, conceptual framework, correlation does not imply causation, crowdsourcing, dark matter, data acquisition, data science, deep learning, Drosophila, epigenetics, Geoffrey Hinton, global pandemic, Google Glasses, ITER tokamak, iterative process, language acquisition, linked data, mouse model, optical character recognition, pattern recognition, personalized medicine, phenotype, race to the bottom, Richard Feynman, Ronald Reagan, semantic web, speech recognition, stem cell, Steven Pinker, supply-chain management, synthetic biology, tacit knowledge, traumatic brain injury, Turing machine, twin studies, web application

But in retrospect, PDP networks were too far away from such things; they were right to emphasize the brain’s parallelism but wrong to throw away the computational baby along with the serial bathwater. Today, “neural network” models have become more sophisticated, but only in narrow ways; they have more layers and better learning algorithms, but they still remain too unstructured. The technique known as deep learning offers innovative ideas about how unsupervised systems can form categories for themselves, but it still yields little insight into higher level cognition, like language, planning, and abstract reasoning. If there is no reason to believe that the essence of human cognition is step-by-step sequential computation in a serial computer with a stored program (à la von Neumann’s ubiquitous computer architecture), there is also no reason to dismiss computation itself.

The “Aniston” cells even seem to respond cross-modally, responding to written words as well as to photographs. Hierarchies of feature detectors have now also found practical application, in the modern-day neural networks that I mentioned earlier, in speech recognition and image class ific ation. So-called deep learning, for example, is a successful machine-learning variation on the theme of hierarchical feature detection, using many layers of feature detectors. But just because some of the brain is composed of feature detectors doesn’t mean that all of it is. Some of what the brain does can’t be captured well by feature detection; for example, human beings are glorious generalizers.

See also computational brain concept: percepts and, 171–72 conceptual clarity: consciousness, 170–71, 175 connections: mapping synaptic, of brain, 94; neuronal, 92 connectivity, 210: algorithms, 119; core theme of, 90–91; human right, 90; in situ sequencing to determine, 59–60; mapping out cell types and, 28–31 connectivity map, 29–31, 182 connectome, 11, 40, 45, 182; circuits of brain, 182–84; defining progress, 200–201; as DNA sequencing problem, 40–49; worms, 265 Connectome (Seung), 12 ConnectomeDB, 13 connectomics, 11, 86; electron microscopy (EM), 45–46 consciousness, 159, 255; behavioral technology, 175; binocular rivalry, 174; brain imaging, 174–75; Burge’s model of perception, 171f; cognitive vs. noncognitive theories of, 165–68; conceptual clarity, 170–71, 175; eventrelated potentials (ERPs), 174–75; global broadcasting, 165, 168, 174; global neuronal workspace, 165f, 166; hard problem of, 162, 269; measurement problem, 161–64, 170–72; nonconceptual representations, 170–72; percepts and concepts, 171–72; precursors to conscious state, 163; transgenic mice and optogenetic switch, 168–70 contrastive method: conscious and unconscious perception, 163; consciousness, 163 copy number variants (CNVs), 236 cortex: grid cell generation, 74; grid cells and grid maps, 71–73; mammalian space circuit, 69–71; of men and mice, 26–27; modular structure of, 43; spatial cell types in entorhinal network, 74–76; teachings from grid cells, 76; understanding the, 67–69 cortical microcircuit: morphoelectric types, 119 cortical oscillations: speech perception and, 144–46 counting, 53, 54, 55, 56 Cre driver lines, 29 Crick, Francis, 172, 269 Crohn’s disease, 234 Dang, Chinh, 3, 25 DARPA (Defense Advanced Research Projects Agency), 195, 199 Darwin, Charles, 191 deCODE, 198 deep brain stimulation (DBS), 195, 227 deep learning, 211 default network, 208 Defense Advanced Research Projects Agency (DARPA), 125. See also SyNAPSE project (IBM) deficit-lesion-correlation approach, 139 degenerative diseases, 229 Dehaene, Stanislas, 165, 167, 172, 173, 174 De humani corporis fabrica (On the Fabric of the Human Body) (Vesalius), 3, 4f deletion syndromes, 238 dementia, 256 Dennett, Daniel, 166 Department of Energy, 196 depression, 122, 219, 227, 256 Descartes, René, 258, 269 de Sitter, Willem, 95 developmental language disorder, 154 Diagnostic and Statistical Manual of Mental Disorders, 261 diffusion magnetic resonance imaging (MRI), 5 digital atlases, 15–16 digital atlasing, 5 dimensionality reduction, 82, 102 diversity: human brain, 214 divisive normalization: canonical neural computation, 180 DNA: bar-coding of, 46–48; encoding neural electrical activities into, 61–62; methylation, 189; neuroimaging genomics, 156; polymerase, 57–58, 61, 62f; sequencing, 55–57, 149, 150 Donald, Merlin, 133 Donoghue, John, 217, 219–232 Dostrovsky, John, 69 Dravet syndrome, 242 duck-billed platypus: brain construction, 187; neocortex, 188 Duke University, 115 dynamical systems approach, 82, 83 dyslexia, 11, 139 Eccles, John, 257 echolocating bat, 187 École Polytechnique Fédérale de Lausanne (EPFL), 116, 122 Edinburgh Mouse Atlas Project (EMAP), 9 Eichele, Gregor, 7 Einhäuser, Wolfgang, 174 Einstein, Albert, 95 electrical recorders, 249 electrical recording, 38, 224, 245, 250, 257 electrical stimulation: neurotechnology, 225–26, 228, 260 electrodes: computational steps, 80f, 81; measuring electrical activity, 79, 80f electroencephalograph (EEG), 5, 244, 260 electromagnetic (EM) fields or waves, 248–249 electron microscopy (EM), 45, 60, 257 electrophoresis, 7, 15, 272 electrophysiology, 32, 35, 154 Eliasmith, Chris, 109, 125–36 EM Connectomics, 60 emergence: concept, 92–93 emergent phenomena, 93 encoding schemes, 214 Engert, Florian, 18 entorhinal cortex: grid cells in, of rat brain, 71, 72f; spatial cell types in, 74–76 epigenetic mechanisms: brain and behavior, 189, 190 epigenetics, 189 epilepsy, 194, 219, 230, 236, 240, 242, 266 ethics: human brain simulations, 268–69; whole brain simulation, 123 EurExpress, 9 European Commission, 111, 195 European Community, 94 Evans, Alan, 5, 10, 14 event related potentials (ERPs): consciousness, 172–73 evolution: brain organization, 190–91; epigenetic mechanisms, 189–90; neocortex during course of, 188–89; quest for species differences, 191–92; science dictating process, 191; studying various species, 186–87; understanding history of brain, 192–93; unusual mammals, 187–88 exome: humans, 152, 153–54 EyeWire, 15 Facebook, 103 feature detectors: neocortex, 211–12, 214 feedback pathways: thalamus, 37 Felleman, Daniel, 12 Feynman, Richard, 111 filtering: canonical neural computation, 180 Fisher, Simon E., 137, 149–57 flexible coordination: Spaun model, 132–33 Fluorescent In Situ Sequencing (FISSEQ), 58, 58f FMRP (fragile X mental retardation protein), 240–41 force fields, 180 format: percepts and concepts, 171 Forschungszentrum Jülich, 116 FORTRAN, 44 foundation grants: funding for brain map, 199–200 FOXP2 gene: human and chimpanzee differences, 156; mutations of, 151–52, 155; songbirds, 155–56 fragile X mental retardation protein (FMRP), 240–41 Fragile X syndrome, 240–42 Freeman, Jeremy, 23, 65, 100–107 Freud, Sigmund, 259 Freud’s psychodynamic theory, 206 Fried, Itzhak, 211 functional brain map, 161 functional dissociations, 140 functional localization: concept, 139 functional magnetic resonance imaging (fMRI), 4–5, 244, 260 functional modeling: neural responses, 102 fusiform face area, 163; identification of, 163 Fyhn, Marianne, 71 Galen, Claudius, 3 GE, 200 Genbank (public database), 196 genealyzers, 203 gene expression, 6–9, 8f, 54 GenePaint, 9 GENESIS neural simulator, 183 genetic brain, 6–14 genetics: psychiatric patients, 235–37 genome: humans, 149, 152; neuroimaging genomics, 156–57.


pages: 340 words: 90,674

The Perfect Police State: An Undercover Odyssey Into China's Terrifying Surveillance Dystopia of the Future by Geoffrey Cain

airport security, Alan Greenspan, AlphaGo, anti-communist, Bellingcat, Berlin Wall, Black Lives Matter, Citizen Lab, cloud computing, commoditize, computer vision, coronavirus, COVID-19, deep learning, DeepMind, Deng Xiaoping, Edward Snowden, European colonialism, fake news, Geoffrey Hinton, George Floyd, ghettoisation, global supply chain, Kickstarter, land reform, lockdown, mass immigration, military-industrial complex, Nelson Mandela, Panopticon Jeremy Bentham, pattern recognition, phenotype, pirate software, post-truth, purchasing power parity, QR code, RAND corporation, Ray Kurzweil, ride hailing / ride sharing, Right to Buy, self-driving car, sharing economy, Silicon Valley, Skype, smart cities, South China Sea, speech recognition, TikTok, Tim Cook: Apple, trade liberalization, trade route, undersea cable, WikiLeaks

I am grateful to a former Google AI developer for assisting with the phrasing of this explanation of deep neural nets. 5. Michael Chui, James Manyika, Mehdi Miremadi, Nicolaus Henke, Rita Chung, Pieter Nel, and Sankalp Malhotra. “Notes from the AI Frontier: Applications and Value of Deep Learning,” McKinsey Global Institute, April 17, 2018, https://www.mckinsey.com/featured-insights/artificial-intelligence/notes-from-the-ai-frontier-applications-and-value-of-deep-learning#. 6. Andrew Ross Sorkin and Steve Lohr, “Microsoft to Buy Skype for $8.5 Billion,” New York Times, May 10, 2011, https://dealbook.nytimes.com/2011/05/10/microsoft-to-buy-skype-for-8-5-billion/. 7. The World Bank, “Individuals Using the Internet (% of Population)—China,” https://data.worldbank.org/indicator/IT.NET.USER.ZS?

Nvidia, “AI Powered Facial Recognition for Computers with SenseTime,” posted by YouTube user NVIDIA on June 6, 2016, https://www.youtube.com/watch?v=wMUmPumXtpw. The video was recorded at Nvidia’s Emerging Companies Summit in 2016. 17. Aaron Tilley, “The New Intel: How Nvidia Went from Powering Video Games to Revolutionizing Artificial Intelligence,” Forbes, November 30, 2016, https://www.forbes.com/sites/aarontilley/2016/11/30/nvidia-deep-learning-ai-intel/?sh=ba1b3777ff1e. 18. Paul Mozur and Don Clark, “China’s Surveillance State Sucks Up Data,” New York Times, November 24, 2020, https://www.nytimes.com/2020/11/22/technology/china-intel-nvidia-xinjiang.html. 19. Nvidia, “AI Powered Facial Recognition.” 20. Nvidia, “AI Powered Facial Recognition.” 21.


pages: 404 words: 92,713

The Art of Statistics: How to Learn From Data by David Spiegelhalter

Abraham Wald, algorithmic bias, Antoine Gombaud: Chevalier de Méré, Bayesian statistics, Brexit referendum, Carmen Reinhart, Charles Babbage, complexity theory, computer vision, confounding variable, correlation coefficient, correlation does not imply causation, dark matter, data science, deep learning, DeepMind, Edmond Halley, Estimating the Reproducibility of Psychological Science, government statistician, Gregor Mendel, Hans Rosling, Higgs boson, Kenneth Rogoff, meta-analysis, Nate Silver, Netflix Prize, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, p-value, placebo effect, probability theory / Blaise Pascal / Pierre de Fermat, publication bias, randomized controlled trial, recommendation engine, replication crisis, self-driving car, seminal paper, sparse data, speech recognition, statistical model, sugar pill, systematic bias, TED Talk, The Design of Experiments, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, Thomas Bayes, Thomas Malthus, Two Sigma

Measures that lack value for prediction or classification may be identified by data visualization or regression methods and then discarded, or the numbers of features may be reduced by forming composite measures that encapsulate most of the information. Recent developments in extremely complex models, such as those labelled as deep learning, suggest that this initial stage of data reduction may not be necessary and the total raw data can be processed in a single algorithm. Classification and Prediction A bewildering range of alternative methods are now readily available for building classification and prediction algorithms.

• Neural networks comprise layers of nodes, each node depending on the previous layer by weights, rather like a series of logistic regressions piled on top of each other. Weights are learned by an optimization procedure, and, rather like random forests, multiple neural networks can be constructed and averaged. Neural networks with many layers have become known as deep-learning models: Google’s Inception image-recognition system is said to have over twenty layers and over 300,000 parameters to estimate. • K-nearest-neighbour classifies according to the majority outcome among close cases in the training set. The results of applying some of these methods to the Titanic data, with tuning parameters chosen using tenfold cross-validation and ROC as an optimization criterion, are shown in Table 6.4.

data literacy: the ability to understand the principles behind learning from data, carry out basic data analyses, and critique the quality of claims made on the basis of data. data science: the study and application of techniques for deriving insights from data, including constructing algorithms for prediction. Traditional statistical science forms part of data science, which also includes a strong element of coding and data management. deep learning: a machine-learning technique that extends standard artificial neural network models to many layers representing different levels of abstraction, say going from individual pixels of an image through to recognition of objects. dependent events: when the probability of one event depends on the outcome of another event.


pages: 282 words: 93,783

The Future Is Analog: How to Create a More Human World by David Sax

Alvin Toffler, augmented reality, autonomous vehicles, Bernie Sanders, big-box store, bike sharing, Black Lives Matter, blockchain, bread and circuses, Buckminster Fuller, Cal Newport, call centre, clean water, cognitive load, commoditize, contact tracing, contact tracing app, COVID-19, crowdsourcing, cryptocurrency, data science, David Brooks, deep learning, digital capitalism, Donald Trump, driverless car, Elon Musk, fiat currency, Francis Fukuyama: the end of history, future of work, gentrification, George Floyd, indoor plumbing, informal economy, Jane Jacobs, Jaron Lanier, Jeff Bezos, Kickstarter, knowledge worker, lockdown, Lyft, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Minecraft, New Urbanism, nuclear winter, opioid epidemic / opioid crisis, Peter Thiel, RAND corporation, Ray Kurzweil, remote working, retail therapy, RFID, Richard Florida, ride hailing / ride sharing, Saturday Night Live, Shoshana Zuboff, side hustle, Sidewalk Labs, Silicon Valley, Silicon Valley startup, Skype, smart cities, social distancing, sovereign wealth fund, Steve Jobs, Superbowl ad, supply-chain management, surveillance capitalism, tech worker, technological singularity, technoutopianism, TED Talk, The Death and Life of Great American Cities, TikTok, Uber and Lyft, uber lyft, unemployed young men, urban planning, walkable city, Y2K, zero-sum game

That trust extends to a belief that letting students evolve in a way that encourages them to love learning and to see education as a part of their humanity will invariably lead to what Heinonen calls “deep learning.” Deep learning is what happens when education transcends information retention. It is more about knowing what to do with the knowledge you gain than just memorizing it. This is the same emotional learning that Mary Helen Immordino-Yang wrote about, where students genuinely care about the process of learning. Heinonen told me that deep learning actually makes the skills-based learning of core subjects, like math, science, reading, and writing, more effective. Compared to students in other countries, Finns spend far less time learning these subjects and use far less technology when doing so.


pages: 619 words: 177,548

Power and Progress: Our Thousand-Year Struggle Over Technology and Prosperity by Daron Acemoglu, Simon Johnson

"Friedman doctrine" OR "shareholder theory", "World Economic Forum" Davos, 4chan, agricultural Revolution, AI winter, Airbnb, airline deregulation, algorithmic bias, algorithmic management, Alignment Problem, AlphaGo, An Inconvenient Truth, artificial general intelligence, augmented reality, basic income, Bellingcat, Bernie Sanders, Big Tech, Bletchley Park, blue-collar work, British Empire, carbon footprint, carbon tax, carried interest, centre right, Charles Babbage, ChatGPT, Clayton Christensen, clean water, cloud computing, collapse of Lehman Brothers, collective bargaining, computer age, Computer Lib, Computing Machinery and Intelligence, conceptual framework, contact tracing, Corn Laws, Cornelius Vanderbilt, coronavirus, corporate social responsibility, correlation does not imply causation, cotton gin, COVID-19, creative destruction, declining real wages, deep learning, DeepMind, deindustrialization, Demis Hassabis, Deng Xiaoping, deskilling, discovery of the americas, disinformation, Donald Trump, Douglas Engelbart, Douglas Engelbart, Edward Snowden, Elon Musk, en.wikipedia.org, energy transition, Erik Brynjolfsson, European colonialism, everywhere but in the productivity statistics, factory automation, facts on the ground, fake news, Filter Bubble, financial innovation, Ford Model T, Ford paid five dollars a day, fulfillment center, full employment, future of work, gender pay gap, general purpose technology, Geoffrey Hinton, global supply chain, Gordon Gekko, GPT-3, Grace Hopper, Hacker Ethic, Ida Tarbell, illegal immigration, income inequality, indoor plumbing, industrial robot, interchangeable parts, invisible hand, Isaac Newton, Jacques de Vaucanson, James Watt: steam engine, Jaron Lanier, Jeff Bezos, job automation, Johannes Kepler, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Joseph-Marie Jacquard, Kenneth Arrow, Kevin Roose, Kickstarter, knowledge economy, labor-force participation, land reform, land tenure, Les Trente Glorieuses, low skilled workers, low-wage service sector, M-Pesa, manufacturing employment, Marc Andreessen, Mark Zuckerberg, megacity, mobile money, Mother of all demos, move fast and break things, natural language processing, Neolithic agricultural revolution, Norbert Wiener, NSO Group, offshore financial centre, OpenAI, PageRank, Panopticon Jeremy Bentham, paperclip maximiser, pattern recognition, Paul Graham, Peter Thiel, Productivity paradox, profit maximization, profit motive, QAnon, Ralph Nader, Ray Kurzweil, recommendation engine, ride hailing / ride sharing, Robert Bork, Robert Gordon, Robert Solow, robotic process automation, Ronald Reagan, scientific management, Second Machine Age, self-driving car, seminal paper, shareholder value, Sheryl Sandberg, Shoshana Zuboff, Silicon Valley, social intelligence, Social Responsibility of Business Is to Increase Its Profits, social web, South Sea Bubble, speech recognition, spice trade, statistical model, stem cell, Steve Jobs, Steve Wozniak, strikebreaker, subscription business, Suez canal 1869, Suez crisis 1956, supply-chain management, surveillance capitalism, tacit knowledge, tech billionaire, technoutopianism, Ted Nelson, TED Talk, The Future of Employment, The Rise and Fall of American Growth, The Structural Transformation of the Public Sphere, The Wealth of Nations by Adam Smith, theory of mind, Thomas Malthus, too big to fail, total factor productivity, trade route, transatlantic slave trade, trickle-down economics, Turing machine, Turing test, Twitter Arab Spring, Two Sigma, Tyler Cowen, Tyler Cowen: Great Stagnation, union organizing, universal basic income, Unsafe at Any Speed, Upton Sinclair, upwardly mobile, W. E. B. Du Bois, War on Poverty, WikiLeaks, wikimedia commons, working poor, working-age population

These lessons about human intelligence and adaptability are often ignored in the AI community, which rushes to automate a range of tasks, regardless of the role of human skill. The triumph of AI in radiology is much trumpeted. In 2016 Geoffrey Hinton, cocreator of modern deep-learning methods, Turing Award winner, and Google scientist, suggested that “people should stop training radiologists now. It’s just completely obvious that within five years deep learning is going to do better than radiologists.” Nothing of the sort has yet happened, and demand for radiologists has increased since 2016, for a very simple reason. Full radiological diagnosis requires even more situational and social intelligence than, for example, customer service, and it is currently beyond the capabilities of machines.

Although talk of intelligent machines has been around for two decades, these technologies started spreading only after 2015. The takeoff is visible in the amount that firms spend on AI-related activities and in the number of job postings for workers with specialized AI skills (including machine learning, machine vision, deep learning, image recognition, natural-language processing, neural networks, support vector machines, and latent semantic analysis). Tracking this indelible footprint, we can see that AI investments and the hiring of AI specialists concentrate in organizations that rely on tasks that can be performed by these technologies, such as actuarial and accounting functions, procurement and purchasing analysis, and various other clerical jobs that involve pattern recognition, computation, and basic speech recognition.

There have also been major advances in data storage, reducing the cost of storing and accessing massive data sets, and improvements in the ability to perform large amounts of computation distributed across many devices, aided by rapid advances in microprocessors and cloud computing. Equally important has been progress in machine learning, especially “deep learning,” by using multilayer statistical models, such as neural networks. In traditional statistical analysis a researcher typically starts with a theory specifying a causal relationship. A hypothesis linking the valuation of the US stock market to interest rates is a simple example of such a causal relationship, and it naturally lends itself to statistical analysis for investigating whether it fits the data and for forecasting future movements.


pages: 417 words: 103,458

The Intelligence Trap: Revolutionise Your Thinking and Make Wiser Decisions by David Robson

active measures, Affordable Care Act / Obamacare, Albert Einstein, Alfred Russel Wallace, Atul Gawande, autism spectrum disorder, availability heuristic, behavioural economics, classic study, cognitive bias, corporate governance, correlation coefficient, cuban missile crisis, Daniel Kahneman / Amos Tversky, dark matter, deep learning, deliberate practice, dematerialisation, Donald Trump, Dunning–Kruger effect, fake news, Flynn Effect, framing effect, fundamental attribution error, illegal immigration, Isaac Newton, job satisfaction, knowledge economy, Large Hadron Collider, lone genius, meta-analysis, Nelson Mandela, obamacare, Parler "social media", pattern recognition, post-truth, price anchoring, reality distortion field, Richard Feynman, risk tolerance, Silicon Valley, social intelligence, Steve Jobs, sunk-cost fallacy, tacit knowledge, TED Talk, the scientific method, theory of mind, traveling salesman, ultimatum game, Y2K, Yom Kippur War

Part 3 turns to the science of learning and memory. Despite their brainpower, intelligent people sometimes struggle to learn well, reaching a kind of plateau in their abilities that fails to reflect their potential. Evidence-based wisdom can help to break that vicious cycle, offering three rules for deep learning. Besides helping us to meet our own personal goals, this cutting-edge research also explains why East Asian education systems are already so successful at applying these principles, and the lessons that Western schooling can learn from them to produce better learners and wiser thinkers. Finally, Part 4 expands our focus beyond the individual, to explore the reasons that talented groups act stupidly – from the failings of the England football team to the crises of huge organisations like BP, Nokia and NASA.

But then you say, “here’s a new fact”, and they’ll go, “Oh, well, that changes things; you’re right.” ’ Bock’s comments show us that there is now a movement away from considering SAT scores and the like as the sum total of our intellectual potential. But the old and new ways of appraising the mind do not need to be in opposition, and in Chapter 8 we will explore how some of the world’s best schools already cultivate these qualities and the lessons they can teach us all about the art of deep learning. If you have been inspired by this research, one of the simplest ways to boost anyone’s curiosity is to become more autonomous during learning. This can be as simple as writing out what you already know about the material to be studied and then setting down the questions you really want to answer.

When Leighton finally reached Kyzyl himself, he left a small plaque in memory of Feynman, and his daughter Michelle would make her own visit in the late 2000s. ‘Like [Ferdinand] Magellan, Richard Feynman completed his last journey in our minds and hearts’, Leighton wrote in his memoir. ‘Through his inspiration to others, his dream took on a life of its own.’ 8 The benefits of eating bitter: East Asian education and the three principles of deep learning James Stigler’s heart was racing and his palms were sweaty – and he wasn’t even the one undergoing the ordeal. A graduate student at the University of Michigan, Stigler was on his first research trip to Japan, and he was now observing a fourth-grade lesson in Sendai. The class were learning how to draw three-dimensional cubes, a task that is not as easy as it might sound for many children, and as the teacher surveyed the students’ work, she quickly singled out a boy whose drawings were particularly sloppy and ordered him to transfer his efforts to the blackboard – in front of everyone.


pages: 418 words: 102,597

Being You: A New Science of Consciousness by Anil Seth

AlphaGo, artificial general intelligence, augmented reality, backpropagation, carbon-based life, Claude Shannon: information theory, computer age, computer vision, Computing Machinery and Intelligence, coronavirus, correlation does not imply causation, CRISPR, cryptocurrency, deep learning, deepfake, DeepMind, Drosophila, en.wikipedia.org, Filter Bubble, GPT-3, GPT-4, John Markoff, longitudinal study, Louis Pasteur, mirror neurons, Neil Armstrong, Nick Bostrom, Norbert Wiener, OpenAI, paperclip maximiser, pattern recognition, Paul Graham, Pierre-Simon Laplace, planetary scale, Plato's cave, precautionary principle, Ray Kurzweil, self-driving car, speech recognition, stem cell, systems thinking, technological singularity, TED Talk, telepresence, the scientific method, theory of mind, Thomas Bayes, TikTok, Turing test

The computer scientist David Marr’s classic 1982 computational theory of vision is both a standard reference for the bottom-up view of perception and a practical cookbook for the design and construction of artificial vision systems. More recent machine vision systems implementing artificial neural networks – such as ‘deep learning’ networks – are nowadays achieving impressive performance levels, in some situations comparable to what humans can do. These systems, too, are frequently based on bottom-up theories. With all these points in its favour, the bottom-up ‘how things seem’ view of perception seems to be on pretty solid ground

See www.nytimes.com/2015/09/21/technology/personaltech/software-is-smart-enough-for-sat-but-still-far-from-intelligent.html. vast artificial neural network: GPT stands for ‘Generative Pre-trained Transformer’ – a type of neural network specialised for language prediction and generation. These networks are trained using an unsupervised deep learning approach essentially to ‘predict the next word’ given a previous word or text snippet. GPT-3 has an astonishing 175 billion parameters and was trained on some 45 terabytes of text data. See https://openai.com/blog/openai-api/ and for technical details: https://arxiv.org/abs/2005.14165. it does not understand: Of course this depends on what is meant by ‘understanding’.

‘Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects’. Nature Neuroscience, 2(1), 79–87. Reep, R. L., Finlay, B. L., & Darlington, R. B. (2007). ‘The limbic system in mammalian brain evolution’. Brain, Behavior and Evolution, 70(1), 57–70. Richards, B. A., Lillicrap, T. P., Beaudoin, P., et al. (2019). ‘A deep learning framework for neuroscience’. Nature Neuroscience, 22(11), 1761–70. Riemer, M., Trojan, J., Beauchamp, M., et al. (2019). ‘The rubber hand universe: On the impact of methodological differences in the rubber hand illusion’. Neuroscience and Biobehavioral Reviews, 104, 268–80. Rosas, F., Mediano, P.


pages: 223 words: 60,909

Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech by Sara Wachter-Boettcher

"Susan Fowler" uber, Abraham Maslow, Airbnb, airport security, algorithmic bias, AltaVista, big data - Walmart - Pop Tarts, Big Tech, Black Lives Matter, data science, deep learning, Donald Trump, fake news, false flag, Ferguson, Missouri, Firefox, Grace Hopper, Greyball, Hacker News, hockey-stick growth, independent contractor, job automation, Kickstarter, lifelogging, lolcat, Marc Benioff, Mark Zuckerberg, Max Levchin, Menlo Park, meritocracy, microaggression, move fast and break things, natural language processing, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, off-the-grid, pattern recognition, Peter Thiel, real-name policy, recommendation engine, ride hailing / ride sharing, Salesforce, self-driving car, Sheryl Sandberg, Silicon Valley, Silicon Valley startup, Snapchat, Steve Jobs, Tactical Technology Collective, TED Talk, Tim Cook: Apple, Travis Kalanick, upwardly mobile, Wayback Machine, women in the workforce, work culture , zero-sum game

Sorelle Friedler, phone interview with the author, January 30, 2017. 9. “Why Google ‘Thought’ This Black Woman Was a Gorilla,” Note to Self, WNYC, September 28, 2015, http://www.wnyc.org/story/deep-problem-deep-learning. 10. Jacky Alciné, email to the author, January 27, 2017. 11. Google Photos, [product tour screens], accessed January 28, 2017, https://photos.google.com. 12. For a walk through the basics, see the free online book by Michael Nielsen: Neural Networks and Deep Learning (Determination Press, 2015), http://neuralnetworksanddeeplearning.com. 13. Daniela Hernandez, “The New Google Photos App Is Disturbingly Good at Data-Mining Your Photos,” Fusion, June 4, 2015, http://fusion.net/story/142326/the-new-google-photos-app-is-disturbingly-good-at-data-mining-your-photos. 14.


Demystifying Smart Cities by Anders Lisdorf

3D printing, artificial general intelligence, autonomous vehicles, backpropagation, behavioural economics, Big Tech, bike sharing, bitcoin, business intelligence, business logic, business process, chief data officer, circular economy, clean tech, clean water, cloud computing, computer vision, Computing Machinery and Intelligence, congestion pricing, continuous integration, crowdsourcing, data is the new oil, data science, deep learning, digital rights, digital twin, distributed ledger, don't be evil, Elon Musk, en.wikipedia.org, facts on the ground, Google Glasses, hydroponic farming, income inequality, information security, Infrastructure as a Service, Internet of things, Large Hadron Collider, Masdar, microservices, Minecraft, OSI model, platform as a service, pneumatic tube, ransomware, RFID, ride hailing / ride sharing, risk tolerance, Salesforce, self-driving car, smart cities, smart meter, software as a service, speech recognition, Stephen Hawking, Steve Jobs, Steve Wozniak, Stuxnet, Thomas Bayes, Turing test, urban sprawl, zero-sum game

The basic premise is that intelligence is something humans can recreate and that we can imbue other entities with this intelligence. There are a number of more or less well-defined subfields of artificial intelligence that are sometimes used interchangeably with the term such as machine learning, deep learning, data mining, neural networks, and so on. In practical terms, they all build applications with computer code that implements particular algorithms. An algorithm is a set of instructions or rules that will provide the solution to a problem. Clearly, not all algorithms are AI. A recipe qualifies as an algorithm but hardly as an intelligent one.

If the output does not match the expected, the weights are open to change, but the more they are successful, the more fixed the weights become. In the end the system consisting of multiple layers of “neurons” adapts such that the input is transformed to elicit the correct output. This is also what is behind the term deep learning where the number of layers is increased. Contrast this to the decision tree where each layer of the tree gives you a good and comprehensible information about how decisions are made in terms of classification. In a neural net, all you have are layers consisting of weights and connections. This is why it is considered a black box.


pages: 196 words: 61,981

Blockchain Chicken Farm: And Other Stories of Tech in China's Countryside by Xiaowei Wang

4chan, AI winter, Amazon Web Services, artificial general intelligence, autonomous vehicles, back-to-the-land, basic income, Big Tech, bitcoin, blockchain, business cycle, cloud computing, Community Supported Agriculture, computer vision, COVID-19, cryptocurrency, data science, deep learning, Deng Xiaoping, Didi Chuxing, disruptive innovation, Donald Trump, drop ship, emotional labour, Ethereum, ethereum blockchain, Francis Fukuyama: the end of history, Garrett Hardin, gig economy, global pandemic, Great Leap Forward, high-speed rail, Huaqiangbei: the electronics market of Shenzhen, China, hype cycle, income inequality, informal economy, information asymmetry, Internet Archive, Internet of things, job automation, Kaizen: continuous improvement, Kickstarter, knowledge worker, land reform, Marc Andreessen, Mark Zuckerberg, Menlo Park, multilevel marketing, One Laptop per Child (OLPC), Pearl River Delta, peer-to-peer lending, precision agriculture, QR code, ride hailing / ride sharing, risk tolerance, Salesforce, Satoshi Nakamoto, scientific management, self-driving car, Silicon Valley, Snapchat, SoftBank, software is eating the world, surveillance capitalism, TaskRabbit, tech worker, technological solutionism, the long tail, TikTok, Tragedy of the Commons, universal basic income, vertical integration, Vision Fund, WeWork, Y Combinator, zoonotic diseases

Artificial intelligence is a broad category, and that broadness makes it susceptible to slippery usages, to being malleable to any kind of political or economic end. AI is technically a subset of machine learning. And within artificial intelligence, one of the most exciting areas over the past ten years has been work done on neural networks, which are used in deep learning. These artificial neural networks rely on models of the brain that have been formalized into mathematical operations. Research into these “artificial neurons” began as early as 1943, with a paper by Warren McCulloch and Walter Pitts on the perceptron, an algorithm that modeled binary (yes/no) classification, which would serve as the foundation of contemporary neural networks.

The seduction of AI is already palpable in China and the United States, across the political spectrum, as people advocate for a fully automated world. The attraction is not simply about rationality and the level of control provided by making systems automated. It’s also about scale: once implemented, certain applications of deep learning, like image recognition, have been shown to be faster and more accurate than humans. It’s no surprise that these qualities make AI the ideal worker. Many of us live in a world where machine learning and forms of artificial intelligence already pervade our everyday lives—recommendation algorithms, fun cosmetic and face filters on Snapchat and Meitu, automated checkouts using image-recognition cameras.


pages: 458 words: 116,832

The Costs of Connection: How Data Is Colonizing Human Life and Appropriating It for Capitalism by Nick Couldry, Ulises A. Mejias

"World Economic Forum" Davos, 23andMe, Airbnb, Amazon Mechanical Turk, Amazon Web Services, behavioural economics, Big Tech, British Empire, call centre, Cambridge Analytica, Cass Sunstein, choice architecture, cloud computing, colonial rule, computer vision, corporate governance, dark matter, data acquisition, data is the new oil, data science, deep learning, different worldview, digital capitalism, digital divide, discovery of the americas, disinformation, diversification, driverless car, Edward Snowden, emotional labour, en.wikipedia.org, European colonialism, Evgeny Morozov, extractivism, fake news, Gabriella Coleman, gamification, gig economy, global supply chain, Google Chrome, Google Earth, hiring and firing, income inequality, independent contractor, information asymmetry, Infrastructure as a Service, intangible asset, Internet of things, Jaron Lanier, job automation, Kevin Kelly, late capitalism, lifelogging, linked data, machine readable, Marc Andreessen, Mark Zuckerberg, means of production, military-industrial complex, move fast and break things, multi-sided market, Naomi Klein, Network effects, new economy, New Urbanism, PageRank, pattern recognition, payday loans, Philip Mirowski, profit maximization, Ray Kurzweil, RFID, Richard Stallman, Richard Thaler, Salesforce, scientific management, Scientific racism, Second Machine Age, sharing economy, Shoshana Zuboff, side hustle, Sidewalk Labs, Silicon Valley, Slavoj Žižek, smart cities, Snapchat, social graph, social intelligence, software studies, sovereign wealth fund, surveillance capitalism, techlash, The Future of Employment, the scientific method, Thomas Davenport, Tim Cook: Apple, trade liberalization, trade route, undersea cable, urban planning, W. E. B. Du Bois, wages for housework, work culture , workplace surveillance

The Intercept, April 30, 2017. https://theintercept.com/2017/04/30/taser-will-use-police-body-camera-videos-to-anticipate-criminal-activity/. Kollewe, Julia. “Marmite Maker Unilever Threatens to Pull Ads from Facebook and Google.” Guardian, February 12, 2018. Krazit, Tom. “Amazon Web Services Backs Deep-Learning Format Introduced by Microsoft and Facebook.” Geekwire, November 16, 2017. https://www.geekwire.com/2017/amazon-web-services-backs-deep-learning-format-introduced-microsoft-facebook/. Krishna, Sankaran. Globalization and Postcolonialism. Lanham, MD: Rowman & Littlefield Publishers, 2008. Kuchler, Hannah. “Facebook Investors Wake Up to Era of Slower Growth.” Financial Times, July 26, 2018. .

In terms of human users, these processes and ecologies of data collection are vast, and the resulting concentration of advertising power correspondingly huge: 72 percent of global advertising spending is in the hands of Google and Facebook. 47 From this base, huge investments in data analysis techniques become possible, with so called deep learning increasingly central to the largest IT businesses. From early on, the vision of Google’s founders was “AI complete.” Much later, IBM announced in 2017 that it “is now a cognitive solutions and cloud platform company,” while Microsoft reorganized itself in March 2018 to prioritize its cloud services and AI businesses.48 The retailers Amazon and Walmart are giant data processing operations.


pages: 281 words: 71,242

World Without Mind: The Existential Threat of Big Tech by Franklin Foer

artificial general intelligence, back-to-the-land, Berlin Wall, big data - Walmart - Pop Tarts, Big Tech, big-box store, Buckminster Fuller, citizen journalism, Colonization of Mars, computer age, creative destruction, crowdsourcing, data is the new oil, data science, deep learning, DeepMind, don't be evil, Donald Trump, Double Irish / Dutch Sandwich, Douglas Engelbart, driverless car, Edward Snowden, Electric Kool-Aid Acid Test, Elon Musk, Evgeny Morozov, Fall of the Berlin Wall, Filter Bubble, Geoffrey Hinton, global village, Google Glasses, Haight Ashbury, hive mind, income inequality, intangible asset, Jeff Bezos, job automation, John Markoff, Kevin Kelly, knowledge economy, Law of Accelerating Returns, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, means of production, move fast and break things, new economy, New Journalism, Norbert Wiener, off-the-grid, offshore financial centre, PageRank, Peace of Westphalia, Peter Thiel, planetary scale, Ray Kurzweil, scientific management, self-driving car, Silicon Valley, Singularitarianism, software is eating the world, Steve Jobs, Steven Levy, Stewart Brand, strong AI, supply-chain management, TED Talk, the medium is the message, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, Thomas L Friedman, Thorstein Veblen, Upton Sinclair, Vernor Vinge, vertical integration, We are as Gods, Whole Earth Catalog, yellow journalism

The aphorism became widely known only: Josh McHugh, “Google vs. Evil,” Wired, January 2003. “We’re at maybe 1%”: Greg Kumparak, “Larry Page Wants Earth to Have a Mad Scientist Island,” TechCrunch, May 15, 2003. “This is the culmination of literally 50 years”: Robert D. Hof, “Deep Learning,” Technology Review, www.technologyreview.com/s/513696/deep-learning. “The Google policy on a lot of things is to get right up to the creepy line”: Sara Jerome, “Schmidt: Google gets ‘right up to the creepy line’,” The Hill, October 1, 2010. Singularity University: David Rowan, “On the Exponential Curve: Inside Singularity University,” Wired, May 2013.


pages: 592 words: 125,186

The Science of Hate: How Prejudice Becomes Hate and What We Can Do to Stop It by Matthew Williams

3D printing, 4chan, affirmative action, agricultural Revolution, algorithmic bias, Black Lives Matter, Brexit referendum, Cambridge Analytica, citizen journalism, cognitive dissonance, coronavirus, COVID-19, dark matter, data science, deep learning, deindustrialization, desegregation, disinformation, Donald Trump, European colonialism, fake news, Ferguson, Missouri, Filter Bubble, gamification, George Floyd, global pandemic, illegal immigration, immigration reform, impulse control, income inequality, longitudinal study, low skilled workers, Mark Zuckerberg, meta-analysis, microaggression, Milgram experiment, Oklahoma City bombing, OpenAI, Overton Window, power law, selection bias, Snapchat, statistical model, The Turner Diaries, theory of mind, TikTok, twin studies, white flight

While some mainstream news sources were recommended as the top video, sources from the alt-right often dominated the top twenty, especially following events such as terror attacks.8 These videos ranked up hundreds of thousands of views by ‘issue hijacking’.‡ Since 2016, Google and YouTube have been altering their algorithms to focus on recommending more authoritative news sources. But the use of new ‘deep learning’ technology that is informed by billions of user behaviours a day means extreme videos will continue to be recommended if they are popular with site visitors. Filter bubbles and our bias Research on internet ‘filter bubbles’, often used interchangeably with the term ‘echo chambers’,§ has established that partisan information sources are amplified in online networks of like-minded social media users, where they go largely unchallenged due to ranking algorithms filtering out any challenging posts.9 Data science shows these filter bubbles are resilient accelerators of prejudice, reinforcing and amplifying extreme viewpoints on both sides of the spectrum.

These can be members of the general public, or experts in particular forms of hate (e.g. race, transgender, disability). Posts that get at least three out of four votes for hate are then put into a training dataset. This is our gold standard that trains the machine to mimic the human judgement in the annotation task. Various algorithms are then run across the dataset, including deep learning varieties popular with Google, Facebook, Twitter and Microsoft. But unlike their use of these algorithms, ours are developed in a closed workshop, meaning they can’t be gamed by new data being sent to them from mischievous internet users. Once we determine the algorithm that produces the most accurate results, we deploy it on live social media data streams.

., 1 Byrd, James, Jr, 1 California, 1, 2n, 3 Caliskan, Aylin, 1 Cambridge Analytica, 1, 2 cancer, 1, 2 Cardiff University Brain Research Imaging Centre (CUBRIC), 1, 2, 3, 4 caregiving motivational system, 1 care homes, 1, 2 Casablanca, 1 cascade effect, 1, 2 categorisation, 1, 2, 3, 4 Catholics, 1 Caucasian Crew, 1 causality, 1, 2 celebrities, 1, 2, 3, 4 censorship, 1, 2 Centennial Olympic Park, Atlanta, 1 Centers for Disease Control (CDC), 1 change blindness, 1 charity, 1, 2, 3 Charlottesville rally, 1, 2, 3n, 4 chatbots, 1, 2, 3 Chauvin, Derek, 1 Chelmsford, 1 Chicago, 1 childhood: attachment issues, 1; child abuse, 1, 2, 3; child grooming, 1; child play, 1; failures of containment, 1, 2, 3, 4; group threat, 1, 2; intergroup contact, 1, 2; learned stereotypes, 1; online hate speech, 1, 2; predicting hate crime, 1; trauma and containment, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10; trigger events, 1, 2; understanding the ‘average’ hate criminal, 1; understanding the ‘exceptional’ hate offender, 1, 2, 3 China, 1, 2, 3, 4 Chinese people, 1, 2, 3 ‘Chinese virus,’ 1, 2 Cho, John, 1 Christchurch mosque attack, 1 Christianity, 1, 2, 3 cinema, 1 citizen journalism, 1 civilising process, 1 civil rights, 1, 2, 3, 4 class, 1, 2 cleaning, 1 climate change, 1, 2 Clinton, Hillary, 1, 2 cognitive behavioural therapy, 1 cognitive dissonance, 1 Cohen, Florette, 1, 2 Cold War, 1 collective humiliation, 1 collective quests for significance, 1, 2 collective trauma, 1, 2 colonialism, 1n, 2 Combat 1, 2 comedies, 1, 2, 3 Communications Acts, 1, 2 compassion, 1, 2, 3 competition, 1, 2, 3, 4, 5, 6, 7, 8 confirmation bias, 1 conflict, 1, 2, 3, 4 conflict resolution, 1, 2, 3, 4, 5 Connectome, 1 Conroy, Jeffrey, 1 Conservative Party, 1, 2, 3 conspiracy theories, 1, 2, 3 contact with others, 1, 2 containment: failures of, 1; hate as container of unresolved trauma, 1; understanding the ‘exceptional’ hate offender, 1, 2, 3 content moderation, 1, 2, 3 context, 1, 2, 3 Convention of Cybercrime, 1 cooperation, 1, 2, 3, 4, 5, 6 Copeland, David, 1, 2, 3, 4, 5, 6, 7 coping mechanisms, 1, 2, 3, 4, 5, 6, 7 Cordoba House (‘Ground Zero mosque’), 1 correction for multiple comparisons, 1, 2n ‘corrective rape’, 1, 2 cortisol, 1 Council of Conservative Citizens, 1n counter-hate speech, 1, 2, 3, 4 courts, 1, 2, 3, 4, 5, 6 COVID-19 pandemic, 1, 2, 3 Cox, Jo, 1, 2, 3 Criado Perez, Caroline, 1 crime, 1, 2, 3, 4, 5, 6, 7 Crime and Disorder Act 1998, 1n crime recording, 1, 2, 3, 4 crime reporting, 1, 2, 3, 4, 5, 6, 7 Crime Survey for England and Wales (CSEW), 1 criminal justice, 1, 2, 3 Criminal Justice Act, 1, 2n criminal prosecution, 1, 2 criminology, 1, 2, 3, 4, 5, 6 cross-categorisation, 1 cross-race or same-race effect, 1 Crusius, Patrick, 1, 2 CUBRIC (Cardiff University Brain Research Imaging Centre), 1, 2, 3, 4 cultural ‘feeding’, 1, 2, 3, 4, 5 cultural worldviews, 1, 2, 3, 4, 5, 6, 7 culture: definitions, 1; group threat, 1, 2, 3; steps to stop hate, 1, 2, 3; tipping point, 1, 2, 3, 4, 5; unlearning prejudiced threat detection, 1 culture machine, 1, 2, 3, 4, 5 culture wars, 1 Curry and Chips, 1 cybercrime, 1 dACC, see dorsal anterior cingulate cortex Daily Mail, 1, 2 Dailymotion, 1 Daily Stormer, 1, 2n Daley, Tom, 1, 2 Darfur, 1 dark matter, 1 death: events that remind us of our mortality, 1; newspapers, 1; predicting hate crime, 1; religion and hate, 1, 2; subcultures of hate, 1, 2; trigger events, 1, 2 death penalty, 1, 2 death threats, 1 decategorisation, 1 De Dreu, Carsten, 1, 2, 3, 4 deep learning, 1, 2 defence mechanisms, 1 defensive haters, 1, 2 dehumanisation, 1, 2, 3, 4, 5, 6 deindividuation, 1, 2 deindustrialisation, 1, 2, 3, 4 Democrats, 1, 2, 3 Denny, Reginald, 1 DeSalvo, Albert (the Boston Strangler), 1 desegregation, 1, 2, 3 Desmond, Matthew, 1 Dewsbury, 1, 2, 3 Diffusion Magnetic Resonance Imaging (Diffusion MRI), 1, 2 diminished responsibility, 1, 2 Director of Public Prosecutions (DPP), 1 disability: brain and hate, 1, 2; group threat, 1, 2, 3, 4, 5, 6; intergroup contact, 1; Japan care home, 1, 2; online hate speech, 1; profiling the hater, 1; suppressing prejudice, 1; victim perception, 1n Discord, 1, 2, 3, 4 discrimination: brain and hate, 1, 2; comedy programmes, 1; Google searches, 1; Japan laws, 1; preference for ingroup, 1; pyramid of hate, 1, 2, 3; questioning prejudgements, 1; trigger events, 1, 2, 3 disgust: brain and hate, 1, 2, 3, 4, 5, 6; group threat detection, 1, 2, 3; ‘gut-deep’ hate, 1, 2; Japan care home, 1; what it means to hate, 1, 2 disinformation, 1, 2, 3 displacement, 1, 2 diversity, 1, 2, 3 dlPFC, see dorsolateral prefrontal cortex domestic violence, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 Doran, John, 1, 2, 3 dorsal anterior cingulate cortex (dACC), 1, 2, 3n, 4, 5, 6, 7, 8, 9 dorsolateral prefrontal cortex (dlPFC), 1n, 2, 3 Douglas, Mary, Purity and Danger, 1 drag queens, 1 drugs, 1, 2, 3, 4, 5, 6, 7, 8, 9 Duggan, Mark, 1 Duke, David, 1 Dumit, Joe, Picturing Personhood, 1 Durkheim, Emile, 1 Dykes, Andrea, 1 Earnest, John T., 1 Eastern Europeans, 1, 2, 3 Ebrahimi, Bijan, 1, 2, 3, 4, 5, 6 echo chambers, 1, 2n economy, 1, 2, 3, 4, 5, 6 EDL, see English Defence League education, 1, 2, 3, 4 Edwards, G., 1 8chan, 1, 2 elections, 1, 2, 3, 4, 5, 6 electroencephalography, 1n elites, 1 ELIZA (computer program), 1 The Ellen Show, 1 El Paso shooting, 1 Elrod, Terry, 1 Emancipation Park, Charlottesville, 1 Emanuel African Methodist Church, Charleston, 1 emotions: brain and hate, 1, 2, 3, 4n, 5, 6, 7, 8, 9; group threat, 1; subcultures of hate, 1; trigger events and mortality, 1; what it means to hate, 1, 2, 3, 4 empathy: brain and hate, 1, 2, 3, 4, 5, 6; feeling hate together, 1; group threat, 1, 2; steps to stop hate, 1, 2, 3; subcultures of hate, 1; trauma and containment, 1 employment, 1, 2, 3, 4, 5, 6, 7 English Defence League (EDL), 1, 2n, 3 epilepsy, 1, 2, 3, 4, 5 Epstein, Robert, 1 equality, 1, 2 Essex, 1 ethnicity, 1, 2n, 3, 4 ethnic minorities, 1, 2, 3, 4, 5, 6 ethnocentrism, 1 EU, see European Union European Commission, 1, 2 European Digital Services Act, 1 European Parliament, 1, 2 European Social Survey, 1 European Union (EU): Brexit referendum, 1, 2, 3, 4n, 5; Facebook misinformation, 1; group threat, 1, 2; online hate speech, 1, 2, 3; trigger events, 1 Eurovision, 1 evidence-based hate crime, 1 evolution, 1, 2, 3, 4, 5, 6, 7, 8 executive control area: brain and hate, 1, 2, 3, 4, 5, 6, 7, 8; disengaging the amygdala autopilot, 1, 2; extremism, 1; recognising false alarms, 1; trauma and containment, 1; trigger events, 1 exogenous shocks, 1 expert opinion, 1 extreme right, 1, 2, 3, 4, 5 extremism: Charlottesville and redpilling, 1, 2; feeling hate together, 1; online hate speech, 1; perceiving versus proving hate, 1; quest for significance, 1, 2, 3; subcultures of hate, 1, 2, 3, 4, 5, 6, 7; trauma and containment, 1; trigger events, 1, 2, 3 Facebook: algorithms, 1, 2; Charlottesville rally, 1, 2; Christchurch mosque attack, 1; far-right hate, 1, 2, 3, 4, 5; filter bubbles, 1, 2; how much online hate speech, 1, 2; Myanmar genocide, 1; online hate and offline harm, 1, 2, 3; redpilling, 1; stopping online hate speech, 1, 2, 3, 4 facial expression, 1, 2, 3, 4 faith, 1, 2 fake accounts, 1, 2; see also bots fake news, 1, 2, 3, 4 false alarms, 1, 2, 3 Farage, Nigel, 1, 2 far left, 1n, 2, 3, 4 Farook, Syed Rizwan, 1 far right: algorithms, 1, 2, 3, 4; brain injury, 1; Charlottesville rally, 1, 2, 3n, 4; COVID-19 pandemic, 1, 2; Facebook, 1, 2, 3, 4, 5; filter bubbles, 1, 2; gateway sites, 1; group threat, 1, 2; red-pilling, 1; rise of, 1; stopping online hate speech, 1; subcultures of hate, 1, 2, 3, 4, 5; terror attacks, 1, 2, 3; tipping point, 1, 2; trauma and containment, 1, 2, 3, 4n; trigger events, 1, 2; YouTube, 1 fathers, 1, 2, 3 FBI, see Federal Bureau of Investigation fear: brain and hate, 1, 2, 3, 4, 5, 6, 7; feeling hate together, 1; group threat, 1, 2, 3, 4, 5; mortality, 1; online hate speech, 1, 2, 3; steps to stop hate, 1, 2; trauma and containment, 1, 2; trigger events, 1, 2, 3 Federal Bureau of Investigation (FBI), 1, 2, 3, 4, 5, 6, 7 Federation of American Immigration Reform, 1 Ferguson, Missouri, 1 Festinger, Leon, 1 fiction, 1 Fields, Ted, 1 50 Cent Army, 1 ‘fight or flight’ response, 1, 2, 3 films, 1, 2 filter bubbles, 1, 2, 3, 4 Finland, 1, 2, 3, 4, 5, 6 Finsbury Park mosque attack, 1, 2, 3 first responders, 1 Fiske, Susan, 1 Five Star Movement, 1 flashbacks, 1 Florida, 1, 2 Floyd, George, 1, 2, 3 Flynt, Larry, 1 fMRI (functional Magnetic Resonance Imaging), 1, 2, 3, 4, 5, 6, 7 football, 1, 2, 3, 4, 5 football hooligans, 1, 2 Forever Welcome, 1 4chan, 1, 2 Fox News, 1, 2 Franklin, Benjamin, 1 Franklin, Joseph Paul, 1, 2, 3, 4, 5, 6, 7, 8 Fransen, Jayda, 1 freedom fighters, 1, 2 freedom of speech, 1, 2, 3, 4, 5, 6 frustration, 1, 2, 3, 4 functional Magnetic Resonance Imaging (fMRI), 1, 2, 3, 4, 5, 6, 7 fundamentalism, 1, 2 fusiform face area, 1 fusion, see identity fusion Gab, 1 Gadd, David, 1, 2n, 3, 4 Gaddafi, Muammar, 1, 2 Gage, Phineas, 1, 2 galvanic skin responses, 1 Gamergate, 1 gateway sites, 1 gay people: author’s experience, 1, 2, 3; brain and hate, 1, 2; Copeland attacks, 1, 2; COVID-19 pandemic, 1; filter bubbles, 1; gay laws, 1; gay marriage, 1, 2, 3; group associations, 1; group threat, 1, 2, 3, 4, 5; hate counts, 1, 2, 3, 4; physical attacks, 1, 2; profiling the hater, 1; Russia, 1, 2, 3, 4, 5; Section 1, 2, 3, 4; steps to stop hate, 1, 2, 3; trigger events, 1, 2; why online hate speech hurts, 1; see also LGBTQ+ people gay rights, 1, 2, 3, 4 gender, 1, 2, 3, 4, 5, 6, 7 Generation Identity, 1 Generation Z, 1, 2 genetics, 1n, 2, 3 genocide, 1, 2, 3, 4, 5, 6 Georgia (country), 1 Georgia, US, 1, 2, 3, 4 Germany, 1, 2, 3, 4, 5, 6, 7 Gilead, Michael, 1 ginger people, 1 girls, and online hate speech, 1 Gladwell, Malcolm, 1 Global Project Against Hate and Extremism, 1 glucocorticoids, 1, 2 God, 1, 2 God’s Will, 1, 2 Goebbels, Joseph, 1 Google, 1, 2, 3, 4, 5, 6, 7, 8 Google+, 1 Google Translate, 1 goth identity, 1, 2, 3, 4 governments, 1, 2, 3, 4, 5, 6 Grant, Oscar, 1 gravitational waves, 1 Great Recession (2007–9), 1 Great Replacement conspiracy theory, 1 Greece, 1, 2 Greenberg, Jeff, 1, 2, 3 Greene, Robert, 1 grey matter, 1 Grillot, Ian, 1, 2 Grodzins, Morton, 1 grooming, 1, 2, 3 ‘Ground Zero mosque’ (Cordoba House), 1 GroupMe, 1 groups: ancient brains in modern world, 1; brain and hate, 1, 2, 3, 4; childhood, 1; feeling hate together, 1; foundations of prejudice, 1; group threat and hate, 1; identity fusion, 1, 2, 3; intergroup hate, 1; pyramid of hate, 1; reasons for hate offending, 1; steps to stop hate, 1, 2; tipping point, 1, 2, 3, 4; warrior psychology, 1, 2, 3; what it means to hate, 1, 2 group threat, 1; beyond threat, 1; Bijan as the threatening racial other, 1; context and threat, 1; cultural machine, group threat and stereotypes, 1; evolution of group threat detection, 1; human biology and threat, 1; neutralising the perception of threat, 1; overview, 1; society, competition and threat, 1; threat in their own words, 1 guilt, 1, 2, 3, 4 guns, 1, 2 ‘gut-deep’ hate, 1, 2, 3, 4 Haines, Matt, 1 Haka, 1 Halle Berry neuron, 1, 2 harassment, 1, 2, 3, 4, 5 harm of hate, 1, 2, 3, 4, 5, 6, 7 Harris, Brendan, 1 Harris, Lasana, 1 Harris, Lovell, 1, 2, 3, 4 hate: author’s brain and hate, 1; the brain and hate, 1; definitions, 1, 2; feeling hate together, 1; foundations of prejudice and hate, 1, 2, 3; group threat and hate, 1; ‘gut-deep’ hate, 1, 2; hate counts, 1; hate in word and deed, 1; profiling the hater, 1; pyramid of hate, 1; rise of the bots and trolls, 1; seven steps to stop hate, 1; subcultures of hate, 1; tipping point from prejudice to hate, 1; trauma, containment and hate, 1; trigger events and ebb and flow of hate, 1; what it means to hate, 1 hate counts, 1; criminalising hate, 1; how they count, 1; overview, 1; perceiving versus proving hate, 1; police and hate, 1; rising hate count, 1; ‘signal’ hate acts and criminalisation, 1; Sophie Lancaster, 1; warped world of hate, 1 hate crime: author’s experience, 1, 2, 3; brain and hate, 1, 2, 3, 4, 5; definitions, 1; events and hate online, 1; events and hate on the streets, 1, 2; the ‘exceptional’ hate criminal, 1; far-right hate, 1, 2, 3; foundations of prejudice and hate, 1, 2, 3, 4; group threat, 1, 2, 3, 4, 5, 6, 7, 8; hate counts, 1, 2, 3, 4, 5; laws, 1n, 2, 3, 4, 5; number of crimes, 1, 2; online hate speech, 1, 2, 3, 4; predicting hate crime, 1; profiling the hater, 1; steps to stop hate, 1, 2, 3; trauma and containment, 1, 2, 3, 4; trigger events, 1, 2, 3, 4, 5, 6; understanding the ‘average’ hate criminal, 1; understanding the ‘exceptional’ hate offender, 1; what it means to hate, 1, 2, 3 hate groups, 1, 2, 3, 4, 5 hate in word and deed, 1; algorithmic far right, 1; Charlottesville rally, 1, 2, 3n, 4; extreme filter bubbles, 1; game changer for the far right, 1; gateway sites, 1; overview, 1; ‘real life effort post’ and Christchurch, 1; red-pilling, 1 HateLab, 1, 2, 3, 4, 5 hate speech: far-right hate, 1, 2, 3; filter bubbles and bias, 1; harm of, 1; how much online hate speech, 1; Japan laws, 1; pyramid of hate, 1; stopping online hate speech, 1; Tay chatbot, 1; trigger events, 1, 2, 3; why online hate speech hurts, 1 hate studies, 1, 2 ‘hazing’ practices, 1 health, 1, 2, 3, 4 Henderson, Russell, 1 Herbert, Ryan, 1 Hewstone, Miles, 1 Heyer, Heather, 1 Hinduism, 1, 2 hippocampus, 1, 2, 3, 4 history of offender, 1 Hitler, Adolf, 1, 2, 3, 4, 5, 6, 7 HIV/AIDS, 1, 2, 3, 4, 5, 6, 7 hollow mask illusion, 1, 2 Hollywood, 1, 2 Holocaust, 1, 2, 3, 4 Homicide Act, 1n homophobia: author’s experience, 1, 2, 3, 4; brain and hate, 1, 2, 3; evidence-based hate crime, 1; federal law, 1; jokes, 1; online hate speech, 1, 2; Russia, 1, 2; Shepard murder, 1; South Africa, 1; trauma and containment, 1; victim perception of motivation, 1n Homo sapiens, 1 homosexuality: author’s experience, 1; online hate speech, 1; policing, 1; questioning prejudgements, 1; Russia, 1, 2; trauma and containment, 1, 2; see also gay people hooligans, 1, 2 Horace, 1 hormones, 1, 2, 3 hot emotions, 1 hot-sauce study, 1, 2 housing, 1, 2, 3, 4, 5, 6 Huddersfield child grooming, 1 human rights, 1, 2, 3 humiliation, 1, 2, 3, 4, 5, 6 humour, 1, 2 Hungary, 1 hunter-gatherers, 1n, 2 Hustler, 1 IAT, see Implicit Association Test identity: author’s experience of attack, 1; British identity, 1, 2; Charlottesville rally, 1, 2; children’s ingroups, 1; group threat, 1, 2; online hate speech, 1, 2, 3, 4; steps to stop hate, 1, 2 identity fusion: fusion and hateful murder, 1; fusion and hateful violence, 1; fusion and self-sacrifice in the name of hate, 1; generosity towards the group, 1; tipping point, 1, 2; warrior psychology, 1, 2, 3 ideology, 1, 2, 3, 4 illegal hate speech, 1, 2, 3, 4 illocutionary speech, 1 imaging, see brain imaging immigration: Forever Welcome, 1; group threat, 1, 2, 3, 4, 5, 6, 7; hate counts, 1n, 2; HateLab Brexit study, 1; identity fusion, 1; intergroup contact, 1; negative stereotypes, 1; online hate speech, 1; Purinton, 1, 2; trauma and containment, 1, 2, 3; trigger events, 1, 2n, 3, 4, 5, 6, 7; YouTube algorithms, 1 immortality, 1, 2 Implicit Association Test (IAT), 1, 2, 3, 4, 5, 6, 7, 8, 9 implicit prejudice: author’s brain and hate, 1, 2, 3, 4; brain and hate, 1, 2, 3, 4, 5, 6; online hate speech, 1, 2 India, 1 Indonesia, 1 Infowars, 1, 2 Ingersoll, Karma, 1 ingroup: brain and hate, 1, 2, 3, 4; child play, 1; group threat, 1, 2, 3, 4, 5, 6, 7; HateLab Brexit study, 1; identity fusion, 1, 2; pyramid of hate, 1; reasons for hate offending, 1; trigger events, 1, 2, 3; what it means to hate, 1, 2, 3, 4, 5 Instagram, 1, 2, 3 Institute for Strategic Dialogue, 1 institutional racism, 1 instrumental crimes, 1 insula: brain and signs of prejudice, 1, 2, 3; facial expressions, 1, 2; fusiform face area, 1; hacking the brain to hate, 1; hate and feeling pain, 1; neuroscience of hate, 1n, 2, 3, 4, 5; parts that edge us towards hate, 1; parts that process prejudice, 1; processing of ‘gut-deep’ hate, 1, 2 Integrated Threat Theory (ITT), 1, 2, 3 integration, 1, 2, 3, 4 intergroup contact, 1, 2, 3 Intergroup Contact Theory, 1, 2, 3 intergroup hate, 1, 2, 3, 4 internet: algorithms, 1, 2; chatbots, 1; counterhate speech, 1; COVID-19 pandemic, 1; far-right hate, 1, 2, 3, 4, 5, 6, 7; filter bubbles, 1, 2, 3; Google searches, 1; hate speech harm, 1; how much online hate speech, 1; online news, 1; reasons for hate offending, 1; rise of the bots and trolls, 1; stopping online hate speech, 1; tipping point, 1, 2, 3; training the machine to count hate, 1; why online hate speech hurts, 1 interracial relations, 1, 2, 3, 4 intolerance, 1, 2 Iranian bots, 1 Iraq, 1 Irish Republican Army (IRA), 1 ISIS, 1, 2, 3, 4, 5, 6, 7, 8, 9 Islam: group threat, 1; online hate speech, 1, 2, 3, 4, 5; steps to stop hate, 1, 2, 3; subcultures of hate, 1, 2, 3, 4; trigger events, 1, 2, 3 Islamism: group threat, 1; online hate speech, 1, 2, 3, 4; profiling the hater, 1; subcultures of hate, 1, 2, 3; trigger events, 1, 2, 3 Islamophobia, 1, 2, 3, 4 Israel, 1, 2, 3 Italy, 1, 2 ITT, see Integrated Threat Theory James, Lee, 1, 2, 3, 4, 5, 6 Japan, 1, 2, 3 Jasko, Katarzyna, 1 Jefferson, Thomas, 1 Jenny Lives with Eric and Martin, 1 Jewish people: COVID-19 pandemic, 1, 2; far-right hate, 1, 2, 3, 4, 5; filter bubbles, 1; Google searches, 1, 2; group threat, 1; Nazism, 1, 2; negative stereotypes, 1 2 online hate speech, 1; pyramid of hate, 1; questioning prejudgements, 1; ritual washing, 1; subcultures of hate, 1, 2; trauma and Franklin, 1, 2, 3 jihad, 1, 2, 3, 4, 5 jokes, 1, 2, 3, 4, 5, 6, 7 Jones, Alex, 1 Jones, Terry, 1 Josephson junction, 1 Judaism, 1; see also Jewish people Jude, Frank, Jr, 1, 2, 3, 4, 5 Kansas, 1 Kerry, John, 1 Kik, 1 King, Gary, 1 King, Martin Luther, Jr, 1, 2 King, Rodney, 1, 2, 3 King, Ryan, 1 Kirklees, 1, 2 KKK, see Ku Klux Klan Kuchibhotla, Srinivas, 1, 2, 3, 4 Kuchibhotla, Sunayana, 1, 2 Ku Klux Klan (KKK), 1, 2, 3n, 4, 5, 6, 7 Labour Party, 1, 2, 3 Lancaster, Sophie, 1, 2 language, 1, 2, 3, 4, 5, 6, 7 LAPD (Los Angeles Police Department), 1 Lapshyn, Pavlo, 1 Lashkar-e-Taiba, 1 Las Vegas shooting, 1, 2 Latinx people, 1, 2, 3, 4, 5, 6, 7 law: brain and hate, 1, 2, 3; criminalising hate, 1; hate counts, 1, 2, 3; Kansas shooting, 1; limited laws, 1; online hate speech, 1; pyramid of hate, 1 Law Commission, 1 Lawrence, Stephen, 1 learned fears, 1, 2, 3 Leave.EU campaign, 1, 2 Leave voters, 1, 2, 3n Lee, Robert E., 1, 2, 3 left orbitofrontal cortex, 1n, 2n Legewie, Joscha, 1, 2, 3, 4 lesbians, 1, 2 Levin, Jack, 1 LGBTQ+ people, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17; see also gay people LIB, see Linguistic Intergroup Bias test Liberman, Nira, 1 Liberty Park, Salt Lake City, 1, 2 Libya, 1, 2, 3, 4 Light, John, 1 Linguistic Intergroup Bias (LIB) test, 1 Liverpool, 1, 2 Livingstone, Ken, 1, 2 Loja, Angel, 1 London: author’s experience of attack, 1; Copeland nail bombing, 1, 2; Duggan shooting, 1; far-right hate, 1; group threat, 1, 2, 3; online hate speech, 1, 2; Rigby attack, 1; terror attacks, 1, 2, 3, 4, 5, 6 London Bridge attack, 1, 2, 3 London School of Economics, 1 ‘lone wolf’ terrorists, 1, 2, 3, 4 long-term memory, 1, 2, 3, 4 Loomer, Laura, 1 Los Angeles, 1 loss: group threat, 1; subcultures of hate, 1, 2, 3, 4; tipping point, 1; trauma and containment, 1, 2, 3, 4, 5 love, 1, 2 Love Thy Neighbour, 1 Lucero, Marcelo, 1, 2 Luqman, Shehzad, 1 ‘Macbeth effect’, 1 machine learning, 1 Madasani, Alok, 1, 2, 3 Madrid attack, 1, 2 Magnetic Resonance Imaging (MRI): Diffusion MRI, 1, 2; functional MRI, 1, 2, 3, 4, 5, 6, 7 magnetoencephalography (MEG), 1, 2, 3 Maldon, 1 Malik, Tashfeen, 1 Maltby, Robert, 1, 2 Manchester, 1, 2 Manchester Arena attack, 1, 2, 3, 4, 5, 6 marginalisation, 1, 2 Martin, David, 1 Martin, Trayvon, 1, 2 MartinLutherKing.org, 1, 2 martyrdom, 1, 2, 3, 4n masculinity, 1, 2, 3, 4, 5 The Matrix, 1 Matthew Shepard and James Byrd Jr Hate Crimes Prevention Act, 1n, 2n Matz, Sandra, 1 Mauritius, 1 McCain, John, 1 McDade, Tony, 1 McDevitt, Jack, Levin McKinney, Aaron, 1 McMichael, Gregory, 1 McMichael, Travis, 1 media: far-right hate, 1, 2; group threat, 1, 2, 3; steps to stop hate, 1, 2, 3, 4, 5, 6; stereotypes in, 1, 2; subcultures of hate, 1; trigger events, 1 Meechan, Mark, 1 MEG (magnetoencephalography), 1, 2, 3 memory, 1, 2, 3, 4, 5, 6, 7 men, and online hate speech, 1 men’s rights, 1 mental illness, 1, 2, 3, 4, 5, 6 mentalising, 1, 2, 3 meta-analysis, 1 Metropolitan Police, 1 Mexican people, 1, 2, 3, 4 micro-aggressions, 1, 2n, 3, 4, 5, 6 micro-events, 1 Microsemi, 1n Microsoft, 1, 2, 3, 4, 5, 6 micro-targeting, 1, 2 Middle East, 1, 2 migration, 1, 2, 3, 4, 5, 6, 7; see also immigration Milgram, Stanley, 1 military, 1 millennials, 1 Milligan, Spike, 1 Milwaukee, 1, 2, 3 minimal groups, 1 Minneapolis, 1, 2, 3 minority groups: far-right hate, 1, 2; group threat, 1, 2, 3, 4, 5; police reporting, 1; questioning prejudgements, 1; trauma and containment, 1; trigger events, 1, 2 misinformation, 1, 2, 3, 4, 5, 6 mission haters, 1, 2, 3 mobile phones, 1, 2, 3 moderation of content, 1, 2, 3 Moore, Nik, 1 Moore, Thomas, 1 Moores, Manizhah, 1 Moore’s Ford lynching, 1 Moradi, Dr Zargol, 1, 2, 3, 4, 5, 6 Moral Choice Dilemma tasks, 1, 2, 3 moral cleansing, 1, 2, 3 moral dimension, 1, 2, 3, 4 moral outrage, 1, 2, 3, 4, 5 Moroccan people, 1, 2 mortality, 1, 2, 3 mortality salience, 1, 2, 3, 4, 5 Moscow, 1 mosques, 1, 2, 3, 4, 5, 6, 7 Moss Side Blood, 1 mothers, 1, 2, 3, 4, 5, 6 motivation, 1n, 2, 3, 4, 5, 6 Mphiti, Thato, 1 MRI, see Magnetic Resonance Imaging Muamba, Fabrice, 1 multiculturalism, 1, 2, 3, 4 murder: brain injury, 1, 2; group threat, 1, 2, 3; hate counts, 1; identity fusion and hateful murder, 1; police and hate, 1, 2; profiling the hater, 1; trauma and containment, 1, 2, 3, 4, 5 Murdered for Being Different, 1 music, 1, 2, 3 Muslims: COVID-19 pandemic, 1; far-right hate, 1, 2, 3, 4; Google searches, 1; group threat, 1, 2, 3, 4, 5, 6; negative stereotypes, 1; online hate speech, 1, 2; profiling the hater, 1, 2; Salah effect, 1; subcultures of hate, 1, 2, 3; trigger events, 1, 2, 3, 4, 5; and Trump, 1, 2, 3, 4n, 5, 6n Mvubu, Themba, 1 Myanmar, 1, 2 Myatt, David, 1 Nandi, Dr Alita, 1 National Action, 1 National Consortium for the Study of Terrorism and Responses to Terrorism, 1 national crime victimisation surveys, 1, 2 National Front, 1, 2, 3 nationalism, 1, 2 National Socialist Movement, 1, 2, 3, 4 natural experiments, 1, 2 Nature: Neuroscience, 1 nature vs nurture debate, 1 Nazism, 1, 2, 3, 4, 5, 6, 7, 8 NCVS (National Crime Victimisation Survey), 1, 2 negative stereotypes: brain and hate, 1, 2; feeling hate together, 1, 2; group threat, 1, 2, 3, 4, 5, 6; steps to stop hate, 1, 2, 3, 4, 5; tipping point, 1 Nehlen, Paul, 1 neo-Nazis, 1n, 2, 3, 4, 5, 6 Netherlands, 1, 2 Netzwerkdurchsetzungsgesetz (NetzDG) law, 1 neuroimaging, see brain imaging neurons, 1, 2, 3, 4, 5, 6, 7 neuroscience, 1, 2, 3, 4, 5, 6, 7, 8, 9 Newark, 1, 2 news, 1, 2, 3, 4, 5, 6, 7 newspapers, 1, 2, 3, 4 New York City, 1, 2, 3, 4, 5, 6 New York Police Department (NYPD), 1 New York Times, 1, 2 New Zealand, 1 n-grams, 1 Nimmo, John, 1 9/11 attacks, 1, 2, 3, 4, 5, 6, 7 911 emergency calls, 1 Nogwaza, Noxolo, 1 non-independence error, 1, 2n Al Noor Mosque, Christchurch, 1 Northern Ireland, 1 NWA, 1 NYPD (New York Police Department), 1 Obama, Barack, 1n, 2, 3, 4, 5, 6 Occupy Paedophilia, 1 ODIHR, see Office for Democratic Institutions and Human Rights Ofcom, 1 offence, 1, 2, 3, 4 Office for Democratic Institutions and Human Rights (ODIHR), 1, 2 Office for Security and Counter Terrorism, 1 office workers, 1 offline harm, 1, 2 Oklahoma City, 1 O’Mahoney, Bernard, 1 online hate speech: author’s experience, 1; COVID-19 pandemic, 1; far-right hate, 1, 2, 3, 4, 5; hate speech harm, 1; how much online hate speech, 1; individual’s role, 1; law’s role, 1; social media companies’ role, 1; steps to stop hate, 1; tipping point, 1, 2; training the machine to count hate, 1; trigger events, 1 Ono, Kazuya, 1 optical illusions, 1 Organization for Human Brain Mapping conference, 1 Orlando attack, 1 Orwell, George, Nineteen Eighty-Four, 1 Osborne, Darren, 1 ‘other’, 1, 2, 3, 4, 5, 6 Ottoman Empire, 1 outgroup: author’s brain and hate, 1, 2, 3; brain and hate, 1, 2, 3, 4, 5, 6, 7; child interaction and play, 1, 2; evolution of group threat detection, 1; feeling hate together, 1; group threat, 1, 2, 3, 4, 5, 6; ‘gut-deep’ hate, 1; HateLab Brexit study, 1; human biology and threat, 1; identity fusion, 1; prejudice formation, 1; profiling the hater, 1; push/pull factor, 1; pyramid of hate, 1; society, competition and threat, 1; steps to stop hate, 1, 2; tipping point, 1; trauma and containment, 1, 2, 3, 4, 5; trigger events, 1, 2, 3, 4, 5, 6, 7, 8 outliers, 1 Overton window, 1, 2, 3, 4 oxytocin, 1, 2, 3, 4 Paddock, Stephen, 1 Paddy’s Pub, Bali, 1 paedophilia, 1, 2, 3, 4, 5 page rank, 1 pain, 1, 2, 3, 4, 5, 6, 7 Pakistani people, 1, 2, 3, 4, 5 Palestine, 1 pandemics, 1, 2, 3, 4 Papua New Guinea, 1, 2, 3 paranoid schizophrenia, 1, 2 parents: caregiving, 1; subcultures of hate, 1; trauma and containment, 1, 2, 3, 4, 5; trigger events, 1, 2, 3 Paris attack, 1 Parsons Green attack, 1, 2 past experience: the ‘average’ hate criminal, 1; the ‘exceptional’ hate criminal, 1; trauma and containment, 1 perception-based hate crime, 1, 2 perception of threat, 1, 2, 3, 4, 5 perpetrators, 1, 2 personal contact, 1, 2 personality, 1, 2, 3 personality disorder, 1, 2 personal safety, 1, 2 personal significance, 1 perspective taking, 1, 2 PFC, see prefrontal cortex Philadelphia Police Department, 1 Philippines, 1 physical attacks, 1, 2, 3, 4, 5, 6, 7, 8 play, 1 Poland, 1, 2, 3 polarisation, 1, 2, 3, 4, 5 police: brain and hate, 1, 2; Duggan shooting, 1; group threat, 1, 2, 3; and hate, 1; NYPD racial bias, 1; online hate speech, 1, 2, 3, 4; perceiving versus proving hate, 1; police brutality, 1, 2, 3, 4; predicting hate crime, 1; recording crime, 1, 2, 3, 4; reporting crime, 1, 2, 3; rising hate count, 1, 2, 3; ‘signal’ hate acts and criminalisation, 1; steps to stop hate, 1, 2, 3; use of force, 1 Polish migrants, 1 politics: early adulthood, 1; far-right hate, 1, 2; filter bubbles and bias, 1; group threat, 1, 2, 3; online hate speech, 1, 2; seven steps to stop hate, 1, 2, 3, 4; trauma and containment, 1; trigger events, 1, 2, 3, 4, 5; Trump election, 1, 2 populism, 1, 2, 3, 4, 5 pornography, 1 Portugal, 1, 2 positive stereotypes, 1, 2 post-traumatic stress disorder (PTSD), 1, 2, 3, 4, 5 poverty, 1, 2, 3 Poway synagogue shooting, 1 power, 1, 2, 3, 4, 5 power law, 1 predicting the next hate crime, 1 prefrontal cortex (PFC): brain and signs of prejudice, 1; brain injury, 1; disengaging the amygdala autopilot, 1; feeling pain, 1; ‘gut-deep’ hate, 1; prejudice network, 1; psychological brainwashing, 1; recognising false alarms, 1; salience network, 1; trauma and containment, 1; trigger events, 1; unlearning prejudiced threat detection, 1, 2 prehistoric brain, 1, 2 prehistory, 1, 2 prejudgements, 1 prejudice: algorithms, 1; author’s brain and hate, 1, 2, 3, 4, 5, 6, 7; brain and hate, 1, 2, 3, 4, 5, 6, 7; brain and signs of prejudice, 1; cultural machine, 1; far-right hate, 1, 2; filter bubbles and bias, 1; foundations of, 1; Google, 2; group threat, 1, 2, 3, 4, 5, 6, 7, 8, 9; human biology and threat, 1; neuroscience of hate, 1, 2; online hate speech, 1, 2, 3; parts that process prejudice, 1; prejudice network, 1, 2, 3, 4; prepared versus learned amygdala responses, 1; pyramid of hate, 1; releasers, 1, 2; steps to stop hate, 1, 2, 3, 4; tipping point from prejudice to hate, 1; trauma and containment, 1, 2, 3, 4, 5; trigger events, 1, 2, 3, 4, 5, 6, 7, 8; Trump, 1, 2; unconscious bias, 1; unlearning prejudiced threat detection, 1; what it means to hate, 1, 2, 3, 4, 5 prepared fears, 1, 2 Prisoner’s Dilemma, 1 profiling the hater, 1 Proposition 1, 2 ProPublica, 1n, 2 prosecution, 1, 2, 3 Protestants, 1 protons, 1 psychoanalysis, 1 psychological development, 1, 2, 3, 4 psychological profiles, 1 psychological training, 1 psychology, 1, 2, 3, 4 psychosocial criminology, 1, 2 psy-ops (psychological operations), 1 PTSD, see post-traumatic stress disorder Public Order Act, 1 pull factor, 1, 2, 3, 4, 5 Pullin, Rhys, 1n Purinton, Adam, 1, 2, 3, 4, 5, 6, 7 push/pull factor, 1, 2, 3, 4, 5, 6 pyramid of hate, 1, 2 Q …, 1 al-Qaeda, 1, 2 quality of life, 1 queer people, 1, 2 quest for significance, 1, 2, 3 Quran burning, 1 race: author’s brain and hate, 1, 2, 3, 4; brain and hate, 1, 2, 3, 4, 5, 6, 7; brain and signs of prejudice, 1; far-right hate, 1, 2, 3; Google searches, 1; group threat, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10; hate counts, 1, 2, 3; online hate speech, 1; predicting hate crime, 1; pyramid of hate, 1; race relations, 1, 2, 3; race riots, 1, 2; race war, 1, 2, 3, 4, 5; steps to stop hate, 1, 2, 3; trauma and containment, 1, 2, 3, 4n, 5, 6; trigger events, 1, 2; unconscious bias, 1; unlearning prejudiced threat detection, 1 racism: author’s experience, 1; brain and hate, 1, 2, 3, 4, 5, 6; far-right hate, 1, 2; group threat, 1, 2, 3, 4, 5, 6, 7, 8; Kansas shooting, 1; NYPD racial bias, 1; online hate speech, 1, 2, 3, 4; steps to stop hate, 1n, 2, 3; Tay chatbot, 1; trauma and containment, 1, 2, 3, 4, 5, 6, 7; Trump election, 1; victim perception of motivation, 1n; white flight, 1 radicalisation: far-right hate, 1, 2, 3; group threat, 1; subcultures of hate, 1, 2, 3, 4, 5; trigger events, 1 rallies, 1, 2, 3; see also Charlottesville rally Ramadan, 1, 2 rape, 1, 2, 3, 4, 5 rap music, 1 realistic threats, 1, 2, 3, 4, 5 Rebel Media, 1 rebels, 1 recategorisation, 1 recession, 1, 2, 3, 4, 5 recommendation algorithms, 1, 2 recording crime, 1, 2, 3, 4 red alert, 1 Reddit, 1, 2, 3, 4 red-pilling, 1, 2, 3, 4 refugees, 1, 2, 3, 4, 5 rejection, 1, 2, 3, 4, 5, 6 releasers of prejudice, 1, 2 religion: group threat, 1, 2, 3; homosexuality, 1; online hate speech, 1, 2, 3; predicting hate crime, 1; pyramid of hate, 1; religion versus hate, 1; steps to stop hate, 1, 2; subcultures of hate, 1, 2; trauma and containment, 1n, 2; trigger events, 1, 2, 3, 4, 5; victim perception of motivation, 1n reporting crimes, 1, 2, 3, 4, 5, 6, 7 repression, 1 Republicans, 1, 2, 3, 4, 5 research studies, 1 responsibility, 1, 2, 3 restorative justice, 1 retaliatory haters, 1, 2, 3 Reuters, 1 Rieder, Bernhard, 1 Rigby, Lee, 1 rights: civil rights, 1, 2, 3, 4; gay rights, 1, 2, 3, 4; human rights, 1, 2, 3; men’s rights, 1; tipping point, 1; women’s rights, 1, 2 right wing, 1, 2, 3, 4, 5, 6; see also far right Right-Wing Authoritarianism (RWA) scale, 1 riots, 1, 2, 3, 4 risk, 1, 2, 3 rites of passage, 1, 2 rituals, 1, 2, 3 Robb, Thomas, 1 Robbers Cave Experiment, 1, 2, 3, 4, 5, 6 Robinson, Tommy (Stephen Yaxley-Lennon), 1, 2, 3, 4 Rohingya Muslims, 1, 2 Roof, Dylann, 1, 2 Roussos, Saffi, 1 Rudolph, Eric, 1 Rushin, S,, 1n Russia, 1, 2, 3, 4, 5, 6, 7, 8 Russian Internet Research Agency, 1 RWA (Right-Wing Authoritarianism) scale, 1 Rwanda, 1 sacred value protection, 1, 2, 3, 4, 5, 6, 7, 8 Saddam Hussein, 1 safety, 1, 2 Sagamihara care home, Japan, 1, 2 Salah, Mohamed, 1, 2, 3 salience network, 1, 2 salmon, brain imaging of, 1 Salt Lake City, 1 same-sex marriage, 1, 2 same-sex relations, 1, 2, 3 San Bernardino attack, 1n, 2, 3 Scanlon, Patsy, 1 scans, see brain imaging Scavino, Dan, 1n schizophrenia, 1, 2, 3, 4 school shootings, 1, 2 science, 1, 2, 3 scripture, 1, 2 SDO, see Social Dominance Orientation (SDO) scale Search Engine Manipulation Effect (SEME), 1 search queries, 1, 2, 3, 4 Second World War, 1, 2, 3 Section 1, Local Government Act, 1, 2, 3 seed thoughts, 1 segregation, 1, 2, 3 seizures, 1, 2, 3 selection bias problem, 1n self-defence, 1, 2 self-esteem, 1, 2, 3, 4 self-sacrifice, 1, 2, 3 Senior, Eve, 1 serial killers, 1, 2, 3 7/7 attack, London, 1 seven steps to stop hate, 1; becoming hate incident first responders, 1; bursting our filter bubbles, 1; contact with others, 1; not allowing divisive events to get the better of us, 1; overview, 1; putting ourselves in the shoes of ‘others’, 1; questioning prejudgements, 1; recognising false alarms, 1 sexism, 1, 2 sexual orientation, 1, 2, 3, 4, 5, 6, 7 sexual violence, 1, 2, 3, 4, 5 sex workers, 1, 2, 3, 4 Shakespeare, William, Macbeth, 1 shame, 1, 2, 3, 4, 5, 6, 7, 8, 9 shared trauma, 1, 2, 3 sharia, 1, 2 Shepard, Matthew, 1, 2 Sherif, Muzafer, 1, 2, 3, 4, 5, 6, 7 shitposting, 1, 2, 3n shootings, 1, 2, 3, 4, 5, 6, 7, 8 ‘signal’ hate acts, 1 significance, 1, 2, 3 Simelane, Eudy, 1 skin colour, 1, 2, 3n, 4, 5, 6, 7 Skitka, Linda, 1, 2 slavery, 1 Slipknot, 1 slurs, 1, 2, 3, 4, 5, 6 Snapchat, 1 social class, 1, 2 social desirability bias, 1, 2 Social Dominance Orientation (SDO) scale, 1 social engineering, 1 socialisation, 1, 2, 3, 4, 5 socialism, 1, 2 social media: chatbots, 1; COVID-19 pandemic, 1; far-right hate, 1, 2, 3, 4; filter bubbles and bias, 1; HateLab Brexit study, 1; online hate speech, 1, 2, 3, 4, 5; online news, 1; pyramid of hate, 1; steps to stop hate, 1, 2, 3; subcultures of hate, 1; trigger events, 1, 2; see also Facebook; Twitter; YouTube Social Perception and Evaluation Lab, 1 Soho, 1 soldiers, 1n, 2, 3 Sorley, Isabella, 1 South Africa, 1 South Carolina, 1 Southern Poverty Law Center, 1n, 2 South Ossetians, 1 Soviet Union, 1, 2 Spain, 1, 2, 3 Spencer, Richard B., 1 Spengler, Andrew, 1, 2, 3, 4 SQUIDs, see superconducting quantum interference devices Stacey, Liam, 1, 2 Stanford University, 1 Star Trek, 1, 2, 3 statistics, 1, 2, 3, 4, 5, 6, 7, 8 statues, 1 Stephan, Cookie, 1, 2 Stephan, Walter, 1, 2 Stephens-Davidowitz, Seth, Everybody Lies, 1 Stereotype Content Model, 1 stereotypes: brain and hate, 1, 2, 3, 4, 5, 6, 7; cultural machine, group threat and stereotypes, 1; definitions, 1; feeling hate together, 1, 2; group threat, 1, 2, 3, 4; homosexuality, 1; NYPD racial bias, 1; steps to stop hate, 1, 2, 3, 4, 5; study of prejudice, 1; tipping point, 1; trigger events, 1 Stoke-on-Trent, 1, 2 Stormfront website, 1, 2, 3 storytelling, 1 stress, 1, 2, 3, 4, 5, 6, 7, 8 striatum, 1, 2, 3n, 4 subcultures, 1, 2, 3, 4, 5 subcultures of hate, 1; collective quests for significance and extreme hate, 1; extremist ideology and compassion, 1; fusion and generosity towards the group, 1; fusion and hateful murder, 1; fusion and hateful violence, 1; fusion and self-sacrifice in the name of hate, 1; quest for significance and extreme hatred, 1; religion/belief, 1; warrior psychology, 1 subhuman, 1, 2 Sue, D.


pages: 254 words: 76,064

Whiplash: How to Survive Our Faster Future by Joi Ito, Jeff Howe

3D printing, air gap, Albert Michelson, AlphaGo, Amazon Web Services, artificial general intelligence, basic income, Bernie Sanders, Big Tech, bitcoin, Black Lives Matter, Black Swan, Bletchley Park, blockchain, Burning Man, business logic, buy low sell high, Claude Shannon: information theory, cloud computing, commons-based peer production, Computer Numeric Control, conceptual framework, CRISPR, crowdsourcing, cryptocurrency, data acquisition, deep learning, DeepMind, Demis Hassabis, digital rights, disruptive innovation, Donald Trump, double helix, Edward Snowden, Elon Musk, Ferguson, Missouri, fiat currency, financial innovation, Flash crash, Ford Model T, frictionless, game design, Gerolamo Cardano, informal economy, information security, interchangeable parts, Internet Archive, Internet of things, Isaac Newton, Jeff Bezos, John Harrison: Longitude, Joi Ito, Khan Academy, Kickstarter, Mark Zuckerberg, microbiome, move 37, Nate Silver, Network effects, neurotypical, Oculus Rift, off-the-grid, One Laptop per Child (OLPC), PalmPilot, pattern recognition, peer-to-peer, pirate software, power law, pre–internet, prisoner's dilemma, Productivity paradox, quantum cryptography, race to the bottom, RAND corporation, random walk, Ray Kurzweil, Ronald Coase, Ross Ulbricht, Satoshi Nakamoto, self-driving car, SETI@home, side project, Silicon Valley, Silicon Valley startup, Simon Singh, Singularitarianism, Skype, slashdot, smart contracts, Steve Ballmer, Steve Jobs, Steven Levy, Stewart Brand, Stuxnet, supply-chain management, synthetic biology, technological singularity, technoutopianism, TED Talk, The Nature of the Firm, the scientific method, The Signal and the Noise by Nate Silver, the strength of weak ties, There's no reason for any individual to have a computer in his home - Ken Olsen, Thomas Kuhn: the structure of scientific revolutions, Two Sigma, universal basic income, unpaid internship, uranium enrichment, urban planning, warehouse automation, warehouse robotics, Wayback Machine, WikiLeaks, Yochai Benkler

Held in one of the university’s largest lecture halls, the DeepMind event drew a standing-room-only crowd—students were all but hanging off the walls to hear Hassabis describe how their approach to machine learning had allowed their team to prove the experts who had predicted it would take ten years for a computer to beat a virtuoso like Sedol wrong. The key was a clever combination of deep learning—a kind of pattern recognition, similar to how a human brain (or Google) can recognize a cat or a fire truck after seeing many images—and “learning” so that it could guess statistically what something was likely to be, or in the case of Go, what a human player, considering all of the games of the past, was likely to play in a particular situation.

There are dozens of advances in machine learning and other fields still standing between us and an AGI, but AlphaGo has already realized several of them. It appears to be creative; it appears to be capable of deriving some sort of symbolic logic through a statistical system. It’s hard to overstate the significance of this accomplishment—many people didn’t believe you could get to symbolic reasoning from deep learning. However, while AlphaGo is very smart and very creative, it can only beat you at Go—not at checkers. Its entire universe of expression and vision is a grid of nineteen lines and black and white stones. It will take many more technological breakthroughs before AlphaGo will be interested in going to nightclubs or running for office.


pages: 240 words: 78,436

Open for Business Harnessing the Power of Platform Ecosystems by Lauren Turner Claire, Laure Claire Reillier, Benoit Reillier

Airbnb, Amazon Mechanical Turk, Amazon Web Services, augmented reality, autonomous vehicles, barriers to entry, basic income, benefit corporation, Blitzscaling, blockchain, carbon footprint, Chuck Templeton: OpenTable:, cloud computing, collaborative consumption, commoditize, crowdsourcing, data science, deep learning, Diane Coyle, Didi Chuxing, disintermediation, distributed ledger, driverless car, fake news, fulfillment center, future of work, George Akerlof, independent contractor, intangible asset, Internet of things, Jean Tirole, Jeff Bezos, Kickstarter, knowledge worker, Lean Startup, Lyft, Mark Zuckerberg, market design, Metcalfe’s law, minimum viable product, multi-sided market, Network effects, Paradox of Choice, Paul Graham, peer-to-peer lending, performance metric, Peter Thiel, platform as a service, price discrimination, price elasticity of demand, profit motive, ride hailing / ride sharing, Sam Altman, search costs, self-driving car, seminal paper, shareholder value, sharing economy, Silicon Valley, Skype, smart contracts, Snapchat, software as a service, Steve Jobs, Steve Wozniak, TaskRabbit, the long tail, The Market for Lemons, Tim Cook: Apple, transaction costs, two-sided market, Uber and Lyft, uber lyft, universal basic income, Y Combinator

Search neural networks can learn from analysing large amounts of data and build their own rules to match users and producers better and faster. RankBrain, Google’s deep neural network, which helps generate responses to search queries, now handles about 15% of Google’s daily search queries.8 All major platforms, from Facebook to Microsoft, have invested in deep learning, with Amazon even having released deep learning open-source software for search and product recommendations.9 Search technology, increasingly powered by deep neural networks, is evolving fast beyond text and geolocalized data to include voice and images. No doubt the integration of new technology such as messaging bots (automated search) and augmented reality (visual search) will redefine existing platforms’ user search experience.


pages: 469 words: 132,438

Taming the Sun: Innovations to Harness Solar Energy and Power the Planet by Varun Sivaram

"World Economic Forum" Davos, accelerated depreciation, addicted to oil, Albert Einstein, An Inconvenient Truth, asset light, asset-backed security, autonomous vehicles, bitcoin, blockchain, carbon footprint, carbon tax, clean tech, collateralized debt obligation, Colonization of Mars, currency risk, decarbonisation, deep learning, demand response, disruptive innovation, distributed generation, diversified portfolio, Donald Trump, electricity market, Elon Musk, energy security, energy transition, financial engineering, financial innovation, fixed income, gigafactory, global supply chain, global village, Google Earth, hive mind, hydrogen economy, index fund, Indoor air pollution, Intergovernmental Panel on Climate Change (IPCC), Internet of things, low interest rates, M-Pesa, market clearing, market design, Masayoshi Son, mass immigration, megacity, Michael Shellenberger, mobile money, Negawatt, ocean acidification, off grid, off-the-grid, oil shock, peer-to-peer lending, performance metric, renewable energy transition, Richard Feynman, ride hailing / ride sharing, rolling blackouts, Ronald Reagan, Silicon Valley, Silicon Valley startup, smart grid, smart meter, SoftBank, Solyndra, sovereign wealth fund, Ted Nordhaus, Tesla Model S, time value of money, undersea cable, vertical integration, wikimedia commons

The first priority is minimizing the amount of expensive reserves needed to accommodate renewable energy unpredictability. Better weather forecasting technology is emerging that can help by enabling grid operators to predict more accurately what solar and wind output will be like hours or days in advance and calling in compensating resources accordingly.42 The advent of “deep learning”—the artificial intelligence algorithms that run your Alexa device at home and learn from past experience— could make forecasts of solar production even more accurate and precise.43 In addition, markets should operate on a quicker cadence, reducing from an hour to five minutes or less the time interval between successive decisions on which generators to dispatch and how much to pay them.

Oxford Institute for Energy Studies, December 2016, https://www.oxfordenergy.org/wpcms/wp-content/uploads/2016/12/EU-energy-policy-4th-time-lucky.pdf. 42.  C. K. Woo et al., “Merit-Order Effects of Renewable Energy and Price Divergence in California’s Day-Ahead and Real-Time Electricity Markets,” Energy Policy 92 (2016): 299–312, doi:10.1016/j.enpol.2016.02.023. 43.  Andre Gensler et al., “Deep Learning for Solar Power Forecasting—An Approach Using AutoEncoder and LSTM Neural Networks,” IEEE International Conference on Systems, Man, and Cybernetics, 2016, http://ieeexplore.ieee.org/document/7844673/. 44.  E. Ela et al., “Wholesale Electricity Market Design with Increasing Levels of Renewable Generation: Incentivizing Flexibility in System Operations,” The Electricity Journal 29, no. 4 (2016): 51–60, doi:10.1016/j.tej.2016.05.001. 45.  

See also Deep decarbonization Decentralization, of power, 119, 219 Decentralized control algorithms, 214–215 Decentralized grids in California, 211 expanding central grid vs. building, 193 in Reforming the Energy Vision program, 208–210 solar power in, 215 systemic innovation to accommodate, 199–200 Deep decarbonization global electricity mix for, 60–63, 62f power sources for, 232–239 U.S. roadmap for, 245 Deep learning, 241 Defense Advanced Research Projects Agency (DARPA), 249, 259, 265 Delos, 29 Demand for electricity in cross-national grids, 202 “duck curve” of, 74–78 forecasted growth of, 60–61 in India, 14 Demand response, 212–216, 284g Demonstration projects, 264–265, 289g Denmark, 200–201, 244 Department of Water and Power (Los Angeles, California), xiii Deregulation, of energy industry, 106–107 Derisking insurance, 105 Desalination, 82, 246, 284g DESERTEC, 205 Developed countries, 64, 122 Developing countries applications of elastic solar materials in, 162 funding/capital for solar projects in, 64, 65, 105, 113, 114, 126 future of solar power in, 4–5 microgrids in, 129 SunEdison YieldCo for, 88 Development banks, 45, 65, 102, 111, 113, 290g Devi, Shaiyra, 119 Diesel-powered microgrids, 122 Direct current (DC), 217, 279g.


pages: 345 words: 84,847

The Runaway Species: How Human Creativity Remakes the World by David Eagleman, Anthony Brandt

active measures, Ada Lovelace, agricultural Revolution, Albert Einstein, Andrew Wiles, Apollo 13, Burning Man, cloud computing, computer age, creative destruction, crowdsourcing, Dava Sobel, deep learning, delayed gratification, Donald Trump, Douglas Hofstadter, en.wikipedia.org, Frank Gehry, Gene Kranz, Google Glasses, Great Leap Forward, haute couture, informal economy, interchangeable parts, Isaac Newton, James Dyson, John Harrison: Longitude, John Markoff, Large Hadron Collider, lone genius, longitudinal study, Menlo Park, microbiome, Netflix Prize, new economy, New Journalism, pets.com, pneumatic tube, QWERTY keyboard, Ray Kurzweil, reversible computing, Richard Feynman, risk tolerance, Scaled Composites, self-driving car, Simon Singh, skeuomorphism, Solyndra, SpaceShipOne, stem cell, Stephen Hawking, Steve Jobs, Stewart Brand, synthetic biology, TED Talk, the scientific method, Watson beat the top human players on Jeopardy!, wikimedia commons, X Prize

When dormant, the bacteria B. pasteurii can survive for decades even in extreme conditions such as the hearts of volcanoes; when active, they secrete calcite, one of concrete’s key ingredients. 5 The hybrid approach between humans and computers is quickly changing, as companies take on superhuman recognition engines (e.g. deep learning algorithms). But note that these new approaches are entirely trained up by previously human-tagged pictures. 6 Julian Franklyn, A Dictionary of Rhyming Slang, 2nd ed. (London: Routledge, 1991). 7 Reprinted by arrangement with the Heirs to the Estate of Martin Luther King Jr. c/o The Writers House as agent for the proprietor New York, NY © 1963 Dr Martin Luther King Jr. © Renewed 1991 Coretta Scott King. 8 Carmel O’Shannessy, “The role of multiple sources in the formation of an innovative auxiliary category in Light Warlpiri, a new Australian mixed language,” Language 89 (2) pp. 328–353. 9 <http://www.whosampled.com/Dr.

When dormant, the bacteria B. pasteurii can survive for decades even in extreme conditions such as the hearts of volcanoes; when active, they secrete calcite, one of concrete’s key ingredients. 5 The hybrid approach between humans and computers is quickly changing, as companies take on superhuman recognition engines (e.g. deep learning algorithms). But note that these new approaches are entirely trained up by previously human-tagged pictures. 6 Julian Franklyn, A Dictionary of Rhyming Slang, 2nd ed. (London: Routledge, 1991). 7 Reprinted by arrangement with the Heirs to the Estate of Martin Luther King Jr. c/o The Writers House as agent for the proprietor New York, NY © 1963 Dr Martin Luther King Jr. © Renewed 1991 Coretta Scott King. 8 Carmel O’Shannessy, “The role of multiple sources in the formation of an innovative auxiliary category in Light Warlpiri, a new Australian mixed language,” Language 89 (2) pp. 328–353. 9 <http://www.whosampled.com/Dr.


pages: 297 words: 84,447

The Star Builders: Nuclear Fusion and the Race to Power the Planet by Arthur Turrell

Albert Einstein, Arthur Eddington, autonomous vehicles, Boeing 747, Boris Johnson, carbon tax, coronavirus, COVID-19, data science, decarbonisation, deep learning, Donald Trump, Eddington experiment, energy security, energy transition, Ernest Rutherford, Extinction Rebellion, green new deal, Greta Thunberg, Higgs boson, Intergovernmental Panel on Climate Change (IPCC), ITER tokamak, Jeff Bezos, Kickstarter, Large Hadron Collider, lockdown, New Journalism, nuclear winter, Peter Thiel, planetary scale, precautionary principle, Project Plowshare, Silicon Valley, social distancing, sovereign wealth fund, statistical model, Stephen Hawking, Steve Bannon, TED Talk, The Rise and Fall of American Growth, Tunguska event

Arnoux, “How Fritz Wagner ‘Discovered’ the H-Mode,” Iter Newsline 86 (2009), https://www.iter.org/newsline/86/659; “Thirty Years of H-Mode,” EUROfusion.org (2012), https://www.euro-fusion.org/news/detail/thirty-years-of-h-mode/?. 4. J. Kates-Harbeck, A. Svyatkovskiy, and W. Tang, “Predicting Disruptive Instabilities in Controlled Fusion Plasmas Through Deep Learning,” Nature 568 (2019): 526; G. Kluth et al., “Deep Learning for NLTE Spectral Opacities,” Physics of Plasmas 27 (2020): 052707. 5. T. Boisson, “British Nuclear Fusion Reactor Relaunched for the First Time in 23 Years,” Trust My Science (2020), https://trustmyscience.com/reacteur-fusion-anglais-relance-premiere-fois-depuis-23-ans/. 6.


pages: 282 words: 85,658

Ask Your Developer: How to Harness the Power of Software Developers and Win in the 21st Century by Jeff Lawson

Airbnb, AltaVista, Amazon Web Services, barriers to entry, big data - Walmart - Pop Tarts, Big Tech, big-box store, bitcoin, business process, call centre, Chuck Templeton: OpenTable:, cloud computing, coronavirus, COVID-19, create, read, update, delete, cryptocurrency, data science, David Heinemeier Hansson, deep learning, DevOps, Elon Musk, financial independence, global pandemic, global supply chain, Hacker News, Internet of things, Jeff Bezos, Kanban, Lean Startup, loose coupling, Lyft, Marc Andreessen, Marc Benioff, Mark Zuckerberg, microservices, minimum viable product, Mitch Kapor, move fast and break things, Paul Graham, peer-to-peer, ride hailing / ride sharing, risk tolerance, Ruby on Rails, Salesforce, side project, Silicon Valley, Silicon Valley startup, Skype, social distancing, software as a service, software is eating the world, sorting algorithm, Startup school, Steve Ballmer, Steve Jobs, Telecommunications Act of 1996, Toyota Production System, transaction costs, transfer pricing, two-pizza team, Uber and Lyft, uber lyft, ubercab, web application, Y Combinator

Investors talk about rewarding a founder who fails by funding their next company. There’s a zealotry toward failure that’s baked so deep into the DNA of Silicon Valley that you’d almost imagine highly successful entrepreneurs walking around sulking, with dreams of eventual failure dancing in their heads. But it’s not the failure that’s celebrated, it’s the deep learnings that advance the mission. Failure is merely accepted as a natural consequence of the learning. When people talk about accepting failure, they’re talking about accepting the journey of discovery. Notice above, when I talked about running experiments, it’s not about success or failure, it’s about accelerated learning.

How we as leaders, and the company as a whole, handle these situations makes a big difference to how employees treat mistakes, and whether the company actually gets better and better at these things. Or, as Chee would say, “suck less.” When things go wrong, it’s either a time to blame, or a time to learn. I believe each failure is an opportunity to uncover deep learnings about how the organization operates, and what could strengthen it systematically, and then take action. We, and many other software companies, do this via a ritual called the “blameless postmortem.” The purpose of the blameless postmort is to dig below the surface of some kind of bad outcome to the true root cause, and address that as an organization.


pages: 357 words: 95,986

Inventing the Future: Postcapitalism and a World Without Work by Nick Srnicek, Alex Williams

3D printing, additive manufacturing, air freight, algorithmic trading, anti-work, antiwork, back-to-the-land, banking crisis, basic income, battle of ideas, blockchain, Boris Johnson, Bretton Woods, business cycle, call centre, capital controls, capitalist realism, carbon footprint, carbon tax, Cass Sunstein, centre right, collective bargaining, crowdsourcing, cryptocurrency, David Graeber, decarbonisation, deep learning, deindustrialization, deskilling, Doha Development Round, Elon Musk, Erik Brynjolfsson, Evgeny Morozov, Ferguson, Missouri, financial independence, food miles, Francis Fukuyama: the end of history, full employment, future of work, gender pay gap, general purpose technology, housing crisis, housing justice, income inequality, industrial robot, informal economy, intermodal, Internet Archive, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Kickstarter, Kiva Systems, late capitalism, liberation theology, Live Aid, low skilled workers, manufacturing employment, market design, Martin Wolf, mass immigration, mass incarceration, means of production, megaproject, minimum wage unemployment, Modern Monetary Theory, Mont Pelerin Society, Murray Bookchin, neoliberal agenda, New Urbanism, Occupy movement, oil shale / tar sands, oil shock, Overton Window, patent troll, pattern recognition, Paul Samuelson, Philip Mirowski, post scarcity, post-Fordism, post-work, postnationalism / post nation state, precariat, precautionary principle, price stability, profit motive, public intellectual, quantitative easing, reshoring, Richard Florida, rising living standards, road to serfdom, Robert Gordon, Ronald Reagan, Second Machine Age, secular stagnation, self-driving car, Slavoj Žižek, social web, stakhanovite, Steve Jobs, surplus humans, synthetic biology, tacit knowledge, technological determinism, the built environment, The Chicago School, The Future of Employment, the long tail, Tyler Cowen, Tyler Cowen: Great Stagnation, universal basic income, wages for housework, warehouse automation, We are all Keynesians now, We are the 99%, women in the workforce, working poor, working-age population

These are tasks that computers are perfectly suited to accomplish once a programmer has created the appropriate software, leading to a drastic reduction in the numbers of routine manual and cognitive jobs over the past four decades.22 The result has been a polarisation of the labour market, since many middle-wage, mid-skilled jobs are routine, and therefore subject to automation.23 Across both North America and Western Europe, the labour market is now characterised by a predominance of workers in low-skilled, low-wage manual and service jobs (for example, fast-food, retail, transport, hospitality and warehouse workers), along with a smaller number of workers in high-skilled, high-wage, non-routine cognitive jobs.24 The most recent wave of automation is poised to change this distribution of the labour market drastically, as it comes to encompass every aspect of the economy: data collection (radio-frequency identification, big data); new kinds of production (the flexible production of robots,25 additive manufacturing,26 automated fast food); services (AI customer assistance, care for the elderly); decision-making (computational models, software agents); financial allocation (algorithmic trading); and especially distribution (the logistics revolution, self-driving cars,27 drone container ships and automated warehouses).28 In every single function of the economy – from production to distribution to management to retail – we see large-scale tendencies towards automation.29 This latest wave of automation is predicated upon algorithmic enhancements (particularly in machine learning and deep learning), rapid developments in robotics and exponential growth in computing power (the source of big data) that are coalescing into a ‘second machine age’ that is transforming the range of tasks that machines can fulfil.30 It is creating an era that is historically unique in a number of ways. New pattern-recognition technologies are rendering both routine and non-routine tasks subject to automation: complex communication technologies are making computers better than humans at certain skilled-knowledge tasks, and advances in robotics are rapidly making technology better at a wide variety of manual-labour tasks.31 For instance, self-driving cars involve the automation of non-routine manual tasks, and non-routine cognitive tasks such as writing news stories or researching legal precedents are now being accomplished by robots.32 The scope of these developments means that everyone from stock analysts to construction workers to chefs to journalists is vulnerable to being replaced by machines.33 Workers who move symbols on a screen are as at risk as those moving goods around a warehouse.

A Critique of Rifkin and Negri’, in In Letters of Blood and Fire (Oakland, CA: PM Press, 2012), p. 78. 46.It should be mentioned that, increasingly, tacit knowledge tasks are being automated through environmental control and machine learning, with more recent innovations eliminating even the need for a controlled environment. Frey and Osborne, Future of Employment, p. 27; Autor, Polanyi’s Paradox; Sarah Yang, ‘New “Deep Learning” Technique Enables Robot Mastery of Skills via Trial and Error’, Phys.org, 21 May 2015, at phys.org. 47.As Marx notes, because of this ‘the field of application for machinery would therefore be entirely different in a communist society from what it is in bourgeois society.’ Marx, Capital, Volume I, p. 515 n. 33. 48.Silvia Federici, ‘Permanent Reproductive Crisis: An Interview’, Mute, 7 March 2013, at metamute.org. 49.For an excellent overview of historical experiences of alternative domestic arrangements, see Dolores Hayden, Grand Domestic Revolution: A History of Feminist Designs for American Homes, Neighbourhoods and Cities (Cambridge: MIT Press, 1996). 50.However, it is important to recognise that, historically, domestic labour-saving devices have tended to place greater demands on household maintenance, rather than allowing more free time.


pages: 351 words: 93,982

Leading From the Emerging Future: From Ego-System to Eco-System Economies by Otto Scharmer, Katrin Kaufer

Affordable Care Act / Obamacare, agricultural Revolution, Albert Einstein, Asian financial crisis, Basel III, behavioural economics, Berlin Wall, Branko Milanovic, cloud computing, collaborative consumption, collapse of Lehman Brothers, colonial rule, Community Supported Agriculture, creative destruction, crowdsourcing, deep learning, dematerialisation, Deng Xiaoping, do what you love, en.wikipedia.org, European colonialism, Fractional reserve banking, Garrett Hardin, Glass-Steagall Act, global supply chain, happiness index / gross national happiness, high net worth, housing crisis, income inequality, income per capita, intentional community, Intergovernmental Panel on Climate Change (IPCC), invisible hand, Johann Wolfgang von Goethe, Joseph Schumpeter, Kickstarter, market bubble, mass immigration, Mikhail Gorbachev, Mohammed Bouazizi, mutually assured destruction, Naomi Klein, new economy, offshore financial centre, Paradox of Choice, peak oil, ride hailing / ride sharing, Ronald Reagan, Silicon Valley, smart grid, Steve Jobs, systems thinking, technology bubble, The Spirit Level, The Wealth of Nations by Adam Smith, Thomas L Friedman, too big to fail, Tragedy of the Commons, vertical integration, Washington Consensus, working poor, Zipcar

What strategies can help us to function as vehicles for shifting the whole? In exploring these questions, we laid out three big ideas. The first is that there are two fundamentally different modes of learning: learning from the past and learning from the emerging future. In order to learn from the emerging future, we have to activate a deep learning cycle that involves not only opening the mind (transcending the cognitive boundaries), but also opening the heart (transcending our relational boundaries) and opening the will (transcending the boundaries of our small will). The U process of learning from the emerging future follows three movements: “Observe, observe,” “Retreat and reflect: allow the inner knowing to emerge,” and “Act in an instant.”

These methodologies combine state-of-the-art organizational learning tools with participatory innovation techniques and blend them with awareness-based leadership practices. Mastery of these blended new leadership technologies, such as presencing, to sense and actualize emerging future possibilities is the methodological backbone of the school. 4. Presencing coaching circles. One of the most important mechanisms for holding the space for deep learning is peer circles that use deep listening–based coaching practices. A coaching circle usually consists of five to seven members and applies a version of the case clinic process that we described at the end of chapter 7. We have found that the power of these peer group circles is simply amazing. They hold the space for individual and shared renewal.


pages: 324 words: 91,653

The Quantum Thief by Hannu Rajaniemi

augmented reality, cognitive dissonance, deep learning, gravity well, haute couture, MITM: man-in-the-middle, music of the spheres, quantum entanglement

They gather bits of your gevulot so they can decrypt your mind.’ ‘Why would they want him? He was nothing special. He could make chocolate. I don’t even like chocolate.’ ‘I think your husband was exactly the kind of person the gogol pirates would be interested in, a specialised mind,’ Isidore says. ‘The Sobornost have an endless appetite for deep learning models, and they are obsessed with human sensory modalities, especially taste and smell.’ He takes care to include Élodie in the conversation’s gevulot. ‘And his chocolate certainly is special. His assistant was kind enough to let me try some when I visited the shop: freshly made, a sliver of that dress that arrived from the factory this morning.

His floor and desk are covered in three-dimensional building sketches, both imaginary and real, dominated by a scale model of the Ares Cathedral. The green creature hides behind it. Smart move, little fellow. It’s a big, bad world out there. Many of his fellow students find studying frustrating. As perfect as exomemory is, it only gives you short-term memories. Deep learning still comes from approximately ten thousand hours of work on any given subject. Isidore does not mind: on a good day, he can get lost in the purity of form for hours, exploring tempmatter models of buildings, feeling each detail under his fingertips. He summons up a text on the Tendai sect and the Daidairi Palace and starts reading, waiting for the contemporary world to fade.


Industry 4.0: The Industrial Internet of Things by Alasdair Gilchrist

3D printing, additive manufacturing, air gap, AlphaGo, Amazon Web Services, augmented reality, autonomous vehicles, barriers to entry, business intelligence, business logic, business process, chief data officer, cloud computing, connected car, cyber-physical system, data science, deep learning, DeepMind, deindustrialization, DevOps, digital twin, fault tolerance, fulfillment center, global value chain, Google Glasses, hiring and firing, industrial robot, inflight wifi, Infrastructure as a Service, Internet of things, inventory management, job automation, low cost airline, low skilled workers, microservices, millennium bug, OSI model, pattern recognition, peer-to-peer, platform as a service, pre–internet, race to the bottom, RFID, Salesforce, Skype, smart cities, smart grid, smart meter, smart transportation, software as a service, stealth mode startup, supply-chain management, The future is already here, trade route, undersea cable, vertical integration, warehouse robotics, web application, WebRTC, Y2K

The problem was that this required the machine to process vast quantities of data and look for patterns and that was not always readily available at the time. We have since discovered that simple neuron networks are a bit of a misnomer as it has little comparison to real neuron networks. In fact, it is now termed deep learning, and is suitable for analysis of large, static data sets. The alternative is the biological neural network, which expands on the neural theme and takes it several steps further. With this AI model, the biological neural network does actually try to mimic the brain’s way of learning, using what is termed spaced distributed representation.

Currently in robotics, we are a long way from the objective; however, in software machine learning is coming along very well. Presently the state of machine learning and artificial intelligence is defined by the latest innovations. In November 2015, Google launched its machine learning system called TensorFlow. Interest in deep learning continues to gain momentum, especially following Google’s purchase of DeepMind Technologies, which has since been renamed Google DeepMind. In February 2015, DeepMind scientists revealed how a computer had taught itself to play almost 50 video games, by figuring out what to do through deep neural networks and reinforcement learning.


Likewar: The Weaponization of Social Media by Peter Warren Singer, Emerson T. Brooking

4chan, active measures, Airbnb, augmented reality, barriers to entry, battle of ideas, Bellingcat, Bernie Sanders, Black Lives Matter, British Empire, Cambridge Analytica, Cass Sunstein, citizen journalism, Citizen Lab, Comet Ping Pong, content marketing, crony capitalism, crowdsourcing, data science, deep learning, digital rights, disinformation, disintermediation, Donald Trump, drone strike, Edward Snowden, en.wikipedia.org, Erik Brynjolfsson, Evgeny Morozov, fake news, false flag, Filter Bubble, global reserve currency, Google Glasses, Hacker Conference 1984, Hacker News, illegal immigration, information security, Internet Archive, Internet of things, invention of movable type, it is difficult to get a man to understand something, when his salary depends on his not understanding it, Jacob Silverman, John Gilmore, John Markoff, Kevin Roose, Kickstarter, lateral thinking, lolcat, Mark Zuckerberg, megacity, Menlo Park, meta-analysis, MITM: man-in-the-middle, Mohammed Bouazizi, Moneyball by Michael Lewis explains big data, moral panic, new economy, offshore financial centre, packet switching, Panopticon Jeremy Bentham, Parag Khanna, pattern recognition, Plato's cave, post-materialism, Potemkin village, power law, pre–internet, profit motive, RAND corporation, reserve currency, sentiment analysis, side project, Silicon Valley, Silicon Valley startup, Snapchat, social web, South China Sea, Steve Bannon, Steve Jobs, Steven Levy, Stewart Brand, systems thinking, too big to fail, trade route, Twitter Arab Spring, UNCLOS, UNCLOS, Upton Sinclair, Valery Gerasimov, We are Anonymous. We are Legion, We are as Gods, Whole Earth Catalog, WikiLeaks, Y Combinator, yellow journalism, Yochai Benkler

In turn, the next layer might discover “circles”; the layer after that, “faces”; the layer after that, “noses.” Each layer allows the network to approach a problem with more and more granularity. But each layer also demands exponentially more neurons and computing power. Neural networks are trained via a process known as “deep learning.” Originally, this process was supervised. A flesh-and-blood human engineer fed the network a mountain of data (10 million images or a library of English literature) and slowly guided the network to find what the engineer was looking for (a “car” or a “compliment”). As the network went to work on its pattern-sorting and the engineer judged its performance and tweaked the synapses, it got a little better each time.

And yet, just as in the Terminator movies, if humans are to be spared from this encroaching, invisible robot invasion, their likely savior will be found in other machines. Recent breakthroughs in neural network training hint at what will drive machine evolution to the next level, but also save us from algorithms that seek to manipulate us: an AI survival of the fittest. Newer, more advanced forms of deep learning involve the use of “generative adversarial networks.” In this type of system, two neural networks are paired off against each other in a potentially endless sparring match. The first network strains to create something that seems real—an image, a video, a human conversation—while the second network struggles to determine if it’s fake.

See information data localization, 89 Dawkins, Richard, 189, 190 de Tocqueville, Alexis, 121 deaths Chicago gangs, 13–14 Duterte’s “drug war,” 15 Gaza City, 194, 196 ISIS and, 153 Nairobi’s Westgate mall, 235 Russia and Ukraine, 201–2, 204 Turkey coup, 91–92 deep fakes, 253 deep learning, 249, 250, 256 deep web, 52 democracy (vs. trolls), 211, 241, 262–63, 265–66 Denver Guardian (fake newspaper), 132 Dewey, Caitlin, 137 Digital Forensic Research Lab, 138 Digital Millennium Copyright Act (DMCA), 225–26 digital trail, 60–61, 66 digital wildfires, 137 digital world war, 172 DigitaShadow, 213 diplomacy, 15, 202 discrediting anti-vaxxers, 124–25 campaigners, 116 of information, 123 journalists and activists, 15, 114 of truth, 140 See also trolls and trolling disinformation, 103–14, 137 botnets, 141–47 cybersecurity, 241 dangers of, 261 Russian botnets, 144–45 See also fake news; propaganda disintermediation, 54–55 diversity, 101 Dixson, Angee, 138–39, 140 Dogood, Silence, 29 Domino’s Pizza, hack, 195 Donbass News International (DNI), 108 Dorsey, Jack, 48 dot-com bubble, 44 Drew, Lori, 227–28 Ducca, Lauren, 116 Dumbledore’s Army, 172 Dupin, C.


pages: 411 words: 98,128

Bezonomics: How Amazon Is Changing Our Lives and What the World's Best Companies Are Learning From It by Brian Dumaine

activist fund / activist shareholder / activist investor, AI winter, Airbnb, Amazon Robotics, Amazon Web Services, Atul Gawande, autonomous vehicles, basic income, Bernie Sanders, Big Tech, Black Swan, call centre, Cambridge Analytica, carbon tax, Carl Icahn, Chris Urmson, cloud computing, corporate raider, creative destruction, Danny Hillis, data science, deep learning, Donald Trump, Elon Musk, Erik Brynjolfsson, Fairchild Semiconductor, fake news, fulfillment center, future of work, gig economy, Glass-Steagall Act, Google Glasses, Google X / Alphabet X, income inequality, independent contractor, industrial robot, Internet of things, Jeff Bezos, job automation, Joseph Schumpeter, Kevin Kelly, Kevin Roose, Lyft, Marc Andreessen, Mark Zuckerberg, military-industrial complex, money market fund, natural language processing, no-fly zone, Ocado, pets.com, plutocrats, race to the bottom, ride hailing / ride sharing, Salesforce, Sand Hill Road, self-driving car, shareholder value, Sheryl Sandberg, Silicon Valley, Silicon Valley startup, Snapchat, speech recognition, Steve Jobs, Stewart Brand, supply-chain management, TED Talk, Tim Cook: Apple, too big to fail, Travis Kalanick, two-pizza team, Uber and Lyft, uber lyft, universal basic income, warehouse automation, warehouse robotics, wealth creators, web application, Whole Earth Catalog, work culture

We place buy orders for millions of items automatically.” Under the old system, Wilke and his managers only had the bandwidth to focus on Amazon’s top-selling items, but at the scale it operates today, those conversations wouldn’t be possible. Now the original retail buying model that used to be stored in human brains is stored in deep learning algorithms—the thinking process is the same, but Amazon’s managers don’t have to repeat the same analyses over and over again. The other advantage is that the machines produce more consistent results. In the past, Amazon managers had their own spreadsheets and their unique models for making guesses about supply and demand.

Amazon spent more than two decades accumulating data on its customers and honing its AI programs to get to the point where the software is the business model. It comes as little surprise, then, that a 2019 survey by the research firm IDC found that only 25 percent of global corporations have an enterprise-wide AI strategy. Even at Amazon, machines are still far from perfect. If there is an aberration, the deep learning algorithms still aren’t smart enough to adjust on the fly. Say a hurricane hits New Orleans: the machines won’t know to stock more food and water there because it’s a random event. And the programs sometimes become outmoded. Wilke and his AI team are constantly evaluating the algorithms to make sure they’re maximizing business.


pages: 350 words: 109,379

How to Run a Government: So That Citizens Benefit and Taxpayers Don't Go Crazy by Michael Barber

Affordable Care Act / Obamacare, anti-fragile, Atul Gawande, battle of ideas, Berlin Wall, Black Swan, Checklist Manifesto, collapse of Lehman Brothers, collective bargaining, deep learning, deliberate practice, facts on the ground, failed state, fear of failure, full employment, G4S, illegal immigration, invisible hand, libertarian paternalism, Mark Zuckerberg, Nate Silver, North Sea oil, obamacare, performance metric, Potemkin village, Ronald Reagan, school choice, The Signal and the Noise by Nate Silver, transaction costs, WikiLeaks

There is widespread and significant progress which is becoming irreversible.21 If the routines are in place, the political leader can move from crucial detail to big picture, from nuts and bolts to overall design, from individual to nation because he or she, or at least the head of delivery, really knows what’s happening now. RULE 36 A FULL-SCALE REVIEW OF THE PROGRAMME AT LEAST ONCE A YEAR PROVIDES DEEP LEARNING (which can be acted on immediately) There is another walk in the Lake District that I love, this one less rugged, less dramatic and less hard work, not least because Rossett Gill is not involved. Even so, it is achingly beautiful and it has a personal connection because it involves crossing a wooded hillside once owned by my great-grandfather.

ROUTINES DON’T BE SPOOKED BY THE DEAFENING SILENCE (but keep listening) ANTICIPATE THE IMPLEMENTATION DIP (and demonstrate the leadership required to get through it) DEAL WITH CRISES (but don’t use them as an excuse) GOVERNMENT BY ROUTINE BEATS GOVERNMENT BY SPASM (it’s not even close) PREPARE MONTHLY NOTES FOR THE LEADER (and make them ‘deeply interesting’) ROUTINE MEETINGS OR STOCKTAKES CREATE FALSE DEADLINES (and solve problems before they become crises) A FULL-SCALE REVIEW OF THE PROGRAMME AT LEAST ONCE A YEAR PROVIDES DEEP LEARNING (which can be acted on immediately) UNDERSTAND THE WOOD AND THE TREES (and the view beyond) 6. PROBLEM-SOLVING CATEGORIZE PROBLEMS BY THEIR INTENSITY (and act accordingly) DIAGNOSE PROBLEMS PRECISELY (and act accordingly) TAKE ALL THE EXCUSES OFF THE TABLE LEARN ACTIVELY FROM EXPERIENCE (failure is a great teacher) NEGOTIATE ON THE BASIS OF PRINCIPLE (but don’t depend on it) GUARD AGAINST FOLLY (it has been common throughout history) 7.


pages: 374 words: 111,284

The AI Economy: Work, Wealth and Welfare in the Robot Age by Roger Bootle

"World Economic Forum" Davos, 3D printing, agricultural Revolution, AI winter, Albert Einstein, AlphaGo, Alvin Toffler, anti-work, antiwork, autonomous vehicles, basic income, Ben Bernanke: helicopter money, Bernie Sanders, Bletchley Park, blockchain, call centre, Cambridge Analytica, Capital in the Twenty-First Century by Thomas Piketty, Carl Icahn, Chris Urmson, computer age, Computing Machinery and Intelligence, conceptual framework, corporate governance, correlation does not imply causation, creative destruction, David Ricardo: comparative advantage, deep learning, DeepMind, deindustrialization, Demis Hassabis, deskilling, Dr. Strangelove, driverless car, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, everywhere but in the productivity statistics, facts on the ground, fake news, financial intermediation, full employment, future of work, Future Shock, general purpose technology, Great Leap Forward, Hans Moravec, income inequality, income per capita, industrial robot, Internet of things, invention of the wheel, Isaac Newton, James Watt: steam engine, Jeff Bezos, Jeremy Corbyn, job automation, job satisfaction, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Joseph Schumpeter, Kevin Kelly, license plate recognition, low interest rates, machine translation, Marc Andreessen, Mark Zuckerberg, market bubble, mega-rich, natural language processing, Network effects, new economy, Nicholas Carr, Ocado, Paul Samuelson, Peter Thiel, Phillips curve, positional goods, quantitative easing, RAND corporation, Ray Kurzweil, Richard Florida, ride hailing / ride sharing, rising living standards, road to serfdom, Robert Gordon, Robert Shiller, Robert Solow, Rutger Bregman, Second Machine Age, secular stagnation, self-driving car, seminal paper, Silicon Valley, Silicon Valley billionaire, Simon Kuznets, Skype, social intelligence, spinning jenny, Stanislav Petrov, Stephen Hawking, Steven Pinker, synthetic biology, technological singularity, The Future of Employment, The Wealth of Nations by Adam Smith, Thomas Malthus, trade route, universal basic income, US Airways Flight 1549, Vernor Vinge, warehouse automation, warehouse robotics, Watson beat the top human players on Jeopardy!, We wanted flying cars, instead we got 140 characters, wealth creators, winner-take-all economy, world market for maybe five computers, Y2K, Yogi Berra

Over the last decade, however, a number of key developments have come together to power AI forward: • Enormous growth in computer processing power. • Rapid growth in available data. • The development of improved technologies, including advances in text and image, including facial, as well as voice, recognition. • The development of “deep learning”. • The advent of algorithm-based decision-making. So now AI seems close to its “James Watt moment.” Just as the steam engine was in existence for some time before Watt developed it and it came to transform production, so AI, which has been on the scene for some time, is about to stage a leap forward.

But, of course, computers are used everywhere in the world. Moreover, if a country decided to eschew the use of computers because it did not produce them it would consign itself to the economic scrap heap. The same is true of AI. Just because your country does not produce AI – none of the algorithms, deep learning apps that are driving AI, nor the physical entities, such as robots – this does not mean that you cannot benefit by employing them. Indeed, if you don’t, you risk falling into economic irrelevance. That said, there is a marked difference of opinion among the technological cognoscenti about how innovation, including with regard to AI – and the gains from it – will be distributed globally.


pages: 390 words: 109,519

Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media by Tarleton Gillespie

4chan, A Declaration of the Independence of Cyberspace, affirmative action, Airbnb, algorithmic bias, algorithmic management, AltaVista, Amazon Mechanical Turk, borderless world, Burning Man, complexity theory, conceptual framework, crowdsourcing, deep learning, do what you love, Donald Trump, drone strike, easy for humans, difficult for computers, Edward Snowden, eternal september, fake news, Filter Bubble, Gabriella Coleman, game design, gig economy, Google Glasses, Google Hangouts, hiring and firing, Ian Bogost, independent contractor, Internet Archive, Jean Tirole, John Gruber, Kickstarter, Mark Zuckerberg, mass immigration, Menlo Park, Minecraft, moral panic, multi-sided market, Netflix Prize, Network effects, pattern recognition, peer-to-peer, power law, real-name policy, recommendation engine, Rubik’s Cube, Salesforce, sharing economy, Silicon Valley, Skype, slashdot, Snapchat, social graph, social web, Steve Jobs, Stewart Brand, TED Talk, Telecommunications Act of 1996, two-sided market, WikiLeaks, Yochai Benkler

., “Naked Image Detection Based on Adaptive and Extensible Skin Color Model.” 92James Sutton, “Improving Nudity Detection and NSFW Image Recognition,” KD Nuggets, June 25, 2016, http://www.kdnuggets.com/2016/06/algorithmia-improving-nudity-detection-nsfw-image-recognition.html. 93Sengamedu, Sanyal, and Satish, “Detection of Pornographic Content in Internet Images.” 94Jay Mahadeokar and Gerry Pesavento, “Open Sourcing a Deep Learning Solution for Detecting NSFW Images,” Yahoo Engineering blog, September 30, 2016, https://yahooeng.tumblr.com/post/151148689421/open-sourcing-a-deep-learning-solution-for. 95Agarwal and Sureka, “A Focused Crawler”; Djuric et al., “Hate Speech Detection with Comment Embeddings”; Sood, Antin, and Churchill, “Profanity Use in Online Communities”; Warner and Hirschberg, “Detecting Hate Speech on the World Wide Web.” 96Brendan Maher, “Can a Video Game Company Tame Toxic Behaviour?”


pages: 334 words: 109,882

Quit Like a Woman: The Radical Choice to Not Drink in a Culture Obsessed With Alcohol by Holly Glenn Whitaker

BIPOC, Black Lives Matter, cognitive dissonance, deep learning, Donald Trump, Elon Musk, en.wikipedia.org, epigenetics, fake news, fixed income, impulse control, incognito mode, Mahatma Gandhi, mass incarceration, medical residency, microaggression, microbiome, Naomi Klein, Nelson Mandela, opioid epidemic / opioid crisis, Peter Thiel, Rat Park, rent control, Rosa Parks, Silicon Valley, Snapchat, Steve Jobs, TED Talk, Torches of Freedom, twin studies, WeWork, white picket fence, young professional, zero-sum game

This practice gives you the chance to stay with the parts of yourself that are hurting, the parts that need you not to go somewhere else, like down the neck of a bottle or the rabbit hole of social media. RASINS also gives you something else, access to a practice known as deep learning. People learn better when they are challenging themselves just outside their ability (where things are not too hard and not too easy). In this state of deep learning, when we are trying to accomplish something difficult, we lay thicker neural networks, meaning that—with practice—our skill at not drinking becomes more potent than our skill of drinking. Lastly, it breaks the cycle of cause and effect: not reacting means not feeding the habit.


Reset by Ronald J. Deibert

23andMe, active measures, air gap, Airbnb, Amazon Web Services, Anthropocene, augmented reality, availability heuristic, behavioural economics, Bellingcat, Big Tech, bitcoin, blockchain, blood diamond, Brexit referendum, Buckminster Fuller, business intelligence, Cal Newport, call centre, Cambridge Analytica, carbon footprint, cashless society, Citizen Lab, clean water, cloud computing, computer vision, confounding variable, contact tracing, contact tracing app, content marketing, coronavirus, corporate social responsibility, COVID-19, crowdsourcing, data acquisition, data is the new oil, decarbonisation, deep learning, deepfake, Deng Xiaoping, disinformation, Donald Trump, Doomsday Clock, dual-use technology, Edward Snowden, Elon Musk, en.wikipedia.org, end-to-end encryption, Evgeny Morozov, failed state, fake news, Future Shock, game design, gig economy, global pandemic, global supply chain, global village, Google Hangouts, Great Leap Forward, high-speed rail, income inequality, information retrieval, information security, Internet of things, Jaron Lanier, Jeff Bezos, John Markoff, Lewis Mumford, liberal capitalism, license plate recognition, lockdown, longitudinal study, Mark Zuckerberg, Marshall McLuhan, mass immigration, megastructure, meta-analysis, military-industrial complex, move fast and break things, Naomi Klein, natural language processing, New Journalism, NSO Group, off-the-grid, Peter Thiel, planetary scale, planned obsolescence, post-truth, proprietary trading, QAnon, ransomware, Robert Mercer, Sheryl Sandberg, Shoshana Zuboff, Silicon Valley, single source of truth, Skype, Snapchat, social distancing, sorting algorithm, source of truth, sovereign wealth fund, sparse data, speech recognition, Steve Bannon, Steve Jobs, Stuxnet, surveillance capitalism, techlash, technological solutionism, the long tail, the medium is the message, The Structural Transformation of the Public Sphere, TikTok, TSMC, undersea cable, unit 8200, Vannevar Bush, WikiLeaks, zero day, zero-sum game

Central Asian countries like Uzbekistan and Kazakhstan have even gone so far as to advertise for Bitcoin mining operations to be hosted in their jurisdictions because of cheap and plentiful coal and other fossil-fuelled energy sources.349 Some estimates put electric energy consumption associated with Bitcoin mining at around 83.67 terawatt-hours per year, more than that of the entire country of Finland, with carbon emissions estimated at 33.82 megatons, roughly equivalent to those of Denmark.350 To put it another way, the Cambridge Centre for Alternative Finance says that the electricity consumed by the Bitcoin network in one year could power all the teakettles used to boil water in the entire United Kingdom for nineteen years.351 A similar energy-sucking dynamic underlies other cutting-edge technologies, like “deep learning.” The latter refers to the complex artificial intelligence systems used to undertake the fine-grained, real-time calculations associated with the range of social media experiences, such as computer vision, speech recognition, natural language processing, audio recognition, social network filtering, and so on.

Training a single AI model can emit as much carbon as five cars in their lifetimes. Retrieved from https://www.technologyreview.com/2019/06/06/239031/training-a-single-ai-model-can-emit-as-much-carbon-as-five-cars-in-their-lifetimes/; Strubell, E., Ganesh, A., & McCallum, A. (2019). Energy and policy considerations for deep learning in NLP. Retrieved from https://arxiv.org/abs/1906.02243 Data centres are “hidden monuments” to our excessive data consumption: Hogan, M. (2015). Facebook data storage centers as the archive’s underbelly. Television & New Media, 16(1), 3–18. https://doi.org/10.1177/1527476413509415; See also Hogan, M. (2015).


pages: 489 words: 106,008

Risk: A User's Guide by Stanley McChrystal, Anna Butrico

"Hurricane Katrina" Superdome, Abraham Maslow, activist fund / activist shareholder / activist investor, airport security, Albert Einstein, Apollo 13, banking crisis, Bernie Madoff, Boeing 737 MAX, business process, cognitive dissonance, collapse of Lehman Brothers, collateralized debt obligation, computer vision, coronavirus, corporate governance, cotton gin, COVID-19, cuban missile crisis, deep learning, disinformation, don't be evil, Dr. Strangelove, fake news, fear of failure, George Floyd, Glass-Steagall Act, global pandemic, Googley, Greta Thunberg, hindsight bias, inflight wifi, invisible hand, iterative process, late fees, lockdown, Paul Buchheit, Ponzi scheme, QWERTY keyboard, ride hailing / ride sharing, Ronald Reagan, San Francisco homelessness, School Strike for Climate, Scientific racism, Silicon Valley, Silicon Valley startup, Skype, social distancing, source of truth, Stanislav Petrov, Steve Jobs, Thomas L Friedman, too big to fail, Travis Kalanick, wikimedia commons, work culture

Google eagerly signed the contract. Now identifying as an AI company (not a data company, as it had been formerly known) Google would create a “customized AI surveillance engine” to scour the DoD’s massive amount of footage. Google’s computer vision, which incorporated both machine learning and deep learning, would analyze the data to track the movements of vehicles and other objects. As they quietly engaged with Project Maven, Google’s AI services showed initial progress—Google’s software had greater success than humans in detecting important footage. aunt jemima ■ Aunt Jemima is a brand of syrup and pancake mix and other foods, whose packaging features the image of the eponymous character originally appropriated from nineteenth-century minstrel shows.

“AI arms race”: Cheryl Pellerin, “Project Maven to Deploy Computer Algorithms to War Zone by Year’s End,” US Department of Defense, July 21, 2017, https://defense.gov/Explore/News/Article/Article/1254719/project-maven-to-deploy-computer-algorithms-to-war-zone-by-years-end/. identifying as an AI company: Pellerin, “Project Maven to Deploy Computer Algorithms to War Zone by Year’s End.” “customized AI surveillance engine”: Letter to Sundar Pinchai, https://static01.nyt.com/files/2018/technology/googleletter.pdf. both machine learning and deep learning: Pellerin, “Project Maven to Deploy Computer Algorithms to War Zone by Year’s End.” track the movements: Letter to Sundar Pinchai. software had greater success: Scheiber and Conger, “Great Google Revolt.” nineteenth-century minstrel shows: Beatrice Dupuy, “No Evidence Former Slave Who Helped Launch Aunt Jemima Products Became a Millionaire,” AP, June 19, 2020, https://apnews.com/afs:Content:9030960288.


pages: 492 words: 118,882

The Blockchain Alternative: Rethinking Macroeconomic Policy and Economic Theory by Kariappa Bheemaiah

"World Economic Forum" Davos, accounting loophole / creative accounting, Ada Lovelace, Adam Curtis, Airbnb, Alan Greenspan, algorithmic trading, asset allocation, autonomous vehicles, balance sheet recession, bank run, banks create money, Basel III, basic income, behavioural economics, Ben Bernanke: helicopter money, bitcoin, Bletchley Park, blockchain, Bretton Woods, Brexit referendum, business cycle, business process, call centre, capital controls, Capital in the Twenty-First Century by Thomas Piketty, cashless society, cellular automata, central bank independence, Charles Babbage, Claude Shannon: information theory, cloud computing, cognitive dissonance, collateralized debt obligation, commoditize, complexity theory, constrained optimization, corporate governance, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, cross-border payments, crowdsourcing, cryptocurrency, data science, David Graeber, deep learning, deskilling, Diane Coyle, discrete time, disruptive innovation, distributed ledger, diversification, double entry bookkeeping, Ethereum, ethereum blockchain, fiat currency, financial engineering, financial innovation, financial intermediation, Flash crash, floating exchange rates, Fractional reserve banking, full employment, George Akerlof, Glass-Steagall Act, Higgs boson, illegal immigration, income inequality, income per capita, inflation targeting, information asymmetry, interest rate derivative, inventory management, invisible hand, John Maynard Keynes: technological unemployment, John von Neumann, joint-stock company, Joseph Schumpeter, junk bonds, Kenneth Arrow, Kenneth Rogoff, Kevin Kelly, knowledge economy, large denomination, Large Hadron Collider, Lewis Mumford, liquidity trap, London Whale, low interest rates, low skilled workers, M-Pesa, machine readable, Marc Andreessen, market bubble, market fundamentalism, Mexican peso crisis / tequila crisis, Michael Milken, MITM: man-in-the-middle, Money creation, money market fund, money: store of value / unit of account / medium of exchange, mortgage debt, natural language processing, Network effects, new economy, Nikolai Kondratiev, offshore financial centre, packet switching, Pareto efficiency, pattern recognition, peer-to-peer lending, Ponzi scheme, power law, precariat, pre–internet, price mechanism, price stability, private sector deleveraging, profit maximization, QR code, quantitative easing, quantitative trading / quantitative finance, Ray Kurzweil, Real Time Gross Settlement, rent control, rent-seeking, robo advisor, Satoshi Nakamoto, Satyajit Das, Savings and loan crisis, savings glut, seigniorage, seminal paper, Silicon Valley, Skype, smart contracts, software as a service, software is eating the world, speech recognition, statistical model, Stephen Hawking, Stuart Kauffman, supply-chain management, technology bubble, The Chicago School, The Future of Employment, The Great Moderation, the market place, The Nature of the Firm, the payments system, the scientific method, The Wealth of Nations by Adam Smith, Thomas Kuhn: the structure of scientific revolutions, too big to fail, trade liberalization, transaction costs, Turing machine, Turing test, universal basic income, Vitalik Buterin, Von Neumann architecture, Washington Consensus

Those jobs are going to get decimated, literally.” Advantages: greater inclusion, increased competition, data standardization Risks: compliance costs, regulation blocks risk monitoring, and technological unemployment 4. Capital Markets Stance: Business-facing Main technologies: Trading Algorithms, Big Data, Neural Nets, Machine/Deep Learning, AI If we were to increase the scale, speed, and volume of the transactions and services stated in the private wealth management industry, we would find ourselves in the high-frequency trading (HFT) world of capital markets , which encompasses the trade and management of private equity, commodities, and derivatives.

As technologies such as the Blockchain begin to remove central points of control, the evolving digital and decentralized structure of markets today are challenging the predefined theories on productivity, risk allocation, and labor requirements. Increased automation, propelled by rapid advancements in machine/deep learning, mobile payments, robotics, and the exponential increase in the computerization of tasks, is leading to the development of networked, on- demand businesses which are transforming and reorganizing firms and establishing new skill requirements across the entire economy. As tasks are digitized and operations are networked, processes can be codified and then replicated.


pages: 706 words: 202,591

Facebook: The Inside Story by Steven Levy

active measures, Airbnb, Airbus A320, Amazon Mechanical Turk, AOL-Time Warner, Apple's 1984 Super Bowl advert, augmented reality, Ben Horowitz, Benchmark Capital, Big Tech, Black Lives Matter, Blitzscaling, blockchain, Burning Man, business intelligence, Cambridge Analytica, cloud computing, company town, computer vision, crowdsourcing, cryptocurrency, data science, deep learning, disinformation, don't be evil, Donald Trump, Dunbar number, East Village, Edward Snowden, El Camino Real, Elon Musk, end-to-end encryption, fake news, Firefox, Frank Gehry, Geoffrey Hinton, glass ceiling, GPS: selective availability, growth hacking, imposter syndrome, indoor plumbing, information security, Jeff Bezos, John Markoff, Jony Ive, Kevin Kelly, Kickstarter, lock screen, Lyft, machine translation, Mahatma Gandhi, Marc Andreessen, Marc Benioff, Mark Zuckerberg, Max Levchin, Menlo Park, Metcalfe’s law, MITM: man-in-the-middle, move fast and break things, natural language processing, Network effects, Oculus Rift, operational security, PageRank, Paul Buchheit, paypal mafia, Peter Thiel, pets.com, post-work, Ray Kurzweil, recommendation engine, Robert Mercer, Robert Metcalfe, rolodex, Russian election interference, Salesforce, Sam Altman, Sand Hill Road, self-driving car, sexual politics, Sheryl Sandberg, Shoshana Zuboff, side project, Silicon Valley, Silicon Valley startup, skeuomorphism, slashdot, Snapchat, social contagion, social graph, social software, South of Market, San Francisco, Startup school, Steve Ballmer, Steve Bannon, Steve Jobs, Steven Levy, Steven Pinker, surveillance capitalism, tech billionaire, techlash, Tim Cook: Apple, Tragedy of the Commons, web application, WeWork, WikiLeaks, women in the workforce, Y Combinator, Y2K, you are the product

He wasn’t thinking about content moderation then, but rather improvement in things like News Feed ranking, better targeting in ad auctions, and facial recognition to better identify your friends in photographs, so you’d engage more with those posts. But the competition to hire AI wizards was fierce. The godfather of deep learning was a British computer scientist working in Toronto named Geoffrey Hinton. He was like the Batman of this new and irreverent form of AI, and his acolytes were a trio of brilliant Robins who individually were making their own huge contributions. One of the Robins, a Parisian named Yann LeCun, jokingly dubbed Hinton’s movement “the Conspiracy.” But the potential of deep learning was no joke to the big tech companies who saw it as a way to perform amazing tasks at scale, everything from facial recognition to instant translation from one language to another.

In the earliest days Facebook did hire some people adept in AI, and both the News Feed and the ad auction were fueled by learning algorithms. But beginning in the mid-2010s one particular approach known as machine learning began to accumulate amazing results, suddenly putting AI to use in a number of practical cases. This supercharged iteration on machine learning was called deep learning. It worked by training networks of artificial neurons—working somewhat like the actual neurons in the human brain—to rapidly identify things like objects in images, or spoken words. Zuckerberg felt that this was another moment like mobile, where the winners would be those who had the best machine-learning engineers.


pages: 159 words: 42,401

Snowden's Box: Trust in the Age of Surveillance by Jessica Bruder, Dale Maharidge

air gap, anti-communist, Bay Area Rapid Transit, Berlin Wall, Black Lives Matter, blockchain, Broken windows theory, Burning Man, Cambridge Analytica, cashless society, Chelsea Manning, citizen journalism, computer vision, crowdsourcing, deep learning, digital rights, disinformation, Donald Trump, Edward Snowden, Elon Musk, end-to-end encryption, Evgeny Morozov, Ferguson, Missouri, Filter Bubble, Firefox, information security, Internet of things, Jeff Bezos, Jessica Bruder, John Perry Barlow, Julian Assange, Laura Poitras, license plate recognition, Mark Zuckerberg, mass incarceration, medical malpractice, messenger bag, Neil Armstrong, Nomadland, Occupy movement, off grid, off-the-grid, pattern recognition, Peter Thiel, Robert Bork, Seymour Hersh, Shoshana Zuboff, Silicon Valley, Skype, social graph, Steven Levy, surveillance capitalism, tech bro, Tim Cook: Apple, web of trust, WikiLeaks

— Leonard Cohen, “Everybody Knows” On a September night in 2017, New York Times tech columnist Farhad Manjoo and his wife were getting ready to sleep when a blood-curdling shriek arose from the bedside. It was Alexa. “The voice assistant began to wail, like a child screaming in a horror-movie dream,” Manjoo later recalled. His Twitter followers greeted the news with sarcasm and satirical advice: “You have an always-on, deep-learning supercomputer node in your house always listening and you are surprised it screams?” wrote one. “Why voluntarily have CIA spy tech in your home?” asked another. “If I were you, I’d keep all the network access wires in one place and keep an axe nearby,” advised a third. Suspicions about Alexa were already running high; a hacker had recently demonstrated how an Echo could be transformed into a wiretap.


pages: 451 words: 125,201

What We Owe the Future: A Million-Year View by William MacAskill

Ada Lovelace, agricultural Revolution, Albert Einstein, Alignment Problem, AlphaGo, artificial general intelligence, Bartolomé de las Casas, Bletchley Park, British Empire, Brownian motion, carbon footprint, carbon tax, charter city, clean tech, coronavirus, COVID-19, cuban missile crisis, decarbonisation, deep learning, DeepMind, Deng Xiaoping, different worldview, effective altruism, endogenous growth, European colonialism, experimental subject, feminist movement, framing effect, friendly AI, global pandemic, GPT-3, hedonic treadmill, Higgs boson, income inequality, income per capita, Indoor air pollution, Intergovernmental Panel on Climate Change (IPCC), Isaac Newton, Islamic Golden Age, iterative process, Jeff Bezos, job satisfaction, lab leak, Lao Tzu, Large Hadron Collider, life extension, lockdown, long peace, low skilled workers, machine translation, Mars Rover, negative emissions, Nick Bostrom, nuclear winter, OpenAI, Peter Singer: altruism, Peter Thiel, QWERTY keyboard, Robert Gordon, Rutger Bregman, Sam Altman, seminal paper, Shenzhen special economic zone , Shenzhen was a fishing village, Silicon Valley, special economic zone, speech recognition, Stanislav Petrov, stem cell, Steven Pinker, strong AI, synthetic biology, total factor productivity, transatlantic slave trade, Tyler Cowen, William MacAskill, women in the workforce, working-age population, World Values Survey, Y Combinator

After correcting for the unprecedented amount of hardware DeepMind was willing to employ, it is not clear whether AlphaGo deviates from the trend of algorithmic improvements at all (Brundage 2016). 37. More specifically, most AI breakthroughs have been due to a particular approach to machine learning that uses multilayered neural networks, known as “deep learning” (Goodfellow et al. 2016; LeCun et al. 2015). At the time of writing, the state-of-the-art AI for text-based applications are so-called transformers, which include Google’s BERT and OpenAI’s GPT-3 (T. Brown et al. 2020; Devlin et al. 2019; Vaswani et al. 2017). Transformers have also been successfully used for tasks involving audio (Child et al. 2019), images (M.

The highest-profile AI achievements in real-time strategy games were DeepMind’s AlphaStar defeat of human grandmasters in the game StarCraft II and the OpenAI Five’s defeat of human world champions in Dota 2 (OpenAI et al. 2019; Vinyals et al. 2019). Early successes in image classification (see, e.g., Krizhevsky et al. 2012) are widely seen as having been key for demonstrating the potential of deep learning. See also the following: speech recognition, Abdel-Hamid et al. (2014); Ravanelli et al. (2019); music, Briot et al. (2020); Choi et al. (2018); Magenta (n.d.); visual art, Gatys et al. (2016); Lecoutre et al. (2017). Building on astonishing progress demonstrated by Ramesh et al. (2021), the ability to create images from text descriptions by combining two AI systems known as VQGAN (Esser et al. 2021) and CLIP (OpenAI 2021b; Radford et al. 2021) caused a Twitter sensation (Miranda 2021). 38.


pages: 492 words: 141,544

Red Moon by Kim Stanley Robinson

artificial general intelligence, basic income, blockchain, Brownian motion, correlation does not imply causation, cryptocurrency, deep learning, Deng Xiaoping, gig economy, Great Leap Forward, Hyperloop, illegal immigration, income inequality, invisible hand, Ken Thompson, Kim Stanley Robinson, low earth orbit, machine translation, Magellanic Cloud, megacity, Neil Armstrong, precariat, quantum entanglement, Schrödinger's Cat, seigniorage, strong AI, Turing machine, universal basic income, zero-sum game

He sat down and began to ponder again the problem of programming self-improvement into an AI. New work from Chengdu on rather simple Monte Carlo tree searches and combinatorial optimization had given him some ideas. Deep learning was alas very shallow whenever it left closed sets of rules and data; the name was a remnant of early AI hype. If you wanted to win a game like chess or go, fine, but when immersed in the larger multivariant world, AI needed more than deep learning. It needed to incorporate the symbolic logic of earlier AI attempts, and the various programs that instructed an AI to pursue “child’s play,” meaning randomly created activities and improvements.


Doppelganger: A Trip Into the Mirror World by Naomi Klein

"World Economic Forum" Davos, 2021 United States Capitol attack, 3D printing, anti-communist, anti-globalists, autism spectrum disorder, benefit corporation, Bernie Sanders, Big Tech, bitcoin, Black Lives Matter, blood diamond, Boris Johnson, Boycotts of Israel, Cambridge Analytica, capitalist realism, ChatGPT, citizen journalism, Climategate, contact tracing, coronavirus, COVID-19, crisis actor, critical race theory, dark matter, deep learning, deepfake, deplatforming, disinformation, Donald Trump, Edward Snowden, Elon Musk, fake news, false flag, feminist movement, George Floyd, glass ceiling, global pandemic, global supply chain, green new deal, Greta Thunberg, hive mind, Intergovernmental Panel on Climate Change (IPCC), Internet Archive, Jeffrey Epstein, Johann Wolfgang von Goethe, lab leak, Lewis Mumford, lockdown, Mark Zuckerberg, mass immigration, mass incarceration, medical residency, military-industrial complex, moral panic, Naomi Klein, Nelson Mandela, neurotypical, new economy, opioid epidemic / opioid crisis, Parler "social media", pattern recognition, Peter Thiel, phenotype, profit motive, QAnon, QR code, Ralph Waldo Emerson, randomized controlled trial, Rosa Parks, Scientific racism, Scramble for Africa, shared worldview, sharing economy, Sheryl Sandberg, Shoshana Zuboff, Silicon Valley, Slavoj Žižek, smart cities, social distancing, Steve Bannon, surveillance capitalism, tech billionaire, tech bro, The Wealth of Nations by Adam Smith, TikTok, trade route, transatlantic slave trade, trickle-down economics, union organizing, W. E. B. Du Bois, Wayback Machine, women in the workforce

Yet now we find ourselves neck-deep in a system where, as with my own real-life doppelganger, the stakes are distinctly higher. Personal data, extracted without full knowledge or understanding, is sold to third parties and can influence everything from what loans we are eligible for to what job postings we see—to whether our jobs are replaced by deep learning bots that have gotten shockingly good at impersonating us. And those helpful recommendations and eerie impersonations come from the same algorithms that have led countless people down perilous information tunnels that end in comparing a vaccine app to the Holocaust and may yet end up somewhere far more dangerous.

“An invented past can never be used; it cracks and crumbles under the pressures of life like clay in a season of drought,” James Baldwin wrote. However, “to accept one’s past—one’s history—is not the same thing as drowning in it; it is learning how to use it.” Many Indigenous friends and neighbors I spoke with, though raw with grief and rage, expressed cautious hope that this kind of deep learning might actually be afoot. In an interview with The Globe and Mail, Norman Retasket, a survivor of the Kamloops school, observed, “If I told the same story three years ago,” about what happened at the school, it would have been seen as “fiction.” Now his stories are believed. “The story hasn’t changed,” he said.


pages: 181 words: 52,147

The Driver in the Driverless Car: How Our Technology Choices Will Create the Future by Vivek Wadhwa, Alex Salkever

23andMe, 3D printing, Airbnb, AlphaGo, artificial general intelligence, augmented reality, autonomous vehicles, barriers to entry, benefit corporation, Bernie Sanders, bitcoin, blockchain, clean water, correlation does not imply causation, CRISPR, deep learning, DeepMind, distributed ledger, Donald Trump, double helix, driverless car, Elon Musk, en.wikipedia.org, epigenetics, Erik Brynjolfsson, gigafactory, Google bus, Hyperloop, income inequality, information security, Internet of things, job automation, Kevin Kelly, Khan Academy, Kickstarter, Law of Accelerating Returns, license plate recognition, life extension, longitudinal study, Lyft, M-Pesa, Mary Meeker, Menlo Park, microbiome, military-industrial complex, mobile money, new economy, off-the-grid, One Laptop per Child (OLPC), personalized medicine, phenotype, precision agriculture, radical life extension, RAND corporation, Ray Kurzweil, recommendation engine, Ronald Reagan, Second Machine Age, self-driving car, seminal paper, Silicon Valley, Skype, smart grid, stem cell, Stephen Hawking, Steve Wozniak, Stuxnet, supercomputer in your pocket, synthetic biology, Tesla Model S, The future is already here, The Future of Employment, Thomas Davenport, Travis Kalanick, Turing test, Uber and Lyft, Uber for X, uber lyft, uranium enrichment, Watson beat the top human players on Jeopardy!, zero day

To me, the crux of this matter will be maintaining the ability of humans to understand robots and stop them from going too far. Google is looking at building in a kill switch on its A.I. systems.12 Other researchers are developing tools to visualize the otherwise impenetrable code in machine-generated algorithms built using Deep Learning systems. So the question that we must always be able to answer in the affirmative is whether we can stop it. With both A.I. and robotics, we must design all systems with this key consideration in mind, even if that reduces the capabilities and emergent properties of those systems and robots. Will All Benefit Equally?


Mastering Machine Learning With Scikit-Learn by Gavin Hackeling

backpropagation, computer vision, constrained optimization, correlation coefficient, data science, Debian, deep learning, distributed generation, iterative process, natural language processing, Occam's razor, optical character recognition, performance metric, recommendation engine

For these reasons, this representation is ineffective for tasks that involve photographs or other natural images. Modern computer vision applications frequently use either hand-engineered feature extraction methods that are applicable to many different problems, or automatically learn features without supervision problem using techniques such as deep learning. We will focus on the former in the next section. [ 64 ] www.it-ebooks.info Chapter 3 Extracting points of interest as features The feature vector we created previously represents every pixel in the image; all of the informative attributes of the image are represented and all of the noisy attributes are represented too.


pages: 543 words: 153,550

Model Thinker: What You Need to Know to Make Data Work for You by Scott E. Page

Airbnb, Albert Einstein, Alfred Russel Wallace, algorithmic trading, Alvin Roth, assortative mating, behavioural economics, Bernie Madoff, bitcoin, Black Swan, blockchain, business cycle, Capital in the Twenty-First Century by Thomas Piketty, Checklist Manifesto, computer age, corporate governance, correlation does not imply causation, cuban missile crisis, data science, deep learning, deliberate practice, discrete time, distributed ledger, Easter island, en.wikipedia.org, Estimating the Reproducibility of Psychological Science, Everything should be made as simple as possible, experimental economics, first-price auction, Flash crash, Ford Model T, Geoffrey West, Santa Fe Institute, germ theory of disease, Gini coefficient, Higgs boson, High speed trading, impulse control, income inequality, Isaac Newton, John von Neumann, Kenneth Rogoff, knowledge economy, knowledge worker, Long Term Capital Management, loss aversion, low skilled workers, Mark Zuckerberg, market design, meta-analysis, money market fund, multi-armed bandit, Nash equilibrium, natural language processing, Network effects, opioid epidemic / opioid crisis, p-value, Pareto efficiency, pattern recognition, Paul Erdős, Paul Samuelson, phenotype, Phillips curve, power law, pre–internet, prisoner's dilemma, race to the bottom, random walk, randomized controlled trial, Richard Feynman, Richard Thaler, Robert Solow, school choice, scientific management, sealed-bid auction, second-price auction, selection bias, six sigma, social graph, spectrum auction, statistical model, Stephen Hawking, Supply of New York City Cabdrivers, systems thinking, tacit knowledge, The Bell Curve by Richard Herrnstein and Charles Murray, The Great Moderation, the long tail, The Rise and Fall of American Growth, the rule of 72, the scientific method, The Spirit Level, the strength of weak ties, The Wisdom of Crowds, Thomas Malthus, Thorstein Veblen, Tragedy of the Commons, urban sprawl, value at risk, web application, winner-take-all economy, zero-sum game

He discovered the orbits to be consistent with the presence of a large planet in the outer region of the solar system. On September 18, 1846, he sent his prediction to the Berlin Observatory. Five days later, astronomers located the planet Neptune exactly where Le Verrier had predicted it would be. That said, prediction differs from explanation. A model can predict without explaining. Deep-learning algorithms can predict product sales, tomorrow’s weather, price trends, and some health outcomes, but they offer little in the way of explanation. Such models resemble bomb-sniffing dogs. Even though a dog’s olfactory system can determine whether a package contains explosives, we should not look to the dog for an explanation of why the bomb is there, how it works, or how to disarm it.

In this example, a straight line classifies nearly perfectly.7 Figure M1: Using a Linear Model to Classify Voting Behavior Nonlinear classifications: In figure M2, positives (+) represent frequent flyers, consumers who fly more than 10,000 miles per year, and negatives (-) represent all other customers of an airline. People of middle age and higher income are more likely to fly. To classify these data requires a nonlinear model, which could be estimated using deep-learning algorithms, such as neural networks. Neural networks include more variables so that they can fit almost any curve. Figure M2: Using a Nonlinear Model to Classify Frequent Flyers Forests of decision trees: In figure M3, positives (+) represent people who attended a science fiction convention based on their age and the hours per week they spend on the internet.


pages: 215 words: 59,188

Seriously Curious: The Facts and Figures That Turn Our World Upside Down by Tom Standage

"World Economic Forum" Davos, agricultural Revolution, augmented reality, autonomous vehicles, Big Tech, blood diamond, business logic, corporate governance, CRISPR, deep learning, Deng Xiaoping, Donald Trump, Dr. Strangelove, driverless car, Elon Musk, failed state, financial independence, gender pay gap, gig economy, Gini coefficient, high net worth, high-speed rail, income inequality, index fund, industrial robot, Internet of things, invisible hand, it's over 9,000, job-hopping, Julian Assange, life extension, Lyft, M-Pesa, Mahatma Gandhi, manufacturing employment, mega-rich, megacity, Minecraft, mobile money, natural language processing, Nelson Mandela, plutocrats, post-truth, price mechanism, private spaceflight, prosperity theology / prosperity gospel / gospel of success, purchasing power parity, ransomware, reshoring, ride hailing / ride sharing, Ronald Coase, self-driving car, Silicon Valley, Snapchat, South China Sea, speech recognition, stem cell, supply-chain management, transaction costs, Uber and Lyft, uber lyft, undersea cable, US Airways Flight 1549, WikiLeaks, zoonotic diseases

This narrows the system’s guesswork considerably. In recent years, machine-learning approaches have made rapid progress, for three reasons. First, computers are far more powerful. Second, they can learn from huge and growing stores of data, whether publicly available on the internet or privately gathered by firms. Third, so-called “deep learning” methods have combined faster computers and more abundant data with new training algorithms and more complex architectures that can learn from example even more efficiently. All this means that computers are now impressively competent at handling spoken requests that require a narrowly defined reply.


pages: 918 words: 257,605

The Age of Surveillance Capitalism by Shoshana Zuboff

"World Economic Forum" Davos, algorithmic bias, Amazon Web Services, Andrew Keen, augmented reality, autonomous vehicles, barriers to entry, Bartolomé de las Casas, behavioural economics, Berlin Wall, Big Tech, bitcoin, blockchain, blue-collar work, book scanning, Broken windows theory, California gold rush, call centre, Cambridge Analytica, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, choice architecture, citizen journalism, Citizen Lab, classic study, cloud computing, collective bargaining, Computer Numeric Control, computer vision, connected car, context collapse, corporate governance, corporate personhood, creative destruction, cryptocurrency, data science, deep learning, digital capitalism, disinformation, dogs of the Dow, don't be evil, Donald Trump, Dr. Strangelove, driverless car, Easter island, Edward Snowden, en.wikipedia.org, Erik Brynjolfsson, Evgeny Morozov, facts on the ground, fake news, Ford Model T, Ford paid five dollars a day, future of work, game design, gamification, Google Earth, Google Glasses, Google X / Alphabet X, Herman Kahn, hive mind, Ian Bogost, impulse control, income inequality, information security, Internet of things, invention of the printing press, invisible hand, Jean Tirole, job automation, Johann Wolfgang von Goethe, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Joseph Schumpeter, Kevin Kelly, Kevin Roose, knowledge economy, Lewis Mumford, linked data, longitudinal study, low skilled workers, Mark Zuckerberg, market bubble, means of production, multi-sided market, Naomi Klein, natural language processing, Network effects, new economy, Occupy movement, off grid, off-the-grid, PageRank, Panopticon Jeremy Bentham, pattern recognition, Paul Buchheit, performance metric, Philip Mirowski, precision agriculture, price mechanism, profit maximization, profit motive, public intellectual, recommendation engine, refrigerator car, RFID, Richard Thaler, ride hailing / ride sharing, Robert Bork, Robert Mercer, Salesforce, Second Machine Age, self-driving car, sentiment analysis, shareholder value, Sheryl Sandberg, Shoshana Zuboff, Sidewalk Labs, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, slashdot, smart cities, Snapchat, social contagion, social distancing, social graph, social web, software as a service, speech recognition, statistical model, Steve Bannon, Steve Jobs, Steven Levy, structural adjustment programs, surveillance capitalism, technological determinism, TED Talk, The Future of Employment, The Wealth of Nations by Adam Smith, Tim Cook: Apple, two-sided market, union organizing, vertical integration, Watson beat the top human players on Jeopardy!, winner-take-all economy, Wolfgang Streeck, work culture , Yochai Benkler, you are the product

The company then issued nonvoting class “C” shares in 2016, solidifying Zuckerberg’s personal control over every decision.14 While financial scholars and investors debated the consequences of these share structures, absolute corporate control enabled the Google and Facebook founders to aggressively pursue acquisitions, establishing an arms race in two critical arenas.15 State-of-the-art manufacturing depended on machine intelligence, compelling Google and later Facebook to acquire companies and talent representing its disciplines: facial recognition, “deep learning,” augmented reality, and more.16 But machines are only as smart as the volume of their diet allows. Thus, Google and Facebook vied to become the ubiquitous net positioned to capture the swarming schools of behavioral surplus flowing from every computer-mediated direction. To this end the founders paid outsized premiums for the chance to corner behavioral surplus through acquisitions of an ever-expanding roster of key supply routes.

If the company had tried to process the growing computational workload with traditional CPUs, he explained, “We would have had to double the entire footprint of Google—data centers and servers—just to do three minutes or two minutes of speech recognition per Android user per day.”27 With data center construction as the company’s largest line item and power as its highest operating cost, Google invented its way through the infrastructure crisis. In 2016 it announced the development of a new chip for “deep learning inference” called the tensor processing unit (TPU). The TPU would dramatically expand Google’s machine intelligence capabilities, consume only a fraction of the power required by existing processors, and reduce both capital expenditure and the operational budget, all while learning more and faster.28 Global revenue for AI products and services is expected to increase 56-fold, from $644 million in 2016 to $36 billion in 2025.29 The science required to exploit this vast opportunity and the material infrastructure that makes it possible have ignited an arms race among tech companies for the 10,000 or so professionals on the planet who know how to wield the technologies of machine intelligence to coax knowledge from an otherwise cacophonous data continent.

In 2017 Facebook boasted two billion monthly users uploading 350 million photos every day, a supply operation that the corporation’s own researchers refer to as “practically infinite.”56 In 2018 a Facebook research team announced that it had “closed the gap” and was now able to recognize faces “in the wild” with 97.35 percent accuracy, “closely approaching human-level performance.” The report highlights the corporation’s supply and manufacturing advantages, especially the use of “deep learning” based on “large training sets.”57 Facebook announced its eagerness to use facial recognition as a means to more powerful ad targeting, but even more of the uplift would come from the immense machine training opportunities represented by so many photos. By 2018, its machines were learning to discern activities, interests, mood, gaze, clothing, gait, hair, body type, and posture.58 The marketing possibilities are infinite.


pages: 200 words: 71,482

The Meaning of Everything: The Story of the Oxford English Dictionary by Simon Winchester

Buckminster Fuller, deep learning, Ernest Rutherford, Khartoum Gordon, Murray Gell-Mann, stakhanovite, wage slave

The English establishment of the day may be rightly derided at this remove as having been class-ridden and imperialist, bombastic and blimpish, racist and insouciant—but it was marked undeniably also by a sweeping erudition and confidence, and it was peopled by men and women who felt they were able to know all, to understand much, and in consequence to radiate the wisdom of deep learning. It is worth pointing this out simply because it was such people—such remarkable, polymathic, cultured, fascinated, wise, and leisured people—who were primarily involved in the creation of the mighty endeavour that the following account celebrates. On that idyllic and blissfully warm Derby Day evening, two magnificent social events were due to be staged, at the beating heart of the nation's social life.


Work in the Future The Automation Revolution-Palgrave MacMillan (2019) by Robert Skidelsky Nan Craig

3D printing, Airbnb, algorithmic trading, AlphaGo, Alvin Toffler, Amazon Web Services, anti-work, antiwork, artificial general intelligence, asset light, autonomous vehicles, basic income, behavioural economics, business cycle, cloud computing, collective bargaining, Computing Machinery and Intelligence, correlation does not imply causation, creative destruction, data is the new oil, data science, David Graeber, David Ricardo: comparative advantage, deep learning, DeepMind, deindustrialization, Demis Hassabis, deskilling, disintermediation, do what you love, Donald Trump, driverless car, Erik Brynjolfsson, fake news, feminist movement, Ford Model T, Frederick Winslow Taylor, future of work, Future Shock, general purpose technology, gig economy, global supply chain, income inequality, independent contractor, informal economy, Internet of things, Jarndyce and Jarndyce, Jarndyce and Jarndyce, job automation, job polarisation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Joseph Schumpeter, knowledge economy, Loebner Prize, low skilled workers, Lyft, Mark Zuckerberg, means of production, moral panic, Network effects, new economy, Nick Bostrom, off grid, pattern recognition, post-work, Ronald Coase, scientific management, Second Machine Age, self-driving car, sharing economy, SoftBank, Steve Jobs, strong AI, tacit knowledge, technological determinism, technoutopianism, TED Talk, The Chicago School, The Future of Employment, the market place, The Nature of the Firm, The Wealth of Nations by Adam Smith, Thorstein Veblen, Turing test, Uber for X, uber lyft, universal basic income, wealth creators, working poor

Given the current crazy levels of hype over the power of AI systems, my three-year-old daughter may well grow up believing that her best friend in university will be an android. Sadly, she, like her uncle and me, will be disappointed. While there has been a step change in the power of AI systems, brought about in the last decade by advances in deep learning techniques, AI systems are not nearly as intelligent as the press, politicians and philosophers would like us to believe. The hype is understandable: technology leaders have to hugely overstate the life-changing power of their AI systems to have any chance of gaining venture capital these days; journalists have to overstate the strength of results from AI projects, to compete in a clickbait environment; and in order to make a name for themselves, politicians and philosophers need to take an extreme and short-term view of AI in order for it to appear relevant and timely.


pages: 287 words: 69,655

Don't Trust Your Gut: Using Data to Get What You Really Want in LIfe by Seth Stephens-Davidowitz

affirmative action, Airbnb, cognitive bias, commoditize, correlation does not imply causation, COVID-19, Daniel Kahneman / Amos Tversky, data science, deep learning, digital map, Donald Trump, en.wikipedia.org, Erik Brynjolfsson, General Magic , global pandemic, Mark Zuckerberg, meta-analysis, Moneyball by Michael Lewis explains big data, Paul Graham, peak-end rule, randomized controlled trial, Renaissance Technologies, Sam Altman, science of happiness, selection bias, side hustle, Silicon Valley, Steve Jobs, Steve Wozniak, systematic bias, Tony Fadell, twin studies, Tyler Cowen, urban planning, Y Combinator

(These days, people are increasingly buying products on services such as Amazon Live, which allows people to pitch their products by video to potential customers.) Researchers were given videos of each sales pitch along with data on how much product was sold afterward. (They also had data on the product being sold, the price of the product, and whether they offered free shipping.) The methods: artificial intelligence and deep learning. The researchers converted their 62.32 million frames of video into data. In particular, the AI was able to code the emotional expression of the salesperson during the video. Did the salesperson appear angry? Disgusted? Scared? Surprised? Sad? Or happy? The result: the researchers found that the emotional expression of a salesperson was a major predictor of how much product they sold.


pages: 205 words: 71,872

Whistleblower: My Journey to Silicon Valley and Fight for Justice at Uber by Susan Fowler

"Susan Fowler" uber, Airbnb, Albert Einstein, Big Tech, Burning Man, cloud computing, data science, deep learning, DevOps, Donald Trump, Elon Musk, end-to-end encryption, fault tolerance, Grace Hopper, Higgs boson, Large Hadron Collider, Lyft, Maui Hawaii, messenger bag, microservices, Mitch Kapor, Richard Feynman, ride hailing / ride sharing, self-driving car, Silicon Valley, TechCrunch disrupt, Travis Kalanick, Uber for X, uber lyft, work culture

I called up editors of small and large print magazines and learned from their mistakes and their triumphs; I took long walks through the city with tech journalists and editors who shared their hard-earned lessons. I had a long list of themes I wanted to cover in my new magazine—everything from biotech to deep learning, from fundraising and hiring software engineers to serverless architecture—and narrowing it down was difficult. Eventually, I landed on the topic of on-call best practices, because I knew that even in the worst-case scenario where nobody else could write for the magazine, the engineering team at Stripe and I could put something together pretty quickly.


pages: 602 words: 177,874

Thank You for Being Late: An Optimist's Guide to Thriving in the Age of Accelerations by Thomas L. Friedman

3D printing, additive manufacturing, affirmative action, Airbnb, AltaVista, Amazon Web Services, Anthropocene, Apple Newton, autonomous vehicles, Ayatollah Khomeini, barriers to entry, Berlin Wall, Bernie Sanders, Big Tech, biodiversity loss, bitcoin, blockchain, Bob Noyce, business cycle, business process, call centre, carbon tax, centre right, Chris Wanstrath, Clayton Christensen, clean tech, clean water, cloud computing, cognitive load, corporate social responsibility, creative destruction, CRISPR, crowdsourcing, data science, David Brooks, deep learning, demand response, demographic dividend, demographic transition, Deng Xiaoping, digital divide, disinformation, Donald Trump, dual-use technology, end-to-end encryption, Erik Brynjolfsson, fail fast, failed state, Fairchild Semiconductor, Fall of the Berlin Wall, Ferguson, Missouri, first square of the chessboard / second half of the chessboard, Flash crash, fulfillment center, game design, gig economy, global pandemic, global supply chain, Great Leap Forward, illegal immigration, immigration reform, income inequality, indoor plumbing, intangible asset, Intergovernmental Panel on Climate Change (IPCC), Internet of things, invention of the steam engine, inventory management, Irwin Jacobs: Qualcomm, Jeff Bezos, job automation, John Markoff, John von Neumann, Khan Academy, Kickstarter, knowledge economy, knowledge worker, land tenure, linear programming, Live Aid, low interest rates, low skilled workers, Lyft, Marc Andreessen, Mark Zuckerberg, mass immigration, Maui Hawaii, Menlo Park, Mikhail Gorbachev, mutually assured destruction, Neil Armstrong, Nelson Mandela, ocean acidification, PalmPilot, pattern recognition, planetary scale, power law, pull request, Ralph Waldo Emerson, ransomware, Ray Kurzweil, Richard Florida, ride hailing / ride sharing, Robert Gordon, Ronald Reagan, Salesforce, Second Machine Age, self-driving car, shareholder value, sharing economy, Silicon Valley, Skype, smart cities, Solyndra, South China Sea, Steve Jobs, subscription business, supercomputer in your pocket, synthetic biology, systems thinking, TaskRabbit, tech worker, TED Talk, The Rise and Fall of American Growth, Thomas L Friedman, Tony Fadell, transaction costs, Transnistria, uber lyft, undersea cable, urban decay, urban planning, Watson beat the top human players on Jeopardy!, WikiLeaks, women in the workforce, Y2K, Yogi Berra, zero-sum game

So, for instance, in October 2015, Google released the basic algorithms for a program called TensorFlow for public consumption by the open-source community. TensorFlow is a set of algorithms that enable fast computers to do “deep learning” with big data sets to perform tasks better than a human brain. “By January 2016 we had a course online on how to use the TensorFlow open-source platform to write deep learning algorithms to teach a machine to do anything—copyediting, flying a plane, or legal discovery from documents,” explained Thrun. This is a huge new field of computer science. TensorFlow was released into the wild in October, and by January, Udacity, working directly with Google engineers, was teaching the skill on its platform.


Hands-On Machine Learning With Scikit-Learn, Keras, and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems by Aurelien Geron

AlphaGo, Amazon Mechanical Turk, Bayesian statistics, centre right, combinatorial explosion, constrained optimization, correlation coefficient, crowdsourcing, data science, deep learning, DeepMind, duck typing, en.wikipedia.org, Geoffrey Hinton, iterative process, Netflix Prize, NP-complete, optical character recognition, P = NP, p-value, pattern recognition, performance metric, recommendation engine, self-driving car, SpamAssassin, speech recognition, statistical model

A simpler approach to maximizing the ELBO is called black box stochastic variational inference (BBSVI): at each iteration, a few samples are drawn from q and they are used to estimate the gradients of the ELBO with regards to the variational parameters λ, which are then used in a gradient ascent step. This approach makes it possible to use Bayesian inference with any kind of model (provided it is differentiable), even deep neural networks: this is called Bayesian deep learning. Tip If you want to dive deeper into Bayesian statistics, check out the Bayesian Data Analysis book by Andrew Gelman, John Carlin, Hal Stern, David Dunson, Aki Vehtari, and Donald Rubin. Gaussian mixture models work great on clusters with ellipsoidal shapes, but if you try to fit a dataset with different shapes, you may have bad surprises.


pages: 280 words: 76,638

Rebel Ideas: The Power of Diverse Thinking by Matthew Syed

adjacent possible, agricultural Revolution, Alfred Russel Wallace, algorithmic bias, behavioural economics, Bletchley Park, Boeing 747, call centre, Cass Sunstein, classic study, cognitive load, computer age, crowdsourcing, cuban missile crisis, deep learning, delayed gratification, drone strike, Elon Musk, Erik Brynjolfsson, Fairchild Semiconductor, fake news, Ferguson, Missouri, Filter Bubble, Firefox, invention of writing, James Dyson, Jeff Bezos, knowledge economy, lateral thinking, market bubble, mass immigration, microbiome, Mitch Kapor, persistent metabolic adaptation, Peter Thiel, post-truth, Richard Thaler, Ronald Reagan, Second Machine Age, self-driving car, seminal paper, Silicon Valley, social intelligence, Steve Jobs, Steve Wozniak, Stuart Kauffman, tech worker, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, traveling salesman, vertical integration

Further research is taking place, not just in Segal’s lab but elsewhere, seeking to build more evidence.22 The goal is to use not merely the microbiome and genome to make dietary recommendations, but other personal factors such as medication, sleep and stress. Topol writes: What we really need to do is pull in multiple types of data . . . from multiple devices, like skin patches and smartwatches. With advanced algorithms, this is eminently doable. In the next few years, you could have a virtual health coach that is deep learning about your relevant health metrics and providing you with customized dietary recommendations. Yet diet is merely one branch of this conceptual revolution. In almost all areas of our lives, we will find ourselves moving from the era of standardisation to the era of personalisation. If this transformation is guided with wisdom, it has the potential to improve health, happiness and productivity, too.


The Smartphone Society by Nicole Aschoff

"Susan Fowler" uber, 4chan, A Declaration of the Independence of Cyberspace, Airbnb, algorithmic bias, algorithmic management, Amazon Web Services, artificial general intelligence, autonomous vehicles, barriers to entry, Bay Area Rapid Transit, Bernie Sanders, Big Tech, Black Lives Matter, blockchain, carbon footprint, Carl Icahn, Cass Sunstein, citizen journalism, cloud computing, correlation does not imply causation, crony capitalism, crowdsourcing, cryptocurrency, data science, deep learning, DeepMind, degrowth, Demis Hassabis, deplatforming, deskilling, digital capitalism, digital divide, do what you love, don't be evil, Donald Trump, Downton Abbey, Edward Snowden, Elon Musk, Evgeny Morozov, fake news, feminist movement, Ferguson, Missouri, Filter Bubble, financial independence, future of work, gamification, gig economy, global value chain, Google Chrome, Google Earth, Googley, green new deal, housing crisis, income inequality, independent contractor, Jaron Lanier, Jeff Bezos, Jessica Bruder, job automation, John Perry Barlow, knowledge economy, late capitalism, low interest rates, Lyft, M-Pesa, Mark Zuckerberg, minimum wage unemployment, mobile money, moral panic, move fast and break things, Naomi Klein, Network effects, new economy, Nicholas Carr, Nomadland, occupational segregation, Occupy movement, off-the-grid, offshore financial centre, opioid epidemic / opioid crisis, PageRank, Patri Friedman, peer-to-peer, Peter Thiel, pets.com, planned obsolescence, quantitative easing, Ralph Waldo Emerson, RAND corporation, Ray Kurzweil, RFID, Richard Stallman, ride hailing / ride sharing, Rodney Brooks, Ronald Reagan, Salesforce, Second Machine Age, self-driving car, shareholder value, sharing economy, Sheryl Sandberg, Shoshana Zuboff, Sidewalk Labs, Silicon Valley, single-payer health, Skype, Snapchat, SoftBank, statistical model, Steve Bannon, Steve Jobs, surveillance capitalism, TaskRabbit, tech worker, technological determinism, TED Talk, the scientific method, The Structural Transformation of the Public Sphere, TikTok, transcontinental railway, transportation-network company, Travis Kalanick, Uber and Lyft, Uber for X, uber lyft, upwardly mobile, Vision Fund, W. E. B. Du Bois, wages for housework, warehouse robotics, WikiLeaks, women in the workforce, yottabyte

Popular weariness and distrust of Silicon Valley and the technology it is developing are eloquently expressed in works of popular culture: television shows such as Silicon Valley, Westworld, and Black Mirror and novels such as Whiskey Tango Foxtrot, The Circle, and the uncannily prescient Super Sad True Love Story These pop explorations of how technology is shaping society range from dyspeptic satire to terrified (and terrifying) dystopian depictions of the future, should we continue down our current path. They function as real-time critique. Despite all the mystique surrounding deep learning and the dark web, our pop culture dystopias reveal a society well on its way to articulating a clear critique of what we don’t like about the Silicon Valley vision of the future.29 Our clarity is in part linked to the peculiar fact that many Silicon Valley visions about what technology should look like, and what our aspirations regarding technology should be, were originally located in science fiction.


pages: 256 words: 73,068

12 Bytes: How We Got Here. Where We Might Go Next by Jeanette Winterson

"Margaret Hamilton" Apollo, "World Economic Forum" Davos, 3D printing, Ada Lovelace, Airbnb, Albert Einstein, Alignment Problem, Amazon Mechanical Turk, Anthropocene, Apollo 11, Apple's 1984 Super Bowl advert, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, basic income, Big Tech, bitcoin, Bletchley Park, blockchain, Boston Dynamics, call centre, Cambridge Analytica, Capital in the Twenty-First Century by Thomas Piketty, cashless society, Charles Babbage, computer age, Computing Machinery and Intelligence, coronavirus, COVID-19, CRISPR, cryptocurrency, dark matter, Dava Sobel, David Graeber, deep learning, deskilling, digital rights, discovery of DNA, Dominic Cummings, Donald Trump, double helix, driverless car, Elon Musk, fake news, flying shuttle, friendly AI, gender pay gap, global village, Grace Hopper, Gregor Mendel, hive mind, housing crisis, Internet of things, Isaac Newton, Jacquard loom, James Hargreaves, Jeff Bezos, Johannes Kepler, John von Neumann, Joseph-Marie Jacquard, Kickstarter, Large Hadron Collider, life extension, lockdown, lone genius, Mark Zuckerberg, means of production, microdosing, more computing power than Apollo, move fast and break things, natural language processing, Nick Bostrom, Norbert Wiener, off grid, OpenAI, operation paperclip, packet switching, Peter Thiel, pink-collar, Plato's cave, public intellectual, QAnon, QWERTY keyboard, Ray Kurzweil, rewilding, ride hailing / ride sharing, Rutger Bregman, Sam Altman, self-driving car, sharing economy, Sheryl Sandberg, Shoshana Zuboff, Silicon Valley, Skype, Snapchat, SoftBank, SpaceX Starlink, speech recognition, spinning jenny, stem cell, Stephen Hawking, Steve Bannon, Steve Jobs, Steven Levy, Steven Pinker, superintelligent machines, surveillance capitalism, synthetic biology, systems thinking, tech billionaire, tech worker, TED Talk, telepresence, telepresence robot, TikTok, trade route, Turing test, universal basic income, Virgin Galactic, Watson beat the top human players on Jeopardy!, women in the workforce, Y Combinator

While most chatbots are narrow AI – an algorithm designed to do one thing only, like order the pizza or run through your ‘choices’ before being transferred to a human – some chatbots seem smarter. Google engineer, inventor and futurist Ray Kurzweil’s Ramona will chat with you on a variety of topics. She’s a deep-learning system whose data-set is continuously augmented by her chats with humans. Kurzweil believes that Ramona will pass the Turing Test by 2029 – that is, she will be indistinguishable, online, from a human being. And that will be the big difference, because communication is not just about asking for information or issuing commands: humans like to do exactly what chatbots don’t do well right now – which is chat – and that implies purposeless, not goal-oriented, diverse, random, often low-level, yet pleasurable communication.


pages: 282 words: 81,873

Live Work Work Work Die: A Journey Into the Savage Heart of Silicon Valley by Corey Pein

"World Economic Forum" Davos, 23andMe, 4chan, affirmative action, Affordable Care Act / Obamacare, Airbnb, Amazon Mechanical Turk, Anne Wojcicki, artificial general intelligence, bank run, barriers to entry, Benevolent Dictator For Life (BDFL), Bernie Sanders, Big Tech, bitcoin, Bitcoin Ponzi scheme, Build a better mousetrap, California gold rush, cashless society, colonial rule, computer age, cryptocurrency, data is the new oil, deep learning, digital nomad, disruptive innovation, Donald Trump, Douglas Hofstadter, driverless car, Elon Musk, Evgeny Morozov, Extropian, fail fast, fake it until you make it, fake news, gamification, gentrification, gig economy, Google bus, Google Glasses, Google X / Alphabet X, Greyball, growth hacking, hacker house, Hacker News, hive mind, illegal immigration, immigration reform, independent contractor, intentional community, Internet of things, invisible hand, Isaac Newton, Jeff Bezos, job automation, Kevin Kelly, Khan Academy, Larry Ellison, Law of Accelerating Returns, Lean Startup, life extension, Lyft, Mahatma Gandhi, Marc Andreessen, Mark Zuckerberg, Menlo Park, minimum viable product, move fast and break things, mutually assured destruction, Neal Stephenson, obamacare, Parker Conrad, passive income, patent troll, Patri Friedman, Paul Graham, peer-to-peer lending, Peter H. Diamandis: Planetary Resources, Peter Thiel, platform as a service, plutocrats, Ponzi scheme, post-work, public intellectual, Ray Kurzweil, regulatory arbitrage, rent control, RFID, Robert Mercer, rolodex, Ronald Reagan, Ross Ulbricht, Ruby on Rails, Sam Altman, Sand Hill Road, Scientific racism, self-driving car, selling pickaxes during a gold rush, sharing economy, side project, Silicon Valley, Silicon Valley billionaire, Silicon Valley startup, Singularitarianism, Skype, Snapchat, Social Justice Warrior, social software, software as a service, source of truth, South of Market, San Francisco, Startup school, stealth mode startup, Steve Bannon, Steve Jobs, Steve Wozniak, TaskRabbit, tech billionaire, tech bro, tech worker, TechCrunch disrupt, technological singularity, technoutopianism, telepresence, too big to fail, Travis Kalanick, tulip mania, Tyler Cowen, Uber for X, uber lyft, ubercab, unit 8200, upwardly mobile, Vernor Vinge, vertical integration, Virgin Galactic, X Prize, Y Combinator, Zenefits

But SU also staged two-day traveling seminars called Singularity Summits. The next summit was to be held in Amsterdam. The promotional materials promised fantastical revelations to all who attended. Sessions on the “revolution in robotics and artificial intelligence” would cover the latest in drones, “telepresence,” and something called “deep learning.” Other speakers would explore the possibilities of bodily implants, exoskeletons, 3-D-printed organs, and nanomedicine. There would be sessions on “organizing society for accelerating change,” which was to tack government and the relationship of technology to “unemployment and inequality.” Finally, for those looking to cash in on this sneak peek at the future, the summit would feature sessions on startups and entrepreneurship in the era of “exponential technology”—a shorthand phrase describing, per Kurzweil’s theories, how the pace of invention has allegedly accelerated through history, bringing us to this moment on the cusp of the Singularity.


pages: 472 words: 80,835

Life as a Passenger: How Driverless Cars Will Change the World by David Kerrigan

3D printing, Airbnb, airport security, Albert Einstein, autonomous vehicles, big-box store, Boeing 747, butterfly effect, call centre, car-free, Cesare Marchetti: Marchetti’s constant, Chris Urmson, commoditize, computer vision, congestion charging, connected car, DARPA: Urban Challenge, data science, deep learning, DeepMind, deskilling, disruptive innovation, Donald Shoup, driverless car, edge city, Elon Musk, en.wikipedia.org, fake news, Ford Model T, future of work, General Motors Futurama, hype cycle, invention of the wheel, Just-in-time delivery, Lewis Mumford, loss aversion, Lyft, Marchetti’s constant, Mars Rover, megacity, Menlo Park, Metcalfe’s law, Minecraft, Nash equilibrium, New Urbanism, QWERTY keyboard, Ralph Nader, RAND corporation, Ray Kurzweil, ride hailing / ride sharing, Rodney Brooks, Sam Peltzman, self-driving car, sensor fusion, Silicon Valley, Simon Kuznets, smart cities, Snapchat, Stanford marshmallow experiment, Steve Jobs, technological determinism, technoutopianism, TED Talk, the built environment, Thorstein Veblen, traffic fines, transit-oriented development, Travis Kalanick, trolley problem, Uber and Lyft, Uber for X, uber lyft, Unsafe at Any Speed, urban planning, urban sprawl, warehouse robotics, Yogi Berra, young professional, zero-sum game, Zipcar

Driverless car makers have put a lot of effort into solving this seemingly simple problem. There is also the risk of a computer being confused by signs like this, which they could mistake for an actual traffic signal: Could a driverless car mistake these signs for a traffic signal? However, driverless cars are being taught through deep learning to identify this and similar signs so as not to confuse them with traffic lights. The best solution to date is for driverless car manufacturers to create a prior map of traffic signals, enabling the driverless cars and its perception systems to anticipate the locations of traffic lights and improve detection of the light state.[270] Thus the vehicle can predict when it should expect traffic lights and concentrate its search.


pages: 247 words: 81,135

The Great Fragmentation: And Why the Future of All Business Is Small by Steve Sammartino

3D printing, additive manufacturing, Airbnb, augmented reality, barriers to entry, behavioural economics, Bill Gates: Altair 8800, bitcoin, BRICs, Buckminster Fuller, citizen journalism, collaborative consumption, cryptocurrency, data science, David Heinemeier Hansson, deep learning, disruptive innovation, driverless car, Dunbar number, Elon Musk, fiat currency, Frederick Winslow Taylor, game design, gamification, Google X / Alphabet X, haute couture, helicopter parent, hype cycle, illegal immigration, index fund, Jeff Bezos, jimmy wales, Kickstarter, knowledge economy, Law of Accelerating Returns, lifelogging, market design, Mary Meeker, Metcalfe's law, Minecraft, minimum viable product, Network effects, new economy, peer-to-peer, planned obsolescence, post scarcity, prediction markets, pre–internet, profit motive, race to the bottom, random walk, Ray Kurzweil, recommendation engine, remote working, RFID, Rubik’s Cube, scientific management, self-driving car, sharing economy, side project, Silicon Valley, Silicon Valley startup, skunkworks, Skype, social graph, social web, software is eating the world, Steve Jobs, subscription business, survivorship bias, The Home Computer Revolution, the long tail, too big to fail, US Airways Flight 1549, vertical integration, web application, zero-sum game

The new general practitioners of business need to make the decision to keep up, just as doctors must with their journals and conferences. There’s no choice. It’s what the new market demands; every youngster entering our industry keeps up to date by default. It’s not even a task for them; they enjoy it. It’s the world they were born into. So unless we decide to enjoy it too, and go for deep learning by using the tools, we’ll not only be left behind, but probably replaced. Now that we’re escaping the industrial machine, it’s about time marketers realised that people are not interchangeable widgets and that they would rather be spoken to and about by a human voice. What is fragmenting We’re being freed from our life in boxes.


pages: 308 words: 84,713

The Glass Cage: Automation and Us by Nicholas Carr

Airbnb, Airbus A320, Andy Kessler, Atul Gawande, autonomous vehicles, Bernard Ziegler, business process, call centre, Captain Sullenberger Hudson, Charles Lindbergh, Checklist Manifesto, cloud computing, cognitive load, computerized trading, David Brooks, deep learning, deliberate practice, deskilling, digital map, Douglas Engelbart, driverless car, drone strike, Elon Musk, Erik Brynjolfsson, Evgeny Morozov, Flash crash, Frank Gehry, Frank Levy and Richard Murnane: The New Division of Labor, Frederick Winslow Taylor, future of work, gamification, global supply chain, Google Glasses, Google Hangouts, High speed trading, human-factors engineering, indoor plumbing, industrial robot, Internet of things, Ivan Sutherland, Jacquard loom, James Watt: steam engine, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Kevin Kelly, knowledge worker, low interest rates, Lyft, machine readable, Marc Andreessen, Mark Zuckerberg, means of production, natural language processing, new economy, Nicholas Carr, Norbert Wiener, Oculus Rift, pattern recognition, Peter Thiel, place-making, plutocrats, profit motive, Ralph Waldo Emerson, RAND corporation, randomized controlled trial, Ray Kurzweil, recommendation engine, robot derives from the Czech word robota Czech, meaning slave, scientific management, Second Machine Age, self-driving car, Silicon Valley, Silicon Valley ideology, software is eating the world, Stephen Hawking, Steve Jobs, systems thinking, tacit knowledge, TaskRabbit, technological determinism, technological solutionism, technoutopianism, TED Talk, The Wealth of Nations by Adam Smith, turn-by-turn navigation, Tyler Cowen, US Airways Flight 1549, Watson beat the top human players on Jeopardy!, William Langewiesche

That’s what computer automation often does today, and it’s why Whitehead’s observation has become misleading as a guide to technology’s consequences. Rather than extending the brain’s innate capacity for automaticity, automation too often becomes an impediment to automatization. In relieving us of repetitive mental exercise, it also relieves us of deep learning. Both complacency and bias are symptoms of a mind that is not being challenged, that is not fully engaged in the kind of real-world practice that generates knowledge, enriches memory, and builds skill. The problem is compounded by the way computer systems distance us from direct and immediate feedback about our actions.


pages: 304 words: 80,143

The Autonomous Revolution: Reclaiming the Future We’ve Sold to Machines by William Davidow, Michael Malone

2013 Report for America's Infrastructure - American Society of Civil Engineers - 19 March 2013, agricultural Revolution, Airbnb, AlphaGo, American Society of Civil Engineers: Report Card, Automated Insights, autonomous vehicles, basic income, benefit corporation, bitcoin, blockchain, blue-collar work, Bob Noyce, business process, call centre, Cambridge Analytica, cashless society, citizen journalism, Clayton Christensen, collaborative consumption, collaborative economy, collective bargaining, creative destruction, crowdsourcing, cryptocurrency, deep learning, DeepMind, disintermediation, disruptive innovation, distributed ledger, en.wikipedia.org, Erik Brynjolfsson, fake news, Filter Bubble, Ford Model T, Francis Fukuyama: the end of history, general purpose technology, Geoffrey West, Santa Fe Institute, gig economy, Gini coefficient, high-speed rail, holacracy, Hyperloop, income inequality, industrial robot, Internet of things, invention of agriculture, invention of movable type, invention of the printing press, invisible hand, Jane Jacobs, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Joseph Schumpeter, license plate recognition, low interest rates, Lyft, Mark Zuckerberg, mass immigration, Network effects, new economy, peer-to-peer lending, QWERTY keyboard, ransomware, Richard Florida, Robert Gordon, robo advisor, Ronald Reagan, Second Machine Age, self-driving car, sharing economy, Shoshana Zuboff, Silicon Valley, Simon Kuznets, Skinner box, Snapchat, speech recognition, streetcar suburb, Stuxnet, surveillance capitalism, synthetic biology, TaskRabbit, The Death and Life of Great American Cities, The Rise and Fall of American Growth, the scientific method, trade route, Turing test, two and twenty, Uber and Lyft, uber lyft, universal basic income, uranium enrichment, urban planning, vertical integration, warehouse automation, zero day, zero-sum game, Zipcar

But some kinds of institutions are almost totally information proxies in disguise. Retail, banking, finance, and monetary systems are examples of institutions with extremely high information proxy content. One would expect them to be significantly transformed. INTELLIGENCE EQUIVALENCE Advances in artificial intelligence, deep learning, neural network processing, and big data have unleashed the forces of intelligence equivalence. Machines are now capable of intelligent behavior. In many applications, they can substitute for humans’ brains, minds, and senses. For more than one hundred years, technologists involved in computation have speculated about and attempted to construct machines that would exhibit intelligent behavior.


pages: 289 words: 86,165

Ten Lessons for a Post-Pandemic World by Fareed Zakaria

"there is no alternative" (TINA), 15-minute city, AlphaGo, An Inconvenient Truth, anti-fragile, Asian financial crisis, basic income, Bernie Sanders, Boris Johnson, butterfly effect, Capital in the Twenty-First Century by Thomas Piketty, car-free, carbon tax, central bank independence, clean water, cloud computing, colonial rule, contact tracing, coronavirus, COVID-19, Credit Default Swap, David Graeber, Day of the Dead, deep learning, DeepMind, deglobalization, Demis Hassabis, Deng Xiaoping, digital divide, Dominic Cummings, Donald Trump, Edward Glaeser, Edward Jenner, Elon Musk, Erik Brynjolfsson, failed state, financial engineering, Francis Fukuyama: the end of history, future of work, gentrification, George Floyd, gig economy, Gini coefficient, global pandemic, global reserve currency, global supply chain, green new deal, hiring and firing, housing crisis, imperial preference, income inequality, Indoor air pollution, invention of the wheel, Jane Jacobs, Jeff Bezos, Jeremy Corbyn, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Snow's cholera map, junk bonds, lockdown, Long Term Capital Management, low interest rates, manufacturing employment, Marc Andreessen, Mark Zuckerberg, Martin Wolf, means of production, megacity, Mexican peso crisis / tequila crisis, middle-income trap, Monroe Doctrine, Nate Silver, Nick Bostrom, oil shock, open borders, out of africa, Parag Khanna, Paris climate accords, Peter Thiel, plutocrats, popular capitalism, Productivity paradox, purchasing power parity, remote working, reserve currency, reshoring, restrictive zoning, ride hailing / ride sharing, Ronald Reagan, secular stagnation, Silicon Valley, social distancing, software is eating the world, South China Sea, Steve Bannon, Steve Jobs, Steven Pinker, Suez crisis 1956, TED Talk, the built environment, The Death and Life of Great American Cities, The inhabitant of London could order by telephone, sipping his morning tea in bed, the various products of the whole earth, The Spirit Level, The Wealth of Nations by Adam Smith, Thomas L Friedman, Tim Cook: Apple, trade route, UNCLOS, universal basic income, urban planning, Washington Consensus, white flight, Works Progress Administration, zoonotic diseases

PublicationDocumentID=6322. 104 “25% of our workforce”: Sonal Khetarpal, “Post-COVID, 75% of 4.5 Lakh TCS Employees to Permanently Work from Home by ’25; from 20%,” Business Today India, April 30, 2020. 104 issued a correction: Saunak Chowdhury, “TCS Refutes Claims of 75% Employees Working from Home Post Lock-Down,” Indian Wire, April 28, 2020. 104 450,000 employees: Tata Consultancy Services, “About Us,” https://www.tcs.com/about-us. 106 up one billion: Jeff Becker and Arielle Trzcinski, “US Virtual Care Visits to Soar to More Than 1 Billion,” Forrester Analytics, April 10, 2020, https://go.forrester.com/press-newsroom/us-virtual-care-visits-to-soar-to-more-than-1-billion/. 106 “greatest contribution to mankind”: Lizzy Gurdus, “Tim Cook: Apple’s Greatest Contribution Will Be ‘About Health,’ ” CNBC Mad Money, January 8, 2019. 107 97% accuracy: “Using Artificial Intelligence to Classify Lung Cancer Types, Predict Mutations,” National Cancer Institute, October 10, 2018, https://www.cancer.gov/news-events/cancer-currents-blog/2018/artificial-intelligence-lung-cancer-classification. 107 up to 11% fewer false positives: D. Ardila, A. P. Kiraly, S. Bharadwaj et al., “End-to-End Lung Cancer Screening with Three-Dimensional Deep Learning on Low-Dose Chest Computed Tomography,” Nature Medicine 25 (2019): 954–61, https://doi.org/10.1038/s41591–019–0447-x. 107 designing proteins to block the virus: Kim Martineau, “Marshaling Artificial Intelligence in the Fight Against Covid-19,” MIT Quest for Intelligence, MIT News, May 19, 2020, http://news.mit.edu/2020/mit-marshaling-artificial-intelligence-fight-against-covid-19–0519. 108 hoped that AI might find solutions . . .


pages: 263 words: 81,527

The Mind Is Flat: The Illusion of Mental Depth and the Improvised Mind by Nick Chater

Albert Einstein, battle of ideas, behavioural economics, classic study, computer vision, Daniel Kahneman / Amos Tversky, deep learning, double helix, Geoffrey Hinton, Henri Poincaré, Jacquard loom, lateral thinking, loose coupling, machine translation, speech recognition, tacit knowledge

Hopfield (1982), ‘Neural networks and physical systems with emergent collective computational abilities’, Proceedings of the National Academy of Sciences of the United States of America, 79(8), 2554–8). Importantly, there are powerful theoretical ideas concerning how such networks learn the constraints that govern the external world from experience (e.g. Y. LeCun, Y. Bengio and G. Hinton (2015), ‘Deep learning’, Nature, 521(7553): 436–44.). 4 Although in a digital computer, cooperative computation across the entire web of constraints is not so straightforward – more sequential methods of searching the web are often used instead. 5 The idea of ‘direct’ perception, which has been much discussed in psychology, is appealing, I think, precisely because we are only ever aware of the output of the cycle of thought: we are oblivious to the calculations involved, and the speed with which the cycle of thought can generate the illusion that our conscious experience must be in immediate contact with reality. 6 H. von Helmholtz, Handbuch der physiologischen Optik, vol. 3 (Leipzig: Voss, 1867).


pages: 290 words: 85,847

A Brief History of Motion: From the Wheel, to the Car, to What Comes Next by Tom Standage

accelerated depreciation, active transport: walking or cycling, autonomous vehicles, back-to-the-city movement, bike sharing, car-free, carbon footprint, Cesare Marchetti: Marchetti’s constant, Chris Urmson, City Beautiful movement, Clapham omnibus, congestion charging, coronavirus, COVID-19, deep learning, Didi Chuxing, Donald Shoup, driverless car, Elaine Herzberg, Elon Musk, flex fuel, Ford Model T, Ford paid five dollars a day, garden city movement, General Motors Futurama, Ida Tarbell, Induced demand, interchangeable parts, invention of the wheel, James Watt: steam engine, Jane Jacobs, jitney, Joan Didion, John Zimmer (Lyft cofounder), Lewis Mumford, lockdown, Lyft, Marshall McLuhan, minimum wage unemployment, oil shock, Own Your Own Home, peak oil, prompt engineering, Ralph Nader, Richard Florida, ride hailing / ride sharing, Rosa Parks, safety bicycle, self-driving car, social distancing, Steve Jobs, streetcar suburb, tech bro, The Death and Life of Great American Cities, trade route, Travis Kalanick, Uber and Lyft, uber lyft, unbiased observer, Unsafe at Any Speed, Upton Sinclair, urban planning, urban sprawl, Victor Gruen, W. E. B. Du Bois, walkable city, white flight, wikimedia commons, Yom Kippur War, Zipcar

Having gathered and combined the data from its sensors, the car needs to work out what everything is. In particular, it must identify other vehicles, pedestrians, cyclists, road markings, traffic lights, road signs, and so forth. Humans find this easy, and machines used to find it difficult. But machine vision has in recent years improved enormously, thanks to the use of deep learning, an artificial-intelligence technique in which systems learn to perform particular tasks by analyzing thousands of labeled examples. For autonomous cars, this means getting hold of thousands of images of street scenes, with each element carefully labeled, so that a perception system can be trained to recognize them.


pages: 335 words: 86,900

Empire of Ants: The Hidden Worlds and Extraordinary Lives of Earth's Tiny Conquerors by Susanne Foitzik, Olaf Fritsche

deep learning, epigenetics, megacity, microbiome, phenotype, random walk, trade route

parasitic fungi can turn ants into zombies Araújo, J. P. M. et al. (2018). Zombie-ant fungi across continents: 15 new species and new combinations within Ophiocordyceps. I. Myrmecophilous hirsutelloid species. Studies in Mycology, 90, 119–60. Fredericksen, M. A. et al. (2017). Three-dimensional visualization and a deep-learning model reveal complex fungal parasite networks in behaviorally manipulated ants. Proceedings of the National Academy of Sciences USA, 114, 12590–95. Hughes, D. P. et al. (2011). Ancient death-grip leaf scars reveal ant-fungal parasitism. Biology Letters, 7, 67–70. Kobmoo, N. et al. (2019). Population genomics revealed cryptic species within host-specific zombie-ant fungi (Ophiocordyceps unilateralis).


pages: 295 words: 81,861

Road to Nowhere: What Silicon Valley Gets Wrong About the Future of Transportation by Paris Marx

2013 Report for America's Infrastructure - American Society of Civil Engineers - 19 March 2013, A Declaration of the Independence of Cyberspace, Airbnb, An Inconvenient Truth, autonomous vehicles, back-to-the-land, Berlin Wall, Bernie Sanders, bike sharing, Californian Ideology, car-free, carbon credits, carbon footprint, cashless society, clean tech, cloud computing, colonial exploitation, computer vision, congestion pricing, corporate governance, correlation does not imply causation, COVID-19, DARPA: Urban Challenge, David Graeber, deep learning, degrowth, deindustrialization, deskilling, Didi Chuxing, digital map, digital rights, Donald Shoup, Donald Trump, Douglas Engelbart, Douglas Engelbart, driverless car, Elaine Herzberg, Elon Musk, energy transition, Evgeny Morozov, Extinction Rebellion, extractivism, Fairchild Semiconductor, Ford Model T, frictionless, future of work, General Motors Futurama, gentrification, George Gilder, gig economy, gigafactory, global pandemic, global supply chain, Google Glasses, Google X / Alphabet X, green new deal, Greyball, high-speed rail, Hyperloop, independent contractor, Induced demand, intermodal, Jane Jacobs, Jeff Bezos, jitney, John Perry Barlow, Kevin Kelly, knowledge worker, late capitalism, Leo Hollis, lockdown, low interest rates, Lyft, Marc Benioff, market fundamentalism, minimum viable product, Mother of all demos, move fast and break things, Murray Bookchin, new economy, oil shock, packet switching, Pacto Ecosocial del Sur, Peter Thiel, pre–internet, price mechanism, private spaceflight, quantitative easing, QWERTY keyboard, Ralph Nader, Richard Florida, ride hailing / ride sharing, Ronald Reagan, safety bicycle, Salesforce, School Strike for Climate, self-driving car, Sidewalk Labs, Silicon Valley, Silicon Valley billionaire, Silicon Valley ideology, Silicon Valley startup, smart cities, social distancing, Southern State Parkway, Steve Jobs, Stewart Brand, Stop de Kindermoord, streetcar suburb, tech billionaire, tech worker, techlash, technological determinism, technological solutionism, technoutopianism, the built environment, The Death and Life of Great American Cities, TikTok, transit-oriented development, transportation-network company, Travis Kalanick, Uber and Lyft, Uber for X, uber lyft, Unsafe at Any Speed, urban planning, urban renewal, VTOL, walkable city, We are as Gods, We wanted flying cars, instead we got 140 characters, WeWork, Whole Earth Catalog, Whole Earth Review, work culture , Yom Kippur War, young professional

As the date when autonomous vehicles were supposed to arrive came and went, and the challenges facing the technology became apparent both to those in the industry who were trying to make progress with their driving systems and to the public who began to see a growing number of stories about autonomous vehicles crashing in troubling ways, some experts started to discuss how more than just a smart AI would be necessary to bring their fantasies to life. In 2018, the Verge reported that Andrew Ng, one of the co-founders of the Google Brain deep-learning AI team, said, “the problem is less about building a perfect driving system than training bystanders to anticipate self-driving behavior.” He added, “we should partner with the government to ask people to be lawful and considerate. Safety isn’t just about the quality of the AI technology.”15 What Ng described was a far cry from what people like Brin or Musk were saying autonomous vehicles would do.


pages: 366 words: 94,209

Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity by Douglas Rushkoff

activist fund / activist shareholder / activist investor, Airbnb, Alan Greenspan, algorithmic trading, Amazon Mechanical Turk, Andrew Keen, bank run, banking crisis, barriers to entry, benefit corporation, bitcoin, blockchain, Burning Man, business process, buy and hold, buy low sell high, California gold rush, Capital in the Twenty-First Century by Thomas Piketty, carbon footprint, centralized clearinghouse, citizen journalism, clean water, cloud computing, collaborative economy, collective bargaining, colonial exploitation, Community Supported Agriculture, corporate personhood, corporate raider, creative destruction, crowdsourcing, cryptocurrency, data science, deep learning, disintermediation, diversified portfolio, Dutch auction, Elon Musk, Erik Brynjolfsson, Ethereum, ethereum blockchain, fiat currency, Firefox, Flash crash, full employment, future of work, gamification, Garrett Hardin, gentrification, gig economy, Gini coefficient, global supply chain, global village, Google bus, Howard Rheingold, IBM and the Holocaust, impulse control, income inequality, independent contractor, index fund, iterative process, Jaron Lanier, Jeff Bezos, jimmy wales, job automation, Joseph Schumpeter, Kickstarter, Large Hadron Collider, loss aversion, low interest rates, Lyft, Marc Andreessen, Mark Zuckerberg, market bubble, market fundamentalism, Marshall McLuhan, means of production, medical bankruptcy, minimum viable product, Mitch Kapor, Naomi Klein, Network effects, new economy, Norbert Wiener, Oculus Rift, passive investing, payday loans, peer-to-peer lending, Peter Thiel, post-industrial society, power law, profit motive, quantitative easing, race to the bottom, recommendation engine, reserve currency, RFID, Richard Stallman, ride hailing / ride sharing, Ronald Reagan, Russell Brand, Satoshi Nakamoto, Second Machine Age, shareholder value, sharing economy, Silicon Valley, Snapchat, social graph, software patent, Steve Jobs, stock buybacks, TaskRabbit, the Cathedral and the Bazaar, The Future of Employment, the long tail, trade route, Tragedy of the Commons, transportation-network company, Turing test, Uber and Lyft, Uber for X, uber lyft, unpaid internship, Vitalik Buterin, warehouse robotics, Wayback Machine, Y Combinator, young professional, zero-sum game, Zipcar

Other companies have opted to become what are known as “flexible purpose” corporations, which allows them to emphasize pretty much any priority over profits—it doesn’t even have to be explicitly beneficial to society at large.74 Flexible purpose corporations also enjoy looser reporting standards than do benefit corporations.75 Vicarious, a tech startup based in the Bay Area, is the sort of business for which the flex corp structure works well. Vicarious operates in the field of artificial intelligence and deep learning; its most celebrated project to date is an attempt to crack CAPTCHAs (those annoying tests of whether a user is human) using AI. Vicarious claims to have succeeded, and its first Turing test demonstrations appear to back up its claim.76 How would such a technology be deployed or monetized? Vicarious doesn’t need to worry about that just yet.


pages: 299 words: 88,375

Gray Day: My Undercover Mission to Expose America's First Cyber Spy by Eric O'Neill

active measures, autonomous vehicles, Berlin Wall, bitcoin, computer age, cryptocurrency, deep learning, disinformation, Dissolution of the Soviet Union, Edward Snowden, Fall of the Berlin Wall, false flag, fear of failure, full text search, index card, information security, Internet of things, Kickstarter, messenger bag, Mikhail Gorbachev, operational security, PalmPilot, ransomware, rent control, Robert Hanssen: Double agent, Ronald Reagan, Skype, thinkpad, Timothy McVeigh, web application, white picket fence, WikiLeaks, young professional

Imagine a collaboration of consumers, organizations, agencies, and businesses aligned in a common network of shared information. Any attempt to breach a single laptop or execute malware through an unfortunate mouse click by one member will instantly inoculate every other device on the network. Deep-learning analysis of all these devices in the cloud will allow a cybersecurity AI to identify and even predict attacks. An entire community of cybersecurity operations will simplify down to a single recurring OODA loop, one that continually resets and defeats attackers. Nobody wants to go back to file cabinets and typewriters.


pages: 374 words: 94,508

Infonomics: How to Monetize, Manage, and Measure Information as an Asset for Competitive Advantage by Douglas B. Laney

3D printing, Affordable Care Act / Obamacare, banking crisis, behavioural economics, blockchain, book value, business climate, business intelligence, business logic, business process, call centre, carbon credits, chief data officer, Claude Shannon: information theory, commoditize, conceptual framework, crowdsourcing, dark matter, data acquisition, data science, deep learning, digital rights, digital twin, discounted cash flows, disintermediation, diversification, en.wikipedia.org, endowment effect, Erik Brynjolfsson, full employment, hype cycle, informal economy, information security, intangible asset, Internet of things, it's over 9,000, linked data, Lyft, Nash equilibrium, Neil Armstrong, Network effects, new economy, obamacare, performance metric, profit motive, recommendation engine, RFID, Salesforce, semantic web, single source of truth, smart meter, Snapchat, software as a service, source of truth, supply-chain management, tacit knowledge, technological determinism, text mining, uber lyft, Y2K, yield curve

In the examples of Dollar General and Kroger, consumer packaged goods (CPG) companies and other suppliers may find they prefer doing business with these retailers because of the transparency and value afforded by the sales and other data made available. And maybe Amazon’s new brick-and-mortar store will outmaneuver grocery giants by using cameras, sensors, deep learning, and automatic payments to track what shoppers are selecting and eliminate the checkout process altogether. This information is monetized also by eliminating the cost of checkers and point-of-sale systems, saving shoppers time during which they’re likely to shop more, and licensing or generating insights from the data.


Learn Algorithmic Trading by Sebastien Donadio

active measures, algorithmic trading, automated trading system, backtesting, Bayesian statistics, behavioural economics, buy and hold, buy low sell high, cryptocurrency, data science, deep learning, DevOps, en.wikipedia.org, fixed income, Flash crash, Guido van Rossum, latency arbitrage, locking in a profit, market fundamentalism, market microstructure, martingale, natural language processing, OpenAI, p-value, paper trading, performance metric, prediction markets, proprietary trading, quantitative trading / quantitative finance, random walk, risk tolerance, risk-adjusted returns, Sharpe ratio, short selling, sorting algorithm, statistical arbitrage, statistical model, stochastic process, survivorship bias, transaction costs, type inference, WebSocket, zero-sum game

Other Books You May Enjoy If you enjoyed this book, you may be interested in these other books by Packt: Mastering Python for Finance - Second Edition James Ma Weiming ISBN: 9781789346466 Solve linear and nonlinear models representing various financial problems Perform principal component analysis on the DOW index and its components Analyze, predict, and forecast stationary and non-stationary time series processes Create an event-driven backtesting tool and measure your strategies Build a high-frequency algorithmic trading platform with Python Replicate the CBOT VIX index with SPX options for studying VIX-based strategies Perform regression-based and classification-based machine learning tasks for prediction Use TensorFlow and Keras in deep learning neural network architecture Hands-On Machine Learning for Algorithmic Trading Stefan Jansen ISBN: 9781789346411 Implement machine learning techniques to solve investment and trading problems Leverage market, fundamental, and alternative data to research alpha factors Design and fine-tune supervised, unsupervised, and reinforcement learning models Optimize portfolio risk and performance using pandas, NumPy, and scikit-learn Integrate machine learning models into a live trading strategy on Quantopian Evaluate strategies using reliable backtesting methodologies for time series Design and evaluate deep neural networks using Keras, PyTorch, and TensorFlow Work with reinforcement learning for trading strategies in the OpenAI Gym Leave a review - let other readers know what you think Please share your thoughts on this book with others by leaving a review on the site that you bought it from.


pages: 279 words: 87,875

Underwater: How Our American Dream of Homeownership Became a Nightmare by Ryan Dezember

"RICO laws" OR "Racketeer Influenced and Corrupt Organizations", activist fund / activist shareholder / activist investor, Airbnb, Bear Stearns, business cycle, call centre, Carl Icahn, Cesare Marchetti: Marchetti’s constant, cloud computing, collateralized debt obligation, company town, coronavirus, corporate raider, COVID-19, Credit Default Swap, credit default swaps / collateralized debt obligations, data science, deep learning, Donald Trump, Home mortgage interest deduction, housing crisis, interest rate swap, low interest rates, margin call, McMansion, mortgage debt, mortgage tax deduction, negative equity, opioid epidemic / opioid crisis, pill mill, rent control, rolodex, Savings and loan crisis, sharing economy, sovereign wealth fund, transaction costs

She was working on a follow-up to the 2017 paper Stefania Albanesi, “Investors in the 2007–2009 Housing Crisis: An Anatomy,” University of Pittsburgh, National Bureau of Economic Research, and Center for Economic and Policy Research, September 17, 2018. The surge of people going from two first mortgages to three Stefania Albanesi and Domonkos F. Vamossy, “Predicting Consumer Default: A Deep Learning Approach,” Working Papers 26165, National Bureau of Economic Research, August 20, 2019. “The great misnomer of the 2008 crisis” Manuel Adelino, Antoinette Schoar, and Felipe Severino, “The Role of Housing and Mortgage Markets in the Financial Crisis,” Annual Review of Financial Economics 10 (2018): 25–41. 25.


pages: 345 words: 92,063

Power, for All: How It Really Works and Why It's Everyone's Business by Julie Battilana, Tiziana Casciaro

"Friedman doctrine" OR "shareholder theory", "World Economic Forum" Davos, Abraham Maslow, affirmative action, agricultural Revolution, Albert Einstein, algorithmic bias, Andy Rubin, Asperger Syndrome, benefit corporation, Big Tech, BIPOC, Black Lives Matter, blood diamond, Boris Johnson, British Empire, call centre, Cass Sunstein, classic study, clean water, cognitive dissonance, collective bargaining, conceptual framework, coronavirus, COVID-19, CRISPR, deep learning, different worldview, digital rights, disinformation, Elon Musk, Erik Brynjolfsson, fake news, feminist movement, fundamental attribution error, future of work, George Floyd, gig economy, Greta Thunberg, hiring and firing, impact investing, income inequality, informal economy, Intergovernmental Panel on Climate Change (IPCC), invention of movable type, Jeff Bezos, job satisfaction, Joshua Gans and Andrew Leigh, Mahatma Gandhi, means of production, mega-rich, meritocracy, meta-analysis, Milgram experiment, moral hazard, Naomi Klein, Nelson Mandela, Occupy movement, Panopticon Jeremy Bentham, principal–agent problem, profit maximization, Ralph Waldo Emerson, ride hailing / ride sharing, Salesforce, School Strike for Climate, Second Machine Age, shareholder value, sharing economy, Sheryl Sandberg, Shoshana Zuboff, Silicon Valley, social distancing, Social Justice Warrior, Social Responsibility of Business Is to Increase Its Profits, Steven Pinker, surveillance capitalism, tacit knowledge, tech worker, the scientific method, The Wisdom of Crowds, TikTok, Tim Cook: Apple, transatlantic slave trade, union organizing, zero-sum game

As one of them put it, “It is one thing to see a VP get a promotion in the corporate sector; it is another to see a woman who hit rock bottom blossom. How do you measure that!?” Lia—who was completely dependent on the certified coaches to achieve her mission—had finally sorted out what the coaches valued most: inspirational purpose, transformative impact, deep learning, and a community of like-minded colleagues. Over time, she had made Up With Women irreplaceable for the coaches to access those valued resources all at once. It’s no wonder that you couldn’t find a more loyal group of volunteers if you tried. By understanding what the coaches needed and wanted, and then figuring out how she could give them access to those resources, Lia introduced a level of mutual dependence into the relationship.


pages: 340 words: 91,416

Lost in Math: How Beauty Leads Physics Astray by Sabine Hossenfelder

Adam Curtis, Albert Einstein, Albert Michelson, anthropic principle, Arthur Eddington, Brownian motion, clockwork universe, cognitive bias, cosmic microwave background, cosmological constant, cosmological principle, crowdsourcing, dark matter, data science, deep learning, double helix, game design, Henri Poincaré, Higgs boson, income inequality, Intergovernmental Panel on Climate Change (IPCC), Isaac Newton, Johannes Kepler, Large Hadron Collider, Murray Gell-Mann, Nick Bostrom, random walk, Richard Feynman, Schrödinger's Cat, Skype, Stephen Hawking, sunk-cost fallacy, systematic bias, TED Talk, the scientific method

Finding patterns and organizing information are tasks that are central to science, and those are the exact tasks that artificial neural networks are built to excel at. Such computers, designed to mimic the function of natural brains, now analyze data sets that no human can comprehend and search for correlations using deep-learning algorithms. There is no doubt that technological progress is changing what we mean by “doing science.” I try to imagine the day when we’ll just feed all cosmological data to an artificial intelligence (AI). We now wonder what dark matter and dark energy are, but this question might not even make sense to the AI.


pages: 292 words: 94,660

The Loop: How Technology Is Creating a World Without Choices and How to Fight Back by Jacob Ward

2021 United States Capitol attack, 4chan, Abraham Wald, AI winter, Albert Einstein, Albert Michelson, Amazon Mechanical Turk, assortative mating, autonomous vehicles, availability heuristic, barriers to entry, Bayesian statistics, Benoit Mandelbrot, Big Tech, bitcoin, Black Lives Matter, Black Swan, blockchain, Broken windows theory, call centre, Cass Sunstein, cloud computing, contact tracing, coronavirus, COVID-19, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, dark matter, data science, deep learning, Donald Trump, drone strike, endowment effect, George Akerlof, George Floyd, hindsight bias, invisible hand, Isaac Newton, Jeffrey Epstein, license plate recognition, lockdown, longitudinal study, Lyft, mandelbrot fractal, Mark Zuckerberg, meta-analysis, natural language processing, non-fungible token, nudge unit, OpenAI, opioid epidemic / opioid crisis, pattern recognition, QAnon, RAND corporation, Richard Thaler, Robert Shiller, selection bias, self-driving car, seminal paper, shareholder value, smart cities, social contagion, social distancing, Steven Levy, survivorship bias, TikTok, Turing test

And then there were the fundamental limitations of the era: a lack of computing power to crunch numbers, insufficient database capacity for the awesome amounts of information necessary to train an algorithm, and the fact that those limitations made it impossible to scale up small, one-off experiments into useful real-world systems. It took two funding winters, countless research dead ends, and exponentially greater computing power and data-storage capabilities to arrive at the present moment. Today, various flavors of machine learning, from deep-learning neural networks to the generative adversarial networks that pit two neural nets against one another, can do everything from read a printed menu to steer a car along a winding mountain road. Again, this all sounds very hot—and it is amazing stuff. But what is actually being delivered into your life needs to be understood clearly, so we can see what it does and, more important, what it doesn’t do.


pages: 328 words: 96,141

Rocket Billionaires: Elon Musk, Jeff Bezos, and the New Space Race by Tim Fernholz

Amazon Web Services, Apollo 13, autonomous vehicles, business climate, Charles Lindbergh, Clayton Christensen, cloud computing, Colonization of Mars, corporate governance, corporate social responsibility, deep learning, disruptive innovation, Donald Trump, Elon Musk, fail fast, fulfillment center, Gene Kranz, high net worth, high-speed rail, Iridium satellite, Jeff Bezos, Kickstarter, Kim Stanley Robinson, Kwajalein Atoll, low earth orbit, Marc Andreessen, Mark Zuckerberg, Mars Society, Masayoshi Son, megaproject, military-industrial complex, minimum viable product, multiplanetary species, mutually assured destruction, Neal Stephenson, Neil Armstrong, new economy, no-fly zone, nuclear paranoia, paypal mafia, Peter H. Diamandis: Planetary Resources, Peter Thiel, pets.com, planetary scale, private spaceflight, profit maximization, RAND corporation, Richard Feynman, Richard Feynman: Challenger O-ring, Ronald Reagan, satellite internet, Scaled Composites, shareholder value, Silicon Valley, skunkworks, SoftBank, sovereign wealth fund, space junk, SpaceShipOne, Stephen Hawking, Steve Jobs, Strategic Defense Initiative, trade route, undersea cable, vertical integration, Virgin Galactic, VTOL, We wanted flying cars, instead we got 140 characters, X Prize, Y2K

“Entertainment turns out to be the driver of technologies that then become very practical and utilitarian for other things,” the Amazon founder said in 2017. “Even in the early days of aviation, one of the first uses of the very first planes was barnstorming; they would go around and land in farmers’ fields and sell tickets. Likewise, more recently, the GPUs that are now used for machine learning and deep learning: they were really invented by Nvidia for video games. New Shepard, that tourism mission, because we can fly it so frequently, is going to be a real driver of our technology.” Sercel’s estimates of the potential space tourism market are bigger than you might think. “There are nearly 250,000 people on planet earth who have more than $30 million in spare change,” he says.


pages: 326 words: 103,170

The Seventh Sense: Power, Fortune, and Survival in the Age of Networks by Joshua Cooper Ramo

air gap, Airbnb, Alan Greenspan, Albert Einstein, algorithmic trading, barriers to entry, Berlin Wall, bitcoin, Bletchley Park, British Empire, cloud computing, Computing Machinery and Intelligence, crowdsourcing, Danny Hillis, data science, deep learning, defense in depth, Deng Xiaoping, drone strike, Edward Snowden, Fairchild Semiconductor, Fall of the Berlin Wall, financial engineering, Firefox, Google Chrome, growth hacking, Herman Kahn, income inequality, information security, Isaac Newton, Jeff Bezos, job automation, Joi Ito, Laura Poitras, machine translation, market bubble, Menlo Park, Metcalfe’s law, Mitch Kapor, Morris worm, natural language processing, Neal Stephenson, Network effects, Nick Bostrom, Norbert Wiener, Oculus Rift, off-the-grid, packet switching, paperclip maximiser, Paul Graham, power law, price stability, quantitative easing, RAND corporation, reality distortion field, Recombinant DNA, recommendation engine, Republic of Letters, Richard Feynman, road to serfdom, Robert Metcalfe, Sand Hill Road, secular stagnation, self-driving car, Silicon Valley, Skype, Snapchat, Snow Crash, social web, sovereign wealth fund, Steve Jobs, Steve Wozniak, Stewart Brand, Stuxnet, superintelligent machines, systems thinking, technological singularity, The Coming Technological Singularity, The Wealth of Nations by Adam Smith, too big to fail, Vernor Vinge, zero day

For a discussion of the relation between coding and the real world, see Bret Victor, “Inventing on Principle,” Speech, Canadian University Software Engineering Conference, January 20, 2012, Montreal, Quebec, available at https://www.youtube.com/watch?v=PUv66718DII. “Early intercontinental travellers”: Peter Sloterdijk, In the World Interior of Capital: Towards a Philosophical Theory of Globalization, trans. Wieland Hoban (Cambridge: Polity Press, 2013), 77. “When building”: Andrew Ng, “Deep Learning: What’s Next” (speech at GPU Technology Conference, San Jose, CA, March 19, 2015). The French philosopher: Bruno Latour, “On Technical Mediation—Philosophy, Sociology, Genealogy,” Common Knowledge 3, no. 2 (Fall 1994): 37. The immense possibility: Ryan Gallagher, “Profiled: From Radio to Porn, British Spies Track Web Users’ Online Identities,” The Intercept, September 25, 2015; GCHQ documents, “PullThrough Steering Group Meeting #16,” at https://theintercept.com/document/2015/09/25/pull-steering-group-minutes/.


pages: 463 words: 105,197

Radical Markets: Uprooting Capitalism and Democracy for a Just Society by Eric Posner, E. Weyl

3D printing, activist fund / activist shareholder / activist investor, Affordable Care Act / Obamacare, Airbnb, Amazon Mechanical Turk, anti-communist, augmented reality, basic income, Berlin Wall, Bernie Sanders, Big Tech, Branko Milanovic, business process, buy and hold, carbon footprint, Cass Sunstein, Clayton Christensen, cloud computing, collective bargaining, commoditize, congestion pricing, Corn Laws, corporate governance, crowdsourcing, cryptocurrency, data science, deep learning, DeepMind, Donald Trump, Elon Musk, endowment effect, Erik Brynjolfsson, Ethereum, feminist movement, financial deregulation, Francis Fukuyama: the end of history, full employment, gamification, Garrett Hardin, George Akerlof, global macro, global supply chain, guest worker program, hydraulic fracturing, Hyperloop, illegal immigration, immigration reform, income inequality, income per capita, index fund, informal economy, information asymmetry, invisible hand, Jane Jacobs, Jaron Lanier, Jean Tirole, Jeremy Corbyn, Joseph Schumpeter, Kenneth Arrow, labor-force participation, laissez-faire capitalism, Landlord’s Game, liberal capitalism, low skilled workers, Lyft, market bubble, market design, market friction, market fundamentalism, mass immigration, negative equity, Network effects, obamacare, offshore financial centre, open borders, Pareto efficiency, passive investing, patent troll, Paul Samuelson, performance metric, plutocrats, pre–internet, radical decentralization, random walk, randomized controlled trial, Ray Kurzweil, recommendation engine, rent-seeking, Richard Thaler, ride hailing / ride sharing, risk tolerance, road to serfdom, Robert Shiller, Ronald Coase, Rory Sutherland, search costs, Second Machine Age, second-price auction, self-driving car, shareholder value, sharing economy, Silicon Valley, Skype, special economic zone, spectrum auction, speech recognition, statistical model, stem cell, telepresence, Thales and the olive presses, Thales of Miletus, The Death and Life of Great American Cities, The Future of Employment, The Market for Lemons, The Nature of the Firm, The Rise and Fall of American Growth, The Theory of the Leisure Class by Thorstein Veblen, The Wealth of Nations by Adam Smith, Thorstein Veblen, trade route, Tragedy of the Commons, transaction costs, trickle-down economics, Tyler Cowen, Uber and Lyft, uber lyft, universal basic income, urban planning, Vanguard fund, vertical integration, women in the workforce, Zipcar

Chris Anderson, Free: The Future of a Radical Price (Hyperion, 2009). 13. Jakob Nielsen, The Case for Micropayments, Nielsen Norman Group (January 25, 1998), https://www.nngroup.com/articles/the-case-for-micropayments/. 14. Daniela Hernandez, Facebook’s Quest to Build an Artificial Brain Depends on this Guy, Wired (2014), https://www.wired.com/2014/08/deep-learning-yann-lecun/. 15. “Complexity” is often used in academic parlance to refer to the difficulty of a problem in the worst case. Often these worst-case bounds are very “conservative” in the sense that they dramatically overstate the requirements in typical real-world applications. With a slight abuse of nomenclature, we use complexity to refer to what a problem requires in a typical or “average” case in practice rather than what it can be proven to require in the worst case. 16. https://news.microsoft.com/features/democratizing-ai/. 17.


Falter: Has the Human Game Begun to Play Itself Out? by Bill McKibben

"Hurricane Katrina" Superdome, 23andMe, Affordable Care Act / Obamacare, Airbnb, Alan Greenspan, American Legislative Exchange Council, An Inconvenient Truth, Anne Wojcicki, Anthropocene, Apollo 11, artificial general intelligence, Bernie Sanders, Bill Joy: nanobots, biodiversity loss, Burning Man, call centre, Cambridge Analytica, carbon footprint, carbon tax, Charles Lindbergh, clean water, Colonization of Mars, computer vision, CRISPR, David Attenborough, deep learning, DeepMind, degrowth, disinformation, Donald Trump, double helix, driverless car, Easter island, Edward Snowden, Elon Musk, ending welfare as we know it, energy transition, Extinction Rebellion, Flynn Effect, gigafactory, Google Earth, Great Leap Forward, green new deal, Greta Thunberg, Hyperloop, impulse control, income inequality, Intergovernmental Panel on Climate Change (IPCC), James Bridle, Jane Jacobs, Jaron Lanier, Jeff Bezos, job automation, Kim Stanley Robinson, life extension, light touch regulation, Mark Zuckerberg, mass immigration, megacity, Menlo Park, moral hazard, Naomi Klein, Neil Armstrong, Nelson Mandela, Nick Bostrom, obamacare, ocean acidification, off grid, oil shale / tar sands, paperclip maximiser, Paris climate accords, pattern recognition, Peter Thiel, plutocrats, profit motive, Ralph Waldo Emerson, Ray Kurzweil, Robert Mercer, Ronald Reagan, Sam Altman, San Francisco homelessness, self-driving car, Silicon Valley, Silicon Valley startup, smart meter, Snapchat, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Pinker, strong AI, supervolcano, tech baron, tech billionaire, technoutopianism, TED Talk, The Wealth of Nations by Adam Smith, traffic fines, Tragedy of the Commons, Travis Kalanick, Tyler Cowen, urban sprawl, Virgin Galactic, Watson beat the top human players on Jeopardy!, Y Combinator, Y2K, yield curve

Ayn Rand, Fountainhead, p. 11. PART THREE: THE NAME OF THE GAME CHAPTER 13 1. Personal conversation, November 22, 2017. 2. James Bridle, “Known Unknowns,” Harper’s, July 2018. 3. “Rise of the Machines,” The Economist, May 22, 2017. 4. “On Welsh Corgis, Computer Vision, and the Power of Deep Learning,” microsoft.com, July 14, 2014. 5. Andrew Roberts, “Elon Musk Says to Forget North Korea Because Artificial Intelligence Is the Real Threat to Humanity,” uproxx.com, August 12, 2017. 6. Tom Simonite, “What Is Ray Kurzweil Up to at Google? Writing Your Emails,” Wired, August 2, 2017. 7.


Mindf*ck: Cambridge Analytica and the Plot to Break America by Christopher Wylie

4chan, affirmative action, Affordable Care Act / Obamacare, air gap, availability heuristic, Berlin Wall, Bernie Sanders, Big Tech, big-box store, Boris Johnson, Brexit referendum, British Empire, call centre, Cambridge Analytica, Chelsea Manning, chief data officer, cognitive bias, cognitive dissonance, colonial rule, computer vision, conceptual framework, cryptocurrency, Daniel Kahneman / Amos Tversky, dark pattern, dark triade / dark tetrad, data science, deep learning, desegregation, disinformation, Dominic Cummings, Donald Trump, Downton Abbey, Edward Snowden, Elon Musk, emotional labour, Etonian, fake news, first-past-the-post, gamification, gentleman farmer, Google Earth, growth hacking, housing crisis, income inequality, indoor plumbing, information asymmetry, Internet of things, Julian Assange, Lyft, Marc Andreessen, Mark Zuckerberg, Menlo Park, move fast and break things, Network effects, new economy, obamacare, Peter Thiel, Potemkin village, recommendation engine, Renaissance Technologies, Robert Mercer, Ronald Reagan, Rosa Parks, Sand Hill Road, Scientific racism, Shoshana Zuboff, side project, Silicon Valley, Skype, Stephen Fry, Steve Bannon, surveillance capitalism, tech bro, uber lyft, unpaid internship, Valery Gerasimov, web application, WikiLeaks, zero-sum game

He had been doing his Ph.D. research on modeling and predicting the behavior of C. elegans roundworms and said that he simply swapped out the worms for people. Jucikas proposed pulling a wide variety of data by building automated data-harvesting utilities, using algorithmic imputations to consolidate different data sources into a single unified identity for each individual, and then using deep-learning neural networks to predict our desired behaviors. We would still need a team of psychologists, he said, to create the narratives needed to change behaviors, but his pipeline served as the first sketch of the targeting system. But what I loved most was that he color-coded it to make the journey look like the London tube map.


pages: 349 words: 98,868

Nervous States: Democracy and the Decline of Reason by William Davies

active measures, Affordable Care Act / Obamacare, Amazon Web Services, Anthropocene, bank run, banking crisis, basic income, Black Lives Matter, Brexit referendum, business cycle, Cambridge Analytica, Capital in the Twenty-First Century by Thomas Piketty, citizen journalism, Climategate, Climatic Research Unit, Colonization of Mars, continuation of politics by other means, creative destruction, credit crunch, data science, decarbonisation, deep learning, DeepMind, deindustrialization, digital divide, discovery of penicillin, Dominic Cummings, Donald Trump, drone strike, Elon Musk, failed state, fake news, Filter Bubble, first-past-the-post, Frank Gehry, gig economy, government statistician, housing crisis, income inequality, Isaac Newton, Jeff Bezos, Jeremy Corbyn, Johannes Kepler, Joseph Schumpeter, knowledge economy, loss aversion, low skilled workers, Mahatma Gandhi, Mark Zuckerberg, mass immigration, meta-analysis, Mont Pelerin Society, mutually assured destruction, Northern Rock, obamacare, Occupy movement, opioid epidemic / opioid crisis, Paris climate accords, pattern recognition, Peace of Westphalia, Peter Thiel, Philip Mirowski, planetary scale, post-industrial society, post-truth, quantitative easing, RAND corporation, Ray Kurzweil, Richard Florida, road to serfdom, Robert Mercer, Ronald Reagan, sentiment analysis, Silicon Valley, Silicon Valley billionaire, Silicon Valley startup, smart cities, Social Justice Warrior, statistical model, Steve Bannon, Steve Jobs, tacit knowledge, the scientific method, Turing machine, Uber for X, universal basic income, University of East Anglia, Valery Gerasimov, W. E. B. Du Bois, We are the 99%, WikiLeaks, women in the workforce, zero-sum game

As the former Pentagon employee Rosa Brooks has observed, one reason why the US military spreads its tentacles ever further into American policymaking is that “Americans increasingly treat the military as an all-purpose tool for fixing anything that happens to be broken.”14 The challenge of fixing a violent and rapidly self-destructive relationship to the natural environment has greater historic importance than any other. Whatever confronts this task, if not the actual military, will have to be something with many of the same characteristics as the military. Making promises Thanks to the sudden progress of “neural networking” techniques of AI (or deep learning), we now face the potential prospect of computers matching the powers of the human mind. This is perhaps the most daunting prospect for expertise today, threatening to replace a wide range of “white collar” and “knowledge-intensive” jobs. The professional work of journalists, lawyers, accountants, and architects is already vulnerable to automation, as machine learning grows in sophistication, thanks partly to the vast quantities of data we produce.


pages: 346 words: 97,330

Ghost Work: How to Stop Silicon Valley From Building a New Global Underclass by Mary L. Gray, Siddharth Suri

"World Economic Forum" Davos, Affordable Care Act / Obamacare, AlphaGo, Amazon Mechanical Turk, Apollo 13, augmented reality, autonomous vehicles, barriers to entry, basic income, benefit corporation, Big Tech, big-box store, bitcoin, blue-collar work, business process, business process outsourcing, call centre, Capital in the Twenty-First Century by Thomas Piketty, cloud computing, cognitive load, collaborative consumption, collective bargaining, computer vision, corporate social responsibility, cotton gin, crowdsourcing, data is the new oil, data science, deep learning, DeepMind, deindustrialization, deskilling, digital divide, do well by doing good, do what you love, don't be evil, Donald Trump, Elon Musk, employer provided health coverage, en.wikipedia.org, equal pay for equal work, Erik Brynjolfsson, fake news, financial independence, Frank Levy and Richard Murnane: The New Division of Labor, fulfillment center, future of work, gig economy, glass ceiling, global supply chain, hiring and firing, ImageNet competition, independent contractor, industrial robot, informal economy, information asymmetry, Jeff Bezos, job automation, knowledge economy, low skilled workers, low-wage service sector, machine translation, market friction, Mars Rover, natural language processing, new economy, operational security, passive income, pattern recognition, post-materialism, post-work, power law, race to the bottom, Rana Plaza, recommendation engine, ride hailing / ride sharing, Ronald Coase, scientific management, search costs, Second Machine Age, sentiment analysis, sharing economy, Shoshana Zuboff, side project, Silicon Valley, Silicon Valley startup, Skype, software as a service, speech recognition, spinning jenny, Stephen Hawking, TED Talk, The Future of Employment, The Nature of the Firm, Tragedy of the Commons, transaction costs, two-sided market, union organizing, universal basic income, Vilfredo Pareto, Wayback Machine, women in the workforce, work culture , Works Progress Administration, Y Combinator, Yochai Benkler

The annual ImageNet competition saw a roughly 10x reduction in error and a roughly 3x increase in precision in recognizing images over the course of eight years. Eventually the vision algorithms achieved a lower error rate than the human workers. The algorithmic and engineering advances that scientists achieved over the eight years of competition fueled much of the recent success of neural networks, the so-called deep learning revolution, which would impact a variety of fields and problem domains. [back] 13. Djellel Difallah, Elena Filatova, and Panos Ipeirotis, “Demographics and Dynamics of Mechanical Turk Workers,” in Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining (New York: ACM, 2018), 135–43, https://doi.org/10.1145/3159652.3159661.


pages: 405 words: 103,723

The Government of No One: The Theory and Practice of Anarchism by Ruth Kinna

Anthropocene, Berlin Wall, British Empire, complexity theory, creative destruction, critical race theory, David Graeber, deep learning, degrowth, en.wikipedia.org, Fall of the Berlin Wall, feminist movement, friendly fire, ghettoisation, Herbert Marcuse, intentional community, John Gilmore, Kickstarter, late capitalism, means of production, meritocracy, moral panic, Murray Bookchin, New Journalism, Occupy movement, post scarcity, public intellectual, rewilding, Steven Pinker, Ted Kaczynski, union organizing, wage slave

Anarchists typically understand education as an approach to life, tapping into long-established conventions that emphasize processes of socialization and moral development as well as learning or knowledge acquisition.50 Expressing a widely held anarchist view, Lucy Parsons defined education as creation of ‘self-thinking individuals’.51 Working on the other side of the Pacific in late Qing dynasty China, the foremost anarchist organizer Shifu likewise distinguished ‘formal education’ from ‘education in the transformation of quotidian life’.52 Distancing himself from campaigns his comrades promoted to instruct people about the basics of anarchism, he pushed for an education that demanded understanding of the ‘causes of the vileness of society’, the abandonment of ‘false morality and corrupt systems’. This kind of deep learning required the eradication of ‘the clever people’ and the disregard of ‘the teachings of so-called sages’. Shifu’s was a programme of disobedience and anti-government activism intended to restore ‘the essential beauty’ of ‘human morality’.53 ‘We must learn to think differently,’ said, in a similar vein, Alexander Berkman, editor of the Blast, ‘before the revolution can come’.


pages: 285 words: 98,832

The Premonition: A Pandemic Story by Michael Lewis

"World Economic Forum" Davos, Airbnb, contact tracing, coronavirus, COVID-19, dark matter, data science, deep learning, Donald Davies, Donald Trump, double helix, energy security, facts on the ground, failed state, gentleman farmer, global supply chain, illegal immigration, Marc Benioff, Mark Zuckerberg, out of africa, precautionary principle, QAnon, rolling blackouts, Ronald Reagan, Salesforce, Silicon Valley, social distancing, Social Justice Warrior, stem cell, tech bro, telemarketer, the new new thing, working poor, young professional

To inject a snake’s heart with a virus requires two postdocs and one full professor: one to hold the snake in a death grip, one to use a Doppler radar to find the snake’s heart, and a third to plunge the needle into it. It seemed exactly the sort of mission that might test the loyalty of a graduate student. The postdocs who spent time in the DeRisi Lab were, at Joe’s insistence, a mixed bag: biologists, chemists, deep learning specialists, medical doctors of every sort. But they had one thing in common: they were up for anything. “I try to recruit all kinds of people,” said Joe. “But the people who are attracted to us would have zero reservations about jumping onto that ship.” The professor and the students injected many boa constrictors, and many pythons, with the arenavirus.


pages: 328 words: 96,678

MegaThreats: Ten Dangerous Trends That Imperil Our Future, and How to Survive Them by Nouriel Roubini

"World Economic Forum" Davos, 2021 United States Capitol attack, 3D printing, 9 dash line, AI winter, AlphaGo, artificial general intelligence, asset allocation, assortative mating, autonomous vehicles, bank run, banking crisis, basic income, Bear Stearns, Big Tech, bitcoin, Bletchley Park, blockchain, Boston Dynamics, Bretton Woods, British Empire, business cycle, business process, call centre, carbon tax, Carmen Reinhart, cashless society, central bank independence, collateralized debt obligation, Computing Machinery and Intelligence, coronavirus, COVID-19, creative destruction, credit crunch, crony capitalism, cryptocurrency, currency manipulation / currency intervention, currency peg, data is the new oil, David Ricardo: comparative advantage, debt deflation, decarbonisation, deep learning, DeepMind, deglobalization, Demis Hassabis, democratizing finance, Deng Xiaoping, disintermediation, Dogecoin, Donald Trump, Elon Musk, en.wikipedia.org, energy security, energy transition, Erik Brynjolfsson, Ethereum, ethereum blockchain, eurozone crisis, failed state, fake news, family office, fiat currency, financial deregulation, financial innovation, financial repression, fixed income, floating exchange rates, forward guidance, Fractional reserve banking, Francis Fukuyama: the end of history, full employment, future of work, game design, geopolitical risk, George Santayana, Gini coefficient, global pandemic, global reserve currency, global supply chain, GPS: selective availability, green transition, Greensill Capital, Greenspan put, Herbert Marcuse, high-speed rail, Hyman Minsky, income inequality, inflation targeting, initial coin offering, Intergovernmental Panel on Climate Change (IPCC), Internet of things, invention of movable type, Isaac Newton, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, junk bonds, Kenneth Rogoff, knowledge worker, Long Term Capital Management, low interest rates, low skilled workers, low-wage service sector, M-Pesa, margin call, market bubble, Martin Wolf, mass immigration, means of production, meme stock, Michael Milken, middle-income trap, Mikhail Gorbachev, Minsky moment, Modern Monetary Theory, money market fund, money: store of value / unit of account / medium of exchange, moral hazard, mortgage debt, Mustafa Suleyman, Nash equilibrium, natural language processing, negative equity, Nick Bostrom, non-fungible token, non-tariff barriers, ocean acidification, oil shale / tar sands, oil shock, paradox of thrift, pets.com, Phillips curve, planetary scale, Ponzi scheme, precariat, price mechanism, price stability, public intellectual, purchasing power parity, quantitative easing, race to the bottom, Ralph Waldo Emerson, ransomware, Ray Kurzweil, regulatory arbitrage, reserve currency, reshoring, Robert Shiller, Ronald Reagan, Salesforce, Satoshi Nakamoto, Savings and loan crisis, Second Machine Age, short selling, Silicon Valley, smart contracts, South China Sea, sovereign wealth fund, Stephen Hawking, TED Talk, The Great Moderation, the payments system, Thomas L Friedman, TikTok, too big to fail, Turing test, universal basic income, War on Poverty, warehouse robotics, Washington Consensus, Watson beat the top human players on Jeopardy!, working-age population, Yogi Berra, Yom Kippur War, zero-sum game, zoonotic diseases

“You can get into semantics about what does reasoning mean, but clearly the AI system was reasoning at that point,” says New York Times journalist Craig Smith, who now hosts the podcast Eye on AI.5 A year later, AlphaGo Zero bested AlphaGo by learning the rules of the game and then generating billions of data points in just three days. Deep learning has progressed with mind-bending speed. In 2020, Deep Mind’s AlphaFold2 revolutionized the field of biology by solving “the protein-folding problem” that had stumped medical researchers for five decades. Besides probing massive volumes of molecular data on protein structures, AlphaFold deployed “transformers,” an innovative neural network that Google Brain scientists unveiled in a 2017 paper.


pages: 599 words: 98,564

The Mutant Project: Inside the Global Race to Genetically Modify Humans by Eben Kirksey

23andMe, Abraham Maslow, Affordable Care Act / Obamacare, Albert Einstein, Bernie Sanders, bioinformatics, bitcoin, Black Lives Matter, blockchain, Buckminster Fuller, clean water, coronavirus, COVID-19, CRISPR, cryptocurrency, data acquisition, deep learning, Deng Xiaoping, Donald Trump, double helix, epigenetics, Ethereum, ethereum blockchain, experimental subject, fake news, gentrification, George Floyd, Jeff Bezos, lockdown, Mark Zuckerberg, megacity, microdosing, moral panic, move fast and break things, personalized medicine, phenotype, placebo effect, randomized controlled trial, Recombinant DNA, Shenzhen special economic zone , Shenzhen was a fishing village, Silicon Valley, Silicon Valley billionaire, Skype, special economic zone, statistical model, stem cell, surveillance capitalism, tech billionaire, technological determinism, upwardly mobile, urban planning, young professional

Many IVF clinics in the world still do not have such a fancy setup, so most embryologists are still eyeballing it. The company that makes this incubator, Vitrolife, claims to have “the world’s largest morphokinetic database,” meaning that they keep track of the embryos as they grow, move, and develop a form. Deep-learning technologies are constantly feeding the algorithm new data, linking real-world events like a failed implantation, a miscarriage, and a live birth back into their patented EmbryoScope system. Embryos deemed “high-risk” are flagged for biopsy procedures and DNA testing.4 Adding CRISPR into the workflow would be easy, O’Neill said.


pages: 320 words: 95,629

Decoding the World: A Roadmap for the Questioner by Po Bronson

23andMe, 3D printing, 4chan, Abraham Maslow, Affordable Care Act / Obamacare, altcoin, Apple's 1984 Super Bowl advert, Asilomar, autonomous vehicles, basic income, Big Tech, bitcoin, blockchain, Burning Man, call centre, carbon credits, carbon tax, cognitive bias, cognitive dissonance, coronavirus, COVID-19, CRISPR, cryptocurrency, decarbonisation, deep learning, deepfake, DeepMind, dematerialisation, Donald Trump, driverless car, dumpster diving, edge city, Ethereum, ethereum blockchain, Eyjafjallajökull, factory automation, fake news, financial independence, Google X / Alphabet X, green new deal, income inequality, industrial robot, Isaac Newton, Jeff Bezos, Kevin Kelly, Kickstarter, Mars Rover, mass immigration, McMansion, means of production, microbiome, microplastics / micro fibres, oil shale / tar sands, opioid epidemic / opioid crisis, Paul Graham, paypal mafia, phenotype, Ponzi scheme, power law, quantum entanglement, Ronald Reagan, Sand Hill Road, sharing economy, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, smart contracts, source of truth, stem cell, Steve Jobs, Steve Jurvetson, sustainable-tourism, synthetic biology, Tesla Model S, too big to fail, trade route, universal basic income, Watson beat the top human players on Jeopardy!, women in the workforce

There is now a large variety of algorithms, but they all use the concept of Perceptrons at their heart. Cutting-edge research in artificial intelligence today is about getting computers to create new goals for themselves and seek ways to attain them. This approach has led to the development of deep-learning techniques that have created programs that defeated grand masters at Go and chess, and fighter pilots in aerial dogfights. Today’s computers have billions of transistors to process the data they ingest. They are trained with a goal in mind, on millions of games or dogfights, and they exhibit nonhuman creativity, trying things humans never have.


pages: 337 words: 96,666

Practical Doomsday: A User's Guide to the End of the World by Michal Zalewski

accounting loophole / creative accounting, AI winter, anti-communist, artificial general intelligence, bank run, big-box store, bitcoin, blockchain, book value, Buy land – they’re not making it any more, capital controls, Capital in the Twenty-First Century by Thomas Piketty, Carrington event, clean water, coronavirus, corporate governance, COVID-19, cryptocurrency, David Graeber, decentralized internet, deep learning, distributed ledger, diversification, diversified portfolio, Dogecoin, dumpster diving, failed state, fiat currency, financial independence, financial innovation, fixed income, Fractional reserve banking, Francis Fukuyama: the end of history, Haber-Bosch Process, housing crisis, index fund, indoor plumbing, information security, inventory management, Iridium satellite, Joan Didion, John Bogle, large denomination, lifestyle creep, mass immigration, McDonald's hot coffee lawsuit, McMansion, medical bankruptcy, Modern Monetary Theory, money: store of value / unit of account / medium of exchange, moral panic, non-fungible token, nuclear winter, off-the-grid, Oklahoma City bombing, opioid epidemic / opioid crisis, paperclip maximiser, passive investing, peak oil, planetary scale, ransomware, restrictive zoning, ride hailing / ride sharing, risk tolerance, Ronald Reagan, Satoshi Nakamoto, Savings and loan crisis, self-driving car, shareholder value, Silicon Valley, supervolcano, systems thinking, tech worker, Ted Kaczynski, TED Talk, Tunguska event, underbanked, urban sprawl, Wall-E, zero-sum game, zoonotic diseases

On this topic, the history of AI research offers a cautionary tale: after the initial exuberance and some stunning early successes of artificial neural networks in the 1950s and 1960s, the field slid into a prolonged “AI winter” of broken promises and constant disappointments. Funding dwindled, and few academics would take pride in associating themselves with the discipline. It wasn’t until the late 2000s that AI research made a comeback, aided with vastly superior computing resources and significant refinements to the architecture of neural networks and to deep learning algorithms. But the field focused on humble, utilitarian goals: building systems custom-tailored to perform highly specialized tasks, such as voice recognition, image classification, or the translation of text. Such architectures, although quite successful, still require quite a few quantum leaps to get anywhere close to AGI, and tellingly, the desire to build a digital “brain in a jar” is not an immediate goal for any serious corporate or academic research right now.


pages: 385 words: 111,113

Augmented: Life in the Smart Lane by Brett King

23andMe, 3D printing, additive manufacturing, Affordable Care Act / Obamacare, agricultural Revolution, Airbnb, Albert Einstein, Amazon Web Services, Any sufficiently advanced technology is indistinguishable from magic, Apollo 11, Apollo Guidance Computer, Apple II, artificial general intelligence, asset allocation, augmented reality, autonomous vehicles, barriers to entry, bitcoin, Bletchley Park, blockchain, Boston Dynamics, business intelligence, business process, call centre, chief data officer, Chris Urmson, Clayton Christensen, clean water, Computing Machinery and Intelligence, congestion charging, CRISPR, crowdsourcing, cryptocurrency, data science, deep learning, DeepMind, deskilling, different worldview, disruptive innovation, distributed generation, distributed ledger, double helix, drone strike, electricity market, Elon Musk, Erik Brynjolfsson, Fellow of the Royal Society, fiat currency, financial exclusion, Flash crash, Flynn Effect, Ford Model T, future of work, gamification, Geoffrey Hinton, gig economy, gigafactory, Google Glasses, Google X / Alphabet X, Hans Lippershey, high-speed rail, Hyperloop, income inequality, industrial robot, information asymmetry, Internet of things, invention of movable type, invention of the printing press, invention of the telephone, invention of the wheel, James Dyson, Jeff Bezos, job automation, job-hopping, John Markoff, John von Neumann, Kevin Kelly, Kickstarter, Kim Stanley Robinson, Kiva Systems, Kodak vs Instagram, Leonard Kleinrock, lifelogging, low earth orbit, low skilled workers, Lyft, M-Pesa, Mark Zuckerberg, Marshall McLuhan, megacity, Metcalfe’s law, Minecraft, mobile money, money market fund, more computing power than Apollo, Neal Stephenson, Neil Armstrong, Network effects, new economy, Nick Bostrom, obamacare, Occupy movement, Oculus Rift, off grid, off-the-grid, packet switching, pattern recognition, peer-to-peer, Ray Kurzweil, retail therapy, RFID, ride hailing / ride sharing, Robert Metcalfe, Salesforce, Satoshi Nakamoto, Second Machine Age, selective serotonin reuptake inhibitor (SSRI), self-driving car, sharing economy, Shoshana Zuboff, Silicon Valley, Silicon Valley startup, Skype, smart cities, smart grid, smart transportation, Snapchat, Snow Crash, social graph, software as a service, speech recognition, statistical model, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, strong AI, synthetic biology, systems thinking, TaskRabbit, technological singularity, TED Talk, telemarketer, telepresence, telepresence robot, Tesla Model S, The future is already here, The Future of Employment, Tim Cook: Apple, trade route, Travis Kalanick, TSMC, Turing complete, Turing test, Twitter Arab Spring, uber lyft, undersea cable, urban sprawl, V2 rocket, warehouse automation, warehouse robotics, Watson beat the top human players on Jeopardy!, white picket fence, WikiLeaks, yottabyte

In 1997, Bill Gates was pretty bullish on speech recognition, predicting that “In this 10-year time frame, I believe that we’ll not only be using the keyboard and the mouse to interact, but during that time we will have perfected speech recognition and speech output well enough that those will become a standard part of the interface.”25 In the year 2000, it was still a decade away. The big breakthroughs came with the application of Markov models and Deep Learning models or neural networks, basically better computer performance and bigger source databases. However, the models that we have today are limited because they still don’t learn language. These algorithms don’t learn language like a human; they identify a phrase through recognition, look it up on a database and then deliver an appropriate response.


pages: 344 words: 104,077

Superminds: The Surprising Power of People and Computers Thinking Together by Thomas W. Malone

Abraham Maslow, agricultural Revolution, Airbnb, Albert Einstein, Alvin Toffler, Amazon Mechanical Turk, Apple's 1984 Super Bowl advert, Asperger Syndrome, Baxter: Rethink Robotics, bitcoin, blockchain, Boeing 747, business process, call centre, carbon tax, clean water, Computing Machinery and Intelligence, creative destruction, crowdsourcing, data science, deep learning, Donald Trump, Douglas Engelbart, Douglas Engelbart, driverless car, drone strike, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, experimental economics, Exxon Valdez, Ford Model T, future of work, Future Shock, Galaxy Zoo, Garrett Hardin, gig economy, happiness index / gross national happiness, independent contractor, industrial robot, Internet of things, invention of the telegraph, inventory management, invisible hand, Jeff Rulifson, jimmy wales, job automation, John Markoff, Joi Ito, Joseph Schumpeter, Kenneth Arrow, knowledge worker, longitudinal study, Lyft, machine translation, Marshall McLuhan, Nick Bostrom, Occupy movement, Pareto efficiency, pattern recognition, prediction markets, price mechanism, radical decentralization, Ray Kurzweil, Rodney Brooks, Ronald Coase, search costs, Second Machine Age, self-driving car, Silicon Valley, slashdot, social intelligence, Stephen Hawking, Steve Jobs, Steven Pinker, Stewart Brand, technological singularity, The Nature of the Firm, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Tim Cook: Apple, Tragedy of the Commons, transaction costs, Travis Kalanick, Uber for X, uber lyft, Vernor Vinge, Vilfredo Pareto, Watson beat the top human players on Jeopardy!

One of the most impressive recent examples of a computer doing unsupervised learning was when a group of Stanford University and Google researchers gave a computer system 10 million digital images from YouTube videos and let the system look for patterns. Without the researchers ever telling the system what to look for, it learned to identify 20,000 categories of objects, including human faces, human bodies, and… cat faces.19 This system used a particularly promising approach to machine learning called deep learning, which loosely simulates the way the different layers of neurons in a brain are connected to one another. Neuromorphic Computing Still another intriguing approach to creating more intelligent computers is to create new kinds of computer hardware that more closely resemble the structure of a human brain.


pages: 363 words: 109,077

The Raging 2020s: Companies, Countries, People - and the Fight for Our Future by Alec Ross

"Friedman doctrine" OR "shareholder theory", "World Economic Forum" Davos, Affordable Care Act / Obamacare, air gap, air traffic controllers' union, Airbnb, Albert Einstein, An Inconvenient Truth, autonomous vehicles, barriers to entry, benefit corporation, Bernie Sanders, Big Tech, big-box store, British Empire, call centre, capital controls, clean water, collective bargaining, computer vision, coronavirus, corporate governance, corporate raider, COVID-19, deep learning, Deng Xiaoping, Didi Chuxing, disinformation, Dissolution of the Soviet Union, Donald Trump, Double Irish / Dutch Sandwich, drone strike, dumpster diving, employer provided health coverage, Francis Fukuyama: the end of history, future of work, general purpose technology, gig economy, Gini coefficient, global supply chain, Goldman Sachs: Vampire Squid, Gordon Gekko, greed is good, high-speed rail, hiring and firing, income inequality, independent contractor, information security, intangible asset, invisible hand, Jeff Bezos, knowledge worker, late capitalism, low skilled workers, Lyft, Marc Andreessen, Marc Benioff, mass immigration, megacity, military-industrial complex, minimum wage unemployment, mittelstand, mortgage tax deduction, natural language processing, Oculus Rift, off-the-grid, offshore financial centre, open economy, OpenAI, Parag Khanna, Paris climate accords, profit motive, race to the bottom, RAND corporation, ride hailing / ride sharing, Robert Bork, rolodex, Ronald Reagan, Salesforce, self-driving car, shareholder value, side hustle, side project, Silicon Valley, smart cities, Social Responsibility of Business Is to Increase Its Profits, sovereign wealth fund, sparse data, special economic zone, Steven Levy, stock buybacks, strikebreaker, TaskRabbit, tech bro, tech worker, transcontinental railway, transfer pricing, Travis Kalanick, trickle-down economics, Uber and Lyft, uber lyft, union organizing, Upton Sinclair, vertical integration, working poor

Uber China was sold to Didi: Alyssa Abkowitz and Rick Carew, “Uber Sells China Operations to Didi Chuxing,” Wall Street Journal, August 1, 2016, https://www.wsj.com/articles/china-s-didi-chuxing-to-acquire-rival-uber-s-chinese-operations-1470024403. After beating out Uber: Sarah Dai, “‘China’s Uber’ Ramps Up AI Arms Race, Says It Will Open Third Deep Learning Research Lab,” South China Morning Post, January 26, 2018, https://www.scmp.com/tech/start-ups/article/2130793/didi-chuxing-ramps-artificial-intelligence-arms-race-says-it-will; Jonathan Cheng, “China’s Ride-Hailing Giant Didi to Test Beijing’s New Digital Currency,” Wall Street Journal, July 8, 2020, https://www.wsj.com/articles/chinas-ride-hailing-giant-didi-to-test-beijings-new-digital-currency-11594206564.


pages: 363 words: 109,834

The Crux by Richard Rumelt

activist fund / activist shareholder / activist investor, air gap, Airbnb, AltaVista, AOL-Time Warner, Bayesian statistics, behavioural economics, biodiversity loss, Blue Ocean Strategy, Boeing 737 MAX, Boeing 747, Charles Lindbergh, Clayton Christensen, cloud computing, cognitive bias, commoditize, coronavirus, corporate raider, COVID-19, creative destruction, crossover SUV, Crossrail, deep learning, Deng Xiaoping, diversified portfolio, double entry bookkeeping, drop ship, Elon Musk, en.wikipedia.org, financial engineering, Ford Model T, Herman Kahn, income inequality, index card, Internet of things, Jeff Bezos, Just-in-time delivery, Larry Ellison, linear programming, lockdown, low cost airline, low earth orbit, Lyft, Marc Benioff, Mark Zuckerberg, Masayoshi Son, meta-analysis, Myron Scholes, natural language processing, Neil Armstrong, Network effects, packet switching, PageRank, performance metric, precision agriculture, RAND corporation, ride hailing / ride sharing, Salesforce, San Francisco homelessness, search costs, selection bias, self-driving car, shareholder value, sharing economy, Silicon Valley, Skype, Snapchat, social distancing, SoftBank, software as a service, statistical model, Steve Ballmer, Steve Jobs, stochastic process, Teledyne, telemarketer, TSMC, uber lyft, undersea cable, union organizing, vertical integration, WeWork

These technologies represented engineering problems of mixing hardware and software, closeness to customers, and lower margins that were not in synch with the bulk of the company’s x86 high-margin, down-the-nodes business. A majority felt that the AI challenge was both important and addressable. The group judged that if Habana were kept separate, it had a chance to carve out a good position in the deep- learning market. But how big would this specialized market be? The culture issue was intertwined with each upside challenge. The group believed that internal beliefs, habits, and processes had been ingrained over many years by market dominance, high margins, and clockwork scaling down through the steps of the Moore’s Law nodes.


pages: 358 words: 106,951

Diverse Bodies, Diverse Practices: Toward an Inclusive Somatics by Don Hanlon Johnson

BIPOC, clean water, colonial rule, complexity theory, conceptual framework, deep learning, emotional labour, epigenetics, imposter syndrome, liberation theology, mass incarceration, microaggression, phenotype, randomized controlled trial, W. E. B. Du Bois, working poor

There is work for us to do, to change our practices so more types of bodies feel invited to this work. A place of beginning might be deepening our awareness to how much release or constriction we need in any given moment. While it hurts and angers me when I think about all the ways I navigate whether or not my body can physically fit somewhere, there is also a deep learning. It has made me very aware of how my body interacts with my environment and the people around me. It has offered me the chance to explore in deep ways the physical and energetic space I take up. I do not take for granted that I am impacting those around me, and that I need to be paying attention to how my behavior and presence do that.


pages: 371 words: 107,141

You've Been Played: How Corporations, Governments, and Schools Use Games to Control Us All by Adrian Hon

"hyperreality Baudrillard"~20 OR "Baudrillard hyperreality", 4chan, Adam Curtis, Adrian Hon, Airbnb, Amazon Mechanical Turk, Amazon Web Services, Astronomia nova, augmented reality, barriers to entry, Bellingcat, Big Tech, bitcoin, bread and circuses, British Empire, buy and hold, call centre, computer vision, conceptual framework, contact tracing, coronavirus, corporate governance, COVID-19, crowdsourcing, cryptocurrency, David Graeber, David Sedaris, deep learning, delayed gratification, democratizing finance, deplatforming, disinformation, disintermediation, Dogecoin, electronic logging device, Elon Musk, en.wikipedia.org, Ethereum, fake news, fiat currency, Filter Bubble, Frederick Winslow Taylor, fulfillment center, Galaxy Zoo, game design, gamification, George Floyd, gig economy, GitHub removed activity streaks, Google Glasses, Hacker News, Hans Moravec, Ian Bogost, independent contractor, index fund, informal economy, Jeff Bezos, job automation, jobs below the API, Johannes Kepler, Kevin Kelly, Kevin Roose, Kickstarter, Kiva Systems, knowledge worker, Lewis Mumford, lifelogging, linked data, lockdown, longitudinal study, loss aversion, LuLaRoe, Lyft, Marshall McLuhan, megaproject, meme stock, meta-analysis, Minecraft, moral panic, multilevel marketing, non-fungible token, Ocado, Oculus Rift, One Laptop per Child (OLPC), orbital mechanics / astrodynamics, Parler "social media", passive income, payment for order flow, prisoner's dilemma, QAnon, QR code, quantitative trading / quantitative finance, r/findbostonbombers, replication crisis, ride hailing / ride sharing, Robinhood: mobile stock trading app, Ronald Coase, Rubik’s Cube, Salesforce, Satoshi Nakamoto, scientific management, shareholder value, sharing economy, short selling, short squeeze, Silicon Valley, SimCity, Skinner box, spinning jenny, Stanford marshmallow experiment, Steve Jobs, Stewart Brand, TED Talk, The Nature of the Firm, the scientific method, TikTok, Tragedy of the Commons, transaction costs, Twitter Arab Spring, Tyler Cowen, Uber and Lyft, uber lyft, urban planning, warehouse robotics, Whole Earth Catalog, why are manhole covers round?, workplace surveillance

But other school-based scoring and reward systems have taken their place. Take the facial recognition systems being deployed in schools in China, as reported by Xue Yujie on the Sixth Tone news site.90 Hanwang Education is responsible for the Class Care System (CCS) which claims to use cameras and deep-learning algorithms to identify students and classify their behaviour into categories including listening, writing, sleeping, answering questions, and interacting with other students. The sum of this behaviour becomes a score, accessible to teachers via an app. Zhang Haopeng, general manager at Hanwang Education, demonstrated the system used by Chifeng No. 4 Middle School in 2018 to Xue: “The parents can see [the score], too.


pages: 405 words: 117,219

In Our Own Image: Savior or Destroyer? The History and Future of Artificial Intelligence by George Zarkadakis

3D printing, Ada Lovelace, agricultural Revolution, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, animal electricity, anthropic principle, Asperger Syndrome, autonomous vehicles, barriers to entry, battle of ideas, Berlin Wall, bioinformatics, Bletchley Park, British Empire, business process, carbon-based life, cellular automata, Charles Babbage, Claude Shannon: information theory, combinatorial explosion, complexity theory, Computing Machinery and Intelligence, continuous integration, Conway's Game of Life, cosmological principle, dark matter, data science, deep learning, DeepMind, dematerialisation, double helix, Douglas Hofstadter, driverless car, Edward Snowden, epigenetics, Flash crash, Google Glasses, Gödel, Escher, Bach, Hans Moravec, income inequality, index card, industrial robot, intentional community, Internet of things, invention of agriculture, invention of the steam engine, invisible hand, Isaac Newton, Jacquard loom, Jacques de Vaucanson, James Watt: steam engine, job automation, John von Neumann, Joseph-Marie Jacquard, Kickstarter, liberal capitalism, lifelogging, machine translation, millennium bug, mirror neurons, Moravec's paradox, natural language processing, Nick Bostrom, Norbert Wiener, off grid, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, Paul Erdős, Plato's cave, post-industrial society, power law, precautionary principle, prediction markets, Ray Kurzweil, Recombinant DNA, Rodney Brooks, Second Machine Age, self-driving car, seminal paper, Silicon Valley, social intelligence, speech recognition, stem cell, Stephen Hawking, Steven Pinker, Strategic Defense Initiative, strong AI, Stuart Kauffman, synthetic biology, systems thinking, technological singularity, The Coming Technological Singularity, The Future of Employment, the scientific method, theory of mind, Turing complete, Turing machine, Turing test, Tyler Cowen, Tyler Cowen: Great Stagnation, Vernor Vinge, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Y2K

These global companies move towards smarter machine technologies because they understand the challenges and opportunities entailed in owning big data. They also understand that it is not enough to own the data. The real game changer lies in understanding the data’s true significance. Take, for instance, Professor LeCun, a pioneer in developing deep learning algorithms that can interpret meanings and contexts of symbols and images. This technology is valuable for Facebook as it aspires to increase the ways in which it serves its billions of customers – and the advertising industry – by extracting meaning from its colossal and ever-expanding archive of user-generated content.


pages: 443 words: 112,800

The Third Industrial Revolution: How Lateral Power Is Transforming Energy, the Economy, and the World by Jeremy Rifkin

3D printing, additive manufacturing, Albert Einstein, American ideology, An Inconvenient Truth, barriers to entry, behavioural economics, bike sharing, borderless world, carbon footprint, centre right, clean tech, collaborative consumption, collaborative economy, Community Supported Agriculture, corporate governance, decarbonisation, deep learning, distributed generation, electricity market, en.wikipedia.org, energy security, energy transition, Ford Model T, global supply chain, Great Leap Forward, high-speed rail, hydrogen economy, income inequality, industrial cluster, informal economy, Intergovernmental Panel on Climate Change (IPCC), invisible hand, Isaac Newton, job automation, knowledge economy, manufacturing employment, marginal employment, Martin Wolf, Masdar, megacity, Mikhail Gorbachev, new economy, off grid, off-the-grid, oil shale / tar sands, oil shock, open borders, peak oil, Ponzi scheme, post-oil, purchasing power parity, Ray Kurzweil, rewilding, Robert Solow, Ronald Reagan, scientific management, scientific worldview, Silicon Valley, Simon Kuznets, Skype, smart grid, smart meter, Spread Networks laid a new fibre optics cable between New York and Chicago, supply-chain management, systems thinking, tech billionaire, the market place, The Wealth of Nations by Adam Smith, Thomas Malthus, too big to fail, transaction costs, trickle-down economics, urban planning, urban renewal, Yom Kippur War, Zipcar

Knowledge, according to Bruffee and other educational reformers, is a social construct, a consensus among the members of a learning community.17 If knowledge is something that exists between people and comes out of their shared experiences, then the way our educational process is set up is inimical to deep learning. Our schooling is often little more than a stimulus-response process, a robotic affair in which students are programmed to respond to the instructions fed into them—much like the standard operating procedures of scientific management that created the workers of the First and Second Industrial Revolutions.


pages: 407 words: 116,726

Infinite Powers: How Calculus Reveals the Secrets of the Universe by Steven Strogatz

Albert Einstein, Asperger Syndrome, Astronomia nova, Bernie Sanders, clockwork universe, complexity theory, cosmological principle, Dava Sobel, deep learning, DeepMind, double helix, Edmond Halley, Eratosthenes, four colour theorem, fudge factor, Henri Poincaré, invention of the telescope, Isaac Newton, Islamic Golden Age, Johannes Kepler, John Harrison: Longitude, Khan Academy, Laplace demon, lone genius, music of the spheres, pattern recognition, Paul Erdős, Pierre-Simon Laplace, precision agriculture, retrograde motion, Richard Feynman, Socratic dialogue, Steve Jobs, the rule of 72, the scientific method

They like to capture material. They defend like iron. But although they are far stronger than any human player, they are not creative or insightful. All that changed with the rise of machine learning. On December 5, 2017, the DeepMind team at Google stunned the chess world with its announcement of a deep-learning program called AlphaZero. The program taught itself chess by playing millions of games against itself and learning from its mistakes. In a matter of hours, it became the best chess player in history. Not only could it easily defeat all the best human masters (it didn’t even bother to try), it crushed the reigning computer world champion of chess.


pages: 309 words: 114,984

The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age by Robert Wachter

activist fund / activist shareholder / activist investor, Affordable Care Act / Obamacare, AI winter, Airbnb, Atul Gawande, Captain Sullenberger Hudson, Checklist Manifesto, Chuck Templeton: OpenTable:, Clayton Christensen, cognitive load, collapse of Lehman Brothers, computer age, creative destruction, crowdsourcing, deep learning, deskilling, disruptive innovation, driverless car, en.wikipedia.org, Erik Brynjolfsson, everywhere but in the productivity statistics, Firefox, Frank Levy and Richard Murnane: The New Division of Labor, general purpose technology, Google Glasses, human-factors engineering, hype cycle, Ignaz Semmelweis: hand washing, Internet of things, job satisfaction, Joseph Schumpeter, Kickstarter, knowledge worker, lifelogging, Marc Benioff, medical malpractice, medical residency, Menlo Park, minimum viable product, natural language processing, Network effects, Nicholas Carr, obamacare, pattern recognition, peer-to-peer, personalized medicine, pets.com, pneumatic tube, Productivity paradox, Ralph Nader, RAND corporation, Richard Hendricks, Robert Solow, Salesforce, Second Machine Age, self-driving car, seminal paper, Silicon Valley, Silicon Valley startup, six sigma, Skype, Snapchat, software as a service, Steve Jobs, Steven Levy, TED Talk, The future is already here, the payments system, The Wisdom of Crowds, Thomas Bayes, Toyota Production System, Uber for X, US Airways Flight 1549, Watson beat the top human players on Jeopardy!, Yogi Berra

Studies have shown that computers can detect significant numbers of breast cancers and pulmonary emboli missed by radiologists, although nobody has yet taken the bold step of having the computers completely supplant the humans, partly because there are armadas of malpractice attorneys waiting to pounce, and partly because, at least for now, the combination of human and machine seems to perform better than either alone. But over the long haul, I wouldn’t bet on the humans here, particularly since one of the hottest areas in artificial intelligence research is “deep learning”—research that has created computers that are reasonably skilled at “reading,” “hearing,” and, yes, “seeing.” The same kind of software that now allows Facebook to guess that a certain collection of pixels is a picture of you, or that alerts the casino’s security guards to keep an eye on that guy, is likely to eventually crack the code in radiology, and in similar areas such as dermatology and pathology.


pages: 424 words: 114,820

Neurodiversity at Work: Drive Innovation, Performance and Productivity With a Neurodiverse Workforce by Amanda Kirby, Theo Smith

affirmative action, Albert Einstein, autism spectrum disorder, Automated Insights, barriers to entry, Black Lives Matter, call centre, commoditize, conceptual framework, corporate social responsibility, COVID-19, deep learning, digital divide, double empathy problem, epigenetics, fear of failure, future of work, gamification, global pandemic, iterative process, job automation, lockdown, longitudinal study, meta-analysis, Minecraft, neurotypical, phenotype, remote work: asynchronous communication, remote working, seminal paper, the built environment, traumatic brain injury, work culture

Students expect to be taught and to learn using modern technology and methods and at a pace that they have chosen – not one that is mandated to them. They also demand an education that is tailored to their unique needs and acknowledges the diverse range of factors that need to be considered in nurturing deep learning in its cohorts.’ Gary describes one initiative he has been involved with. ‘Ravensbourne is a small specialist design and media university based in London that attracts a broad and diverse student cohort that largely reflects the ethnic diversity of the capital itself (around 44 per cent). But it also welcomes neurodivergent creatives who require specialist support and an inclusive pedagogy.


pages: 399 words: 118,576

Ageless: The New Science of Getting Older Without Getting Old by Andrew Steele

Alfred Russel Wallace, assortative mating, bioinformatics, caloric restriction, caloric restriction, clockwatching, coronavirus, correlation does not imply causation, COVID-19, CRISPR, dark matter, deep learning, discovery of penicillin, double helix, Easter island, epigenetics, Hans Rosling, Helicobacter pylori, life extension, lone genius, megastructure, meta-analysis, microbiome, mouse model, parabiotic, Peter Thiel, phenotype, precautionary principle, radical life extension, randomized controlled trial, Silicon Valley, stealth mode startup, stem cell, TED Talk, zero-sum game

Revisiting the ratio of bacterial to host cells in humans’, Cell 164, 337–40 (2016). DOI: 10.1016/j.cell.2016.01.013 ageless.link/9oeph4 … your intestines can come to be dominated … Buford, 2017 ageless.link/y49t3u … we have managed to build ‘microbial clocks’ … Fedor Galkin et al., ‘Human microbiome aging clocks based on deep learning and tandem of permutation feature importance and accumulated local effects’, bioRxiv (2018). DOI: 10.1101/507780 ageless.link/3wtnuz One study … without a microbiome … Marisa Stebegg et al., ‘Heterochronic faecal transplantation boosts gut germinal centres in aged mice’, Nat. Commun. 10, 2443 (2019).


pages: 409 words: 112,055

The Fifth Domain: Defending Our Country, Our Companies, and Ourselves in the Age of Cyber Threats by Richard A. Clarke, Robert K. Knake

"World Economic Forum" Davos, A Declaration of the Independence of Cyberspace, Affordable Care Act / Obamacare, air gap, Airbnb, Albert Einstein, Amazon Web Services, autonomous vehicles, barriers to entry, bitcoin, Black Lives Matter, Black Swan, blockchain, Boeing 737 MAX, borderless world, Boston Dynamics, business cycle, business intelligence, call centre, Cass Sunstein, cloud computing, cognitive bias, commoditize, computer vision, corporate governance, cryptocurrency, data acquisition, data science, deep learning, DevOps, disinformation, don't be evil, Donald Trump, Dr. Strangelove, driverless car, Edward Snowden, Exxon Valdez, false flag, geopolitical risk, global village, immigration reform, information security, Infrastructure as a Service, Internet of things, Jeff Bezos, John Perry Barlow, Julian Assange, Kubernetes, machine readable, Marc Benioff, Mark Zuckerberg, Metcalfe’s law, MITM: man-in-the-middle, Morris worm, move fast and break things, Network effects, open borders, platform as a service, Ponzi scheme, quantum cryptography, ransomware, Richard Thaler, Salesforce, Sand Hill Road, Schrödinger's Cat, self-driving car, shareholder value, Silicon Valley, Silicon Valley startup, Skype, smart cities, Snapchat, software as a service, Steven Levy, Stuxnet, technoutopianism, The future is already here, Tim Cook: Apple, undersea cable, unit 8200, WikiLeaks, Y2K, zero day

ANNs can adjust their own “wiring” and weighting based upon patterns in data, in order to improve themselves and their predictive abilities. For instance, an ANN that classifies pictures of cats or dogs will, over time, learn the distinguishing factors of the two types of animals and adjust its “wiring” so that its cat or dog predictions are more accurate. The second type of ML you will also hear about is deep learning, which is in turn a type of ANN that uses multiple layers of “neurons” to analyze data, allowing it to perform very complex analysis. Enough with the definitions. What can AI do defensively or offensively in security and warfare? Artificially Intelligent About Security? AI/ML for the Defense On a large corporate network today, there are between three and six dozen separate cybersecurity software applications in use, each contributing to the security of the overall network in a specific capacity.


pages: 444 words: 118,393

The Nature of Software Development: Keep It Simple, Make It Valuable, Build It Piece by Piece by Ron Jeffries

Amazon Web Services, anti-pattern, bitcoin, business cycle, business intelligence, business logic, business process, c2.com, call centre, cloud computing, continuous integration, Conway's law, creative destruction, dark matter, data science, database schema, deep learning, DevOps, disinformation, duck typing, en.wikipedia.org, fail fast, fault tolerance, Firefox, Hacker News, industrial robot, information security, Infrastructure as a Service, Internet of things, Jeff Bezos, Kanban, Kubernetes, load shedding, loose coupling, machine readable, Mars Rover, microservices, Minecraft, minimum viable product, MITM: man-in-the-middle, Morris worm, move fast and break things, OSI model, peer-to-peer lending, platform as a service, power law, ransomware, revision control, Ruby on Rails, Schrödinger's Cat, Silicon Valley, six sigma, software is eating the world, source of truth, SQL injection, systems thinking, text mining, time value of money, transaction costs, Turing machine, two-pizza team, web application, zero day

Can there be any bigger waste of system resources than burning cycles and clock time only to throw away the result? If the system can determine in advance that it will fail at an operation, it’s always better to fail fast. That way, the caller doesn’t have to tie up any of its capacity waiting and can get on with other work. How can the system tell whether it will fail? Do we need Deep Learning? Don’t worry, you won’t need to hire a cadre of data scientists. It’s actually much more mundane than that. There’s a large class of “resource unavailable” failures. For example, when a load balancer gets a connection request but not one of the servers in its service pool is functioning, it should immediately refuse the connection.


Entangled Life: How Fungi Make Our Worlds, Change Our Minds & Shape Our Futures by Merlin Sheldrake

Anthropocene, Apollo 11, biofilm, buy low sell high, carbon footprint, CRISPR, crowdsourcing, cuban missile crisis, dark matter, deep learning, discovery of penicillin, Easter island, experimental subject, Fellow of the Royal Society, Isaac Newton, Kickstarter, late capitalism, low earth orbit, Mason jar, meta-analysis, microbiome, moral panic, NP-complete, phenotype, randomized controlled trial, Ronald Reagan, seminal paper, TED Talk, the built environment, Thomas Bayes, Thomas Malthus, traveling salesman, two and twenty

On the nutritional dependence of certain trees on root symbiosis with belowground fungi (an English translation of A. B. Frank’s classic paper of 1885). Mycorrhiza 15: 267–75. Fredericksen MA, Zhang Y, Hazen ML, Loreto RG, Mangold CA, Chen DZ, Hughes DP. 2017. Three-dimensional visualization and a deep-learning model reveal complex fungal parasite networks in behaviorally manipulated ants. Proceedings of the National Academy of Sciences 114: 12590–595. Fricker MD, Boddy L, Bebber DP. 2007a. “Network Organisation of Mycelial Fungi.” In Biology of the Fungal Cell. Howard RJ, Gow NAR, eds. Springer International Publishing, pp. 309–30.


pages: 416 words: 129,308

The One Device: The Secret History of the iPhone by Brian Merchant

Airbnb, animal electricity, Apollo Guidance Computer, Apple II, Apple's 1984 Super Bowl advert, Black Lives Matter, Charles Babbage, citizen journalism, Citizen Lab, Claude Shannon: information theory, computer vision, Computing Machinery and Intelligence, conceptual framework, cotton gin, deep learning, DeepMind, Douglas Engelbart, Dynabook, Edward Snowden, Elon Musk, Ford paid five dollars a day, Frank Gehry, gigafactory, global supply chain, Google Earth, Google Hangouts, Higgs boson, Huaqiangbei: the electronics market of Shenzhen, China, information security, Internet of things, Jacquard loom, John Gruber, John Markoff, Jony Ive, Large Hadron Collider, Lyft, M-Pesa, MITM: man-in-the-middle, more computing power than Apollo, Mother of all demos, natural language processing, new economy, New Journalism, Norbert Wiener, offshore financial centre, oil shock, pattern recognition, peak oil, pirate software, profit motive, QWERTY keyboard, reality distortion field, ride hailing / ride sharing, rolodex, Shenzhen special economic zone , Silicon Valley, Silicon Valley startup, skeuomorphism, skunkworks, Skype, Snapchat, special economic zone, speech recognition, stealth mode startup, Stephen Hawking, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, TED Talk, Tim Cook: Apple, Tony Fadell, TSMC, Turing test, uber lyft, Upton Sinclair, Vannevar Bush, zero day

When Gruber says knowledge, I think he means a firm, robust grasp on how the world works and how to reason. Today, researchers are less interested in developing AI’s ability to reason and more intent on having them do more and more complex machine learning, which is not unlike automated data mining. You might have heard the term deep learning. Projects like Google’s DeepMind neural network work essentially by hoovering up as much data as possible, then getting better and better at simulating desired outcomes. By processing immense amounts of data about, say, Van Gogh’s paintings, a system like this can be instructed to create a Van Gogh painting—and it will spit out a painting that looks kinda-sorta like a Van Gogh.


pages: 527 words: 147,690

Terms of Service: Social Media and the Price of Constant Connection by Jacob Silverman

"World Economic Forum" Davos, 23andMe, 4chan, A Declaration of the Independence of Cyberspace, Aaron Swartz, Airbnb, airport security, Amazon Mechanical Turk, augmented reality, basic income, Big Tech, Brian Krebs, California gold rush, Californian Ideology, call centre, cloud computing, cognitive dissonance, commoditize, company town, context collapse, correlation does not imply causation, Credit Default Swap, crowdsourcing, data science, deep learning, digital capitalism, disinformation, don't be evil, driverless car, drone strike, Edward Snowden, Evgeny Morozov, fake it until you make it, feminist movement, Filter Bubble, Firefox, Flash crash, game design, global village, Google Chrome, Google Glasses, Higgs boson, hive mind, Ian Bogost, income inequality, independent contractor, informal economy, information retrieval, Internet of things, Jacob Silverman, Jaron Lanier, jimmy wales, John Perry Barlow, Kevin Kelly, Kevin Roose, Kickstarter, knowledge economy, knowledge worker, Larry Ellison, late capitalism, Laura Poitras, license plate recognition, life extension, lifelogging, lock screen, Lyft, machine readable, Mark Zuckerberg, Mars Rover, Marshall McLuhan, mass incarceration, meta-analysis, Minecraft, move fast and break things, national security letter, Network effects, new economy, Nicholas Carr, Occupy movement, off-the-grid, optical character recognition, payday loans, Peter Thiel, planned obsolescence, postindustrial economy, prediction markets, pre–internet, price discrimination, price stability, profit motive, quantitative hedge fund, race to the bottom, Ray Kurzweil, real-name policy, recommendation engine, rent control, rent stabilization, RFID, ride hailing / ride sharing, Salesforce, self-driving car, sentiment analysis, shareholder value, sharing economy, Sheryl Sandberg, Silicon Valley, Silicon Valley ideology, Snapchat, social bookmarking, social graph, social intelligence, social web, sorting algorithm, Steve Ballmer, Steve Jobs, Steven Levy, systems thinking, TaskRabbit, technological determinism, technological solutionism, technoutopianism, TED Talk, telemarketer, transportation-network company, Travis Kalanick, Turing test, Uber and Lyft, Uber for X, uber lyft, universal basic income, unpaid internship, women in the workforce, Y Combinator, yottabyte, you are the product, Zipcar

In addition to fake accounts, people also post things that are intentionally insincere and misleading, including in their profiles, which further complicates the effort to divide people into the kinds of highly specific categories (e.g., single dads from major cities who don’t belong to gyms) that market researchers like. Of course, these analytical tools are getting better, incorporating the latest discoveries in computational linguistics and deep learning, a form of artificial intelligence in which computers are taught to understand colloquial speech and recognize objects (such as people’s faces). Some sentiment analysis software now applies several different filters to each piece of text in order to consider not only the tone and meaning of the utterance but also whether the source is reliable or somehow biased.


pages: 589 words: 147,053

The Age of Em: Work, Love and Life When Robots Rule the Earth by Robin Hanson

8-hour work day, artificial general intelligence, augmented reality, Berlin Wall, bitcoin, blockchain, brain emulation, business cycle, business process, Clayton Christensen, cloud computing, correlation does not imply causation, creative destruction, deep learning, demographic transition, Erik Brynjolfsson, Ethereum, ethereum blockchain, experimental subject, fault tolerance, financial intermediation, Flynn Effect, Future Shock, Herman Kahn, hindsight bias, information asymmetry, job automation, job satisfaction, John Markoff, Just-in-time delivery, lone genius, Machinery of Freedom by David Friedman, market design, megaproject, meta-analysis, Nash equilibrium, new economy, Nick Bostrom, pneumatic tube, power law, prediction markets, quantum cryptography, rent control, rent-seeking, reversible computing, risk tolerance, Silicon Valley, smart contracts, social distancing, statistical model, stem cell, Thomas Malthus, trade route, Turing test, Tyler Cowen, Vernor Vinge, William MacAskill

Many of these people expect traditional artificial intelligence, that is, hand-coded software, to achieve broad human level abilities before brain emulations appear. I think that past rates of progress in coding smart software suggest that at previous rates it will take two to four centuries to achieve broad human level abilities via this route. These critics often point to exciting recent developments, such as advances in “deep learning,” that they think make prior trends irrelevant. More generally, some critics fault me for insufficiently crediting new trends that they expect will soon revolutionize society, even if we don’t yet see strong supporting evidence of these trends. Such revolutions include robots taking most jobs, local sourcing replacing mass production, small firms replacing big ones, worker cooperatives replacing for-profits, ability tests replacing school degrees, and 3D printers replacing manufacturing plants.


pages: 486 words: 150,849

Evil Geniuses: The Unmaking of America: A Recent History by Kurt Andersen

"Friedman doctrine" OR "shareholder theory", "World Economic Forum" Davos, affirmative action, Affordable Care Act / Obamacare, air traffic controllers' union, airline deregulation, airport security, Alan Greenspan, always be closing, American ideology, American Legislative Exchange Council, An Inconvenient Truth, anti-communist, Apple's 1984 Super Bowl advert, artificial general intelligence, autonomous vehicles, basic income, Bear Stearns, Bernie Sanders, blue-collar work, Bonfire of the Vanities, bonus culture, Burning Man, call centre, Capital in the Twenty-First Century by Thomas Piketty, carbon tax, Cass Sunstein, centre right, computer age, contact tracing, coronavirus, corporate governance, corporate raider, cotton gin, COVID-19, creative destruction, Credit Default Swap, cryptocurrency, deep learning, DeepMind, deindustrialization, Donald Trump, Dr. Strangelove, Elon Musk, ending welfare as we know it, Erik Brynjolfsson, feminist movement, financial deregulation, financial innovation, Francis Fukuyama: the end of history, future of work, Future Shock, game design, General Motors Futurama, George Floyd, George Gilder, Gordon Gekko, greed is good, Herbert Marcuse, Herman Kahn, High speed trading, hive mind, income inequality, industrial robot, interchangeable parts, invisible hand, Isaac Newton, It's morning again in America, James Watt: steam engine, Jane Jacobs, Jaron Lanier, Jeff Bezos, jitney, Joan Didion, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Joseph Schumpeter, junk bonds, Kevin Roose, knowledge worker, lockdown, low skilled workers, Lyft, Mark Zuckerberg, market bubble, mass immigration, mass incarceration, Menlo Park, Naomi Klein, new economy, Norbert Wiener, Norman Mailer, obamacare, Overton Window, Peter Thiel, Picturephone, plutocrats, post-industrial society, Powell Memorandum, pre–internet, public intellectual, Ralph Nader, Right to Buy, road to serfdom, Robert Bork, Robert Gordon, Robert Mercer, Ronald Reagan, Saturday Night Live, Seaside, Florida, Second Machine Age, shareholder value, Silicon Valley, social distancing, Social Responsibility of Business Is to Increase Its Profits, Steve Jobs, Stewart Brand, stock buybacks, strikebreaker, tech billionaire, The Death and Life of Great American Cities, The Future of Employment, The Rise and Fall of American Growth, The Wealth of Nations by Adam Smith, Tim Cook: Apple, too big to fail, trickle-down economics, Tyler Cowen, Tyler Cowen: Great Stagnation, Uber and Lyft, uber lyft, union organizing, universal basic income, Unsafe at Any Speed, urban planning, urban renewal, very high income, wage slave, Wall-E, War on Poverty, We are all Keynesians now, Whole Earth Catalog, winner-take-all economy, women in the workforce, working poor, young professional, éminence grise

” There’s a whole new subdiscipline of technologists and economists predicting and debating what work can or can’t be automated partly or entirely and, depending on the cost, what jobs will or won’t be done mainly by smart machines by what year in the twenty-first century. Even discounting for digital enthusiasts’ habitual overoptimism, the recent rate of progress in AI and robotics has been astounding. The exponential growth of digital data and cheapening of computer power reached a point in the last decade that allowed so-called deep learning on so-called neural networks—extremely smart machines—to achieve remarkable technical feat after remarkable technical feat. A common task in creating AI software, for instance, is training a system to recognize and classify millions of images. In the fall of 2017 that task typically took engineers three hours to do, but by the summer of 2019 it took only eighty-eight seconds and thus cost 99 percent less.


The New Map: Energy, Climate, and the Clash of Nations by Daniel Yergin

"RICO laws" OR "Racketeer Influenced and Corrupt Organizations", 3D printing, 9 dash line, activist fund / activist shareholder / activist investor, addicted to oil, Admiral Zheng, Albert Einstein, American energy revolution, Asian financial crisis, autonomous vehicles, Ayatollah Khomeini, Bakken shale, Bernie Sanders, BRICs, British Empire, carbon tax, circular economy, clean tech, commodity super cycle, company town, coronavirus, COVID-19, decarbonisation, deep learning, Deng Xiaoping, Didi Chuxing, disruptive innovation, distributed generation, Donald Trump, driverless car, Edward Snowden, Elon Musk, energy security, energy transition, failed state, Ford Model T, geopolitical risk, gig economy, global pandemic, global supply chain, green new deal, Greta Thunberg, hydraulic fracturing, Indoor air pollution, Intergovernmental Panel on Climate Change (IPCC), inventory management, James Watt: steam engine, John Zimmer (Lyft cofounder), Kickstarter, LNG terminal, Lyft, Malacca Straits, Malcom McLean invented shipping containers, Masayoshi Son, Masdar, mass incarceration, megacity, megaproject, middle-income trap, Mikhail Gorbachev, mutually assured destruction, new economy, off grid, oil rush, oil shale / tar sands, oil shock, open economy, paypal mafia, peak oil, pension reform, power law, price mechanism, purchasing power parity, RAND corporation, rent-seeking, ride hailing / ride sharing, rolling blackouts, Ronald Reagan, Russian election interference, self-driving car, Silicon Valley, smart cities, social distancing, South China Sea, sovereign wealth fund, Suez crisis 1956, super pumped, supply-chain management, TED Talk, trade route, Travis Kalanick, Twitter Arab Spring, Uber and Lyft, uber lyft, ubercab, UNCLOS, UNCLOS, uranium enrichment, vertical integration, women in the workforce

Department of Transportation, Assuring America’s Leadership in Automated Vehicles Technologies: Automated Vehicles 4.0 (Washington D.C., 2020); Rebecca Yergin, “NHTSA Continues to Ramp Up Exploration of Automated Driving Technologies,” Covington & Burling, Blog, April 2020. 13. Marco della Cava, “Garage Startup Uses Deep Learning to Teach Cars to Drive,” USA Today, August 30, 2016. Chapter 39: Hailing the Future 1. Interview with Garrett Camp; “UberCab” pitch deck, December 2008. 2. Adam Lashinsky, Wild Ride: Inside Uber’s Quest for World Domination (New York: Portfolio/Penguin, 2017), pp. 80–81, 91. 3.


Spies, Lies, and Algorithms by Amy B. Zegart

2021 United States Capitol attack, 4chan, active measures, air gap, airport security, Apollo 13, Bellingcat, Bernie Sanders, Bletchley Park, Chelsea Manning, classic study, cloud computing, cognitive bias, commoditize, coronavirus, correlation does not imply causation, COVID-19, crowdsourcing, cryptocurrency, cuban missile crisis, Daniel Kahneman / Amos Tversky, deep learning, deepfake, DeepMind, disinformation, Donald Trump, drone strike, dual-use technology, Edward Snowden, Elon Musk, en.wikipedia.org, end-to-end encryption, failed state, feminist movement, framing effect, fundamental attribution error, Gene Kranz, global pandemic, global supply chain, Google Earth, index card, information asymmetry, information security, Internet of things, job automation, John Markoff, lockdown, Lyft, Mark Zuckerberg, Nate Silver, Network effects, off-the-grid, openstreetmap, operational security, Parler "social media", post-truth, power law, principal–agent problem, QAnon, RAND corporation, Richard Feynman, risk tolerance, Robert Hanssen: Double agent, Ronald Reagan, Rubik’s Cube, Russian election interference, Saturday Night Live, selection bias, seminal paper, Seymour Hersh, Silicon Valley, Steve Jobs, Stuxnet, synthetic biology, uber lyft, unit 8200, uranium enrichment, WikiLeaks, zero day, zero-sum game

For nuclear threat intelligence, machine learning techniques offer particular promise in analyzing satellite imagery of known missile sites or facilities to detect changes over time.53 In 2017, for example, U.S. intelligence officials from the National Geospatial-Intelligence Agency (NGA) asked researchers at the University of Missouri to develop machine learning tools to see how fast and accurately they could identify surface-to-air missile sites over a huge area in Southwest China. The research team developed a deep learning neural network (essentially, a collection of algorithms working together) and used only commercially available satellite imagery with one-meter resolution. Both the computer and the human team correctly identified 90 percent of the missile sites. But the computer completed the job eighty times faster than humans, taking just forty-two minutes to scan an area of approximately ninety thousand square kilometers (about three-fourths the size of North Korea).54 As noted in chapter 5, machine learning also holds promise for faster sifting of large quantities of written information—everything from trade documents that might suggest illicit financing schemes to the metadata of photos online—such as the date and time stamp on the picture, the type of camera used, the software that processed the image, and where the camera was placed when the picture was taken.55 In addition, computer modeling is enabling analysts to better understand the specifications and functions of structures already built.


Alpha Trader by Brent Donnelly

Abraham Wald, algorithmic trading, Asian financial crisis, Atul Gawande, autonomous vehicles, backtesting, barriers to entry, beat the dealer, behavioural economics, bitcoin, Boeing 747, buy low sell high, Checklist Manifesto, commodity trading advisor, coronavirus, correlation does not imply causation, COVID-19, crowdsourcing, cryptocurrency, currency manipulation / currency intervention, currency risk, deep learning, diversification, Edward Thorp, Elliott wave, Elon Musk, endowment effect, eurozone crisis, fail fast, financial engineering, fixed income, Flash crash, full employment, global macro, global pandemic, Gordon Gekko, hedonic treadmill, helicopter parent, high net worth, hindsight bias, implied volatility, impulse control, Inbox Zero, index fund, inflation targeting, information asymmetry, invisible hand, iterative process, junk bonds, Kaizen: continuous improvement, law of one price, loss aversion, low interest rates, margin call, market bubble, market microstructure, Market Wizards by Jack D. Schwager, McMansion, Monty Hall problem, Network effects, nowcasting, PalmPilot, paper trading, pattern recognition, Peter Thiel, prediction markets, price anchoring, price discovery process, price stability, quantitative easing, quantitative trading / quantitative finance, random walk, Reminiscences of a Stock Operator, reserve currency, risk tolerance, Robert Shiller, secular stagnation, Sharpe ratio, short selling, side project, Stanford marshmallow experiment, Stanford prison experiment, survivorship bias, tail risk, TED Talk, the scientific method, The Wisdom of Crowds, theory of mind, time dilation, too big to fail, transaction costs, value at risk, very high income, yield curve, you are the product, zero-sum game

PART THREE METHODOLOGY AND MATHEMATICS “Nothing in the world is worth having or worth doing unless it means effort, pain, difficulty… I have never in my life envied a human being who led an easy life.” THEODORE ROOSEVELT In Part Three I will outline the specific steps you must take to master your market. Your goal must be to become an expert in the product or markets that you trade. Stay narrow (focus on one or two markets) and go deep (learn as much as possible, right down to the nitty-grittiest details). This approach gives you the best shot at a sustainable edge. Wide and shallow can work, but has a much lower probability of success than narrow and deep. Here is the abstract of an interesting study of Twitter sentiment that supports the idea that narrow focus yields higher returns99.


pages: 1,239 words: 163,625

The Joys of Compounding: The Passionate Pursuit of Lifelong Learning, Revised and Updated by Gautam Baid

Abraham Maslow, activist fund / activist shareholder / activist investor, Airbnb, Alan Greenspan, Albert Einstein, Alvin Toffler, Andrei Shleifer, asset allocation, Atul Gawande, availability heuristic, backtesting, barriers to entry, beat the dealer, Benoit Mandelbrot, Bernie Madoff, bitcoin, Black Swan, book value, business process, buy and hold, Cal Newport, Cass Sunstein, Checklist Manifesto, Clayton Christensen, cognitive dissonance, collapse of Lehman Brothers, commoditize, corporate governance, correlation does not imply causation, creative destruction, cryptocurrency, Daniel Kahneman / Amos Tversky, deep learning, delayed gratification, deliberate practice, discounted cash flows, disintermediation, disruptive innovation, Dissolution of the Soviet Union, diversification, diversified portfolio, dividend-yielding stocks, do what you love, Dunning–Kruger effect, Edward Thorp, Elon Musk, equity risk premium, Everything should be made as simple as possible, fear index, financial independence, financial innovation, fixed income, follow your passion, framing effect, George Santayana, Hans Rosling, hedonic treadmill, Henry Singleton, hindsight bias, Hyman Minsky, index fund, intangible asset, invention of the wheel, invisible hand, Isaac Newton, it is difficult to get a man to understand something, when his salary depends on his not understanding it, Jeff Bezos, John Bogle, Joseph Schumpeter, junk bonds, Kaizen: continuous improvement, Kickstarter, knowledge economy, Lao Tzu, Long Term Capital Management, loss aversion, Louis Pasteur, low interest rates, Mahatma Gandhi, mandelbrot fractal, margin call, Mark Zuckerberg, Market Wizards by Jack D. Schwager, Masayoshi Son, mental accounting, Milgram experiment, moral hazard, Nate Silver, Network effects, Nicholas Carr, offshore financial centre, oil shock, passive income, passive investing, pattern recognition, Peter Thiel, Ponzi scheme, power law, price anchoring, quantitative trading / quantitative finance, Ralph Waldo Emerson, Ray Kurzweil, Reminiscences of a Stock Operator, reserve currency, Richard Feynman, Richard Thaler, risk free rate, risk-adjusted returns, Robert Shiller, Savings and loan crisis, search costs, shareholder value, six sigma, software as a service, software is eating the world, South Sea Bubble, special economic zone, Stanford marshmallow experiment, Steve Jobs, Steven Levy, Steven Pinker, stocks for the long run, subscription business, sunk-cost fallacy, systems thinking, tail risk, Teledyne, the market place, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, time value of money, transaction costs, tulip mania, Upton Sinclair, Walter Mischel, wealth creators, Yogi Berra, zero-sum game

Identify the core ideas and learn them deeply. This deeply ingrained knowledge base can serve as a meaningful springboard for more advanced learning and action in your field. Be brutally honest with yourself. If you do not understand something, revisit the core concepts again and again. Remember that merely memorizing stuff is not deep learning. 2. Make mistakes. Mistakes highlight unforeseen opportunities as well as gaps in our understanding. And mistakes are great teachers. As Michael Jordan once said, “I’ve missed more than nine thousand shots in my career. I’ve lost almost three hundred games. Twenty-six times, I’ve been trusted to take the game-winning shot and missed.


Blueprint: The Evolutionary Origins of a Good Society by Nicholas A. Christakis

Abraham Maslow, agricultural Revolution, Alfred Russel Wallace, AlphaGo, Amazon Mechanical Turk, assortative mating, autism spectrum disorder, Cass Sunstein, classic study, CRISPR, crowdsourcing, data science, David Attenborough, deep learning, different worldview, disruptive innovation, domesticated silver fox, double helix, driverless car, Easter island, epigenetics, experimental economics, experimental subject, Garrett Hardin, intentional community, invention of agriculture, invention of gunpowder, invention of writing, iterative process, job satisfaction, Joi Ito, joint-stock company, land tenure, language acquisition, Laplace demon, longitudinal study, Mahatma Gandhi, Marc Andreessen, means of production, mental accounting, meta-analysis, microbiome, out of africa, overview effect, phenotype, Philippa Foot, Pierre-Simon Laplace, placebo effect, race to the bottom, Ralph Waldo Emerson, replication crisis, Rubik’s Cube, Silicon Valley, Skinner box, social intelligence, social web, stem cell, Steven Pinker, the scientific method, theory of mind, Tragedy of the Commons, twin studies, ultimatum game, zero-sum game

,” Behavioral Ecology 16 (2005): 656–660. 26. S. A. Adamo, “The Strings of the Puppet Master: How Parasites Change Host Behavior,” in D. P. Hughes, J. Brodeur, and F. Thomas, eds., Host Manipulation by Parasites (Oxford: Oxford University Press, 2012), pp. 36–53. 27. M. A. Fredericksen et al., “Three-Dimensional Visualization and a Deep-Learning Model Reveal Complex Fungal Parasite Networks in Behaviorally Manipulated Ants,” PNAS: Proceedings of the National Academy of Sciences 114 (2017): 12590–12595. 28. D. P. Hughes, T. Wappler, and C. C. Labandeira, “Ancient Death-Grip Leaf Scars Reveal Ant-Fungal Parasitism,” Biology Letters 7 (2011): 67–70. 29.


pages: 618 words: 180,430

The Making of Modern Britain by Andrew Marr

anti-communist, antiwork, Arthur Marwick, banking crisis, Bernie Madoff, Bletchley Park, British Empire, business climate, Corn Laws, deep learning, Etonian, garden city movement, guns versus butter model, illegal immigration, imperial preference, Kickstarter, lateral thinking, New Journalism, New Urbanism, plutocrats, public intellectual, Red Clydeside, rent control, strikebreaker, trade liberalization, V2 rocket, wage slave, women in the workforce

Not refreshed enough, however, for in 1908 he became ill, resigned and died three weeks later in Downing Street – the only prime minister to do so. Asquith’s succession meant a steelier, tougher figure at the top, and a much harder time ahead for the monarch. Known to admiring cartoonists as ‘the Last of the Romans’ and later, to colleagues alarmed by his convivial habits, as ‘old Squiffy’, his deep learning awed other MPs just as much as his private life intrigued them. He had been happily married to Helen, a quiet, gentle woman who had produced five children; but in the same year he met Margot Tennant at a Commons dinner party, Helen caught typhoid and died on holiday in Scotland. Asquith was by then a rising political star and successful barrister, but Margot was slow to yield.


pages: 685 words: 203,949

The Organized Mind: Thinking Straight in the Age of Information Overload by Daniel J. Levitin

Abraham Maslow, airport security, Albert Einstein, Amazon Mechanical Turk, Anton Chekhov, autism spectrum disorder, Bayesian statistics, behavioural economics, big-box store, business process, call centre, Claude Shannon: information theory, cloud computing, cognitive bias, cognitive load, complexity theory, computer vision, conceptual framework, correlation does not imply causation, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, data science, deep learning, delayed gratification, Donald Trump, en.wikipedia.org, epigenetics, Eratosthenes, Exxon Valdez, framing effect, friendly fire, fundamental attribution error, Golden Gate Park, Google Glasses, GPS: selective availability, haute cuisine, How many piano tuners are there in Chicago?, human-factors engineering, if you see hoof prints, think horses—not zebras, impulse control, index card, indoor plumbing, information retrieval, information security, invention of writing, iterative process, jimmy wales, job satisfaction, Kickstarter, language acquisition, Lewis Mumford, life extension, longitudinal study, meta-analysis, more computing power than Apollo, Network effects, new economy, Nicholas Carr, optical character recognition, Pareto efficiency, pattern recognition, phenotype, placebo effect, pre–internet, profit motive, randomized controlled trial, Rubik’s Cube, Salesforce, shared worldview, Sheryl Sandberg, Skype, Snapchat, social intelligence, statistical model, Steve Jobs, supply-chain management, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Bayes, traumatic brain injury, Turing test, Twitter Arab Spring, ultimatum game, Wayback Machine, zero-sum game

For example, in English, silverfish is an insect, not a type of fish; prairie dog is a rodent, not a dog; and a toadstool is neither a toad nor a stool that a toad might sit on. Our hunger for knowledge can be at the roots of our failings or our successes. It can distract us or it can keep us engaged in a lifelong quest for deep learning and understanding. Some learning enhances our lives, some is irrelevant and simply distracts us—tabloid stories probably fall into this latter category (unless your profession is as a tabloid writer). Successful people are expert at categorizing useful versus distracting knowledge. How do they do it?


pages: 677 words: 206,548

Future Crimes: Everything Is Connected, Everyone Is Vulnerable and What We Can Do About It by Marc Goodman

23andMe, 3D printing, active measures, additive manufacturing, Affordable Care Act / Obamacare, Airbnb, airport security, Albert Einstein, algorithmic trading, Alvin Toffler, Apollo 11, Apollo 13, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, augmented reality, autonomous vehicles, Baxter: Rethink Robotics, Bill Joy: nanobots, bitcoin, Black Swan, blockchain, borderless world, Boston Dynamics, Brian Krebs, business process, butterfly effect, call centre, Charles Lindbergh, Chelsea Manning, Citizen Lab, cloud computing, Cody Wilson, cognitive dissonance, computer vision, connected car, corporate governance, crowdsourcing, cryptocurrency, data acquisition, data is the new oil, data science, Dean Kamen, deep learning, DeepMind, digital rights, disinformation, disintermediation, Dogecoin, don't be evil, double helix, Downton Abbey, driverless car, drone strike, Edward Snowden, Elon Musk, Erik Brynjolfsson, Evgeny Morozov, Filter Bubble, Firefox, Flash crash, Free Software Foundation, future of work, game design, gamification, global pandemic, Google Chrome, Google Earth, Google Glasses, Gordon Gekko, Hacker News, high net worth, High speed trading, hive mind, Howard Rheingold, hypertext link, illegal immigration, impulse control, industrial robot, information security, Intergovernmental Panel on Climate Change (IPCC), Internet of things, Jaron Lanier, Jeff Bezos, job automation, John Harrison: Longitude, John Markoff, Joi Ito, Jony Ive, Julian Assange, Kevin Kelly, Khan Academy, Kickstarter, Kiva Systems, knowledge worker, Kuwabatake Sanjuro: assassination market, Large Hadron Collider, Larry Ellison, Laura Poitras, Law of Accelerating Returns, Lean Startup, license plate recognition, lifelogging, litecoin, low earth orbit, M-Pesa, machine translation, Mark Zuckerberg, Marshall McLuhan, Menlo Park, Metcalfe’s law, MITM: man-in-the-middle, mobile money, more computing power than Apollo, move fast and break things, Nate Silver, national security letter, natural language processing, Nick Bostrom, obamacare, Occupy movement, Oculus Rift, off grid, off-the-grid, offshore financial centre, operational security, optical character recognition, Parag Khanna, pattern recognition, peer-to-peer, personalized medicine, Peter H. Diamandis: Planetary Resources, Peter Thiel, pre–internet, printed gun, RAND corporation, ransomware, Ray Kurzweil, Recombinant DNA, refrigerator car, RFID, ride hailing / ride sharing, Rodney Brooks, Ross Ulbricht, Russell Brand, Salesforce, Satoshi Nakamoto, Second Machine Age, security theater, self-driving car, shareholder value, Sheryl Sandberg, Silicon Valley, Silicon Valley startup, SimCity, Skype, smart cities, smart grid, smart meter, Snapchat, social graph, SoftBank, software as a service, speech recognition, stealth mode startup, Stephen Hawking, Steve Jobs, Steve Wozniak, strong AI, Stuxnet, subscription business, supply-chain management, synthetic biology, tech worker, technological singularity, TED Talk, telepresence, telepresence robot, Tesla Model S, The future is already here, The Future of Employment, the long tail, The Wisdom of Crowds, Tim Cook: Apple, trade route, uranium enrichment, Virgin Galactic, Wall-E, warehouse robotics, Watson beat the top human players on Jeopardy!, Wave and Pay, We are Anonymous. We are Legion, web application, Westphalian system, WikiLeaks, Y Combinator, you are the product, zero day

Characteristics of a strong AI would include the ability to reason, make judgments, plan, learn, communicate, and unify these skills toward achieving common goals across a variety of domains, and commercial interest is growing. In 2014, Google purchased DeepMind Technologies for more than $500 million in order to strengthen its already strong capabilities in deep learning AI. In the same vein, Facebook created a new internal division specifically focused on advanced AI. Optimists believe that the arrival of AGI may bring with it a period of unprecedented abundance in human history, eradicating war, curing all disease, radically extending human life, and ending poverty.


pages: 562 words: 201,502

Elon Musk by Walter Isaacson

4chan, activist fund / activist shareholder / activist investor, Airbnb, Albert Einstein, AltaVista, Apollo 11, Apple II, Apple's 1984 Super Bowl advert, artificial general intelligence, autism spectrum disorder, autonomous vehicles, basic income, Big Tech, blockchain, Boston Dynamics, Burning Man, carbon footprint, ChatGPT, Chuck Templeton: OpenTable:, Clayton Christensen, clean tech, Colonization of Mars, computer vision, Computing Machinery and Intelligence, coronavirus, COVID-19, crowdsourcing, cryptocurrency, deep learning, DeepMind, Demis Hassabis, disinformation, Dogecoin, Donald Trump, Douglas Engelbart, drone strike, effective altruism, Elon Musk, estate planning, fail fast, fake news, game design, gigafactory, GPT-4, high-speed rail, hiring and firing, hive mind, Hyperloop, impulse control, industrial robot, information security, Jeff Bezos, Jeffrey Epstein, John Markoff, John von Neumann, Jony Ive, Kwajalein Atoll, lab leak, large language model, Larry Ellison, lockdown, low earth orbit, Marc Andreessen, Marc Benioff, Mars Society, Max Levchin, Michael Shellenberger, multiplanetary species, Neil Armstrong, Network effects, OpenAI, packet switching, Parler "social media", paypal mafia, peer-to-peer, Peter Thiel, QAnon, Ray Kurzweil, reality distortion field, remote working, rent control, risk tolerance, Rubik’s Cube, Salesforce, Sam Altman, Sam Bankman-Fried, San Francisco homelessness, Sand Hill Road, Saturday Night Live, self-driving car, seminal paper, short selling, Silicon Valley, Skype, SpaceX Starlink, Stephen Hawking, Steve Jobs, Steve Jurvetson, Steve Wozniak, Steven Levy, Streisand effect, supply-chain management, tech bro, TED Talk, Tesla Model S, the payments system, Tim Cook: Apple, universal basic income, Vernor Vinge, vertical integration, Virgin Galactic, wikimedia commons, William MacAskill, work culture , Y Combinator

The OpenAI team rejected that idea, and Altman stepped in as president of the lab, starting a for-profit arm that was able to raise equity funding. So Musk decided to forge ahead with building a rival AI team to work on Tesla Autopilot. Even as he was struggling with the production hell surges in Nevada and Fremont, he recruited Andrej Karpathy, a specialist in deep learning and computer vision, away from OpenAI. “We realized that Tesla was going to become an AI company and would be competing for the same talent as OpenAI,” Altman says. “It pissed some of our team off, but I fully understood what was happening.” Altman would turn the tables in 2023 by hiring Karpathy back after he became exhausted working for Musk. 41 The Launch of Autopilot Tesla, 2014–2016 Franz von Holzhausen with an early “Robotaxi” Radar Musk had discussed with Larry Page the possibility of Tesla and Google working together to build an autopilot system that would allow cars to be self-driving.


Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems by Martin Kleppmann

active measures, Amazon Web Services, billion-dollar mistake, bitcoin, blockchain, business intelligence, business logic, business process, c2.com, cloud computing, collaborative editing, commoditize, conceptual framework, cryptocurrency, data science, database schema, deep learning, DevOps, distributed ledger, Donald Knuth, Edward Snowden, end-to-end encryption, Ethereum, ethereum blockchain, exponential backoff, fake news, fault tolerance, finite state, Flash crash, Free Software Foundation, full text search, functional programming, general-purpose programming language, Hacker News, informal economy, information retrieval, Internet of things, iterative process, John von Neumann, Ken Thompson, Kubernetes, Large Hadron Collider, level 1 cache, loose coupling, machine readable, machine translation, Marc Andreessen, microservices, natural language processing, Network effects, no silver bullet, operational security, packet switching, peer-to-peer, performance metric, place-making, premature optimization, recommendation engine, Richard Feynman, self-driving car, semantic web, Shoshana Zuboff, social graph, social web, software as a service, software is eating the world, sorting algorithm, source of truth, SPARQL, speech recognition, SQL injection, statistical model, surveillance capitalism, systematic bias, systems thinking, Tragedy of the Commons, undersea cable, web application, WebSocket, wikimedia commons

Porup: “‘Internet of Things’ Security Is Hilariously Broken and Getting Worse,” arstechnica.com, January 23, 2016. [96] Bruce Schneier: Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World. W. W. Norton, 2015. ISBN: 978-0-393-35217-7 [97] The Grugq: “Nothing to Hide,” grugq.tumblr.com, April 15, 2016. [98] Tony Beltramelli: “Deep-Spying: Spying Using Smartwatch and Deep Learning,” Masters Thesis, IT University of Copenhagen, December 2015. Available at arxiv.org/abs/1512.05616 [99] Shoshana Zuboff: “Big Other: Surveillance Capitalism and the Prospects of an Information Civilization,” Journal of Information Technology, volume 30, number 1, pages 75–89, April 2015. doi:10.1057/jit.2015.5 [100] Carina C.


pages: 1,237 words: 227,370

Designing Data-Intensive Applications: The Big Ideas Behind Reliable, Scalable, and Maintainable Systems by Martin Kleppmann

active measures, Amazon Web Services, billion-dollar mistake, bitcoin, blockchain, business intelligence, business logic, business process, c2.com, cloud computing, collaborative editing, commoditize, conceptual framework, cryptocurrency, data science, database schema, deep learning, DevOps, distributed ledger, Donald Knuth, Edward Snowden, end-to-end encryption, Ethereum, ethereum blockchain, exponential backoff, fake news, fault tolerance, finite state, Flash crash, Free Software Foundation, full text search, functional programming, general-purpose programming language, Hacker News, informal economy, information retrieval, Infrastructure as a Service, Internet of things, iterative process, John von Neumann, Ken Thompson, Kubernetes, Large Hadron Collider, level 1 cache, loose coupling, machine readable, machine translation, Marc Andreessen, microservices, natural language processing, Network effects, no silver bullet, operational security, packet switching, peer-to-peer, performance metric, place-making, premature optimization, recommendation engine, Richard Feynman, self-driving car, semantic web, Shoshana Zuboff, social graph, social web, software as a service, software is eating the world, sorting algorithm, source of truth, SPARQL, speech recognition, SQL injection, statistical model, surveillance capitalism, systematic bias, systems thinking, Tragedy of the Commons, undersea cable, web application, WebSocket, wikimedia commons

Porup: “‘Internet of Things’ Security Is Hilariously Broken and Getting Worse,” arstechnica.com, January 23, 2016. [96] Bruce Schneier: Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World. W. W. Norton, 2015. ISBN: 978-0-393-35217-7 [97] The Grugq: “Nothing to Hide,” grugq.tumblr.com, April 15, 2016. [98] Tony Beltramelli: “Deep-Spying: Spying Using Smartwatch and Deep Learning,” Masters Thesis, IT University of Copenhagen, December 2015. Available at arxiv.org/abs/1512.05616 [99] Shoshana Zuboff: “Big Other: Surveillance Capitalism and the Prospects of an Information Civilization,” Journal of Information Technology, volume 30, number 1, pages 75–89, April 2015. doi:10.1057/jit.2015.5 [100] Carina C.


pages: 768 words: 252,874

A History of Judaism by Martin Goodman

British Empire, classic study, deep learning, liberation theology, mass immigration, place-making, spice trade, the market place, trade route, wikimedia commons, Yom Kippur War

Following his death in 1038, authority within rabbinic Judaism was dispersed to a number of new centres in the Mediterranean world and northern Europe, where Jews came under the hegemony not just of Islamic rulers in Palestine, Egypt, North Africa and Spain but also of a multiplicity of Christian states united by recognition of papal jurisdiction in religious matters from Rome. In Spain, France and Germany, rabbis with a shared respect for, and deep learning in, the Babylonian Talmud as well as the biblical texts consolidated the expression of the law as guidance for everyday life while evolving, through mystical speculation as well as philosophical analysis, novel theologies about the relation of God to his creation. The connection of intellectual talmudic scholarship to the practical concerns of European Jews was facilitated by a new role for individual rabbis as local communal arbitrators in Jewish communities in the Rhineland and in France from the eleventh century.


pages: 1,330 words: 372,940

Kissinger: A Biography by Walter Isaacson

Alan Greenspan, Apollo 13, belling the cat, Berlin Wall, Charles Lindbergh, cuban missile crisis, deep learning, Deng Xiaoping, Dr. Strangelove, Great Grain Robbery, haute couture, Herman Kahn, index card, Khyber Pass, long peace, Mikhail Gorbachev, Monroe Doctrine, Norman Mailer, oil shock, out of africa, Plato's cave, RAND corporation, restrictive zoning, rolodex, Ronald Reagan, Seymour Hersh, Socratic dialogue, Ted Sorensen, Yom Kippur War

When a civilization does decay, a new one with higher values tends to be erected on the ruins of the old. Toynbee ultimately failed, according to Kissinger, because he claimed to view human progress in a Christian framework but he relied on empirical methods that left no room for the role of free will. It was an approach “whose exhibition of deep learning tends to obscure its methodological shallowness,” Kissinger wrote.16 Man’s knowledge of freedom, Kissinger argued, must come from an inner intuition. This led him to Immanuel Kant, the German philosopher whose main treatises were written in the 1780s. Kissinger got off to a troublesome start by asserting that the connections between causes and effects exist only in the human mind: “Causality expresses the pattern which the mind imposes on a sequence of events in order to make their appearance comprehensible.”