Von Neumann architecture

34 results back to index


pages: 210 words: 62,771

Turing's Vision: The Birth of Computer Science by Chris Bernhardt

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Andrew Wiles, Bletchley Park, British Empire, cellular automata, Charles Babbage, Claude Shannon: information theory, complexity theory, Computing Machinery and Intelligence, Conway's Game of Life, discrete time, Douglas Hofstadter, Georg Cantor, Gödel, Escher, Bach, Henri Poincaré, Internet Archive, Jacquard loom, John Conway, John von Neumann, Joseph-Marie Jacquard, Ken Thompson, Norbert Wiener, Paul Erdős, Reflections on Trusting Trust, Turing complete, Turing machine, Turing test, Von Neumann architecture

Turing Machines Examples of Turing Machines Computable Functions and Calculations Church-Turing Thesis Computational Power Machines That Don’t Halt 5. Other Systems for Computation The Lambda Calculus Tag Systems One-Dimensional Cellular Automata 6. Encodings and the Universal Machine A Method of Encoding Finite Automata Universal Machines Construction of Universal Machines Modern Computers Are Universal Machines Von Neumann Architecture Random Access Machines RAMs Can Be Emulated by Turing Machines Other Universal Machines What Happens When We Input 〈M〉 into M? 7. Undecidable Problems Proof by Contradiction Russell’s Barber Finite Automata That Do Not Accept Their Encodings Turing Machines That Do Not Accept Their Encodings Does a Turing Machine Diverge on Its Encoding?

That we can simulate Turing machines on modern computers is not surprising. What is surprising is that we can design a Turing machine to simulate a modern computer, showing that Turing machines are equivalent in computing power to modern computers. We will sketch how this is done. The first step is to get a concrete description of the modern computer. Von Neumann Architecture Later we will talk more about John von Neumann, but it is important to know a few facts before we proceed. The First Draft of a Report on the EDVAC is probably the most important paper on the design of modern computers. It was written in 1945, as the first electronic computers were being built.

His machines were theoretical constructs meant to incorporate the basic computational steps of human computers. Von Neumann was interested in building a physical machine. His focus was not on the theory of computation, but on the design of an efficient machine for doing actual computations. The resulting design outlined in the report is often referred to as von Neumann architecture and most modern computers are based on this architecture. Von Neumann’s design built on the ideas of many people. The First Draft, as its name suggests, was a draft of a paper and it was only meant to be circulated to a small number of people. The fact that von Neumann was listed as the sole author and that other people’s work was not credited correctly would not have been a problem if the readership was restricted, as originally intended, to just a few colleagues, but the First Draft was widely circulated and became enormously influential in the design of all subsequent computers.


pages: 463 words: 118,936

Darwin Among the Machines by George Dyson

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, backpropagation, Bletchley Park, British Empire, carbon-based life, cellular automata, Charles Babbage, Claude Shannon: information theory, combinatorial explosion, computer age, Computing Machinery and Intelligence, Danny Hillis, Donald Davies, fault tolerance, Fellow of the Royal Society, finite state, IFF: identification friend or foe, independent contractor, invention of the telescope, invisible hand, Isaac Newton, Jacquard loom, James Watt: steam engine, John Nash: game theory, John von Neumann, launch on warning, low earth orbit, machine readable, Menlo Park, Nash equilibrium, Norbert Wiener, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, phenotype, RAND corporation, Richard Feynman, spectrum auction, strong AI, synthetic biology, the scientific method, The Wealth of Nations by Adam Smith, Turing machine, Von Neumann architecture, zero-sum game

Turing’s automatic computing engine, like Babbage’s analytical engine, was never built. Turing’s proposal “synthesized the concepts of a stored-program universal computer, a floating-point subroutine library, artificial intelligence, details such as a hardware bootstrap loader, and much else.”36 At a time when no such machines were in existence and the von Neumann architecture had only just been proposed, Turing produced a complete description of a million-cycle-per-second computer that foreshadowed the RISC (Reduced Instruction Set Computer) architecture that has now gained prominence after fifty years. The report was accompanied by circuit diagrams, a detailed physical and logical analysis of the internal storage system, sample programs, detailed (if bug-ridden) subroutines, and even an estimated (if unrealistic) cost of £11,200.

Progress reports were disseminated not only among the participating funding agencies and to a half-dozen groups that were duplicating the IAS design, but to any location where the potential of high-speed digital computers might fall on fertile ground. It is no accident that the vast majority of computers in circulation today follow the von Neumann architecture—characterized by a central processing unit operating in parallel on the multiple bits of one word of data at a time, a hierarchical memory ranging from fast but limited random-access memory to slow but unlimited media, such as floppy disks or tape, and a distinction between hardware and software that enabled robust computers (and a robust computer industry) to advance by a leapfrog process with each element evolving freely on its own.

“Quite often the likelihood of getting actual numerical results was very much larger if he was not in the computer room, because everybody got so nervous when he was there,” reported Martin Schwarzschild. “But when you were in real thinking trouble, you would go to von Neumann and nobody else.”43 Von Neumann’s reputation, after fifty years, has been injured less by his critics than by his own success. The astounding proliferation of the von Neumann architecture has obscured von Neumann’s contributions to massively parallel computing, distributed information processing, evolutionary computation, and neural nets. Because his deathbed notes for his canceled Silliman lectures at Yale were published posthumously (and for a popular audience) as The Computer and the Brain (1958), von Neumann’s work has been associated with the claims of those who were exaggerating the analogies between the digital computer and the brain.


When Computers Can Think: The Artificial Intelligence Singularity by Anthony Berglas, William Black, Samantha Thalind, Max Scratchmann, Michelle Estes

3D printing, Abraham Maslow, AI winter, air gap, anthropic principle, artificial general intelligence, Asilomar, augmented reality, Automated Insights, autonomous vehicles, availability heuristic, backpropagation, blue-collar work, Boston Dynamics, brain emulation, call centre, cognitive bias, combinatorial explosion, computer vision, Computing Machinery and Intelligence, create, read, update, delete, cuban missile crisis, David Attenborough, DeepMind, disinformation, driverless car, Elon Musk, en.wikipedia.org, epigenetics, Ernest Rutherford, factory automation, feminist movement, finite state, Flynn Effect, friendly AI, general-purpose programming language, Google Glasses, Google X / Alphabet X, Gödel, Escher, Bach, Hans Moravec, industrial robot, Isaac Newton, job automation, John von Neumann, Law of Accelerating Returns, license plate recognition, Mahatma Gandhi, mandelbrot fractal, natural language processing, Nick Bostrom, Parkinson's law, patent troll, patient HM, pattern recognition, phenotype, ransomware, Ray Kurzweil, Recombinant DNA, self-driving car, semantic web, Silicon Valley, Singularitarianism, Skype, sorting algorithm, speech recognition, statistical model, stem cell, Stephen Hawking, Stuxnet, superintelligent machines, technological singularity, Thomas Malthus, Turing machine, Turing test, uranium enrichment, Von Neumann architecture, Watson beat the top human players on Jeopardy!, wikimedia commons, zero day

Then in 1948 ENIAC was modified to have what is essentially a von Neumann architecture. This made it much easier to program. However, it also made the computer six times slower than it had been previously because it could now only execute one instruction at a time. Even on that ancient computer that ran thousands of times slower than modern computers, the trade off was considered worthwhile. Being easy to program was and is generally far more important than being very efficient. Today there are variations of the basic von Neumann architecture. Graphics Processing Units (GPUs) contain hundreds of von Neumann subsystems that can compute at the same time and so render complex scenes in real time.

Reasoning about program logic 5. Automating program generation 6. High-level models 7. Learning first order concepts 8. Evolutionary algorithms 9. Artificial life 10. Evolutionary programming Computer Hardware 1. Introduction 2. Transistors 3. Logic Elements 4. Programmable Logic Arrays 5. Von Neumann Architecture 6. PLAs vs von Neumann 7. Analog Computers 8. Neurons Brains 1. Gross anatomy 2. Neocortex 3. Brain activity 4. Brain function and size 5. Brain simulation 6. Worms 13. Computational Neuroscience 1. Neurons 2. Neuron synapse 3. Integrate and fire (IF) neurons 4. Hebbian learning 5. Plasticity 6.

Neurons are also relatively slow, with only roughly 200 firings per second, so they have to work concurrently to produce results in a timely manner. On the other hand, ordinary personal computers might contain 4 billion bytes of fast memory, and several thousand billion bytes of slower disk storage. Unlike a neuron, a byte of computer memory is passive, and a conventional “von Neumann” architecture can only process a few dozen bytes at any one time. That said, the computer can perform several billion operations per second, which is millions of times faster than neurons. Specialized hardware and advanced architectures can perform many operations simultaneously, but we also know from experience that it is difficult to write highly concurrent programs that utilize that hardware efficiently.


pages: 118 words: 35,663

Smart Machines: IBM's Watson and the Era of Cognitive Computing (Columbia Business School Publishing) by John E. Kelly Iii

AI winter, book value, call centre, carbon footprint, Computing Machinery and Intelligence, crowdsourcing, demand response, discovery of DNA, disruptive innovation, Erik Brynjolfsson, Fairchild Semiconductor, future of work, Geoffrey West, Santa Fe Institute, global supply chain, Great Leap Forward, Internet of things, John von Neumann, Large Hadron Collider, Mars Rover, natural language processing, optical character recognition, pattern recognition, planetary scale, RAND corporation, RFID, Richard Feynman, smart grid, smart meter, speech recognition, TED Talk, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!

A cognitive computer employing these systems will respond to inquiries more quickly than today’s computers; less data movement will be required and less energy will be used. Today’s von Neumann–style computing won’t go away when cognitive systems come online. New chip and computing technologies will extend its life far into the future. In many cases, the cognitive architecture and the von Neumann architecture will be employed side by side in hybrid systems. Traditional computing will become ever more capable while cognitive technologies will do things that were not possible before. Already, cloud, social networking, mobile, and new ways to interact with computing from tablets to glasses are fueling the desire for cognitive systems that will, for example, both harvest insights from social networks and enhance our experiences within them.

He concluded that while it was futile in the short term to try to invent a new technology for a cognitive machine, that didn’t mean the project should be abandoned. Instead, the team needed to refocus on CMOS chip technology and on digital circuitry rather than analog circuitry. They would create an entirely new non–von Neumann architecture in both silicon and software that would simulate the functions of neurons and synapses. Using that architecture, they would produce chips for sense-making tasks that would be vastly more efficient than today’s standard digital processors.7 Dharmendra called a meeting of all of the participants in the project.

Computing intelligence will be too costly to be practical. Scientists at IBM Research believe that to make computing sustainable in the era of big data, we will need a different kind of machine—the data-centric computer. Today’s computers are processor-centric. The microprocessor, which is the central processing unit in the von Neumann architecture, is where much of the action happens in computing. Working hand in hand with the operating system, the microprocessor sends out instructions to various components within the computer, requesting data from where it’s stored, including memory chips and disk drives. If the computer is part of a larger network, the processor fetches data from storage systems located out on the network.


pages: 253 words: 80,074

The Man Who Invented the Computer by Jane Smiley

1919 Motor Transport Corps convoy, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Arthur Eddington, Bletchley Park, British Empire, c2.com, Charles Babbage, computer age, Computing Machinery and Intelligence, Fellow of the Royal Society, Ford Model T, Henri Poincaré, IBM and the Holocaust, Isaac Newton, John von Neumann, Karl Jansky, machine translation, Norbert Wiener, Norman Macrae, Pierre-Simon Laplace, punch-card reader, RAND corporation, Turing machine, Vannevar Bush, Von Neumann architecture

Partisans of von Neumann make the case that, as with everything else von Neumann did, he took the raw material of another man’s ideas and immediately transcended it, or, as Macrae says, “Johnny grabbed other people’s ideas, then by his clarity leapt five blocks ahead of them, and helped put them into practical effect.” The most important contribution of the “First Draft” to computer design was that it laid out what came to be known as “von Neumann architecture”—that is, that the computer could contain a set of instructions in its memory like the set of instructions that Turing’s human “computer” would have been given and would have to follow day after day forever. The instructions would be stored in the memory, which the electronic computer could readily access (not like a paper tape or a deck of punch cards).

Flowers promised the machine by August, but postwar repairs and improvements to the telephone system superseded the project, and by February 1947 the ACE was going nowhere because Turing could not persuade Womersley to commit himself to Turing’s ideas—for example, an engineering department was set up, but made no progress. Possibly, Womersley was the sort of administrator who thinks contradictory ideas constitute a backup plan, but in the end they constituted no plan at all because what had come to be called “von Neumann architecture”—the principles of computer design set out in the “First Draft”—were simply taking over by coming to seem tried and tested.3 Turing quit. In the autumn of 1947, he returned to Cambridge. 1. One reason that Zuse’s autobiography is interesting is that it gives Americans a perspective on life in Nazi Germany that we rarely get.

Eckert was still with Sperry Rand (he stayed with Sperry, and then Unisys, until 1989). Neither Mauchly nor Eckert had profited directly from the ENIAC patent, but they did get credit (and they did seek that credit) for inventing the computer. Eckert, in particular, was vocal about the inaccuracy of the phrase “von Neumann architecture”—he thought it should be called “Eckert architecture.” But the vagaries of patent law and the delay in awarding the Eckert and Mauchly patents seemed to be working for Sperry. If the patent had been awarded in 1947, it would have run out by 1964, before computers became big business. However, in 1960, the patent was still being challenged.


pages: 566 words: 122,184

Code: The Hidden Language of Computer Hardware and Software by Charles Petzold

Bill Gates: Altair 8800, Charles Babbage, Claude Shannon: information theory, computer age, Dennis Ritchie, digital divide, Donald Knuth, Douglas Engelbart, Douglas Engelbart, Dynabook, Eratosthenes, Fairchild Semiconductor, Free Software Foundation, Gary Kildall, Grace Hopper, invention of the telegraph, Isaac Newton, Ivan Sutherland, Jacquard loom, James Watt: steam engine, John von Neumann, Joseph-Marie Jacquard, Ken Thompson, Louis Daguerre, millennium bug, Multics, Norbert Wiener, optical character recognition, popular electronics, Richard Feynman, Richard Stallman, Silicon Valley, Steve Jobs, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture

These instructions should be sequential in memory and addressed with a program counter but should also allow conditional jumps. This design came to be known as the stored-program concept. These design decisions were such an important evolutionary step that today we speak of von Neumann architecture. The computer that we built in the last chapter was a classic von Neumann machine. But with von Neumann architecture comes the von Neumann bottleneck. A von Neumann machine generally spends a significant amount of time just fetching instructions from memory in preparation for executing them. You'll recall that the final design of the Chapter 17 computer required that three-quarters of the time it spent on each instruction be involved in the instruction fetch.

It makes it sound as if computer architecture were patterned after an office. The truth is that the distinction between memory and storage is an artificial one and exists solely because we don't have a single storage medium that is both fast and vast as well as nonvolatile. What we know today as "von Neumann architecture"—the dominant computer architecture for over 50 years—is a direct result of this technical deficiency. Here's another question that someone once asked me: "Why can't you run Macintosh programs under Windows?" My mouth opened to begin an answer when I realized that it involved many more technical issues than I'm sure my questioner was prepared to deal with in one sitting.

Because C has operations that parallel many common processor instructions, C is sometimes categorized as a high-level assembly language. More than any ALGOL-like language, C closely mimics common processor instruction sets. Yet all ALGOL-like languages—which really means most commonly used programming languages—were designed based on von Neumann architecture computers. Breaking out of the von Neumann mind-set when designing a computer language isn't easy, and getting other people to use such a language is even harder. One such non–von Neumann language is LISP (which stands for List Processing), which was designed by John McCarthy in the late 1950s and is useful for work in the field of artificial intelligence.


Turing's Cathedral by George Dyson

1919 Motor Transport Corps convoy, Abraham Wald, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Benoit Mandelbrot, Bletchley Park, British Empire, Brownian motion, cellular automata, Charles Babbage, cloud computing, computer age, Computing Machinery and Intelligence, Danny Hillis, dark matter, double helix, Dr. Strangelove, fault tolerance, Fellow of the Royal Society, finite state, Ford Model T, Georg Cantor, Henri Poincaré, Herman Kahn, housing crisis, IFF: identification friend or foe, indoor plumbing, Isaac Newton, Jacquard loom, John von Neumann, machine readable, mandelbrot fractal, Menlo Park, Murray Gell-Mann, Neal Stephenson, Norbert Wiener, Norman Macrae, packet switching, pattern recognition, Paul Erdős, Paul Samuelson, phenotype, planetary scale, RAND corporation, random walk, Richard Feynman, SETI@home, social graph, speech recognition, The Theory of the Leisure Class by Thorstein Veblen, Thorstein Veblen, Turing complete, Turing machine, Von Neumann architecture

., adopts “Johnny” nickname (1930) appointed to IAS (1933) resigns positions in Germany (1933–1935) becomes U.S. citizen (1937) marries Klára (Klári) Dan (1938) mission to England during World War II (1943), 4.1, 13.1 joins Manhattan Project at Los Alamos (1943), 4.1, 5.1 appointed to Atomic Energy Commission (1954) accepts position at UCLA (1954) diagnosed with cancer (1955) final 18 months of life, 14.1, 18.1 death (1957) and alternative models of computing, 14.1, 14.2 on analog vs. digital, 9.1, 12.1, 14.1 and applied mathematics, 3.1, 4.1, 5.1, 10.1 and Atomic Energy Commission (AEC), 1.1, 11.1, 14.1, 14.2, 14.3, 15.1, 15.2, 16.1, 18.1 on axiomatization 49–50 and Nils Barricelli and beginnings of ECP, 6.1, 7.1 and Julian Bigelow, 7.1, 14.1, 18.1 with Niels Bohr in Copenhagen (1938) on bottlenecks in computing, 5.1, 5.2 and the brain in Cambridge with Turing (1935) on communism, 4.1, 10.1, 10.2 character of, 3.1, 4.1, 4.2, 4.3, 4.4, 5.1, 5.2, 5.3, 7.1, 7.2, 8.1, 8.2, 8.3, 10.1, 10.2, 10.3, 11.1, 11.2, 11.3, 12.1, 13.1, 13.2, 14.1, 17.1, 18.1, 18.2, 18.3 and Cybernetics movement driving habits of, 4.1, 10.1, 15.1 and economics, 4.1, 4.2, 15.1 and the EDVAC and engineers, 4.1, 8.1, 10.1 and the ENIAC, 5.1, 5.2, 5.3, 9.1, 9.2, 10.1, 11.1 and foundations of mathematics frequent travels of, 4.1, 4.2, 5.1, 10.1, 10.2, 13.1, 16.1 on future of computing and future of technology and gambling, 10.1, 10.2, 10.3 and game theory, 4.1, 4.2, 4.3, 8.1, 10.1, 15.1, 18.1 and Gödel, 4.1, 6.1, 6.2, 6.3, 6.4, 8.1, 13.1, 15.1, 18.1 and Herman Goldstine and IBM, 4.1, 5.1, 5.2, 8.1, 9.1, 14.1, 18.1 and incompleteness on infinity, 4.1, 16.1 and Los Alamos, prf.1, 1.1, 1.2, 4.1, 5.1, 5.2, 10.1, 10.2, 10.3, 11.1, 11.2, 11.3, 15.1, 18.1 mental abilities of, 1.1, 4.1, 4.2, 4.3, 4.4, 7.1, 7.2, 11.1, 14.1, 18.1 and military, 1.1, 4.1, 4.2, 10.1, 10.2, 10.3, 11.1, 14.1, 16.1, 17.1 and molecular biology, 12.1, 15.1 on “a network of computers that would span the world” (1945) on non-linear coding and nuclear weapons, prf.1, 1.1, 1.2, 4.1, 6.1, 10.1, 10.2, 11.1, 11.2, 11.3, 11.4, 15.1, 16.1 and numerical weather prediction, 9.1, 9.2, 9.3, 18.1 and Oppenheimer, 9.1, 14.1, 14.2 and origins of ECP, 5.1, 5.2, 5.3 and patent rights, 5.1, 5.2, 8.1 and preventive (“quick”) war, 6.1, 10.1, 10.2, 15.1, 18.1 and punched cards, 4.1, 9.1 and RCA, 5.1, 5.2, 9.1 on reliability, 7.1, 12.1 religious beliefs of, 4.1, 14.1 on remote input/output and Selectron and shock waves, 4.1, 12.1, 16.1, 16.2 on singularity (technological) and stored-program computing, 5.1, 5.2, 6.1, 8.1, 10.1, 16.1 superstitions of and theory of self-reproducing automata, 1.1, 15.1, 15.2 and Turing (and Universal Turing Machine), 1.1, 3.1, 5.1, 6.1, 8.1, 13.1, 13.2, 13.3, 13.4, 13.5, 15.1 and Stan Ulam, 4.1, 4.2, 4.3, 4.4, 4.5, 4.6, 10.1, 10.2, 11.1, 11.2, 14.1, 14.2 and Oswald Veblen, 4.1, 14.1, 18.1 work habits, 4.1, 7.1, 10.1 and World War II, 4.1, 6.1 von Neumann, Klári (Klára Dán, 1911–1963), 4.1, 10.1, 10.2, 14.1, 14.2, 16.1, 18.1, 18.2 childhood in “Roaring Twenties” Budapest meets von Neumann in Monte Carlo (1930s) meets von Neumann in Budapest (1937) marries John von Neumann (1938) arrives in U.S. (1938) suicide of father (1939) pregnancy and miscarriage (1942) death, in La Jolla (1963) and computer programming, 10.1, 10.2 and depression, 10.1, 10.2 on ECP, 5.1, 5.2, 7.1, 7.2, 18.1 and ENIAC, 9.1, 10.1 on Abraham Flexner, 3.1, 3.2 on Hungary and Hungarians on IAS, 3.1, 6.1, 7.1, 7.2, 11.1 marriages of, 10.1, 10.2, 18.1, nts.1n and Monte Carlo (code), 10.1, 10.2, 16.1 and nuclear weapons design, 10.1, 18.1 and population research, 10.1, 10.2, 10.3 and Princeton, 4.1, 10.1, 14.1, 18.1 on John von Neumann, 3.1, 4.1, 4.2, 4.3, 4.4, 5.1, 5.2, 10.1, 11.1, 14.1, 18.1 on von Neumann and destruction of Europe, 4.1, 4.2, 10.1, 10.2, 10.3 on von Neumann and Morgenstern, 4.1, 4.2 on von Neumann and U.S. landscape, 4.1, 10.1 on Oswald Veblen, 3.1, 4.1, 4.2, 14.1 Vonneumann, Nicholas (1911–2011), 4.1, 4.2, 4.3 on life under Béla Kun on family life in Budapest on John von Neumann, 4.1, 4.2, 4.3, 4.4, 14.1 on Max von Neumann von Neumann, Max (1873–1928), 4.1, 4.2, 4.3, 4.4 awarded nobility, 1913 von Neumann, Michael (1907–1989) von Neumann (Whitman), Marina, 4.1, 10.1, 10.2 on John von Neumann, 4.1, 4.2, 10.1, 14.1 von Neumann architecture and non–von Neumann architecture von Neumann bottleneck Wald, Abraham, 7.1, 7.2 Walter Reed Hospital, 4.1, 14.1 Ware, Willis, 1.1, 5.1, 7.1, 7.2, 7.3, 7.4, 8.1, 8.2, 18.1, 18.2 on Bigelow, 7.1, 8.1, 14.1 on duplication of IAS computer joins ECP, 1946 on numerical testing routines on opposition to ECP at IAS on Presper Eckert on Selectron memory on von Neumann, prf.1, prf.2, 5.1, 10.1 Warren, Robert B.

The “First Draft of a Report on the EDVAC,” reproduced by mimeograph and released into limited distribution by the Moore School on June 30, 1945, outlined the design of a high-speed stored-program electronic digital computer, including the requisite formulation and interpretation of coded instructions—“which must be given to the device in absolutely exhaustive detail.”42 The functional elements of the computer were separated into a hierarchical memory, a control organ, a central arithmetic unit, and input/output channels, making distinctions still known as the “von Neumann architecture” today. A fast internal memory, coupled to a larger secondary memory, and linked in turn to an unlimited supply of punched cards or paper tape, gave the unbounded storage that Turing had prescribed. The impediment of a single channel between memory and processor is memorialized as the “von Neumann bottleneck,” although its namesake attempted, unsuccessfully, to nip this in the bud.

However, it is believed to be not far from an important central truth, that highly recursive, conditional and repetitive routines are used because they are notationally efficient (but not necessarily unique) as descriptions of underlying processes.”40 Bigelow questioned the persistence of the von Neumann architecture and challenged the central dogma of digital computing: that without programmers, computers cannot compute. He (and von Neumann) had speculated from the very beginning about “the possibility of causing various elementary pieces of information situated in the cells of a large array (say, of memory) to enter into a computation process without explicitly generating a coordinate address in ‘machine-space’ for selecting them out of the array.”41 Biology has been doing this all along.


pages: 339 words: 94,769

Possible Minds: Twenty-Five Ways of Looking at AI by John Brockman

AI winter, airport security, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Alignment Problem, AlphaGo, artificial general intelligence, Asilomar, autonomous vehicles, basic income, Benoit Mandelbrot, Bill Joy: nanobots, Bletchley Park, Buckminster Fuller, cellular automata, Claude Shannon: information theory, Computing Machinery and Intelligence, CRISPR, Daniel Kahneman / Amos Tversky, Danny Hillis, data science, David Graeber, deep learning, DeepMind, Demis Hassabis, easy for humans, difficult for computers, Elon Musk, Eratosthenes, Ernest Rutherford, fake news, finite state, friendly AI, future of work, Geoffrey Hinton, Geoffrey West, Santa Fe Institute, gig economy, Hans Moravec, heat death of the universe, hype cycle, income inequality, industrial robot, information retrieval, invention of writing, it is difficult to get a man to understand something, when his salary depends on his not understanding it, James Watt: steam engine, Jeff Hawkins, Johannes Kepler, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, Kickstarter, Laplace demon, Large Hadron Collider, Loebner Prize, machine translation, market fundamentalism, Marshall McLuhan, Menlo Park, military-industrial complex, mirror neurons, Nick Bostrom, Norbert Wiener, OpenAI, optical character recognition, paperclip maximiser, pattern recognition, personalized medicine, Picturephone, profit maximization, profit motive, public intellectual, quantum cryptography, RAND corporation, random walk, Ray Kurzweil, Recombinant DNA, Richard Feynman, Rodney Brooks, self-driving car, sexual politics, Silicon Valley, Skype, social graph, speech recognition, statistical model, Stephen Hawking, Steven Pinker, Stewart Brand, strong AI, superintelligent machines, supervolcano, synthetic biology, systems thinking, technological determinism, technological singularity, technoutopianism, TED Talk, telemarketer, telerobotics, The future is already here, the long tail, the scientific method, theory of mind, trolley problem, Turing machine, Turing test, universal basic income, Upton Sinclair, Von Neumann architecture, Whole Earth Catalog, Y2K, you are the product, zero-sum game

A second foundational piece of work was in a 1945 “First Draft” report on the design for a digital computer, wherein von Neumann advocated for a memory that could contain both instructions and data.* This is now known as a von Neumann architecture computer—as distinct from a Harvard architecture computer, where there are two separate memories, one for instructions and one for data. The vast majority of computer chips built in the era of Moore’s Law are based on the von Neumann architecture, including those powering our data centers, our laptops, and our smartphones. Von Neumann’s digital-computer architecture is conceptually the same generalization—from early digital computers constructed with electromagnetic relays at both Harvard University and Bletchley Park—that occurs in going from a special-purpose Turing Machine to a Universal Turing Machine.

In the fifty-year Moore’s Law–fueled race to produce software that could exploit the doubling of computer capability every two years, the typical care and certification of engineering disciplines were thrown by the wayside. Software engineering was fast and prone to failures. This rapid development of software without standards of correctness has opened up many routes to exploit von Neumann architecture’s storage of data and instructions in the same memory. One of the most common routes, known as “buffer overrun,” involves an input number (or long string of characters) that is bigger than the programmer expected and overflows into where the instructions are stored. By carefully designing an input number that is too big by far, someone using a piece of software can infect it with instructions not intended by the programmer, and thus change what it does.


pages: 405 words: 117,219

In Our Own Image: Savior or Destroyer? The History and Future of Artificial Intelligence by George Zarkadakis

3D printing, Ada Lovelace, agricultural Revolution, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, animal electricity, anthropic principle, Asperger Syndrome, autonomous vehicles, barriers to entry, battle of ideas, Berlin Wall, bioinformatics, Bletchley Park, British Empire, business process, carbon-based life, cellular automata, Charles Babbage, Claude Shannon: information theory, combinatorial explosion, complexity theory, Computing Machinery and Intelligence, continuous integration, Conway's Game of Life, cosmological principle, dark matter, data science, deep learning, DeepMind, dematerialisation, double helix, Douglas Hofstadter, driverless car, Edward Snowden, epigenetics, Flash crash, Google Glasses, Gödel, Escher, Bach, Hans Moravec, income inequality, index card, industrial robot, intentional community, Internet of things, invention of agriculture, invention of the steam engine, invisible hand, Isaac Newton, Jacquard loom, Jacques de Vaucanson, James Watt: steam engine, job automation, John von Neumann, Joseph-Marie Jacquard, Kickstarter, liberal capitalism, lifelogging, machine translation, millennium bug, mirror neurons, Moravec's paradox, natural language processing, Nick Bostrom, Norbert Wiener, off grid, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, Paul Erdős, Plato's cave, post-industrial society, power law, precautionary principle, prediction markets, Ray Kurzweil, Recombinant DNA, Rodney Brooks, Second Machine Age, self-driving car, seminal paper, Silicon Valley, social intelligence, speech recognition, stem cell, Stephen Hawking, Steven Pinker, Strategic Defense Initiative, strong AI, Stuart Kauffman, synthetic biology, systems thinking, technological singularity, The Coming Technological Singularity, The Future of Employment, the scientific method, theory of mind, Turing complete, Turing machine, Turing test, Tyler Cowen, Tyler Cowen: Great Stagnation, Vernor Vinge, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Y2K

Von Neumann was fascinated by the design of ENIAC, and wondered how the computer might be easily reprogrammed to perform a different set of operations – not involving artillery ballistics this time, but to predict the results of a hydrogen bomb explosion. Invited by the team that developed ENIAC to advise them, von Neumann produced a landmark report,7 which described a machine that could store both data and programs.8 The ‘von Neumann architecture’ – as it has hitherto been known – demonstrated how computers could be reprogrammed easily. Until then computers had fixed programs, and had to be physically rewired in order to be reprogrammed. Von Neumann’s architecture allowed code in a computer to be self-modified. One could thus write programs that write programs, an idea that makes possible the host of automated tools that computer engineers have nowadays at their disposal, such as assemblers and compilers.

This is a very hard question to answer, since we do not yet have a way to collect credible evidence.26 Nevertheless, I personally would be inclined to bet that the spontaneous emergence of self-awareness in current technological cyberspace is highly improbable. Since the 1940s, we have been locked in a specific approach to computer technology that separates hardware from software, and which is mostly based on a specific hardware architecture called the ‘von Neumann architecture’, as we saw in the previous chapter. There could have been many other paths we could have taken in computer evolution (for instance advanced analogue computers), but we did not. The word ‘evolution’ is of great importance here. The pseudo-cybernetic assumption of the AI Singularity hypothesis essentially claims that an evolutionary kind of emergence of self-awareness is involved.

., McAfee A. (2014),The Second Machine Engine. New York: W.W. Norton & Co. 13In Turing’s description the tape with the symbols (the ‘data’) is separate from the table of instructions (the ‘program’). In modern computers data and programs are stored in the same storage, a key insight that is part of the ‘von Neumann architecture’. 14According to historians Robert Friedel and Paul Israel at least twenty-two other inventors ‘discovered’ the incandescent lamp prior to Thomas Edison. However, it was Edison who developed the lamp into an effective source of electric lighting by selecting an effective incandescent material, achieving a higher vacuum and using a higher resistance filament. 15Konrad Zuse invented the world’s first programmable computer Z3, which became operational in May 1941. 14 From Bletchley Park to Google Campus 1‘Global Information Report 2013’, World Economic Forum (www.weforum.com). 2This is a phrase from Greek philosopher Heraclitus (535–475 BC).


pages: 352 words: 120,202

Tools for Thought: The History and Future of Mind-Expanding Technology by Howard Rheingold

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Bletchley Park, card file, cellular automata, Charles Babbage, Claude Shannon: information theory, combinatorial explosion, Compatible Time-Sharing System, computer age, Computer Lib, Computing Machinery and Intelligence, conceptual framework, Conway's Game of Life, Douglas Engelbart, Dynabook, experimental subject, Hacker Ethic, heat death of the universe, Howard Rheingold, human-factors engineering, interchangeable parts, invention of movable type, invention of the printing press, Ivan Sutherland, Jacquard loom, John von Neumann, knowledge worker, machine readable, Marshall McLuhan, Menlo Park, Neil Armstrong, Norbert Wiener, packet switching, pattern recognition, popular electronics, post-industrial society, Project Xanadu, RAND corporation, Robert Metcalfe, Silicon Valley, speech recognition, Steve Jobs, Steve Wozniak, Stewart Brand, Ted Nelson, telemarketer, The Home Computer Revolution, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture

He was one of history's most brilliant physicists, logicians, and mathematicians, as well as the software genius who invented the first electronic digital computer. John von Neumann was the center of the group who created the "stored program" concept that made truly powerful computers possible, and he specified a template that is still used to design almost all computers--the "von Neumann architecture." When he died, the Secretaries of Defense, the Army, Air Force, and Navy and the Joint Chiefs of staff were all gathered around his bed, attentive to his last gasps of technical and policy advice. Norbert Wiener, raised to be a prodigy, graduated from Tufts at fourteen, earned his Ph.D. from Harvard at eighteen, and studied with Bertrand Russell at nineteen.

All such machines, the authors of the "Preliminary Report" declared, must have a unit where arithmetic and logical operations can be performed (the processing unit where actual calculation takes place, equivalent to Babbage's "mill"), a unit where instructions and data for the current problem can be stored (like Babbage's "store," a kind of temporary memory device), a unit that executes the instructions according to the specified sequential order (like the "read/write head" of Turing's theoretical machine), and a unit where the human operator can enter raw information or see the computed output (what we now call "input-output devices"). Any machine that adheres to these principles -- no matter what physical technology is used to implement these logical functions -- is an example of what has become known as "the von Neumann architecture." It doesn't matter whether you build such a machine out of gears and springs, vacuum tubes, or transistors, as long as its operations follow this logical sequence. This theoretical template was first implemented in the Unites States at the Institute for Advanced Study. Modified copies of the IAS machine were made for the Rand Corporation, an Air Force spinoff "think tank" that was responsible for keeping track of targets for the nation's new but fast-growing nuclear armory, and for the Los Alamos Laboratory.

"He reminds me of Moses parting the Red Sea," is the way Alan Kay describes Engelbart's gentle charisma. Of course, the original Moses never set foot in the promised Land. And he never had the reputation of being an easy man to work with. In 1951, Engelbart quit his job at Ames and went to graduate school at the University of California at Berkeley, where one of the first von Neumann architecture computers was being built. That was when he began to notice that not only didn't people know what he was talking about, but some presumably "objective" scientists were overly hostile. He started saying the wrong things to people who could affect his career, things that simply sounded strange to the other electrical engineers.


pages: 239 words: 70,206

Data-Ism: The Revolution Transforming Decision Making, Consumer Behavior, and Almost Everything Else by Steve Lohr

"World Economic Forum" Davos, 23andMe, Abraham Maslow, Affordable Care Act / Obamacare, Albert Einstein, Alvin Toffler, Bear Stearns, behavioural economics, big data - Walmart - Pop Tarts, bioinformatics, business cycle, business intelligence, call centre, Carl Icahn, classic study, cloud computing, computer age, conceptual framework, Credit Default Swap, crowdsourcing, Daniel Kahneman / Amos Tversky, Danny Hillis, data is the new oil, data science, David Brooks, driverless car, East Village, Edward Snowden, Emanuel Derman, Erik Brynjolfsson, everywhere but in the productivity statistics, financial engineering, Frederick Winslow Taylor, Future Shock, Google Glasses, Ida Tarbell, impulse control, income inequality, indoor plumbing, industrial robot, informal economy, Internet of things, invention of writing, Johannes Kepler, John Markoff, John von Neumann, lifelogging, machine translation, Mark Zuckerberg, market bubble, meta-analysis, money market fund, natural language processing, obamacare, pattern recognition, payday loans, personalized medicine, planned obsolescence, precision agriculture, pre–internet, Productivity paradox, RAND corporation, rising living standards, Robert Gordon, Robert Solow, Salesforce, scientific management, Second Machine Age, self-driving car, Silicon Valley, Silicon Valley startup, SimCity, six sigma, skunkworks, speech recognition, statistical model, Steve Jobs, Steven Levy, The Design of Experiments, the scientific method, Thomas Kuhn: the structure of scientific revolutions, Tony Fadell, unbanked and underbanked, underbanked, Von Neumann architecture, Watson beat the top human players on Jeopardy!, yottabyte

The big-data era is the next evolutionary upheaval in the landscape of computing. The things people want to do with data, like real-time analysis of data streams or continuously running machine-learning software, pose a threat to the traditional computer industry. Conventional computing—the Von Neumann architecture, named for mathematician and computer scientist John von Neumann—operates according to discrete steps of program, store, and process. Major companies and markets were built around those tiers of computing—software, disk drives, and microprocessors, respectively. Modern data computing, according to John Kelly, IBM’s senior vice president in charge of research, will “completely disrupt the industry as we know it, creating new platforms and players.”

., 5–6 Snyder, Steven, 165–67, 170 social networks, research using human behavior and, 86–94 retail use, 153–62 spread of information and, 73–74 Twitter posts and, 197–202 see also privacy concerns Social Security numbers, data used to predict person’s, 187–88 software, origin of term, 96 Solow, Robert, 72 Speakeasy programming language, 160 Spee (Harvard club), 28–30 Spohrer, Jim, 25 Stanford University, 211–12 Starbucks, 157 Stockholm, rush-hour pricing in, 47 storytelling, computer algorithms and, 120–21, 149, 165–66, 205, 214 structural racism, in big data racial profiling, 194–95 Structure of Scientific Revolutions, The (Kuhn), 175 Sweeney, Latanya, 193–95 System S, at IBM, 40 Tarbell, Ida, 208 Taylor, Frederick Winslow, 207–8 Tecco, Halle, 16, 25, 28, 168–69 Tetlock, Philip, 67–68 thermostats, learning by, 143–45, 147–53 Thinking, Fast and Slow (Kahneman), 66–67 toggling, 84 Truth in Lending Act (1968), 185 T-shaped people, 25 Tukey, John, 96–97 Turing, Alan, 178–79 Tversky, Amos, 66 Twitter, 85 posts studied for personal information, 197–202 “Two Cultures, The” (Snow), 5–6 “universal machine” (Turing’s theoretical computer), 179 universities, data science and, 15–16, 97–98, 211–12 Unlocking the Value of Personal Data: From Collection to Usage (World Economic Forum), 203 “Unreasonable Effectiveness of Data, The” (Norvig), 116 use-only restrictions, on data, 203 Uttamchandani, Menka, 77–78, 80, 212 VALS (Values, Attitudes, and Lifestyles), 155 Van Alstyne, Marshall, 74 Vance, Ashlee, 85 Vargas, Veronica, 159–60 Varma, Anil, 136–37 Veritas, 91 vineyards, data used for precision agriculture in, 123–33, 212 Vivero, David, 29 Vladeck, David, 203, 204 von Neumann, John, 54 Von Neumann architecture, 54 Walker, Donald, 2, 63, 212 Walmart, 104, 154 Watson, Thomas Jr., 49 Watson technology, of IBM, 45, 66–67, 120, 205 as cloud service, 9, 54 Jeopardy and, 7, 40, 111, 114 medical diagnoses and, 69–70, 109 Watts, Duncan J., 86 weather analysis, with big data, 129–32 Weitzner, Daniel, 184 “Why ask Why?”


pages: 476 words: 121,460

The Man From the Future: The Visionary Life of John Von Neumann by Ananyo Bhattacharya

Ada Lovelace, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Alvin Roth, Andrew Wiles, Benoit Mandelbrot, business cycle, cellular automata, Charles Babbage, Claude Shannon: information theory, clockwork universe, cloud computing, Conway's Game of Life, cuban missile crisis, Daniel Kahneman / Amos Tversky, DeepMind, deferred acceptance, double helix, Douglas Hofstadter, Dr. Strangelove, From Mathematics to the Technologies of Life and Death, Georg Cantor, Greta Thunberg, Gödel, Escher, Bach, haute cuisine, Herman Kahn, indoor plumbing, Intergovernmental Panel on Climate Change (IPCC), Isaac Newton, Jacquard loom, Jean Tirole, John Conway, John Nash: game theory, John von Neumann, Kenneth Arrow, Kickstarter, linear programming, mandelbrot fractal, meta-analysis, mutually assured destruction, Nash equilibrium, Norbert Wiener, Norman Macrae, P = NP, Paul Samuelson, quantum entanglement, RAND corporation, Ray Kurzweil, Richard Feynman, Ronald Reagan, Schrödinger's Cat, second-price auction, side project, Silicon Valley, spectrum auction, Steven Levy, Strategic Defense Initiative, technological singularity, Turing machine, Von Neumann architecture, zero-sum game

He congratulated von Neumann for providing the first ‘complete logical framework for the machine’ and contrasted the streamlined design with the ENIAC, which was ‘chuck full of gadgets that have as their only raison d’etre that they appealed to John Mauchly’.43 Computer designers now refer to the whole configuration as the ‘von Neumann architecture’, and nearly all computers in use today – smart phones, laptops, desktops – are built according to its precepts. The design’s fundamental drawback, now called the ‘von Neumann bottleneck’, is that instructions or data have to be found and fetched serially from memory – like standing in a line, and being able to pass messages only forwards or backwards.

How, then, do brains far more mundane than von Neumann’s accomplish incredible feats that defeat today’s most sophisticated computers like, for example, making up an amusing pun? The answer is that neurons do not fire one after the other, but do their work simultaneously: they are not serial, like von Neumann architecture computers, but parallel – massively so. It was a lasting insight. The artificial neural networks that power today’s best-performing artificial intelligence systems, like those of Google’s DeepMind, are also a kind of parallel processor: they seem to ‘learn’ in a somewhat similar way to the human brain – altering the various weights of each artificial neuron until they can perform a particular task.

‘Bertus’ 22, 149, 150 Budapest anti-Semitism 2, 5 Jewish population 2 Kann-Heller 2 Lenin Boys 13 Romanian occupation 13–14 von Neumann’s life in 1–9, 5, 7 Budapest, University of 6–7, 11, 25, 65 Burks, Arthur 125, 130, 231, 235, 243, 258, 265 Burroughs, William S. xiv Bush, Vannevar 77, 107 California Institute of Technology 184, 245 Cambridge, University of, King’s College 70 Cantor, Georg 20, 20–21 carbon dioxide emissions 283 cardinality 23 Carleton University, Ottawa 225 Carter, Jimmy 264 causality 29, 48–9, 51, 60, 76, 298n63 Cayley, Arthur 32 cellular automata see self-reproducing automata Cellular Automata Machine 245 chain reaction, nuclears, Monte Carlo bomb simulations 8, 80, 81–2, 87–8, 133–4 Champernowne, David 151 Champlain, SS 77 chance 144–5 Chandrasekhar, Subrahmanyan 79 Chebyshev polynomials 12–13, 295n32 chess 4, 142–3, 144, 162–3, 164, 257, 289–90n8, 313n51 China 209 Church, Alonzo 118–9 Churchill, Winston 80, 90 cities, segregation 270, 271, 272 ‘clanking replicators’ 261–3, 262 Clauser, John 57 Cliff, Rodger 2654 climate change 283, 284 Clippinger, Richard 135 closed subroutine, the 138 Cockcroft, John 301n23 Codd, Edgar 258 Cold War 218–20183, 203, 208–12 counterforce strategy 222 doctrine of preventive war 208–10 game theory analysis 212–16 ICBM threat 216–18 kill-or-be-killed paranoia 203 little wars strategy 222–4 nuclear deterrence 212–16, 221–224 paranoia 203 preventive war 208–10 Soviet aggression 222 Soviet Union hydrogen bomb test 216–17 VNs view of 208–9 Collbohm, Frank 185–6, 217 Columbia University 78, 213, 214 Communist Party of Germany (KPD) 99 Compleat Strategyst, The (Williams) 189 Computer and the Brain, The (von Neumann) 275–6 computer programmer job born 131–2 ‘low class individuals’ 278 Neumann, Klára Dán von, as one of the first 133, 135, 136–138 computer science, foundations of 70, 121 computers xiii approach to programming 134–5 Atanasoff–Berry 127 Automatic Computing Engine (ACE) 121, 125 Automatic Sequence Controlled Calculator (ASCC) see Harvard Mark I birth of 16, 28, 102–40 closed subroutine 138 coding 115–16 comparison to the brain 273–6 delay-lines 124–5 differential analysers 107–8 earliest 73 quantum 59 women’s role 85–6, 108–9, 120–1 differential analyser 107–8 EDVAC patent dispute 125–6, 127–8 EDVAC report 121–7 see First Draft of a Report on the EDVAC (von Neumann) ENIAC xi, 1054–11, 106, 120, 123, 124, 126, 127–8 ENIAC conversion 130–5, 309–10n68 First Draft of a Report on the EDVAC (von Neumann) 111, 121–7111 for-loops 116 Gödel’s contribution 111–18 Harvard Mark I 104 human 120–1 IAS project 127, 128–30, 131, 138–9, 140, 193 IBM 701 14039 JOHNNIAC 193, 194 Manchester Baby 138 MANIAC I 137, 139, 310n77 origin of 28 program-controlled 119–20 Project PY 1101 proliferation 272–3 RAND Corporation and 192–3 size 106 Small-Scale Experimental Machine (SSEM) see Manchester Baby 138 storage capacity 123 stored-program 120, 121, 122 see also First Draft of a Report on the EDVAC subroutines 119 Turing’s contribution 118–21 the universal Turing machine 118–21, 306–7n35, 307n37 virus, first 236 VNs contribution 122, 125–76, 129–130, 131, 139–140, 308n48 VNs early interests in 79–80, 103–5 von Neumann architecture 123, 128, 275 von Neumann bottleneck 123 Conan Doyle, Arthur, ‘The Final Problem’ 153–4, 165–6, 165, 314n52 Conferences on Cybernetics 227 Connes, Alain 62 Conus textile sea snail 249 Conway, John Horton 237–41, 243 hexagonal packing of circles 237, 238 Life 239–41, 239, 240, 242, 243, 244, 245, 257 Universal Turing Machine within 241, 243 survey of life forms 240 Universal Turing machine 241, 243 cooperative game theory 172–3, 176, 178, 196–7 Copeland, B.


pages: 294 words: 96,661

The Fourth Age: Smart Robots, Conscious Computers, and the Future of Humanity by Byron Reese

"World Economic Forum" Davos, agricultural Revolution, AI winter, Apollo 11, artificial general intelligence, basic income, bread and circuses, Buckminster Fuller, business cycle, business process, Charles Babbage, Claude Shannon: information theory, clean water, cognitive bias, computer age, CRISPR, crowdsourcing, dark matter, DeepMind, Edward Jenner, Elon Musk, Eratosthenes, estate planning, financial independence, first square of the chessboard, first square of the chessboard / second half of the chessboard, flying shuttle, full employment, Hans Moravec, Hans Rosling, income inequality, invention of agriculture, invention of movable type, invention of the printing press, invention of writing, Isaac Newton, Islamic Golden Age, James Hargreaves, job automation, Johannes Kepler, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, lateral thinking, life extension, Louis Pasteur, low interest rates, low skilled workers, manufacturing employment, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Mary Lou Jepsen, Moravec's paradox, Nick Bostrom, On the Revolutions of the Heavenly Spheres, OpenAI, pattern recognition, profit motive, quantum entanglement, radical life extension, Ray Kurzweil, recommendation engine, Rodney Brooks, Sam Altman, self-driving car, seminal paper, Silicon Valley, Skype, spinning jenny, Stephen Hawking, Steve Wozniak, Steven Pinker, strong AI, technological singularity, TED Talk, telepresence, telepresence robot, The Future of Employment, the scientific method, Timothy McVeigh, Turing machine, Turing test, universal basic income, Von Neumann architecture, Wall-E, warehouse robotics, Watson beat the top human players on Jeopardy!, women in the workforce, working poor, Works Progress Administration, Y Combinator

Who could have guessed that such a humble little device could do all that? Well, Turing could, of course. But no one else seems to have had that singular idea. Exit Turing. Enter John von Neumann, whom we call the father of modern computing. In 1945, he developed the von Neumann architecture for computers. While Turing machines are purely theoretical, designed to frame the question of what computers can do, the von Neumann architecture is about how to build actual computers. He suggested an internal processor and computer memory that holds both programs and data. In addition to the computer’s memory, there might also be external storage to hold data and information not currently needed.


pages: 339 words: 92,785

I, Warbot: The Dawn of Artificially Intelligent Conflict by Kenneth Payne

Abraham Maslow, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, AlphaGo, anti-communist, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, Asperger Syndrome, augmented reality, Automated Insights, autonomous vehicles, backpropagation, Black Lives Matter, Bletchley Park, Boston Dynamics, classic study, combinatorial explosion, computer age, computer vision, Computing Machinery and Intelligence, coronavirus, COVID-19, CRISPR, cuban missile crisis, data science, deep learning, deepfake, DeepMind, delayed gratification, Demis Hassabis, disinformation, driverless car, drone strike, dual-use technology, Elon Musk, functional programming, Geoffrey Hinton, Google X / Alphabet X, Internet of things, job automation, John Nash: game theory, John von Neumann, Kickstarter, language acquisition, loss aversion, machine translation, military-industrial complex, move 37, mutually assured destruction, Nash equilibrium, natural language processing, Nick Bostrom, Norbert Wiener, nuclear taboo, nuclear winter, OpenAI, paperclip maximiser, pattern recognition, RAND corporation, ransomware, risk tolerance, Ronald Reagan, self-driving car, semantic web, side project, Silicon Valley, South China Sea, speech recognition, Stanislav Petrov, stem cell, Stephen Hawking, Steve Jobs, strong AI, Stuxnet, technological determinism, TED Talk, theory of mind, TikTok, Turing machine, Turing test, uranium enrichment, urban sprawl, V2 rocket, Von Neumann architecture, Wall-E, zero-sum game

He demonstrated that logical propositions, like AND/OR and IF/THEN, and binary numbers could be implemented by switches in a telephone circuit.13 Switches, binary code and Boolean logic soon became part of the basic architecture of modern digital computing. Another significant figure early in the history of computing was the charismatic American, John von Neumann.14 His contribution is rather less clear-cut; others can also lay claim to what has become known as the von Neumann architecture for modern computers. By this stage, spurred by the war and the problems of fire control, communications and code-breaking, computing was a large and rapidly expanding field. Still, von Neumann, a brilliant mathematician and consummate committee man, certainly played an outsized part in the story of the computer, advocating a design where the instructions for the computer were separate from the machinery of computing (rather like Turing’s logic processor and ticker tape).

In Turing’s example, the ‘software’ program, inscribed on the tape, did a huge amount of heavy lifting. The hardware itself was pretty basic. But the balance could shift, and the boundary wasn’t always clear cut, as with the idea of ‘firmware’, semi-permanent software that helps run the machinery, which in turn implements the software. And there was another enduring design feature in the von Neumann architecture—memory, on which to store the program and information that the computer would use in its calculations. In Turing’s machine, the tape itself served as the memory. In the real world, creating memory was an engineering challenge, initially solved using cathode tubes appropriated from television screens—discrete ‘bits’ of information would circulate in the tube, being read off the screen with a foil covering and being looped back into the tube—in other words, being ‘held’ in memory.


pages: 346 words: 97,890

The Road to Conscious Machines by Michael Wooldridge

Ada Lovelace, AI winter, algorithmic bias, AlphaGo, Andrew Wiles, Anthropocene, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, backpropagation, basic income, Bletchley Park, Boeing 747, British Empire, call centre, Charles Babbage, combinatorial explosion, computer vision, Computing Machinery and Intelligence, DARPA: Urban Challenge, deep learning, deepfake, DeepMind, Demis Hassabis, don't be evil, Donald Trump, driverless car, Elaine Herzberg, Elon Musk, Eratosthenes, factory automation, fake news, future of work, gamification, general purpose technology, Geoffrey Hinton, gig economy, Google Glasses, intangible asset, James Watt: steam engine, job automation, John von Neumann, Loebner Prize, Minecraft, Mustafa Suleyman, Nash equilibrium, Nick Bostrom, Norbert Wiener, NP-complete, P = NP, P vs NP, paperclip maximiser, pattern recognition, Philippa Foot, RAND corporation, Ray Kurzweil, Rodney Brooks, self-driving car, Silicon Valley, Stephen Hawking, Steven Pinker, strong AI, technological singularity, telemarketer, Tesla Model S, The Coming Technological Singularity, The Future of Employment, the scientific method, theory of mind, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, trolley problem, Turing machine, Turing test, universal basic income, Von Neumann architecture, warehouse robotics

Across the Atlantic in Pennsylvania, a team led by John Mauchly and J. Presper Eckert developed a machine called ENIAC to compute artillery tables. With some tweaks by the brilliant Hungarian mathematician John von Neumann, ENIAC established the fundamental architecture of the modern computer (the architecture of conventional computers is called the Von Neumann architecture, in his honour). Over in post-war England, Fred Williams and Tom Kilburn built the Manchester Baby, which led directly to the world’s first commercial computer, the Ferranti Mark 1 – Turing himself joined the staff of Manchester University in 1948, and wrote some of the first programs to run on it.

A A* 77 À la recherche du temps perdu (Proust) 205–8 accountability 257 Advanced Research Projects Agency (ARPA) 87–8 adversarial machine learning 190 AF (Artificial Flight) parable 127–9, 243 agent-based AI 136–49 agent-based interfaces 147, 149 ‘Agents That Reduce Work and Information Overload’ (Maes) 147–8 AGI (Artificial General Intelligence) 41 AI – difficulty of 24–8 – ethical 246–62, 284, 285 – future of 7–8 – General 42, 53, 116, 119–20 – Golden Age of 47–88 – history of 5–7 – meaning of 2–4 – narrow 42 – origin of name 51–2 – strong 36–8, 41, 309–14 – symbolic 42–3, 44 – varieties of 36–8 – weak 36–8 AI winter 87–8 AI-complete problems 84 ‘Alchemy and AI’ (Dreyfus) 85 AlexNet 187 algorithmic bias 287–9, 292–3 alienation 274–7 allocative harm 287–8 AlphaFold 214 AlphaGo 196–9 AlphaGo Zero 199 AlphaZero 199–200 Alvey programme 100 Amazon 275–6 Apple Watch 218 Argo AI 232 arithmetic 24–6 Arkin, Ron 284 ARPA (Advanced Research Projects Agency) 87–8 Artificial Flight (AF) parable 127–9, 243 Artificial General Intelligence (AGI) 41 artificial intelligence see AI artificial languages 56 Asilomar principles 254–6 Asimov, Isaac 244–6 Atari 2600 games console 192–6, 327–8 augmented reality 296–7 automated diagnosis 220–1 automated translation 204–8 automation 265, 267–72 autonomous drones 282–4 Autonomous Vehicle Disengagement Reports 231 autonomous vehicles see driverless cars autonomous weapons 281–7 autonomy levels 227–8 Autopilot 228–9 B backprop/backpropagation 182–3 backward chaining 94 Bayes nets 158 Bayes’ Theorem 155–8, 365–7 Bayesian networks 158 behavioural AI 132–7 beliefs 108–10 bias 172 black holes 213–14 Blade Runner 38 Blocks World 57–63, 126–7 blood diseases 94–8 board games 26, 75–6 Boole, George 107 brains 43, 306, 330–1 see also electronic brains branching factors 73 Breakout (video game) 193–5 Brooks, Rodney 125–9, 132, 134, 243 bugs 258 C Campaign to Stop Killer Robots 286 CaptionBot 201–4 Cardiogram 215 cars 27–8, 155, 223–35 certainty factors 97 ceteris paribus preferences 262 chain reactions 242–3 chatbots 36 checkers 75–7 chess 163–4, 199 Chinese room 311–14 choice under uncertainty 152–3 combinatorial explosion 74, 80–1 common values and norms 260 common-sense reasoning 121–3 see also reasoning COMPAS 280 complexity barrier 77–85 comprehension 38–41 computational complexity 77–85 computational effort 129 computers – decision making 23–4 – early developments 20 – as electronic brains 20–4 – intelligence 21–2 – programming 21–2 – reliability 23 – speed of 23 – tasks for 24–8 – unsolved problems 28 ‘Computing Machinery and Intelligence’ (Turing) 32 confirmation bias 295 conscious machines 327–30 consciousness 305–10, 314–17, 331–4 consensus reality 296–8 consequentialist theories 249 contradictions 122–3 conventional warfare 286 credit assignment problem 173, 196 Criado Perez, Caroline 291–2 crime 277–81 Cruise Automation 232 curse of dimensionality 172 cutlery 261 Cybernetics (Wiener) 29 Cyc 114–21, 208 D DARPA (Defense Advanced Research Projects Agency) 87–8, 225–6 Dartmouth summer school 1955 50–2 decidable problems 78–9 decision problems 15–19 deduction 106 deep learning 168, 184–90, 208 DeepBlue 163–4 DeepFakes 297–8 DeepMind 167–8, 190–200, 220–1, 327–8 Defense Advanced Research Projects Agency (DARPA) 87–8, 225–6 dementia 219 DENDRAL 98 Dennett, Daniel 319–25 depth-first search 74–5 design stance 320–1 desktop computers 145 diagnosis 220–1 disengagements 231 diversity 290–3 ‘divide and conquer’ assumption 53–6, 128 Do-Much-More 35–6 dot-com bubble 148–9 Dreyfus, Hubert 85–6, 311 driverless cars 27–8, 155, 223–35 drones 282–4 Dunbar, Robin 317–19 Dunbar’s number 318 E ECAI (European Conference on AI) 209–10 electronic brains 20–4 see also computers ELIZA 32–4, 36, 63 employment 264–77 ENIAC 20 Entscheidungsproblem 15–19 epiphenomenalism 316 error correction procedures 180 ethical AI 246–62, 284, 285 European Conference on AI (ECAI) 209–10 evolutionary development 331–3 evolutionary theory 316 exclusive OR (XOR) 180 expected utility 153 expert systems 89–94, 123 see also Cyc; DENDRAL; MYCIN; R1/XCON eye scans 220–1 F Facebook 237 facial recognition 27 fake AI 298–301 fake news 293–8 fake pictures of people 214 Fantasia 261 feature extraction 171–2 feedback 172–3 Ferranti Mark 1 20 Fifth Generation Computer Systems Project 113–14 first-order logic 107 Ford 232 forward chaining 94 Frey, Carl 268–70 ‘The Future of Employment’ (Frey & Osborne) 268–70 G game theory 161–2 game-playing 26 Gangs Matrix 280 gender stereotypes 292–3 General AI 41, 53, 116, 119–20 General Motors 232 Genghis robot 134–6 gig economy 275 globalization 267 Go 73–4, 196–9 Golden Age of AI 47–88 Google 167, 231, 256–7 Google Glass 296–7 Google Translate 205–8, 292–3 GPUs (Graphics Processing Units) 187–8 gradient descent 183 Grand Challenges 2004/5 225–6 graphical user interfaces (GUI) 144–5 Graphics Processing Units (GPUs) 187–8 GUI (graphical user interfaces) 144–5 H hard problem of consciousness 314–17 hard problems 84, 86–7 Harm Assessment Risk Tool (HART) 277–80 Hawking, Stephen 238 healthcare 215–23 Herschel, John 304–6 Herzberg, Elaine 230 heuristic search 75–7, 164 heuristics 91 higher-order intentional reasoning 323–4, 328 high-level programming languages 144 Hilbert, David 15–16 Hinton, Geoff 185–6, 221 HOMER 141–3, 146 homunculus problem 315 human brain 43, 306, 330–1 human intuition 311 human judgement 222 human rights 277–81 human-level intelligence 28–36, 241–3 ‘humans are special’ argument 310–11 I image classification 186–7 image-captioning 200–4 ImageNet 186–7 Imitation Game 30 In Search of Lost Time (Proust) 205–8 incentives 261 indistinguishability 30–1, 37, 38 Industrial Revolutions 265–7 inference engines 92–4 insurance 219–20 intelligence 21–2, 127–8, 200 – human-level 28–36, 241–3 ‘Intelligence Without Representation’ (Brooks) 129 Intelligent Knowledge-Based Systems 100 intentional reasoning 323–4, 328 intentional stance 321–7 intentional systems 321–2 internal mental phenomena 306–7 Internet chatbots 36 intuition 311 inverse reinforcement learning 262 Invisible Women (Criado Perez) 291–2 J Japan 113–14 judgement 222 K Kasparov, Garry 163 knowledge bases 92–4 knowledge elicitation problem 123 knowledge graph 120–1 Knowledge Navigator 146–7 knowledge representation 91, 104, 129–30, 208 knowledge-based AI 89–123, 208 Kurzweil, Ray 239–40 L Lee Sedol 197–8 leisure 272 Lenat, Doug 114–21 lethal autonomous weapons 281–7 Lighthill Report 87–8 LISP 49, 99 Loebner Prize Competition 34–6 logic 104–7, 121–2 logic programming 111–14 logic-based AI 107–11, 130–2 M Mac computers 144–6 McCarthy, John 49–52, 107–8, 326–7 machine learning (ML) 27, 54–5, 168–74, 209–10, 287–9 machines with mental states 326–7 Macintosh computers 144–6 magnetic resonance imaging (MRI) 306 male-orientation 290–3 Manchester Baby computer 20, 24–6, 143–4 Manhattan Project 51 Marx, Karl 274–6 maximizing expected utility 154 Mercedes 231 Mickey Mouse 261 microprocessors 267–8, 271–2 military drones 282–4 mind modelling 42 mind-body problem 314–17 see also consciousness minimax search 76 mining industry 234 Minsky, Marvin 34, 52, 180 ML (machine learning) 27, 54–5, 168–74, 209–10, 287–9 Montezuma’s Revenge (video game) 195–6 Moore’s law 240 Moorfields Eye Hospital 220–1 moral agency 257–8 Moral Machines 251–3 MRI (magnetic resonance imaging) 306 multi-agent systems 160–2 multi-layer perceptrons 177, 180, 182 Musk, Elon 238 MYCIN 94–8, 217 N Nagel, Thomas 307–10 narrow AI 42 Nash, John Forbes Jr 50–1, 161 Nash equilibrium 161–2 natural languages 56 negative feedback 173 neural nets/neural networks 44, 168, 173–90, 369–72 neurons 174 Newell, Alan 52–3 norms 260 NP-complete problems 81–5, 164–5 nuclear energy 242–3 nuclear fusion 305 O ontological engineering 117 Osborne, Michael 268–70 P P vs NP problem 83 paperclips 261 Papert, Seymour 180 Parallel Distributed Processing (PDP) 182–4 Pepper 299 perception 54 perceptron models 174–81, 183 Perceptrons (Minsky & Papert) 180–1, 210 personal healthcare management 217–20 perverse instantiation 260–1 Phaedrus 315 physical stance 319–20 Plato 315 police 277–80 Pratt, Vaughan 117–19 preference relations 151 preferences 150–2, 154 privacy 219 problem solving and planning 55–6, 66–77, 128 programming 21–2 programming languages 144 PROLOG 112–14, 363–4 PROMETHEUS 224–5 protein folding 214 Proust, Marcel 205–8 Q qualia 306–7 QuickSort 26 R R1/XCON 98–9 radiology 215, 221 railway networks 259 RAND Corporation 51 rational decision making 150–5 reasoning 55–6, 121–3, 128–30, 137, 315–16, 323–4, 328 regulation of AI 243 reinforcement learning 172–3, 193, 195, 262 representation harm 288 responsibility 257–8 rewards 172–3, 196 robots – as autonomous weapons 284–5 – Baye’s theorem 157 – beliefs 108–10 – fake 299–300 – indistinguishability 38 – intentional stance 326–7 – SHAKEY 63–6 – Sophia 299–300 – Three Laws of Robotics 244–6 – trivial tasks 61 – vacuum cleaning 132–6 Rosenblatt, Frank 174–81 rules 91–2, 104, 359–62 Russia 261 Rutherford, Ernest (1st Baron Rutherford of Nelson) 242 S Sally-Anne tests 328–9, 330 Samuel, Arthur 75–7 SAT solvers 164–5 Saudi Arabia 299–300 scripts 100–2 search 26, 68–77, 164, 199 search trees 70–1 Searle, John 311–14 self-awareness 41, 305 see also consciousness semantic nets 102 sensors 54 SHAKEY the robot 63–6 SHRDLU 56–63 Simon, Herb 52–3, 86 the Singularity 239–43 The Singularity is Near (Kurzweil) 239 Siri 149, 298 Smith, Matt 201–4 smoking 173 social brain 317–19 see also brains social media 293–6 social reasoning 323, 324–5 social welfare 249 software agents 143–9 software bugs 258 Sophia 299–300 sorting 26 spoken word translation 27 STANLEY 226 STRIPS 65 strong AI 36–8, 41, 309–14 subsumption architecture 132–6 subsumption hierarchy 134 sun 304 supervised learning 169 syllogisms 105, 106 symbolic AI 42–3, 44, 181 synapses 174 Szilard, Leo 242 T tablet computers 146 team-building problem 78–81, 83 Terminator narrative of AI 237–9 Tesla 228–9 text recognition 169–71 Theory of Mind (ToM) 330 Three Laws of Robotics 244–6 TIMIT 292 ToM (Theory of Mind) 330 ToMnet 330 TouringMachines 139–41 Towers of Hanoi 67–72 training data 169–72, 288–9, 292 translation 204–8 transparency 258 travelling salesman problem 82–3 Trolley Problem 246–53 Trump, Donald 294 Turing, Alan 14–15, 17–19, 20, 24–6, 77–8 Turing Machines 18–19, 21 Turing test 29–38 U Uber 168, 230 uncertainty 97–8, 155–8 undecidable problems 19, 78 understanding 201–4, 312–14 unemployment 264–77 unintended consequences 263 universal basic income 272–3 Universal Turing Machines 18, 19 Upanishads 315 Urban Challenge 2007 226–7 utilitarianism 249 utilities 151–4 utopians 271 V vacuum cleaning robots 132–6 values and norms 260 video games 192–6, 327–8 virtue ethics 250 Von Neumann and Morgenstern model 150–5 Von Neumann architecture 20 W warfare 285–6 WARPLAN 113 Waymo 231, 232–3 weak AI 36–8 weapons 281–7 wearable technology 217–20 web search 148–9 Weizenbaum, Joseph 32–4 Winograd schemas 39–40 working memory 92 X XOR (exclusive OR) 180 Z Z3 computer 19–20 PELICAN BOOKS Economics: The User’s Guide Ha-Joon Chang Human Evolution Robin Dunbar Revolutionary Russia: 1891–1991 Orlando Figes The Domesticated Brain Bruce Hood Greek and Roman Political Ideas Melissa Lane Classical Literature Richard Jenkyns Who Governs Britain?


The Deep Learning Revolution (The MIT Press) by Terrence J. Sejnowski

AI winter, Albert Einstein, algorithmic bias, algorithmic trading, AlphaGo, Amazon Web Services, Any sufficiently advanced technology is indistinguishable from magic, augmented reality, autonomous vehicles, backpropagation, Baxter: Rethink Robotics, behavioural economics, bioinformatics, cellular automata, Claude Shannon: information theory, cloud computing, complexity theory, computer vision, conceptual framework, constrained optimization, Conway's Game of Life, correlation does not imply causation, crowdsourcing, Danny Hillis, data science, deep learning, DeepMind, delayed gratification, Demis Hassabis, Dennis Ritchie, discovery of DNA, Donald Trump, Douglas Engelbart, driverless car, Drosophila, Elon Musk, en.wikipedia.org, epigenetics, Flynn Effect, Frank Gehry, future of work, Geoffrey Hinton, Google Glasses, Google X / Alphabet X, Guggenheim Bilbao, Gödel, Escher, Bach, haute couture, Henri Poincaré, I think there is a world market for maybe five computers, industrial robot, informal economy, Internet of things, Isaac Newton, Jim Simons, John Conway, John Markoff, John von Neumann, language acquisition, Large Hadron Collider, machine readable, Mark Zuckerberg, Minecraft, natural language processing, Neil Armstrong, Netflix Prize, Norbert Wiener, OpenAI, orbital mechanics / astrodynamics, PageRank, pattern recognition, pneumatic tube, prediction markets, randomized controlled trial, Recombinant DNA, recommendation engine, Renaissance Technologies, Rodney Brooks, self-driving car, Silicon Valley, Silicon Valley startup, Socratic dialogue, speech recognition, statistical model, Stephen Hawking, Stuart Kauffman, theory of mind, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, Turing machine, Von Neumann architecture, Watson beat the top human players on Jeopardy!, world market for maybe five computers, X Prize, Yogi Berra

But there was one problem: coming to equilibrium and collecting statistics became increasingly slow to simulate, and larger networks took much longer to reach equilibrium. In principle, it is possible to build a computer with a massively parallel architecture that is much faster than one with a traditional von Neumann architecture that makes one update at a time. Digital computers in the 1980s could perform only a million operations per second. Today’s computers perform billions of operations per second, and, by linking together many thousands of cores, high-performance computers are a million times faster than before—an unprecedented increase in technological power.

The race is on to design and build a new generation of chips to run learning algorithms, whether deep, reinforcement, or other, thousands of times faster and more efficiently than the way they are now simulated on generalpurpose computers. The new very large-scale integration (VLSI) chips have parallel processing architectures, with memory onboard to alleviate the bottleneck between memory and the central processing unit (CPU) in the sequential von Neumann architectures that have dominated computing for the last fifty years. We are still in an exploratory phase with regard to hardware, and each type of special-purpose VLSI chip has different strengths and limitations. Massive amounts of computer power will be needed to run the large-scale networks that are being developed for AI applications, and there is tremendous potential for profit in building efficient hardware.


pages: 370 words: 107,983

Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All by Robert Elliott Smith

"World Economic Forum" Davos, Ada Lovelace, adjacent possible, affirmative action, AI winter, Alfred Russel Wallace, algorithmic bias, algorithmic management, AlphaGo, Amazon Mechanical Turk, animal electricity, autonomous vehicles, behavioural economics, Black Swan, Brexit referendum, British Empire, Cambridge Analytica, cellular automata, Charles Babbage, citizen journalism, Claude Shannon: information theory, combinatorial explosion, Computing Machinery and Intelligence, corporate personhood, correlation coefficient, crowdsourcing, Daniel Kahneman / Amos Tversky, data science, deep learning, DeepMind, desegregation, discovery of DNA, disinformation, Douglas Hofstadter, Elon Musk, fake news, Fellow of the Royal Society, feminist movement, Filter Bubble, Flash crash, Geoffrey Hinton, Gerolamo Cardano, gig economy, Gödel, Escher, Bach, invention of the wheel, invisible hand, Jacquard loom, Jacques de Vaucanson, John Harrison: Longitude, John von Neumann, Kenneth Arrow, Linda problem, low skilled workers, Mark Zuckerberg, mass immigration, meta-analysis, mutually assured destruction, natural language processing, new economy, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, On the Economy of Machinery and Manufactures, p-value, pattern recognition, Paul Samuelson, performance metric, Pierre-Simon Laplace, post-truth, precariat, profit maximization, profit motive, Silicon Valley, social intelligence, statistical model, Stephen Hawking, stochastic process, Stuart Kauffman, telemarketer, The Bell Curve by Richard Herrnstein and Charles Murray, The Future of Employment, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Bayes, Thomas Malthus, traveling salesman, Turing machine, Turing test, twin studies, Vilfredo Pareto, Von Neumann architecture, warehouse robotics, women in the workforce, Yochai Benkler

The others include devising a way to make ENIAC (arguably the world’s first real computer) programmable; making substantial contributions to quantum physics6 and equilibrium theories in economics; and, inventing game theory, an area of mathematical research which shaped cold war politics for a generation through his descriptions of a game-theoretic construct he called “mutually assured destruction.” Inspired by Turing’s papers on computation, von Neumann also came up with the modern conception of Babbage’s ‘store’ and ‘mill’ computer structure, in what is now called the ‘von Neumann architecture’, the architecture at the heart of almost all modern computers. Amongst this world-changing productivity, von Neumann also speculated about how computer programs, like genetic organisms, might be able to self-replicate. His ‘cellular automata’ theory closely parallels the actual replication methods of biological DNA, despite the fact that von Neumann’s work was done in advance of the actual structure of DNA being discovered by Watson and Crick in 1953.7 At the same time, statistician George Box suggested ‘evolutionary operations’8 as a methodology for optimizing industrial processes in the late 1950s, though he never implemented the procedure as a computer algorithm, and there are a number of other scientists who also struck close to the ideas that would eventually emerge as evolutionary computation.

See TSP truth uncertainty, here TSP (travelling salesman problem), here, here Tuckett, David, here, here, here, here Tumblr, here Turing, Alan, here, here, here, here, here Turing Test, here, here, here Turkers, here, here Turner, Ted, here Twitter, here, here, here, here Uber, here, here UCL (University College London), here, here, here, here, here, here, here UK Eugenics Records Office, here uncertainty, here, here, here, here, here, here, here, here uncertainty factors, here University College London. See UCL Ursula Le Guin, here USENET groups, here Vaucanson, Jacques, here, here, here Verhulst, Pierre François, here Vlaams Belang, here Voltaire, here Von Neumann, John, here, here Von Neumann Architecture, here Von Neumann’s game theory, here Wallace, Alfred Russel, here, here, here Wallace/Darwin synchronicity, here Walras, Leon, here, here Washington, Booker T., here Watson, James, here, here Weizenbaum, Joseph, here Whale, James, here Whitman, Walt, here Williams, Robert, here Wollstonecraft, Mary, here, here, here, here, here World Economic Forum at Davos, here, here Yager, Chuck, here YouTube, here Zuckerberg, Mark, here, here, here, here, here BLOOMSBURY BUSINESS Bloomsbury Publishing Plc 50 Bedford Square, London, WC1B 3DP, UK 1385 Broadway, New York, NY 10018, USA BLOOMSBURY, BLOOMSBURY BUSINESS and the Diana logo are trademarks of Bloomsbury Publishing Plc First published in Great Britain 2019 Copyright © Robert Elliott Smith, 2019 Cover design by Alice Marwick Robert Elliott Smith has asserted his right under the Copyright, Designs and Patents Act, 1988, to be identified as Author of this work.


pages: 420 words: 119,928

The Three-Body Problem (Remembrance of Earth's Past) by Cixin Liu

Apollo 13, back-to-the-land, cosmic microwave background, Deng Xiaoping, game design, Henri Poincaré, horn antenna, information security, invisible hand, Isaac Newton, Norbert Wiener, Panamax, quantum entanglement, RAND corporation, Search for Extraterrestrial Intelligence, Von Neumann architecture

In the future, any malfunctions will be dealt with the same way!” Von Neumann glanced at Newton, disgusted. They watched as a few riders dashed into the motherboard with their swords unsheathed. After they “repaired” the faulty component, the order to restart was given. This time, the operation went very smoothly. Twenty minutes later, Three Body’s Von Neumann architecture human-formation computer had begun full operations under the Qin 1.0 operating system. “Run solar orbit computation software ‘Three Body 1.0’!” Newton screamed at the top of his lungs. “Start the master computing module! Load the differential calculus module! Load the finite element analysis module!

Against the background of the three suns in syzygy, text appeared: Civilization Number 184 was destroyed by the stacked gravitational attractions of a tri-solar syzygy. This civilization had advanced to the Scientific Revolution and the Industrial Revolution. In this civilization, Newton established nonrelativistic classical mechanics. At the same time, due to the invention of calculus and the Von Neumann architecture computer, the foundation was set for the quantitative mathematical analysis of the motion of three bodies. After a long time, life and civilization will begin once more, and progress through the unpredictable world of Three Body. We invite you to log on again. * * * Just as Wang logged out of the game, a stranger called.


pages: 720 words: 197,129

The Innovators: How a Group of Inventors, Hackers, Geniuses and Geeks Created the Digital Revolution by Walter Isaacson

1960s counterculture, Ada Lovelace, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AltaVista, Alvin Toffler, Apollo Guidance Computer, Apple II, augmented reality, back-to-the-land, beat the dealer, Bill Atkinson, Bill Gates: Altair 8800, bitcoin, Bletchley Park, Bob Noyce, Buckminster Fuller, Byte Shop, c2.com, call centre, Charles Babbage, citizen journalism, Claude Shannon: information theory, Clayton Christensen, commoditize, commons-based peer production, computer age, Computing Machinery and Intelligence, content marketing, crowdsourcing, cryptocurrency, Debian, desegregation, Donald Davies, Douglas Engelbart, Douglas Engelbart, Douglas Hofstadter, driverless car, Dynabook, El Camino Real, Electric Kool-Aid Acid Test, en.wikipedia.org, eternal september, Evgeny Morozov, Fairchild Semiconductor, financial engineering, Firefox, Free Software Foundation, Gary Kildall, Google Glasses, Grace Hopper, Gödel, Escher, Bach, Hacker Ethic, Haight Ashbury, Hans Moravec, Howard Rheingold, Hush-A-Phone, HyperCard, hypertext link, index card, Internet Archive, Ivan Sutherland, Jacquard loom, Jaron Lanier, Jeff Bezos, jimmy wales, John Markoff, John von Neumann, Joseph-Marie Jacquard, Leonard Kleinrock, Lewis Mumford, linear model of innovation, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Menlo Park, Mitch Kapor, Mother of all demos, Neil Armstrong, new economy, New Journalism, Norbert Wiener, Norman Macrae, packet switching, PageRank, Paul Terrell, pirate software, popular electronics, pre–internet, Project Xanadu, punch-card reader, RAND corporation, Ray Kurzweil, reality distortion field, RFC: Request For Comment, Richard Feynman, Richard Stallman, Robert Metcalfe, Rubik’s Cube, Sand Hill Road, Saturday Night Live, self-driving car, Silicon Valley, Silicon Valley startup, Skype, slashdot, speech recognition, Steve Ballmer, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Steven Pinker, Stewart Brand, Susan Wojcicki, technological singularity, technoutopianism, Ted Nelson, Teledyne, the Cathedral and the Bazaar, The Coming Technological Singularity, The Nature of the Firm, The Wisdom of Crowds, Turing complete, Turing machine, Turing test, value engineering, Vannevar Bush, Vernor Vinge, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, Whole Earth Review, wikimedia commons, William Shockley: the traitorous eight, Yochai Benkler

To facilitate this, von Neumann came up with a variable-address program language that enabled an easy switch to substitute instructions while the program was running.57 The team at Penn proposed to the Army that a new and improved ENIAC be built along these lines. It would be binary rather than decimal, use mercury delay lines for memory, and include much, though not all, of what became known as “von Neumann architecture.” In the original proposal to the Army, this new machine was called the Electronic Discrete Variable Automatic Calculator. Increasingly, however, the team started referring to it as a computer, because it would do so much more than merely calculate. Not that it mattered. Everyone simply called it EDVAC.

Watson was a harbinger of a third wave of computing, one that blurred the line between augmented human intelligence and artificial intelligence. “The first generation of computers were machines that counted and tabulated,” Rometty says, harking back to IBM’s roots in Herman Hollerith’s punch-card tabulators used for the 1890 census. “The second generation involved programmable machines that used the von Neumann architecture. You had to tell them what to do.” Beginning with Ada Lovelace, people wrote algorithms that instructed these computers, step by step, how to perform tasks. “Because of the proliferation of data,” Rometty adds, “there is no choice but to have a third generation, which are systems that are not programmed, they learn.”27 But even as this occurs, the process could remain one of partnership and symbiosis with humans rather than one designed to relegate humans to the dustbin of history.

Dyson, Turing’s Cathedral, 1957. See also Aspray, John von Neumann and the Origins of Modern Computing. 62. Eckert oral history, Charles Babbage Institute. See also McCartney, ENIAC, 125, quoting Eckert: “We were clearly suckered by John von Neumann, who succeeded in some circles at getting my ideas called the ‘von Neumann architecture.’ ” 63. Jennings Bartik, Pioneer Programmer, 518. 64. Charles Duhigg and Steve Lohr, “The Patent, Used as a Sword,” New York Times, Oct. 7, 2012. 65. McCartney, ENIAC, 103. 66. C. Dianne Martin, “ENIAC: The Press Conference That Shook the World,” IEEE Technology and Society, Dec. 1995. 67.


pages: 481 words: 125,946

What to Think About Machines That Think: Today's Leading Thinkers on the Age of Machine Intelligence by John Brockman

Adam Curtis, agricultural Revolution, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic trading, Anthropocene, artificial general intelligence, augmented reality, autism spectrum disorder, autonomous vehicles, backpropagation, basic income, behavioural economics, bitcoin, blockchain, bread and circuses, Charles Babbage, clean water, cognitive dissonance, Colonization of Mars, complexity theory, computer age, computer vision, constrained optimization, corporate personhood, cosmological principle, cryptocurrency, cuban missile crisis, Danny Hillis, dark matter, data science, deep learning, DeepMind, Demis Hassabis, digital capitalism, digital divide, digital rights, discrete time, Douglas Engelbart, driverless car, Elon Musk, Emanuel Derman, endowment effect, epigenetics, Ernest Rutherford, experimental economics, financial engineering, Flash crash, friendly AI, functional fixedness, global pandemic, Google Glasses, Great Leap Forward, Hans Moravec, hive mind, Ian Bogost, income inequality, information trail, Internet of things, invention of writing, iterative process, James Webb Space Telescope, Jaron Lanier, job automation, Johannes Kepler, John Markoff, John von Neumann, Kevin Kelly, knowledge worker, Large Hadron Collider, lolcat, loose coupling, machine translation, microbiome, mirror neurons, Moneyball by Michael Lewis explains big data, Mustafa Suleyman, natural language processing, Network effects, Nick Bostrom, Norbert Wiener, paperclip maximiser, pattern recognition, Peter Singer: altruism, phenotype, planetary scale, Ray Kurzweil, Recombinant DNA, recommendation engine, Republic of Letters, RFID, Richard Thaler, Rory Sutherland, Satyajit Das, Search for Extraterrestrial Intelligence, self-driving car, sharing economy, Silicon Valley, Skype, smart contracts, social intelligence, speech recognition, statistical model, stem cell, Stephen Hawking, Steve Jobs, Steven Pinker, Stewart Brand, strong AI, Stuxnet, superintelligent machines, supervolcano, synthetic biology, systems thinking, tacit knowledge, TED Talk, the scientific method, The Wisdom of Crowds, theory of mind, Thorstein Veblen, too big to fail, Turing machine, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!, We are as Gods, Y2K

What distinguishes natural from artificial intelligence is not what it is but only how it’s made. Of course, that little word only is doing some heavy lifting here. Brains use a highly parallel architecture and mobilize many noisy analog units (i.e., neurons) firing simultaneously, while most computers use von Neumann architecture, with serial operation of much faster digital units. These distinctions are blurring, however, from both ends. Neural-net architectures are built in silicon, and brains interact ever more seamlessly with external digital organs. Already I feel that my laptop is an extension of my self—in particular, it is a repository for both visual and narrative memory, a sensory portal into the outside world, and a big part of my mathematical digestive system. 2.

This is far from obvious; we lack any data, either way. I personally think that consciousness is incredibly more complex than is currently assumed by the “experts.” A human being is not merely x numbers of axons and synapses, and we have no reason to assume that we can count our flops-per-second in a plain Von Neumann architecture, reach a certain number, and suddenly out pops a thinking machine. If true consciousness can emerge, let’s be clear what that could entail. If the machine is truly aware, it will, by definition, develop a “personality.” It may be irascible, flirtatious, maybe the ultimate know-it-all, possibly incredibly full of itself.


pages: 211 words: 57,618

Quantum Computing for Everyone by Chris Bernhardt

Albert Einstein, complexity theory, correlation does not imply causation, discrete time, John von Neumann, low earth orbit, P = NP, quantum cryptography, quantum entanglement, reversible computing, Richard Feynman, selection bias, Turing machine, Von Neumann architecture

There is nothing special about this. Exactly the same argument can be used whatever value is chosen for this bit. The inability to clone a qubit has many important consequences. We want to be able to back up files and send copies of files to other people. Copying is ubiquitous. Our everyday computers are based on von Neumann architecture, which is heavily based on the ability to copy. When we run a program we are always copying bits from one place to another. In quantum computing this is not possible for general qubits. So, if programmable quantum computers are designed they will not be based on our current architecture.


pages: 894 words: 190,485

Write Great Code, Volume 1 by Randall Hyde

AltaVista, business process, Donald Knuth, John von Neumann, level 1 cache, locality of reference, machine readable, Von Neumann architecture, Y2K

Writing great code requires a strong knowledge of the computer’s architecture. 6.1 The Basic System Components The basic operational design of a computer system is called its architecture. John von Neumann, a pioneer in computer design, is given credit for the principal architecture in use today. For example, the 80x86 family uses the von Neumann architecture (VNA). A typical von Neumann system has three major components: the central processing unit (CPU), memory, and input/output (I/O), as shown in Figure 6-1. Figure 6-1. Typical von Neumann machine In VNA machines, like the 80x86, the CPU is where all the action takes place. All computations occur within the CPU.

The extra pins needed on the processor to support two physically separate buses increase the cost of the processor and introduce many other engineering problems. However, microprocessor designers have discovered that they can obtain many benefits of the Harvard architecture with few of the disadvantages by using separate on-chip caches for data and instructions. Advanced CPUs use an internal Harvard architecture and an external von Neumann architecture. Figure 9-9 shows the structure of the 80x86 with separate data and instruction caches. Each path between the sections inside the CPU represents an independent bus, and data can flow on all paths concurrently. This means that the prefetch queue can be pulling instruction opcodes from the instruction cache while the execution unit is writing data to the data cache.


pages: 968 words: 224,513

The Art of Assembly Language by Randall Hyde

Donald Knuth, Free Software Foundation, machine readable, P = NP, p-value, sorting algorithm, Von Neumann architecture, Y2K

There hasn't been any real assembly language. Before we can progress any further and learn some real assembly language, a detour is necessary; unless you understand the basic structure of the Intel 80x86 CPU family, the machine instructions will make little sense. The Intel CPU family is generally classified as a Von Neumann Architecture Machine. Von Neumann computer systems contain three main building blocks: the central processing unit (CPU), memory, and input/output (I/0) devices. These three components are interconnected using the system bus (consisting of the address, data, and control buses). The block diagram in Figure 1-4 shows this relationship.

The first form we've been using throughout this chapter, so there is little need to discuss it here. The second form, the register indirect call, calls the procedure whose address is held in the specified 32-bit register. The address of a procedure is the byte address of the first instruction to execute within that procedure. Remember, on a Von Neumann architecture machine (like the 80x86), the system stores machine instructions in memory along with other data. The CPU fetches the instruction opcode values from memory prior to executing them. When you execute the register indirect call instruction, the 80x86 first pushes the return address onto the stack and then begins fetching the next opcode byte (instruction) from the address specified by the register's value.


pages: 332 words: 93,672

Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy by George Gilder

23andMe, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AlphaGo, AltaVista, Amazon Web Services, AOL-Time Warner, Asilomar, augmented reality, Ben Horowitz, bitcoin, Bitcoin Ponzi scheme, Bletchley Park, blockchain, Bob Noyce, British Empire, Brownian motion, Burning Man, business process, butterfly effect, carbon footprint, cellular automata, Claude Shannon: information theory, Clayton Christensen, cloud computing, computer age, computer vision, crony capitalism, cross-subsidies, cryptocurrency, Danny Hillis, decentralized internet, deep learning, DeepMind, Demis Hassabis, disintermediation, distributed ledger, don't be evil, Donald Knuth, Donald Trump, double entry bookkeeping, driverless car, Elon Musk, Erik Brynjolfsson, Ethereum, ethereum blockchain, fake news, fault tolerance, fiat currency, Firefox, first square of the chessboard, first square of the chessboard / second half of the chessboard, floating exchange rates, Fractional reserve banking, game design, Geoffrey Hinton, George Gilder, Google Earth, Google Glasses, Google Hangouts, index fund, inflation targeting, informal economy, initial coin offering, Internet of things, Isaac Newton, iterative process, Jaron Lanier, Jeff Bezos, Jim Simons, Joan Didion, John Markoff, John von Neumann, Julian Assange, Kevin Kelly, Law of Accelerating Returns, machine translation, Marc Andreessen, Mark Zuckerberg, Mary Meeker, means of production, Menlo Park, Metcalfe’s law, Money creation, money: store of value / unit of account / medium of exchange, move fast and break things, Neal Stephenson, Network effects, new economy, Nick Bostrom, Norbert Wiener, Oculus Rift, OSI model, PageRank, pattern recognition, Paul Graham, peer-to-peer, Peter Thiel, Ponzi scheme, prediction markets, quantitative easing, random walk, ransomware, Ray Kurzweil, reality distortion field, Recombinant DNA, Renaissance Technologies, Robert Mercer, Robert Metcalfe, Ronald Coase, Ross Ulbricht, Ruby on Rails, Sand Hill Road, Satoshi Nakamoto, Search for Extraterrestrial Intelligence, self-driving car, sharing economy, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Singularitarianism, Skype, smart contracts, Snapchat, Snow Crash, software is eating the world, sorting algorithm, South Sea Bubble, speech recognition, Stephen Hawking, Steve Jobs, Steven Levy, Stewart Brand, stochastic process, Susan Wojcicki, TED Talk, telepresence, Tesla Model S, The Soul of a New Machine, theory of mind, Tim Cook: Apple, transaction costs, tulip mania, Turing complete, Turing machine, Vernor Vinge, Vitalik Buterin, Von Neumann architecture, Watson beat the top human players on Jeopardy!, WikiLeaks, Y Combinator, zero-sum game

A fifty-seven-year-old, brown-haired engineer with a black hat and backpack and hiking boots, he is dressed Silicon-Valley-mountaineer style to take me on a high-altitude adventure in microchips and software, ideas and speculations, Google maps and Elon Musk “reality distortion fields” down Route 101 at five o’clock on a late-August Friday evening. It’s not quite Doctor Brown’s Back to the Future ride in a DeLorean, but it will suffice for some modest time-travel in the history of computing. Since writing his college thesis in the late 1970s, Dally has rebelled against the serial step-by-step computing regime known as the von Neumann architecture. After working on the “Cosmic Cube” under Chuck Seitz for his Ph.D. at Caltech (1983), Dally has led design of parallel machines at MIT (the J-machine and the M-machine), introduced massive parallelism to Cray supercomputers (the T-3D and 3E), and pioneered parallel graphics at Stanford (the Imagine project, a streaming parallel device incorporating programmable “shaders,” now ubiquitous in the industry’s graphic processors from Nvidia and others).


pages: 496 words: 174,084

Masterminds of Programming: Conversations With the Creators of Major Programming Languages by Federico Biancuzzi, Shane Warden

Benevolent Dictator For Life (BDFL), business intelligence, business logic, business process, cellular automata, cloud computing, cognitive load, commoditize, complexity theory, conceptual framework, continuous integration, data acquisition, Dennis Ritchie, domain-specific language, Douglas Hofstadter, Fellow of the Royal Society, finite state, Firefox, follow your passion, Frank Gehry, functional programming, general-purpose programming language, Guido van Rossum, higher-order functions, history of Unix, HyperCard, industrial research laboratory, information retrieval, information security, iterative process, Ivan Sutherland, John von Neumann, Ken Thompson, Larry Ellison, Larry Wall, linear programming, loose coupling, machine readable, machine translation, Mars Rover, millennium bug, Multics, NP-complete, Paul Graham, performance metric, Perl 6, QWERTY keyboard, RAND corporation, randomized controlled trial, Renaissance Technologies, Ruby on Rails, Sapir-Whorf hypothesis, seminal paper, Silicon Valley, slashdot, software as a service, software patent, sorting algorithm, SQL injection, Steve Jobs, traveling salesman, Turing complete, type inference, Valgrind, Von Neumann architecture, web application

To me, that all seems like wishful thinking. At the same time, I don’t know what will happen. There could be a quantum jump where, even though the computers that we know don’t actually change, a different kind of platform suddenly becomes much more prevalent and the rules are different. Perhaps a shift away from the von Neumann architecture? Guido: I wasn’t even thinking of that, but that’s certainly also a possibility. I was more thinking of what if mobile phones become the ubiquitous computing device. Mobile phones are only a few years behind the curve of the power of regular laptops, which suggests that in a few years, mobile phones, apart from the puny keyboard and screen, will have enough computing power so that you don’t need a laptop anymore.

They were very innovative and spawned a lot of follow-on work over the years. Unfortunately, there were a few problems that I didn’t solve, and neither did anybody else. So here was a promising idea, but it just didn’t quite work in the long run. I pulled some of those ideas into UML, but data flow architecture doesn’t seem to replace von Neumann architecture in most cases. So I had my shot and didn’t quite make it. There are also cellular automata. I think over half of my fellow grad students tried to build on them a highly parallel computer. That has to be the right approach, because that’s how the universe is constructed. (Or maybe not. Modern physics is stranger than fiction.


pages: 317 words: 101,074

The Road Ahead by Bill Gates, Nathan Myhrvold, Peter Rinearson

Albert Einstein, Apple's 1984 Super Bowl advert, Berlin Wall, Bill Gates: Altair 8800, Bob Noyce, Bonfire of the Vanities, business process, California gold rush, Charles Babbage, Claude Shannon: information theory, computer age, Donald Knuth, first square of the chessboard, first square of the chessboard / second half of the chessboard, glass ceiling, global village, informal economy, invention of movable type, invention of the printing press, invention of writing, John von Neumann, knowledge worker, medical malpractice, Mitch Kapor, new economy, packet switching, popular electronics, Richard Feynman, Ronald Reagan, SimCity, speech recognition, Steve Ballmer, Steve Jobs, Steven Pinker, Ted Nelson, telemarketer, the scientific method, The Wealth of Nations by Adam Smith, transaction costs, Turing machine, Turing test, Von Neumann architecture

John von Neumann, a brilliant Hungarian-born American, who is known for many things, including the development of game theory and his contributions to nuclear weaponry, is credited with the leading role in figuring out a way around this problem. He created the paradigm that all digital computers still follow. The "von Neumann architecture," as it is known today, is based on principles he articulated in 1945—including the principle that a computer could avoid cabling changes by storing instructions in its memory. As soon as this idea was put into practice, the modern computer was born. Today the brains of most computers are descendants of the microprocessor Paul Allen and I were so knocked out by in the seventies, and personal computers often are rated according to how many bits of information (one switch in the lighting example) their microprocessor can process at a time, or how many bytes (a cluster of eight bits) of memory or disk-based storage they have.


pages: 370 words: 94,968

The Most Human Human: What Talking With Computers Teaches Us About What It Means to Be Alive by Brian Christian

"Friedman doctrine" OR "shareholder theory", 4chan, Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Bertrand Russell: In Praise of Idleness, Blue Ocean Strategy, carbon footprint, cellular automata, Charles Babbage, Claude Shannon: information theory, cognitive dissonance, commoditize, complexity theory, Computing Machinery and Intelligence, crowdsourcing, David Heinemeier Hansson, Donald Trump, Douglas Hofstadter, George Akerlof, Gödel, Escher, Bach, high net worth, Isaac Newton, Jacques de Vaucanson, Jaron Lanier, job automation, Kaizen: continuous improvement, Ken Thompson, l'esprit de l'escalier, language acquisition, Loebner Prize, machine translation, Menlo Park, operational security, Ray Kurzweil, RFID, Richard Feynman, Ronald Reagan, SimCity, Skype, Social Responsibility of Business Is to Increase Its Profits, starchitect, statistical model, Stephen Hawking, Steve Jobs, Steven Pinker, Thales of Miletus, theory of mind, Thomas Bayes, Turing machine, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!, zero-sum game

Ray Kurzweil (in 2005’s The Singularity Is Near), among several other computer scientists, speaks of a utopian future where we shed our bodies and upload our minds into computers and live forever, virtual, immortal, disembodied. Heaven for hackers. To Ackley’s point, most work on computation has not traditionally been on dynamic systems, or interactive ones, or ones integrating data from the real world in real time. Indeed, theoretical models of the computer—the Turing machine, the von Neumann architecture—seem like reproductions of an idealized version of conscious, deliberate reasoning. As Ackley puts it, “The von Neumann machine is an image of one’s conscious mind where you tend to think: you’re doing long division, and you run this algorithm step-by-step. And that’s not how brains operate.


pages: 340 words: 97,723

The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity by Amy Webb

"Friedman doctrine" OR "shareholder theory", Ada Lovelace, AI winter, air gap, Airbnb, airport security, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic bias, AlphaGo, Andy Rubin, artificial general intelligence, Asilomar, autonomous vehicles, backpropagation, Bayesian statistics, behavioural economics, Bernie Sanders, Big Tech, bioinformatics, Black Lives Matter, blockchain, Bretton Woods, business intelligence, Cambridge Analytica, Cass Sunstein, Charles Babbage, Claude Shannon: information theory, cloud computing, cognitive bias, complexity theory, computer vision, Computing Machinery and Intelligence, CRISPR, cross-border payments, crowdsourcing, cryptocurrency, Daniel Kahneman / Amos Tversky, data science, deep learning, DeepMind, Demis Hassabis, Deng Xiaoping, disinformation, distributed ledger, don't be evil, Donald Trump, Elon Musk, fail fast, fake news, Filter Bubble, Flynn Effect, Geoffrey Hinton, gig economy, Google Glasses, Grace Hopper, Gödel, Escher, Bach, Herman Kahn, high-speed rail, Inbox Zero, Internet of things, Jacques de Vaucanson, Jeff Bezos, Joan Didion, job automation, John von Neumann, knowledge worker, Lyft, machine translation, Mark Zuckerberg, Menlo Park, move fast and break things, Mustafa Suleyman, natural language processing, New Urbanism, Nick Bostrom, one-China policy, optical character recognition, packet switching, paperclip maximiser, pattern recognition, personalized medicine, RAND corporation, Ray Kurzweil, Recombinant DNA, ride hailing / ride sharing, Rodney Brooks, Rubik’s Cube, Salesforce, Sand Hill Road, Second Machine Age, self-driving car, seminal paper, SETI@home, side project, Silicon Valley, Silicon Valley startup, skunkworks, Skype, smart cities, South China Sea, sovereign wealth fund, speech recognition, Stephen Hawking, strong AI, superintelligent machines, surveillance capitalism, technological singularity, The Coming Technological Singularity, the long tail, theory of mind, Tim Cook: Apple, trade route, Turing machine, Turing test, uber lyft, Von Neumann architecture, Watson beat the top human players on Jeopardy!, zero day

If you don’t have enough of either, the machine will start running hot, or you’ll get an error message, or it will simply shut down. It’s a problem known as the “von Neumann bottleneck.” No matter how fast the processor is capable of working, the program memory and data memory cause the von Neumann bottleneck, limiting the data transfer rate. Just about all of our current computers are based on the von Neumann architecture, and the problem is that existing processors can’t execute programs any faster than they’re able to retrieve instructions and data from memory. The bottleneck is a big problem for AI. Right now, when you talk to your Alexa or Google Home, your voice is being recorded, parsed, and then transmitted to the cloud for a response—given the physical distance between you and the various data centers involved, it’s mind-blowing that Alexa can talk back within a second or two.


pages: 377 words: 97,144

Singularity Rising: Surviving and Thriving in a Smarter, Richer, and More Dangerous World by James D. Miller

23andMe, affirmative action, Albert Einstein, artificial general intelligence, Asperger Syndrome, barriers to entry, brain emulation, cloud computing, cognitive bias, correlation does not imply causation, crowdsourcing, Daniel Kahneman / Amos Tversky, David Brooks, David Ricardo: comparative advantage, Deng Xiaoping, en.wikipedia.org, feminist movement, Flynn Effect, friendly AI, hive mind, impulse control, indoor plumbing, invention of agriculture, Isaac Newton, John Gilmore, John von Neumann, knowledge worker, Larry Ellison, Long Term Capital Management, low interest rates, low skilled workers, Netflix Prize, neurotypical, Nick Bostrom, Norman Macrae, pattern recognition, Peter Thiel, phenotype, placebo effect, prisoner's dilemma, profit maximization, Ray Kurzweil, recommendation engine, reversible computing, Richard Feynman, Rodney Brooks, Silicon Valley, Singularitarianism, Skype, statistical model, Stephen Hawking, Steve Jobs, sugar pill, supervolcano, tech billionaire, technological singularity, The Coming Technological Singularity, the scientific method, Thomas Malthus, transaction costs, Turing test, twin studies, Vernor Vinge, Von Neumann architecture

To understand why, let me tell you a bit about von Neumann. Although a fantastic scientist, a pathbreaking economist, and one of the best mathematicians of the twentieth century, von Neumann also possessed fierce practical skills. He was, arguably, the creator of the modern digital computer.11 The computer architecture he developed, now called “von Neumann architecture,” lies at the heart of most computers.12 Von Neumann’s brains took him to the centers of corporate power, and he did high-level consulting work for many private businesses, including Standard Oil, for which he helped to extract more resources from dried-out wells.13 Johnny (as his biographer often calls him in tribute to von Neumann’s unpretentious nature) was described as having “the invaluable faculty of being able to take the most difficult problem, separate it into its components, whereupon everything looked brilliantly simple. . . .”14 During World War II, von Neumann became the world’s leading expert on explosives and used this knowledge to help build better conventional bombs, thwart German sea mines, and determine the optimal altitude for airborne detonations. 15 Johnny functioned as a human computer as a part of the Manhattan Project’s efforts to create fission bombs. 16 Whereas atomic weapons developers today use computers to decipher the many mathematical equations that challenge their trade, the Manhattan Project’s scientists had to rely on human intellect alone.


pages: 416 words: 112,268

Human Compatible: Artificial Intelligence and the Problem of Control by Stuart Russell

3D printing, Ada Lovelace, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Alfred Russel Wallace, algorithmic bias, AlphaGo, Andrew Wiles, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, augmented reality, autonomous vehicles, basic income, behavioural economics, Bletchley Park, blockchain, Boston Dynamics, brain emulation, Cass Sunstein, Charles Babbage, Claude Shannon: information theory, complexity theory, computer vision, Computing Machinery and Intelligence, connected car, CRISPR, crowdsourcing, Daniel Kahneman / Amos Tversky, data science, deep learning, deepfake, DeepMind, delayed gratification, Demis Hassabis, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Ernest Rutherford, fake news, Flash crash, full employment, future of work, Garrett Hardin, Geoffrey Hinton, Gerolamo Cardano, Goodhart's law, Hans Moravec, ImageNet competition, Intergovernmental Panel on Climate Change (IPCC), Internet of things, invention of the wheel, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Nash: game theory, John von Neumann, Kenneth Arrow, Kevin Kelly, Law of Accelerating Returns, luminiferous ether, machine readable, machine translation, Mark Zuckerberg, multi-armed bandit, Nash equilibrium, Nick Bostrom, Norbert Wiener, NP-complete, OpenAI, openstreetmap, P = NP, paperclip maximiser, Pareto efficiency, Paul Samuelson, Pierre-Simon Laplace, positional goods, probability theory / Blaise Pascal / Pierre de Fermat, profit maximization, RAND corporation, random walk, Ray Kurzweil, Recombinant DNA, recommendation engine, RFID, Richard Thaler, ride hailing / ride sharing, Robert Shiller, robotic process automation, Rodney Brooks, Second Machine Age, self-driving car, Shoshana Zuboff, Silicon Valley, smart cities, smart contracts, social intelligence, speech recognition, Stephen Hawking, Steven Pinker, superintelligent machines, surveillance capitalism, Thales of Miletus, The Future of Employment, The Theory of the Leisure Class by Thorstein Veblen, Thomas Bayes, Thorstein Veblen, Tragedy of the Commons, transport as a service, trolley problem, Turing machine, Turing test, universal basic income, uranium enrichment, vertical integration, Von Neumann architecture, Wall-E, warehouse robotics, Watson beat the top human players on Jeopardy!, web application, zero-sum game

It was all the more remarkable for the fact that, unlike monetary amounts, the utility values of various bets and prizes are not directly observable; instead, utilities are to be inferred from the preferences exhibited by an individual. It would be two centuries before the implications of the idea were fully worked out and it became broadly accepted by statisticians and economists. In the middle of the twentieth century, John von Neumann (a great mathematician after whom the standard “von Neumann architecture” for computers was named16) and Oskar Morgenstern published an axiomatic basis for utility theory.17 What this means is the following: as long as the preferences exhibited by an individual satisfy certain basic axioms that any rational agent should satisfy, then necessarily the choices made by that individual can be described as maximizing the expected value of a utility function.


The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal by M. Mitchell Waldrop

Ada Lovelace, air freight, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Apple II, battle of ideas, Berlin Wall, Bill Atkinson, Bill Duvall, Bill Gates: Altair 8800, Bletchley Park, Boeing 747, Byte Shop, Charles Babbage, Claude Shannon: information theory, Compatible Time-Sharing System, computer age, Computing Machinery and Intelligence, conceptual framework, cuban missile crisis, Dennis Ritchie, do well by doing good, Donald Davies, double helix, Douglas Engelbart, Douglas Engelbart, Dynabook, experimental subject, Fairchild Semiconductor, fault tolerance, Frederick Winslow Taylor, friendly fire, From Mathematics to the Technologies of Life and Death, functional programming, Gary Kildall, Haight Ashbury, Howard Rheingold, information retrieval, invisible hand, Isaac Newton, Ivan Sutherland, James Watt: steam engine, Jeff Rulifson, John von Neumann, Ken Thompson, Leonard Kleinrock, machine translation, Marc Andreessen, Menlo Park, Multics, New Journalism, Norbert Wiener, packet switching, pink-collar, pneumatic tube, popular electronics, RAND corporation, RFC: Request For Comment, Robert Metcalfe, Silicon Valley, Skinner box, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, The Soul of a New Machine, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture, Wiener process, zero-sum game

Moreover, he had a point: it was only in the late 1970s, with the availability of reliable and inexpensive microchips, that computer scientists would begin serious experimentation with "parallel" computers that could carry out many operations simultaneously. To this day, the vast majority of computers in the world-including essentially all personal computers-are still based on the serial, step-by-step "von Neumann" architecture. Von Neumann mailed off his handwritten manuscript to Goldstine at the Moore School in late June 1945. He may well have felt rushed at that point, since the Trinity test of the plutonium bomb was less than three weeks away (it would take place on July 16). But in any case, he left numerous blank spaces for names, references, and other information that he planned to insert after his col- leagues had had a chance to comment.

Once they were in, moreover, assign each of them a securely walled-off piece of the computer's memory where they could store data and programming code without anybody else's horning in. And fi- nally, when the users needed some actual processing power, dole it out to them via an artful trick. You couldn't literally divide a computer's central processing unit, McCarthy knew; the standard von Neumann architecture allowed for only one such unit, which could carry out only one operation at a time. However, even the slowest electronic computer was very, very fast on any human time scale. So, Mc- Carthy wondered, why not let the CPU skip from one user's memory area to the next user's in sequence, executing a few steps of each task as it went?


pages: 492 words: 118,882

The Blockchain Alternative: Rethinking Macroeconomic Policy and Economic Theory by Kariappa Bheemaiah

"World Economic Forum" Davos, accounting loophole / creative accounting, Ada Lovelace, Adam Curtis, Airbnb, Alan Greenspan, algorithmic trading, asset allocation, autonomous vehicles, balance sheet recession, bank run, banks create money, Basel III, basic income, behavioural economics, Ben Bernanke: helicopter money, bitcoin, Bletchley Park, blockchain, Bretton Woods, Brexit referendum, business cycle, business process, call centre, capital controls, Capital in the Twenty-First Century by Thomas Piketty, cashless society, cellular automata, central bank independence, Charles Babbage, Claude Shannon: information theory, cloud computing, cognitive dissonance, collateralized debt obligation, commoditize, complexity theory, constrained optimization, corporate governance, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, cross-border payments, crowdsourcing, cryptocurrency, data science, David Graeber, deep learning, deskilling, Diane Coyle, discrete time, disruptive innovation, distributed ledger, diversification, double entry bookkeeping, Ethereum, ethereum blockchain, fiat currency, financial engineering, financial innovation, financial intermediation, Flash crash, floating exchange rates, Fractional reserve banking, full employment, George Akerlof, Glass-Steagall Act, Higgs boson, illegal immigration, income inequality, income per capita, inflation targeting, information asymmetry, interest rate derivative, inventory management, invisible hand, John Maynard Keynes: technological unemployment, John von Neumann, joint-stock company, Joseph Schumpeter, junk bonds, Kenneth Arrow, Kenneth Rogoff, Kevin Kelly, knowledge economy, large denomination, Large Hadron Collider, Lewis Mumford, liquidity trap, London Whale, low interest rates, low skilled workers, M-Pesa, machine readable, Marc Andreessen, market bubble, market fundamentalism, Mexican peso crisis / tequila crisis, Michael Milken, MITM: man-in-the-middle, Money creation, money market fund, money: store of value / unit of account / medium of exchange, mortgage debt, natural language processing, Network effects, new economy, Nikolai Kondratiev, offshore financial centre, packet switching, Pareto efficiency, pattern recognition, peer-to-peer lending, Ponzi scheme, power law, precariat, pre–internet, price mechanism, price stability, private sector deleveraging, profit maximization, QR code, quantitative easing, quantitative trading / quantitative finance, Ray Kurzweil, Real Time Gross Settlement, rent control, rent-seeking, robo advisor, Satoshi Nakamoto, Satyajit Das, Savings and loan crisis, savings glut, seigniorage, seminal paper, Silicon Valley, Skype, smart contracts, software as a service, software is eating the world, speech recognition, statistical model, Stephen Hawking, Stuart Kauffman, supply-chain management, technology bubble, The Chicago School, The Future of Employment, The Great Moderation, the market place, The Nature of the Firm, the payments system, the scientific method, The Wealth of Nations by Adam Smith, Thomas Kuhn: the structure of scientific revolutions, too big to fail, trade liberalization, transaction costs, Turing machine, Turing test, universal basic income, Vitalik Buterin, Von Neumann architecture, Washington Consensus

It was during the time of developing ENIAC that he met the renowned polymath, John von Neumann, and with his help went on to design a stored-program computer, the EDVAC (Electronic Discrete Variable Automatic Computer), the first binary computer (ENIAC was decimal). See Figure 4-11. Figure 4-11.General design of the Electronic Discrete Variable Automatic Computer. Reference Source: ‘The von Neumann Architecture’, The Computing Universe, 2014 From an abstract architecture perspective, von Neumann’s design is logically equivalent to Turing’s Universal Turing Machine. In fact, von Neumann had read Turing’s theoretical papers prior to designing his machine. Ultimately it was this simple design that was built upon by successive generations of computer scientists and led to the design of computers with multiple processors and the creation of parallel computing.


pages: 429 words: 114,726

The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise by Nathan L. Ensmenger

barriers to entry, business process, Charles Babbage, Claude Shannon: information theory, computer age, deskilling, Donald Knuth, Firefox, Frederick Winslow Taylor, functional programming, future of work, Grace Hopper, informal economy, information retrieval, interchangeable parts, Isaac Newton, Jacquard loom, job satisfaction, John von Neumann, knowledge worker, Larry Ellison, loose coupling, machine readable, new economy, no silver bullet, Norbert Wiener, pattern recognition, performance metric, Philip Mirowski, post-industrial society, Productivity paradox, RAND corporation, Robert Gordon, scientific management, Shoshana Zuboff, sorting algorithm, Steve Jobs, Steven Levy, systems thinking, tacit knowledge, technological determinism, the market place, The Theory of the Leisure Class by Thorstein Veblen, Thomas Kuhn: the structure of scientific revolutions, Thorstein Veblen, Turing machine, Von Neumann architecture, world market for maybe five computers, Y2K

In 1945–1946, von Neumann circulated an informal “First Draft of a Report on the EDVAC,” which described the EDVAC in terms of its logical structure, using notation borrowed from neurophysiology. Ignoring most of the physical details of the EDVAC design, such as its vacuum tube circuitry, von Neumann focused instead on the main functional units of the computer: its arithmetic unit, memory, and input and output. The “von Neumann architecture,” as it came to be known, served as the logical basis for almost all computers designed in subsequent decades. By abstracting the logical design of the digital computer from any particular physical implementation, von Neumann took a crucial first step in the development of a modern theory of computation.55 His was not the only contribution; in 1937, for example, Turing had described, for the purposes of demonstrating the limits of computation, what would become known as the Universal Turing Machine.


pages: 436 words: 127,642

When Einstein Walked With Gödel: Excursions to the Edge of Thought by Jim Holt

Ada Lovelace, Albert Einstein, Andrew Wiles, anthropic principle, anti-communist, Arthur Eddington, Benoit Mandelbrot, Bletchley Park, Brownian motion, cellular automata, Charles Babbage, classic study, computer age, CRISPR, dark matter, David Brooks, Donald Trump, Dr. Strangelove, Eddington experiment, Edmond Halley, everywhere but in the productivity statistics, Fellow of the Royal Society, four colour theorem, Georg Cantor, George Santayana, Gregor Mendel, haute couture, heat death of the universe, Henri Poincaré, Higgs boson, inventory management, Isaac Newton, Jacquard loom, Johannes Kepler, John von Neumann, Joseph-Marie Jacquard, Large Hadron Collider, Long Term Capital Management, Louis Bachelier, luminiferous ether, Mahatma Gandhi, mandelbrot fractal, Monty Hall problem, Murray Gell-Mann, new economy, Nicholas Carr, Norbert Wiener, Norman Macrae, Paradox of Choice, Paul Erdős, Peter Singer: altruism, Plato's cave, power law, probability theory / Blaise Pascal / Pierre de Fermat, quantum entanglement, random walk, Richard Feynman, Robert Solow, Schrödinger's Cat, scientific worldview, Search for Extraterrestrial Intelligence, selection bias, Skype, stakhanovite, Stephen Hawking, Steven Pinker, Thorstein Veblen, Turing complete, Turing machine, Turing test, union organizing, Vilfredo Pareto, Von Neumann architecture, wage slave

It thus fell short of the modern computer, which stores its instructions in the form of coded numbers, or “software.” Von Neumann aspired to create a truly universal machine, one that (as Dyson aptly puts it) “broke the distinction between numbers that mean things and numbers that do things.” A report sketching the architecture for such a machine—still known as the von Neumann architecture—was drawn up and circulated toward the end of the war. Although the report contained design ideas from the ENIAC inventors, von Neumann was listed as the sole author, which occasioned some grumbling among the uncredited. And the report had another curious omission. It failed to mention the man who, as von Neumann well knew, had originally worked out the possibility of a universal computer: Alan Turing.


pages: 303 words: 67,891

Advances in Artificial General Intelligence: Concepts, Architectures and Algorithms: Proceedings of the Agi Workshop 2006 by Ben Goertzel, Pei Wang

AI winter, artificial general intelligence, backpropagation, bioinformatics, brain emulation, classic study, combinatorial explosion, complexity theory, computer vision, Computing Machinery and Intelligence, conceptual framework, correlation coefficient, epigenetics, friendly AI, functional programming, G4S, higher-order functions, information retrieval, Isaac Newton, Jeff Hawkins, John Conway, Loebner Prize, Menlo Park, natural language processing, Nick Bostrom, Occam's razor, p-value, pattern recognition, performance metric, precautionary principle, Ray Kurzweil, Rodney Brooks, semantic web, statistical model, strong AI, theory of mind, traveling salesman, Turing machine, Turing test, Von Neumann architecture, Y2K

Since the number of processing unit is a constant, and so does the capacity of each unit, they will need to be shared by the concepts, because the system as a whole will producing new concepts from time to time, whose number will soon exceed the number of processing units. Consequently, the system still need time-sharing and space-sharing, and it is only that what to be shared is not a single CPU and RAM, but many processing units. Some people blame the von Neumann architecture of computer for the past failure of AI, but the argument is not convincing. It is true that the current computer architecture is not designed especially for AI, but it has not been proved that it cannot be used to implement a truly intelligent system. Special hardware is optional for NARS, since the system can be fully implemented on the current hardware/software platform, though special hardware will surely make it work better. 3.6 Evolution Under the assumption of insufficient knowledge, all object-level knowledge in NARS can be modified by the system’s various learning mechanisms.


pages: 502 words: 132,062

Ways of Being: Beyond Human Intelligence by James Bridle

Ada Lovelace, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Anthropocene, Any sufficiently advanced technology is indistinguishable from magic, autonomous vehicles, behavioural economics, Benoit Mandelbrot, Berlin Wall, Big Tech, Black Lives Matter, blockchain, Californian Ideology, Cambridge Analytica, carbon tax, Charles Babbage, cloud computing, coastline paradox / Richardson effect, Computing Machinery and Intelligence, corporate personhood, COVID-19, cryptocurrency, DeepMind, Donald Trump, Douglas Hofstadter, Elon Musk, experimental subject, factory automation, fake news, friendly AI, gig economy, global pandemic, Gödel, Escher, Bach, impulse control, James Bridle, James Webb Space Telescope, John von Neumann, Kickstarter, Kim Stanley Robinson, language acquisition, life extension, mandelbrot fractal, Marshall McLuhan, microbiome, music of the spheres, negative emissions, Nick Bostrom, Norbert Wiener, paperclip maximiser, pattern recognition, peer-to-peer, planetary scale, RAND corporation, random walk, recommendation engine, self-driving car, SETI@home, shareholder value, Silicon Valley, Silicon Valley ideology, speech recognition, statistical model, surveillance capitalism, techno-determinism, technological determinism, technoutopianism, the long tail, the scientific method, The Soul of a New Machine, theory of mind, traveling salesman, trolley problem, Turing complete, Turing machine, Turing test, UNCLOS, undersea cable, urban planning, Von Neumann architecture, wikimedia commons, zero-sum game

In his proposal for the EDVAC, the first all-digital, stored-program computer, von Neumann specified a particular architecture: a single connection, or ‘bus’, between the memory and the central processor, meaning that the computer could not fetch data and execute commands at the same time. Today, just as almost all computers are based on Turing’s a-machine, almost all computers use the von Neumann architecture. But a problem results from this: the central processing unit (CPU) is constantly forced to wait on required information as it is moved into or out of memory, which can result in a serious drag on its processing speed. The original decision to build computers this way was made for reasons of simplicity, but it means that significant amounts of computer time, of software design and of electrical energy are expended on moving information around rather than doing anything with it.


pages: 528 words: 146,459

Computer: A History of the Information Machine by Martin Campbell-Kelly, William Aspray, Nathan L. Ensmenger, Jeffrey R. Yost

Ada Lovelace, air freight, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Apple's 1984 Super Bowl advert, barriers to entry, Bill Gates: Altair 8800, Bletchley Park, borderless world, Buckminster Fuller, Build a better mousetrap, Byte Shop, card file, cashless society, Charles Babbage, cloud computing, combinatorial explosion, Compatible Time-Sharing System, computer age, Computer Lib, deskilling, don't be evil, Donald Davies, Douglas Engelbart, Douglas Engelbart, Dynabook, Edward Jenner, Evgeny Morozov, Fairchild Semiconductor, fault tolerance, Fellow of the Royal Society, financial independence, Frederick Winslow Taylor, game design, garden city movement, Gary Kildall, Grace Hopper, Herman Kahn, hockey-stick growth, Ian Bogost, industrial research laboratory, informal economy, interchangeable parts, invention of the wheel, Ivan Sutherland, Jacquard loom, Jeff Bezos, jimmy wales, John Markoff, John Perry Barlow, John von Neumann, Ken Thompson, Kickstarter, light touch regulation, linked data, machine readable, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, Menlo Park, Mitch Kapor, Multics, natural language processing, Network effects, New Journalism, Norbert Wiener, Occupy movement, optical character recognition, packet switching, PageRank, PalmPilot, pattern recognition, Pierre-Simon Laplace, pirate software, popular electronics, prediction markets, pre–internet, QWERTY keyboard, RAND corporation, Robert X Cringely, Salesforce, scientific management, Silicon Valley, Silicon Valley startup, Steve Jobs, Steven Levy, Stewart Brand, Ted Nelson, the market place, Turing machine, Twitter Arab Spring, Vannevar Bush, vertical integration, Von Neumann architecture, Whole Earth Catalog, William Shockley: the traitorous eight, women in the workforce, young professional

Although the 101-page report was in draft form, with many references left incomplete, twenty-four copies were immediately distributed to people closely associated with Project PY. Von Neumann’s sole authorship of the report seemed unimportant at the time, but it later led to his being given sole credit for the invention of the modern computer. Today, computer scientists routinely speak of “the von Neumann architecture” in preference to the more prosaic “stored-program concept”; this has done an injustice to von Neumann’s co-inventors. Although von Neumann’s EDVAC Report was a masterly synthesis, it had the effect of driving the engineers and logicians further apart. For example, in the report von Neumann had pursued the biological metaphor by eliminating all the electronic circuits in favor of logical elements using the “neurons” of brain science.


pages: 500 words: 146,240

Gamers at Work: Stories Behind the Games People Play by Morgan Ramsay, Peter Molyneux

Any sufficiently advanced technology is indistinguishable from magic, augmented reality, Bill Atkinson, Bob Noyce, book value, collective bargaining, Colossal Cave Adventure, do what you love, financial engineering, game design, Golden age of television, Ian Bogost, independent contractor, index card, Mark Zuckerberg, oil shock, pirate software, RAND corporation, risk tolerance, Silicon Valley, SimCity, Skype, Steve Jobs, Von Neumann architecture

I had been starting companies all of my life, so it just seemed like the natural thing to do. I really didn’t question any alternatives to starting the company and then licensing the hardware. Ramsay: Was there a lot of focus on hardware then? Bushnell: It was all hardware. As it turned out, the first video games didn’t have Von Neumann architectures at all. They had what we called “digital-state machines.” These machines were, essentially, clocked output signal generators that created waveforms that drove the television monitor. If you wanted to change anything, you had to change the hardware. There was no software at all. In fact, the very first game that executed a program was Asteroids in 1979.


pages: 489 words: 148,885

Accelerando by Stross, Charles

book value, business cycle, call centre, carbon-based life, cellular automata, cognitive dissonance, commoditize, Conway's Game of Life, dark matter, disinformation, dumpster diving, Extropian, financial engineering, finite state, flag carrier, Flynn Effect, Future Shock, glass ceiling, gravity well, John von Neumann, junk bonds, Kickstarter, knapsack problem, Kuiper Belt, machine translation, Magellanic Cloud, mandelbrot fractal, market bubble, means of production, military-industrial complex, MITM: man-in-the-middle, Neal Stephenson, orbital mechanics / astrodynamics, packet switching, performance metric, phenotype, planetary scale, Pluto: dwarf planet, quantum entanglement, reversible computing, Richard Stallman, satellite internet, SETI@home, Silicon Valley, Singularitarianism, Skinner box, slashdot, South China Sea, stem cell, technological singularity, telepresence, The Chicago School, theory of mind, Turing complete, Turing machine, Turing test, upwardly mobile, Vernor Vinge, Von Neumann architecture, warehouse robotics, web of trust, Y2K, zero-sum game

She reaches over for Boris's pitcher of jellyfish juice, but frowns as she does so: "Aineko wasn't conscious back then, but later … when SETI@home finally received that message back, oh, however many years ago, Aineko remembered the lobsters. And cracked it wide open while all the CETI teams were still thinking in terms of von Neumann architectures and concept-oriented programming. The message was a semantic net designed to mesh perfectly with the lobster broadcast all those years ago, and provide a high-level interface to a communications network we're going to visit." She squeezes Boris's fingertips. "SETI@home logged these coordinates as the origin of the transmission, even though the public word was that the message came from a whole lot farther away – they didn't want to risk a panic if people knew there were aliens on our cosmic doorstep.


pages: 523 words: 154,042

Fancy Bear Goes Phishing: The Dark History of the Information Age, in Five Extraordinary Hacks by Scott J. Shapiro

3D printing, 4chan, active measures, address space layout randomization, air gap, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, availability heuristic, Bernie Sanders, bitcoin, blockchain, borderless world, Brian Krebs, business logic, call centre, carbon tax, Cass Sunstein, cellular automata, cloud computing, cognitive dissonance, commoditize, Compatible Time-Sharing System, Computing Machinery and Intelligence, coronavirus, COVID-19, CRISPR, cryptocurrency, cyber-physical system, Daniel Kahneman / Amos Tversky, Debian, Dennis Ritchie, disinformation, Donald Trump, double helix, Dr. Strangelove, dumpster diving, Edward Snowden, en.wikipedia.org, Evgeny Morozov, evil maid attack, facts on the ground, false flag, feminist movement, Gabriella Coleman, gig economy, Hacker News, independent contractor, information security, Internet Archive, Internet of things, invisible hand, John Markoff, John von Neumann, Julian Assange, Ken Thompson, Larry Ellison, Laura Poitras, Linda problem, loss aversion, macro virus, Marc Andreessen, Mark Zuckerberg, Menlo Park, meta-analysis, Minecraft, Morris worm, Multics, PalmPilot, Paul Graham, pirate software, pre–internet, QWERTY keyboard, Ralph Nader, RAND corporation, ransomware, Reflections on Trusting Trust, Richard Stallman, Richard Thaler, Ronald Reagan, Satoshi Nakamoto, security theater, Shoshana Zuboff, side hustle, Silicon Valley, Skype, SoftBank, SQL injection, Steve Ballmer, Steve Jobs, Steven Levy, Stuxnet, supply-chain attack, surveillance capitalism, systems thinking, TaskRabbit, tech billionaire, tech worker, technological solutionism, the Cathedral and the Bazaar, the new new thing, the payments system, Turing machine, Turing test, Unsafe at Any Speed, vertical integration, Von Neumann architecture, Wargames Reagan, WarGames: Global Thermonuclear War, Wayback Machine, web application, WikiLeaks, winner-take-all economy, young professional, zero day, éminence grise

(The vacuum tubes were arranged in rings of ten, and only one tube was on at a time, representing one digit.) Von Neumann understood that binary symbols are easier to encode electronically. Open circuits would count as zeros, closed circuits as ones. EDVAC became the world’s first digital computer. Von Neumann is also credited with inventing the “stored program” computer, now known as the “von Neumann architecture.” For all its virtues, the ENIAC had one problem: code was hardwired into the machine. Whenever a user wanted to run a new program, a team of women, known as the programmers, manually changed ENIAC’s internal wiring to implement the code. A program might take two weeks just to load and test before it could run.


pages: 578 words: 168,350

Scale: The Universal Laws of Growth, Innovation, Sustainability, and the Pace of Life in Organisms, Cities, Economies, and Companies by Geoffrey West

"World Economic Forum" Davos, Alfred Russel Wallace, Anthropocene, Anton Chekhov, Benoit Mandelbrot, Black Swan, British Empire, butterfly effect, caloric restriction, caloric restriction, carbon footprint, Cesare Marchetti: Marchetti’s constant, clean water, coastline paradox / Richardson effect, complexity theory, computer age, conceptual framework, continuous integration, corporate social responsibility, correlation does not imply causation, cotton gin, creative destruction, dark matter, Deng Xiaoping, double helix, driverless car, Dunbar number, Edward Glaeser, endogenous growth, Ernest Rutherford, first square of the chessboard, first square of the chessboard / second half of the chessboard, Frank Gehry, Geoffrey West, Santa Fe Institute, Great Leap Forward, Guggenheim Bilbao, housing crisis, Index librorum prohibitorum, invention of agriculture, invention of the telephone, Isaac Newton, Jane Jacobs, Jeff Bezos, Johann Wolfgang von Goethe, John von Neumann, Kenneth Arrow, laissez-faire capitalism, Large Hadron Collider, Larry Ellison, Lewis Mumford, life extension, Mahatma Gandhi, mandelbrot fractal, Marc Benioff, Marchetti’s constant, Masdar, megacity, Murano, Venice glass, Murray Gell-Mann, New Urbanism, Oklahoma City bombing, Peter Thiel, power law, profit motive, publish or perish, Ray Kurzweil, Richard Feynman, Richard Florida, Salesforce, seminal paper, Silicon Valley, smart cities, Stephen Hawking, Steve Jobs, Stewart Brand, Suez canal 1869, systematic bias, systems thinking, technological singularity, The Coming Technological Singularity, The Death and Life of Great American Cities, the scientific method, the strength of weak ties, time dilation, too big to fail, transaction costs, urban planning, urban renewal, Vernor Vinge, Vilfredo Pareto, Von Neumann architecture, Whole Earth Catalog, Whole Earth Review, wikimedia commons, working poor

The great John von Neumann, mathematician, physicist, computer scientist, and polymath, a man whose ideas and accomplishments have had a huge influence on your life, made the following remarkably prescient observation more than seventy years ago: “The ever accelerating progress of technology and changes in the mode of human life . . . gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.”7 Among von Neumann’s many accomplishments before he died at the relatively young age of fifty-three in 1957 are his seminal role in the early development of quantum mechanics, his invention of game theory, which is a major tool in economic modeling, and the conceptual design of modern computers universally referred to as the von Neumann architecture. So can we imagine making an innovation as powerful and influential as the invention of the Internet every fifteen, ten, or even five years? This is a classic reductio ad absurdum argument showing that regardless of how ingenious we are, how many marvelous gadgets and devices we invent, we simply won’t be able to overcome the threat of the ultimate singularity if we continue business as usual.


pages: 798 words: 240,182

The Transhumanist Reader by Max More, Natasha Vita-More

"World Economic Forum" Davos, 23andMe, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, augmented reality, Bill Joy: nanobots, bioinformatics, brain emulation, Buckminster Fuller, cellular automata, clean water, cloud computing, cognitive bias, cognitive dissonance, combinatorial explosion, Computing Machinery and Intelligence, conceptual framework, Conway's Game of Life, cosmological principle, data acquisition, discovery of DNA, Douglas Engelbart, Drosophila, en.wikipedia.org, endogenous growth, experimental subject, Extropian, fault tolerance, Flynn Effect, Francis Fukuyama: the end of history, Frank Gehry, friendly AI, Future Shock, game design, germ theory of disease, Hans Moravec, hypertext link, impulse control, index fund, John von Neumann, joint-stock company, Kevin Kelly, Law of Accelerating Returns, life extension, lifelogging, Louis Pasteur, Menlo Park, meta-analysis, moral hazard, Network effects, Nick Bostrom, Norbert Wiener, pattern recognition, Pepto Bismol, phenotype, positional goods, power law, precautionary principle, prediction markets, presumed consent, Project Xanadu, public intellectual, radical life extension, Ray Kurzweil, reversible computing, RFID, Ronald Reagan, scientific worldview, silicon-based life, Singularitarianism, social intelligence, stem cell, stochastic process, superintelligent machines, supply-chain management, supply-chain management software, synthetic biology, systems thinking, technological determinism, technological singularity, Ted Nelson, telepresence, telepresence robot, telerobotics, the built environment, The Coming Technological Singularity, the scientific method, The Wisdom of Crowds, transaction costs, Turing machine, Turing test, Upton Sinclair, Vernor Vinge, Von Neumann architecture, VTOL, Whole Earth Review, women in the workforce, zero-sum game

Now, the rapidly increasing number of processing cores in ­general-purpose CPUs and GPU arrays are indicative of a drive toward parallel computation. Parallel computation is a more natural fit to neural computation. It is essential for the acquisition and analysis of data from the brain. Of course, compared with a sequential Von Neumann architecture, parallel computing platforms, and in particular neuromorphic platforms, are a much better target for the implementation of a whole brain emulation. An example of neuromorphic processor hardware is the chip developed at IBM as an outcome of research in the DARPA SyNAPSE program led by Dharmendra Modha.