Claude Shannon: information theory

55 results back to index


pages: 550 words: 154,725

The Idea Factory: Bell Labs and the Great Age of American Innovation by Jon Gertner

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, back-to-the-land, Black Swan, business climate, Claude Shannon: information theory, Clayton Christensen, complexity theory, corporate governance, cuban missile crisis, Edward Thorp, horn antenna, Hush-A-Phone, information retrieval, invention of the telephone, James Watt: steam engine, Karl Jansky, knowledge economy, Leonard Kleinrock, Metcalfe’s law, Nicholas Carr, Norbert Wiener, Picturephone, Richard Feynman, Richard Feynman, Robert Metcalfe, Sand Hill Road, Silicon Valley, Skype, Steve Jobs, Telecommunications Act of 1996, traveling salesman, uranium enrichment, William Shockley: the traitorous eight

Shannon Collection, Library of Congress. 4 Erico Marui Guizzo, “The Essential Message: Claude Shannon and the Making of Information Theory” (master’s thesis, MIT, 2003). 5 Len Kleinrock, a former student of Shannon’s, author interview. 6 Liversidge, “Profile of Claude Shannon.” 7 Claude Shannon, letter to Dr. V. Bush, December 13, 1939. Shannon Collection, Library of Congress. 8 Liversidge, “Profile of Claude Shannon.” Biographical facts relating to Shannon’s father are in a personal letter Shannon wrote, October 20, 1981, to Ms. Shari Bukowski: “[My father] was born in Oxford, New Jersey in 1862, came to Ovid, Michigan when very young and was raised and graduated there. He was a traveling salesman for a period and came to Gaylord shortly after 1900. There he bought a furniture and undertaking business, and, having confidence in Gaylord’s future, built the Shannon Block and Post Office building on Main Street.”

Bush, March 8, 1940. Shannon Collection, Library of Congress. 13 Claude Shannon, letter to Dr. V. Bush, February 16, 1939. “Off and on I have been working on an analysis of some of the fundamental properties of general systems for the transmission of intellegence [sic], including telephony, radio, television, telegraphy, etc.” Shannon, Collected Papers, p. 455. 14 Claude Shannon, oral history conducted in July 1982 by Robert Price, IEEE History Center, New Brunswick, NJ. 15 Claude Shannon, interview with Robert Price, December 20, 1983. Shannon said: “… a lot of information theory … I had worked out before, during the five years between 1940 and 1945. Much of that work I did at home.” Shannon’s penciled manuscripts: author’s review, Shannon Collection, Library of Congress. 16 Betty Shannon, author interview. 17 T.

It seemed lost on Shannon that the scientist who had declared that any message could be sent through any noisy channel with almost perfect fidelity was now himself a proven exception. Transmissions could reach Claude Shannon. But then they would fail to go any farther. Information theory, in the meantime, was getting ready for the masses. In 1953, one of the premier science journalists of the era, Francis Bello of Fortune magazine, profiled Shannon along with Norbert Wiener, an MIT mathematician who was putting forward theories on the command and control of machines, a discipline closely related to Shannon’s work on information. Wiener called his work cybernetics. “Within the last five years a new theory has appeared that seems to bear some of the same hallmarks of greatness,” Bello wrote. “The new theory, still almost unknown to the general public, goes under either of two names: communication theory or information theory. Whether or not it will ultimately rank with the enduring great is a question now being resolved in a score of major laboratories here and abroad.”11 Bello didn’t say whether Shannon was working toward any kind of resolution, and in fact he wasn’t.


pages: 389 words: 109,207

Fortune's Formula: The Untold Story of the Scientific Betting System That Beat the Casinos and Wall Street by William Poundstone

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, anti-communist, asset allocation, beat the dealer, Benoit Mandelbrot, Black-Scholes formula, Brownian motion, buy low sell high, capital asset pricing model, Claude Shannon: information theory, computer age, correlation coefficient, diversified portfolio, Edward Thorp, en.wikipedia.org, Eugene Fama: efficient market hypothesis, high net worth, index fund, interest rate swap, Isaac Newton, Johann Wolfgang von Goethe, John Meriwether, John von Neumann, Kenneth Arrow, Long Term Capital Management, Louis Bachelier, margin call, market bubble, market fundamentalism, Marshall McLuhan, Myron Scholes, New Journalism, Norbert Wiener, offshore financial centre, Paul Samuelson, publish or perish, quantitative trading / quantitative finance, random walk, risk tolerance, risk-adjusted returns, Robert Shiller, Robert Shiller, Ronald Reagan, Rubik’s Cube, short selling, speech recognition, statistical arbitrage, The Predators' Ball, The Wealth of Nations by Adam Smith, transaction costs, traveling salesman, value at risk, zero-coupon bond, zero-sum game

Many of these artists were acquainted with at least the name of Claude Shannon and the conceptual gist of his theory. To people like Cage and Rauschenberg, who were exploring how minimal a work of music or art may be, information theory appeared to have something to say—even if no one was ever entirely sure what. Shannon came to feel that information theory had been over-sold. In a 1956 editorial he gently derided the information theory “bandwagon.” People who did not understand the theory deeply were seizing on it as a trendy metaphor and overstating its relevance to fields remote from its origin. Other theorists such as Norbert Wiener and Peter Elias took up this theme. It was time, Elias acidly wrote, to stop publishing papers with titles like “Information Theory, Photosynthesis, and Religion.” To Shannon, Wiener, and Elias, the question of information theory’s relevance was more narrowly defined than it was for Marshall McLuhan.

In 1956 a young scientist pondering his employer’s ambivalent relationship with bookmaking devised the most successful gambling system of all time. PART ONE Entropy Claude Shannon LIFE IS A GAMBLE. There are few sure things, least of all in the competitive world of academic recruitment. Claude Shannon was as close to a sure thing as existed. That is why the Massachusetts Institute of Technology was prepared to do what was necessary to lure Shannon away from AT&T’s Bell Labs, and why the institute was delighted when Shannon became a visiting professor in 1956. Shannon had done what practically no one else had done since the Renaissance. He had single-handedly invented an important new science. Shannon’s information theory is an abstract science of communication that lies behind computers, the Internet, and all digital media. “It’s said that it is one of the few times in history where somebody founded the field, asked all the right questions, and proved most of them and answered them all at once,” noted Cornell’s Toby Berger.

“This, of course, involves not only written and oral speech, but also music, the pictorial arts, the theater, the ballet, and in fact all human behavior.” These words were written by Shannon’s former employer Warren Weaver. Weaver’s essay presented information theory as a humanistic discipline—perhaps misleadingly so. Strongly influenced by Shannon, media theorist Marshall McLuhan coined the term “information age” in Understanding Media (1964). Oracular as some of his pronouncements were, McLuhan spoke loud and clear with that concise coinage. It captured the way the electronic media (still analog in the 1960s) were changing the world. It implied, more presciently than McLuhan could have known, that Claude Shannon was a prime mover in that revolution. There were earnest attempts to apply information theory to semantics, linguistics, psychology, economics, management, quantum physics, literary criticism, garden design, music, the visual arts, and even religion.


pages: 855 words: 178,507

The Information: A History, a Theory, a Flood by James Gleick

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, AltaVista, bank run, bioinformatics, Brownian motion, butterfly effect, citation needed, Claude Shannon: information theory, clockwork universe, computer age, conceptual framework, crowdsourcing, death of newspapers, discovery of DNA, Donald Knuth, double helix, Douglas Hofstadter, en.wikipedia.org, Eratosthenes, Fellow of the Royal Society, Gödel, Escher, Bach, Henri Poincaré, Honoré de Balzac, index card, informal economy, information retrieval, invention of the printing press, invention of writing, Isaac Newton, Jacquard loom, Jacquard loom, Jaron Lanier, jimmy wales, John von Neumann, Joseph-Marie Jacquard, lifelogging, Louis Daguerre, Marshall McLuhan, Menlo Park, microbiome, Milgram experiment, Network effects, New Journalism, Norbert Wiener, On the Economy of Machinery and Manufactures, PageRank, pattern recognition, phenotype, Pierre-Simon Laplace, pre–internet, Ralph Waldo Emerson, RAND corporation, reversible computing, Richard Feynman, Richard Feynman, Rubik’s Cube, Simon Singh, Socratic dialogue, Stephen Hawking, Steven Pinker, stochastic process, talking drums, the High Line, The Wisdom of Crowds, transcontinental railway, Turing machine, Turing test, women in the workforce

♦ “THE MERE SOUNDS OF SPEECH”: Edward Sapir, Language: An Introduction to the Study of Speech (New York: Harcourt, Brace, 1921), 21. ♦ “D MEASURES, IN A SENSE, HOW MUCH A TEXT”: “Communication Theory of Secrecy Systems,” in Claude Shannon, Collected Papers, 85. ♦ “THE ENEMY IS NO BETTER OFF”: Ibid., 97. ♦ “THE ‘MEANING’ OF A MESSAGE IS GENERALLY IRRELEVANT”: “Communication Theory—Exposition of Fundamentals,” IRE Transactions on Information Theory, no. 1 (February 1950), in Claude Shannon, Collected Papers, 173. ♦ “WHAT GIBBS DID FOR PHYSICAL CHEMISTRY”: Warren Weaver letter to Claude Shannon, 27 January 1949, Manuscript Division, Library of Congress. ♦ “SOMETHING OF A DELAYED ACTION BOMB”: John R. Pierce, “The Early Days of Information Theory,” IEEE Transactions on Information Theory 19, no. 1 (1973): 4. ♦ “THE FUNDAMENTAL PROBLEM OF COMMUNICATION”: Claude Elwood Shannon and Warren Weaver, The Mathematical Theory of Communication (Urbana: University of Illinois Press, 1949), 31

New Wires, New Logic Chapter 7. Information Theory Chapter 8. The Informational Turn Chapter 9. Entropy and Its Demons Chapter 10. Life’s Own Code Chapter 11. Into the Meme Pool Chapter 12. The Sense of Randomness Chapter 13. Information Is Physical Chapter 14. After the Flood Chapter 15. New News Every Day Epilogue Acknowledgments Notes Bibliography Index A Note About The Author Illustration Credits PROLOGUE The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning. —Claude Shannon (1948) AFTER 1948, which was the crucial year, people thought they could see the clear purpose that inspired Claude Shannon’s work, but that was hindsight.

♦ “BY THE SPEED OF TRANSMISSION OF INTELLIGENCE IS MEANT”: H. Nyquist, “Certain Factors Affecting Telegraph Speed,” 333. ♦ “THE CAPACITY OF A SYSTEM TO TRANSMIT”: R. V. L. Hartley, “Transmission of Information,” 537. 7. INFORMATION THEORY ♦ “PERHAPS COMING UP WITH A THEORY”: Jon Barwise, “Information and Circumstance,” Notre Dame Journal of Formal Logic 27, no. 3 (1986): 324. ♦ SAID NOTHING TO EACH OTHER ABOUT THEIR WORK: Shannon interview with Robert Price: “A Conversation with Claude Shannon: One Man’s Approach to Problem Solving,” IEEE Communications Magazine 22 (1984): 125; cf. Alan Turing to Claude Shannon, 3 June 1953, Manuscript Division, Library of Congress. ♦ “NO, I’M NOT INTERESTED IN DEVELOPING A POWERFUL BRAIN”: Andrew Hodges, Alan Turing: The Enigma (London: Vintage, 1992), 251. ♦ “A CONFIRMED SOLITARY”: Max H.


pages: 229 words: 67,599

The Logician and the Engineer: How George Boole and Claude Shannon Created the Information Age by Paul J. Nahin

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Any sufficiently advanced technology is indistinguishable from magic, Claude Shannon: information theory, conceptual framework, Edward Thorp, Fellow of the Royal Society, finite state, four colour theorem, Georg Cantor, Grace Hopper, Isaac Newton, John von Neumann, knapsack problem, New Journalism, Pierre-Simon Laplace, reversible computing, Richard Feynman, Richard Feynman, Schrödinger's Cat, Steve Jobs, Steve Wozniak, thinkpad, Thomas Bayes, Turing machine, Turing test, V2 rocket

See diode potentiometer probability: conditional; conservation of; constraint; inclusion-exclusion theorem; independence; mutually exclusive events; theorem of total product-of-sums pull-down resistor pull-up resistor qubit quantum: logic; mechanics Rad, Tibor. See also Busy Beaver Game redundancy Reilly, Sidney relay: crummy; theory of Riordan, John RS flip-flop sample (point); (space) shannon (information unit). See also bit Shannon, Catherine (sister of Claude) Shannon, Claude (father of Claude) Shannon, Claude Elwood; codes by; life of; on probability; his salesmen and engineers puzzle; and switches; and time machines; and Turing machines Shannon, Mabel (mother of Claude) Shannon-Hagelbarger theorem Sheffer, Henry Shestakov, Victor Shor, Peter. See also algorithm (Shor’s) Sklansky, Jack source rate sphere-packing spooky-action-at-a-distance. See also quantum mechanics square-root of NOT state-transition state-vector; collapse of. See also quantum mechanics stored-program stroke function sum-of-products switch; bounce; parallel; series; voltage-controlled.

The Logician and the Engineer Frontispiece: Reproduced by arrangement with the artist. The Logician and the Engineer How George Boole and Claude Shannon Created the Information Age PAUL J . NAHIN PRINCETON UNIVERSITY PRESS PRINCETON AND OXFORD Copyright © 2013 by Princeton University Press Published by Princeton University Press, 41 William Street, Princeton, New Jersey 08540 In the United Kingdom: Princeton University Press, 6 Oxford Street, Woodstock, Oxfordshire OX20 1TW press.princeton.edu All Rights Reserved Library of Congress Cataloging-in-Publication Data Nahin, Paul J. The logician and the engineer : how George Boole and Claude Shannon created the information age / Paul J. Nahin. pages cmIncludes bibliographical references and index. ISBN 978-0-691-15100-7 1. Boole, George, 1815–1864. 2.

The second part of Chapter 3 is, therefore, based only on several short biographical remembrances, interviews Shannon gave some years before his death, and obituary notices. The most useful to me were (a) Robert G. Gallager, “Claude Elwood Shannon,” Proceedings of the American Philosophical Society, June 2003, pp. 187–191. (b) Anthony Liversidge, “Profile of Claude Shannon,” Omni, August 1987 (reprinted in Shannon’s Collected Papers, N.J.A. Sloane and Aaron D. Wyner, editors, IEEE Press, 1993). (c) Solomon W. Golomb et al., “Claude Elwood Shannon (1916–2001),” Notices of the American Mathematical Society, January 2002, pp. 8–16. (d) James F. Crow, “Shannon’s Brief Foray into Genetics,” Genetics, November 2001, pp. 915–917. 10. Liversidge, “Profile of Claude Shannon.” 4 Boolean Algebra They who are acquainted with the present state of the theory of Symbolical Algebra, are aware that the validity of the processes of analysis does not depend upon the interpretation of the symbols which are employed, but solely upon the laws of their combination


pages: 370 words: 94,968

The Most Human Human: What Talking With Computers Teaches Us About What It Means to Be Alive by Brian Christian

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

4chan, Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Bertrand Russell: In Praise of Idleness, carbon footprint, cellular automata, Claude Shannon: information theory, cognitive dissonance, commoditize, complexity theory, crowdsourcing, David Heinemeier Hansson, Donald Trump, Douglas Hofstadter, George Akerlof, Gödel, Escher, Bach, high net worth, Isaac Newton, Jacques de Vaucanson, Jaron Lanier, job automation, l'esprit de l'escalier, Loebner Prize, Menlo Park, Ray Kurzweil, RFID, Richard Feynman, Richard Feynman, Ronald Reagan, Skype, statistical model, Stephen Hawking, Steve Jobs, Steven Pinker, theory of mind, Thomas Bayes, Turing machine, Turing test, Von Neumann architecture, Watson beat the top human players on Jeopardy!, zero-sum game

The Turing test, bless it, has now given us a yardstick for this shame. A Mathematical Theory of Communication It seems, at first glance, that information theory—the science of data transmission, data encryption, and data compression—would be mostly a question of engineering, having little to do with the psychological and philosophical questions that surround the Turing test and AI. But these two ships turn out to be sailing quite the same seas. The landmark paper that launched information theory is Claude Shannon’s 1948 “A Mathematical Theory of Communication,” and as it happens, this notion of scientifically evaluating “communication” binds information theory and the Turing test to each other from the get-go. What is it, exactly, that Shannon identified as the essence of communication? How do you measure it?

For more, see Hofstadter’s I Am a Strange Loop. 56 Benjamin Seider, Gilad Hirschberger, Kristin Nelson, and Robert Levenson, “We Can Work It Out: Age Differences in Relational Pronouns, Physiology, and Behavior in Marital Conflict,” Psychology and Aging 24, no. 3 (September 2009), pp. 604–13. 10. High Surprisal 1 Claude Shannon, “A Mathematical Theory of Communication,” Bell System Technical Journal 27 (1948), pp. 379–423, 623–56. 2 average American teenager: Katie Hafner, “Texting May Be Taking a Toll,” New York Times, May 25, 2009. 3 The two are in fact related: For more information on the connections between Shannon (information) entropy and thermodynamic entropy, see, e.g., Edwin Jaynes, “Information Theory and Statistical Mechanics,” Physical Review 106, no. 4, (May 1957), pp. 620–30; and Edwin Jaynes, “Information Theory and Statistical Mechanics II,” Physical Review 108, no. 2 (October 1957), pp. 171–90. 4 Donald Barthelme, “Not-Knowing,” in Not-Knowing: The Essays and Interviews of Donald Barthelme, edited by Kim Herzinger (New York: Random House, 1997). 5 Jonathan Safran Foer, Extremely Loud and Incredibly Close (Boston: Houghton Mifflin, 2005). 6 The cloze test comes originally from W.

Epilogue: The Unsung Beauty of the Glassware Cabinet Acknowledgments Notes The beautiful changes as a forest is changed By a chameleon’s tuning his skin to it; As a mantis, arranged On a green leaf, grows Into it, makes the leaf leafier … –RICHARD WILBUR I think metaphysics is good if it improves everyday life; otherwise forget it. –ROBERT PIRSIG As President, I believe that robotics can inspire young people to pursue science and engineering. And I also want to keep an eye on those robots, in case they try anything. –BARACK OBAMA 0. Prologue Claude Shannon, artificial intelligence pioneer and founder of information theory, met his wife, Mary Elizabeth, at work. This was Bell Labs in Murray Hill, New Jersey, the early 1940s. He was an engineer, working on wartime cryptography and signal transmission. She was a computer. 1. Introduction: The Most Human Human I wake up five thousand miles from home in a hotel room with no shower: for the first time in fifteen years, I take a bath.


pages: 405 words: 117,219

In Our Own Image: Savior or Destroyer? The History and Future of Artificial Intelligence by George Zarkadakis

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, Ada Lovelace, agricultural Revolution, Airbnb, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, anthropic principle, Asperger Syndrome, autonomous vehicles, barriers to entry, battle of ideas, Berlin Wall, bioinformatics, British Empire, business process, carbon-based life, cellular automata, Claude Shannon: information theory, combinatorial explosion, complexity theory, continuous integration, Conway's Game of Life, cosmological principle, dark matter, dematerialisation, double helix, Douglas Hofstadter, Edward Snowden, epigenetics, Flash crash, Google Glasses, Gödel, Escher, Bach, income inequality, index card, industrial robot, Internet of things, invention of agriculture, invention of the steam engine, invisible hand, Isaac Newton, Jacquard loom, Jacquard loom, Jacques de Vaucanson, James Watt: steam engine, job automation, John von Neumann, Joseph-Marie Jacquard, liberal capitalism, lifelogging, millennium bug, Moravec's paradox, natural language processing, Norbert Wiener, off grid, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, Paul Erdős, post-industrial society, prediction markets, Ray Kurzweil, Rodney Brooks, Second Machine Age, self-driving car, Silicon Valley, speech recognition, stem cell, Stephen Hawking, Steven Pinker, strong AI, technological singularity, The Coming Technological Singularity, The Future of Employment, the scientific method, theory of mind, Turing complete, Turing machine, Turing test, Tyler Cowen: Great Stagnation, Vernor Vinge, Von Neumann architecture, Watson beat the top human players on Jeopardy!, Y2K

We have already met the first two. Norbert Wiener was the grand visionary of cybernetics. Inspired by mechanical control systems, such as artillery targeting and servomechanisms, as well as Claude Shannon’s mathematical theory of communication and information, he articulated the theory of cybernetics in his landmark book, Cybernetics, of 1948.4 Godfather number two, Claude Shannon, was the genius who gave us information theory. We saw how Wiener and Shannon pondered on the ontology of information, and how they decided to regard it as something beyond matter and energy. The legacy of their decision is still with us today, in the disembodiment of information that defines post-humanism. The man who demonstrated the direct connection between neurons and computers was Professor Warren S. McCulloch (1898–1969), the American neurophysiologist who loved writing sonnets and laid the foundations of many contemporary brain theories.

In a strange twist of history, it appears that after the Second World War computer scientists and engineers dismissed Aristotelian empiricism and became Neoplatonists. Instead of adhering to the principle that universals are always predicated by physical objects they adopted the opposite, Platonic, idea. In Part III of this book, we will explore in more detail the technological reasons behind this conceptual shift. But before we do so, let us first examine the concept of information. The father of information theory is the American mathematician, electronic engineer and cryptographer Claude Shannon (1916–2001). He worked as a cryptanalyst in the Second World War, and in early 1943 he met Alan Turing, who had been posted to Washington to work with the Americans on breaking the German naval codes. Like his English counterpart, Shannon is one of the great heroes of computer science, a man whose work has shaped the world we live in. He was a rather playful person, too.

From the late nineteenth century till the late 1930s a number of other, seemingly irrelevant inventions were made that, with time, would define modern computing. Two of them were, arguably, the most significant. In 1876 Graham Bell discovered the telephone. And four years later, in 1879, Thomas Edison discovered the incandescent lamp.14 These two inventions would become fundamental in the birth and evolution of electronics and telecommunications. By the time Claude Shannon began to formulate his information theory, there was a pressing need for better telecommunications systems because of the ubiquity of the telephone. Meanwhile, Edison’s incandescent lamp had evolved into sophisticated vacuum tubes that acted as electronic amplifiers, rectifiers, switches and oscillators. Shannon was the genius who combined logic with electronics. In 1937, as a twenty-one-year-old student at the Massachusetts Institute of Technology (MIT), he demonstrated how Boole’s logic could be applied in electronic circuits to construct and resolve logical or numerical relationships.


pages: 193 words: 19,478

Memory Machines: The Evolution of Hypertext by Belinda Barnet

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

augmented reality, Benoit Mandelbrot, Bill Duvall, British Empire, Buckminster Fuller, Claude Shannon: information theory, collateralized debt obligation, computer age, conceptual framework, Douglas Engelbart, Douglas Engelbart, game design, hiring and firing, Howard Rheingold, HyperCard, hypertext link, information retrieval, Internet Archive, John Markoff, linked data, mandelbrot fractal, Marshall McLuhan, Menlo Park, nonsequential writing, Norbert Wiener, publish or perish, Robert Metcalfe, semantic web, Steve Jobs, Stewart Brand, technoutopianism, Ted Nelson, the scientific method, Vannevar Bush, wikimedia commons

It was an underpaid woman sitting there with a hand calculator, and they’d have rooms full of them, that’s how they got their computing done. So you’d say, ‘What’s your job?’ ‘I’m a computer.’ (Engelbart 1986) The Analyzer evaluated and solved these equations by mechanical integration. It automated a task that was previously done by human beings and created a small revolution at MIT. Many of the people who worked on the machine (for example Harold Hazen, Gordon Brown and Claude Shannon) later made contributions to feedback control, information theory and computing (Mindell 2000). The machine was a huge success that brought prestige and a flood of federal money to MIT and Bush. However, by the spring of 1950 the Analyzer was gathering dust in a storeroom; the project had died. Why did it fail? Why did the world’s most 14 Memory Machines important analogue computer end up obsolescing in a backroom? It is not uncommon for a successful engineering prototype to become obsolescent within twenty years.

Microfilm would provide ultra-fast input and inexpensive massmemory, photoelectricity would allow high-speed sensing and reproduction, and digital electronics would allow astonishingly fast and inexpensive control and calculation. (Burke 1991, 147) Bush transferred these three technologies to the new design. This decision was not pure genius on his part; they were perfect analogues for a popular conception of how the brain worked at the time. The scientific community at MIT were developing a pronounced interest in man-machine analogues, and although Claude Shannon had not yet published his information theory (Dutta 1995), it was already being formulated. Much discussion also took place around MIT about how the brain might process information in the manner of an analogue machine. Bush thought and designed in terms of analogies between brain and machine, electricity and information. He shared this central research agenda with Norbert Weiner and Warren McCulloch, both at MIT, who were at the time ‘working on parallels they saw between neural structure and process and computation’ (Nyce and Kahn 1991, 63; see also Hayles 1999).

In 1935 the US Navy came to Bush for advice on machines to crack coding devices like the new Japanese cipher machines (Burke 1991). They wanted a long-term project that would give the United States the most technically advanced cryptanalytic capabilities in the world, a superfast machine to count the coincidences of letters in two messages or copies of a single message. Bush assembled a research team for this project that included Claude Shannon, one of the early information theorists and a significant part of the emerging cybernetics community (Nyce and Kahn 1991). Three new technologies were emerging at the time that handled information: photoelectricity, microfilm and digital electronics. All three were just emerging, but, unlike the fragile magnetic recording [Bush’s] students were exploring, they appeared to be ready to use in calculation machines.


pages: 505 words: 142,118

A Man for All Markets by Edward O. Thorp

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3Com Palm IPO, Albert Einstein, asset allocation, beat the dealer, Bernie Madoff, Black Swan, Black-Scholes formula, Brownian motion, buy low sell high, carried interest, Chuck Templeton: OpenTable, Claude Shannon: information theory, cognitive dissonance, collateralized debt obligation, compound rate of return, Credit Default Swap, credit default swaps / collateralized debt obligations, diversification, Edward Thorp, Erdős number, Eugene Fama: efficient market hypothesis, financial innovation, George Santayana, German hyperinflation, Henri Poincaré, high net worth, High speed trading, index arbitrage, index fund, interest rate swap, invisible hand, Jarndyce and Jarndyce, Jeff Bezos, John Meriwether, John Nash: game theory, Kenneth Arrow, Livingstone, I presume, Long Term Capital Management, Louis Bachelier, margin call, Mason jar, merger arbitrage, Murray Gell-Mann, Myron Scholes, NetJets, Norbert Wiener, passive investing, Paul Erdős, Paul Samuelson, Pluto: dwarf planet, Ponzi scheme, price anchoring, publish or perish, quantitative trading / quantitative finance, race to the bottom, random walk, Renaissance Technologies, RFID, Richard Feynman, Richard Feynman, risk-adjusted returns, Robert Shiller, Robert Shiller, rolodex, Sharpe ratio, short selling, Silicon Valley, statistical arbitrage, stem cell, survivorship bias, The Myth of the Rational Market, The Predators' Ball, the rule of 72, The Wisdom of Crowds, too big to fail, Upton Sinclair, value at risk, Vanguard fund, Vilfredo Pareto, Works Progress Administration

Before Ed Thorp, mathematicians of gambling had their love of chance largely unrequited. Thorp’s method is as follows: He cuts to the chase in identifying a clear edge (that is something that in the long run puts the odds in his favor). The edge has to be obvious and uncomplicated. For instance, calculating the momentum of a roulette wheel, which he did with the first wearable computer (and with no less a coconspirator than the great Claude Shannon, father of information theory), he estimated a typical edge of roughly 40 percent per bet. But that part is easy, very easy. It is capturing the edge, converting it into dollars in the bank, restaurant meals, interesting cruises, and Christmas gifts to friends and family—that’s the hard part. It is the dosage of your betting—not too little, not too much—that matters in the end. For that, Ed did great work on his own, before the theoretical refinement that came from a third member of the Information Trio: John Kelly, originator of the famous Kelly Criterion, a formula for placing bets that we discuss today because Ed Thorp made it operational.

To protect myself from this happening with my work on blackjack, I settled on Proceedings of the National Academy of Sciences, as it was the quickest to publish of any journal I knew, taking as little as two or three months, and was also very prestigious. This required a member of the academy to approve and forward my work, so I sought out the only mathematics member of the academy at MIT, Claude Shannon. Claude was famous for the creation of information theory, which is crucial for modern computing, communications, and much more. The department secretary arranged a short appointment with a reluctant Shannon at noon. However, she warned me that Shannon was going to be in for only a few minutes, that I shouldn’t expect more, and that he didn’t spend time on topics or people that didn’t interest him. A little in awe but feeling lucky, I arrived at Shannon’s office to find a thinnish alert man of middle height and build, somewhat sharp-featured.

During the long ride back I wondered how my research into the mathematical theory of a game might change my life. In the abstract, life is a mixture of chance and choice. Chance can be thought of as the cards you are dealt in life. Choice is how you play them. I chose to investigate blackjack. As a result, chance offered me a new set of unexpected opportunities. Ever since my first meeting with Claude Shannon in September, we had been working on the roulette project approximately twenty hours a week. Meanwhile, I was teaching courses, doing research in pure mathematics, attending department functions, writing up my blackjack research, and adjusting to being a new father. Following a roulette work session at the Shannons’, Claude asked me at dinner if I thought anything would ever top this in my life.


pages: 250 words: 73,574

Nine Algorithms That Changed the Future: The Ingenious Ideas That Drive Today's Computers by John MacCormick, Chris Bishop

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Ada Lovelace, AltaVista, Claude Shannon: information theory, fault tolerance, information retrieval, Menlo Park, PageRank, pattern recognition, Richard Feynman, Richard Feynman, Silicon Valley, Simon Singh, sorting algorithm, speech recognition, Stephen Hawking, Steve Jobs, Steve Wozniak, traveling salesman, Turing machine, Turing test, Vannevar Bush

So it is not altogether surprising that the two major events triggering the creation of error-correcting codes both occurred in the research laboratories of the Bell Telephone Company. The two heroes of our story, Claude Shannon and Richard Hamming, were both researchers at Bell Labs. Hamming we have met already: it was his annoyance at the weekend crashes of a company computer that led directly to his invention of the first error-correcting codes, now known as Hamming codes. However, error-correcting codes are just one part of a larger discipline called information theory, and most computer scientists trace the birth of the field of information theory to a 1948 paper by Claude Shannon. This extraordinary paper, entitled “The Mathematical Theory of Communication,” is described in one biography of Shannon as “the Magna Carta of the information age.”

THE ORIGINS OF COMPRESSION ALGORITHMS The same-as-earlier trick described in this chapter—one of the main compression methods used in ZIP files—is known to computer scientists as the LZ77 algorithm. It was invented by two Israeli computer scientists, Abraham Lempel and Jacob Ziv, and published in 1977. To trace the origins of compression algorithms, however, we need to delve three decades further back into scientific history. We have already met Claude Shannon, the Bell Labs scientist who founded the field of information theory with his 1948 paper. Shannon was one of the two main heroes in our story of error-correcting codes (chapter 5), but he and his 1948 paper also figure importantly in the emergence of compression algorithms. This is no coincidence. In fact, error-correcting codes and compression algorithms are two sides of the same coin. It all comes down to the notion of redundancy, which featured quite heavily in chapter 5.

The quotation from Hamming on page 60 is also given in this book and derives from a 1977 interview of Hamming by Thompson. Mathematicians will greatly enjoy Thompson's delightful book, but it definitely assumes the reader has a healthy dose of college math. Dewdney's book (see above) has two interesting chapters on coding theory. The two quotations about Shannon on pages 77-78 are taken from a brief biography by N. J. A. Sloane and A. D. Wyner, appearing in Claude Shannon: Collected Papers edited by Sloane and Wyner (1993). Pattern recognition (chapter 6). Bishop's lectures (see above) have some interesting material that nicely complements this chapter. The geographical data about political donations is taken from the Fundrace project of the Huffington Post. All the handwritten digit data is taken from a dataset provided by Yann LeCun, of New York University's Courant Institute, and his collaborators.


pages: 210 words: 62,771

Turing's Vision: The Birth of Computer Science by Chris Bernhardt

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Andrew Wiles, British Empire, cellular automata, Claude Shannon: information theory, complexity theory, Conway's Game of Life, discrete time, Douglas Hofstadter, Georg Cantor, Gödel, Escher, Bach, Henri Poincaré, Internet Archive, Jacquard loom, Jacquard loom, John Conway, John von Neumann, Joseph-Marie Jacquard, Norbert Wiener, Paul Erdős, Turing complete, Turing machine, Turing test, Von Neumann architecture

As Robert Soare has commented, you can think of your laptop being a Turing machine and the oracle as being the web. After Turing completed his Ph.D., von Neumann offered him a position as his assistant, but Turing decided not to accept and instead returned to England. During the time that Turing was working on his Ph.D. another breakthrough paper was written. This was on logic and switching circuits and was written by Claude Shannon. Claude Shannon In 1936, Claude Shannon graduated from the University of Michigan with two undergraduate degrees; one in electrical engineering and one in mathematics. He then went to M.I.T. for graduate school. At M.I.T. he worked on an early analog computer. This work led him to consider switches and digital computing. Switches have the property that they are either on or off. It is clear that they can represent 0 or 1, but Shannon showed they could do much more; that all of Boolean algebra and binary arithmetic could be done using electrical circuits and switches.

This idea, along with ideas from group theory and probability theory enabled him, like Rejewski, to design a machine containing multiple copies of Enigmas wired together in such a way that they performed the search through a restricted set of possibilities. His first bombe was operational in the spring of 1940.7 In 1942, after the United States joined the war, Turing went to America to work with cryptologists there, helping with the design of American bombes. While in America, he visited Bell Labs, which was then in lower Manhattan, where he met Claude Shannon. In addition to working on cryptology, they discussed their work on computing.8 At the end of the war there were a large number of bombes working in both the US and England. Being able to read Germany’s messages enabled the allies to locate German U-boats and the routes of ships. It helped with the choice of when and where to attack. Turing and Rejewski had not only played an important part in winning the war, but had ushered cryptology into a new age.

Why the Poles called their machine bomba (which translates as bomb in English) seems to be a mystery. Rejewsksi, when asked about this, is said to have replied that he couldn’t think of a better name. 8. Both Shannon and Turing were interested in using ideas from probability to extract information from data. Shannon would later extend some of his wartime work and write the groundbreaking paper “A Mathematical Theory of Communication” that is one of the foundations of Information Theory. Turing wrote several articles on the application of probability to cryptography. These were classified and only now are being made available to the public (two papers were declassified in 2012. They are available at http://www.nationalarchives.gov.uk). 9. The English use the term valve where the Americans use vacuum tube. 10. The National Museum of Computing in Bletchley Park has reconstructed a Colossus.


pages: 518 words: 107,836

How Not to Network a Nation: The Uneasy History of the Soviet Internet (Information Policy) by Benjamin Peters

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, Andrei Shleifer, Benoit Mandelbrot, bitcoin, Brownian motion, Claude Shannon: information theory, cloud computing, cognitive dissonance, computer age, conceptual framework, continuation of politics by other means, crony capitalism, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, David Graeber, Dissolution of the Soviet Union, Donald Davies, double helix, Drosophila, Francis Fukuyama: the end of history, From Mathematics to the Technologies of Life and Death, hive mind, index card, informal economy, information asymmetry, invisible hand, Jacquard loom, Jacquard loom, John von Neumann, Kevin Kelly, knowledge economy, knowledge worker, linear programming, mandelbrot fractal, Marshall McLuhan, means of production, Menlo Park, Mikhail Gorbachev, mutually assured destruction, Network effects, Norbert Wiener, packet switching, Pareto efficiency, pattern recognition, Paul Erdős, Peter Thiel, Philip Mirowski, RAND corporation, rent-seeking, road to serfdom, Ronald Coase, scientific mainstream, Steve Jobs, Stewart Brand, stochastic process, technoutopianism, The Structural Transformation of the Public Sphere, transaction costs, Turing machine

These include Claude Lévi-Strauss’s treatment of language as a technologically ordered series (after meeting with Macy Conference attendee Roman Jakobson in Paris in 1950); Jacques Lacan’s turning to mathematical concepts; Roland Barthes’s turn to schematic accounts of communication; Gilles Deleuze’s abandonment of meaning, with Claude Shannon’s information theory in hand; Felix Guattari’s, Michel Foucault’s, and other French theorists’ experimentation with terms such as encoding, decoding, information, and communication.34 Postmodern French theory owes a deep debt to postwar information theory and the cybernetic sciences. In England, cybernetics took on a different character in the form of the Ratio Club, a small but potent gathering of British cybernetic figures who gathered regularly in the basement of the National Hospital for Nervous Diseases in London from 1949 through 1955.

Gerovitch details the translation of their terms: “What Wiener called ‘the feedback mechanism’ they called ‘the theory of feedback’ … ‘basic principles of digital computing’ became ‘the theory of automatic high-speed electronic calculating machines’; ‘cybernetic models of human thinking’ became the ‘theory of self-organizing logical processes.’”70 In fact, the coauthors used the word theory six times in their definition of cybernetics to emphasize the theoretical nature of the new science, possibly as a way to avoid having to discuss the political implications of introducing a practical field of human-machine applications into a society well suited to adopt them. The coauthors also integrated and expanded the stochastic analysis of Claude Shannon’s information theory while simultaneously stripping Wiener’s organism-machine analogy of its political potency.71 Wiener’s core analogies between animal and machine, machine and mind were stressed as analogies—or how “self-organizing logical processes [appeared] similar to the processes of human thought” but were not synonyms. At the same time, the article scripts his language of control, feedback, and automated systems in the machine and organism into the common language of information, or Shannon’s mathematical theory of communication.

The Macy Conferences, as they were informally known, staked out a spacious interdisciplinary purview for cybernetic research.11 In addition to McCulloch, who directed the conferences, a few noted participants included Wiener himself, the mathematician and game theorist John von Neumann, leading anthropologist Margaret Mead and her then husband Gregory Bateson, founding information theorist and engineer Claude Shannon, sociologist-statistician and communication theorist Paul Lazarsfeld, psychologist and computer scientist J.C.R. Licklider, as well as influential psychiatrists, psychoanalysts, and philosophers such as Kurt Lewin, F.S.C. Northrop, Molly Harrower, and Lawrence Kubie, among others. Relying on mathematical and formal definitions of communication, participants rendered permeable the boundaries that distinguished humans, machines, and animals as information systems.

The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal by M. Mitchell Waldrop

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Ada Lovelace, air freight, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, Apple II, battle of ideas, Berlin Wall, Bill Duvall, Bill Gates: Altair 8800, Byte Shop, Claude Shannon: information theory, computer age, conceptual framework, cuban missile crisis, Donald Davies, double helix, Douglas Engelbart, Douglas Engelbart, Dynabook, experimental subject, fault tolerance, Frederick Winslow Taylor, friendly fire, From Mathematics to the Technologies of Life and Death, Haight Ashbury, Howard Rheingold, information retrieval, invisible hand, Isaac Newton, James Watt: steam engine, Jeff Rulifson, John von Neumann, Leonard Kleinrock, Marc Andreessen, Menlo Park, New Journalism, Norbert Wiener, packet switching, pink-collar, popular electronics, RAND corporation, RFC: Request For Comment, Robert Metcalfe, Silicon Valley, Steve Crocker, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, Ted Nelson, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture, Wiener process, zero-sum game

And on June 30, Bell Labs announced a strange new amplifying device based on the cutting-edge physics of crystals and quantum mechanics-something called a transistor. And yet even in the midst of all that, Claude Shannon's long-delayed opus on information theory exploded like a bomb. His analysis of communication was breathtaking in scope, masterful in execution-and, for most people, totally unexpected. "It was like a bolt out of the blue, a really unique thing," recalls his Bell Labs colleague John Pierce. "I don't know of any other theory that came in a complete form like that, with very few antecedents or history." "It was a revelation," agrees Oliver Selfridge. "Around MIT the reaction was 'Brilliant! Why didn't I think of that?' Information theory gave us a whole con- ceptual vocabulary, as well as a technical vocabulary." Indeed, there was something about this notion of quantifying information that fired the imagination, much as the notion of quantifying "chaos" would do in the 1980s.

And on almost every page he used yet another psychological term: "information." NEW KINDS OF PEOPLE 75 Miller kept on reading. And by the time he was finished, he now says, he knew that his life had changed. THE CONJURER Legend has it that Claude Shannon published "A Mathematical Theory of Com- munication" in 1948 only because his boss at Bell Labs finally badgered him into it. And whatever the truth of that story, the point is that no one who knew Shannon has any trouble believing it. "He wrote beautiful papers-when he wrote," says Robert Fano, who became a leader of MIT's information-theory group in the 1950s and still has a reveren- tial photograph of Shannon hanging in his office. "And he gave beautiful talks- when he gave a talk. But he hated to do it." It wasn't a matter of Shannon's being lazy, Fano says; he was constantly fill- ing up notebooks with ideas, theorems, and calculations.

Miller's own work on chunking had already shown him that the mind organized concepts into a hierarchy. "But Chomsky's model seemed to make that idea much more general," he says. Indeed, his meet- ing with Chomsky was as pivotal as the moment eight years earlier when he'd picked up that July 1948 issue of the Bell Systems Technical Journal containing Claude Shannon's article-and all the more so because Chomsky's message was to be strongly reinforced just a short time later. Miller remembers the day very clearly: Tuesday, September 11, 1956, the second day of the second international conference on information theory. Actually, the whole conference was good. Held in MIT's Sloan Building, right on the river- front, it included talks by Jerry Wiesner, Bob Fano, Peter Elias, Oliver Selfridge, Walter Rosenblith, and even Shannon himself. But that second day was what THE FREEDOM TO MAKE MISTAKES 133 Miller had really been looking forward to.


pages: 137 words: 36,231

Information: A Very Short Introduction by Luciano Floridi

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

agricultural Revolution, Albert Einstein, bioinformatics, carbon footprint, Claude Shannon: information theory, conceptual framework, double helix, Douglas Engelbart, Douglas Engelbart, George Akerlof, Gordon Gekko, industrial robot, information asymmetry, intangible asset, Internet of things, invention of writing, John Nash: game theory, John von Neumann, moral hazard, Nash equilibrium, Norbert Wiener, Pareto efficiency, phenotype, Pierre-Simon Laplace, prisoner's dilemma, RAND corporation, RFID, Thomas Bayes, Turing machine, Vilfredo Pareto

The mathematical theory of communication (MTC) is by far the most important, influential, and widely known. The name for this branch of probability theory comes from Claude Shannon's seminal work. Shannon pioneered the field of mathematical studies of information and obtained many of its principal results, even though he acknowledged the importance of previous work done by other researchers and colleagues at Bell laboratories. After Shannon, MTC became known as information theory. Today, Shannon is considered `the father of information theory', and the kind of information MTC deals with is often qualified as Shannon information. The term `information theory' is an appealing but unfortunate label, which continues to cause endless misunderstandings. Shannon came to regret its widespread popularity, and I shall avoid it in this context.

My hope is that it will help the reader to make sense of the large variety of informational phenomena with which we deal on a daily basis, of their profound and fundamental importance, and hence of the information society in which we live. Information is notorious for coming in many forms and having many meanings. It can be associated with several explanations, depending on the perspective adopted and the requirements and desiderata one has in mind. The father of information theory, Claude Shannon (1916-2001), for one, was very cautious: The word ' information' has been given different meanings by various writers in the general field of information theory. It is likely that at least a number of these will prove sufficiently useful in certain applications to deserve further study and permanent recognition. It is hardly to be expected that a single concept of information would satisfactorily account for the numerous possible applications of this general field.

Seligman, Information Flow: The Logic of Distributed Systems (Cambridge: Cambridge University Press, 1997). G. Bateson, Steps to an Ecology of Mind (Frogmore, St Albans: Paladin, 1973). T. M. Cover and J. A. Thomas, Elements of Information Theory (New York; Chichester: Wiley, 1991). F. I. Dretske, Knowledge and the Flow of Information (Oxford: Blackwell, 1981). D. S. Jones, Elementary Information Theory (Oxford: Clarendon Press, 1979). D. M. MacKay, Information, Mechanism and Meaning (Cambridge, MA: MIT Press, 1969). J. R. Pierce, An Introduction to Information Theory: Symbols, Signals and Noise, 2nd edn (New York: Dover Publications, 1980). A. M. Turing, `Computing Machinery and Intelligence', Minds and Machines, 1950, 59, 433-60. Chapter 3 C. Cherry, On Human Communication: A Review, a Survey, and a Criticism, 3rd edn (Cambridge, MA; London: MIT Press, 1978).


pages: 339 words: 57,031

From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism by Fred Turner

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

1960s counterculture, A Declaration of the Independence of Cyberspace, Apple's 1984 Super Bowl advert, back-to-the-land, bioinformatics, Buckminster Fuller, Claude Shannon: information theory, complexity theory, computer age, conceptual framework, Danny Hillis, dematerialisation, distributed generation, Douglas Engelbart, Douglas Engelbart, Dynabook, Electric Kool-Aid Acid Test, From Mathematics to the Technologies of Life and Death, future of work, game design, George Gilder, global village, Golden Gate Park, Hacker Ethic, Haight Ashbury, hive mind, Howard Rheingold, informal economy, invisible hand, Jaron Lanier, John Markoff, John von Neumann, Kevin Kelly, knowledge economy, knowledge worker, market bubble, Marshall McLuhan, mass immigration, means of production, Menlo Park, Mother of all demos, new economy, Norbert Wiener, peer-to-peer, post-industrial society, postindustrial economy, Productivity paradox, QWERTY keyboard, Ralph Waldo Emerson, RAND corporation, Richard Stallman, Robert Shiller, Robert Shiller, Ronald Reagan, Shoshana Zuboff, Silicon Valley, Silicon Valley ideology, South of Market, San Francisco, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, technoutopianism, Ted Nelson, Telecommunications Act of 1996, theory of mind, urban renewal, Vannevar Bush, Whole Earth Catalog, Whole Earth Review, Yom Kippur War

In his book Cybernetics; or, Control and Communication in the Animal and the Machine, he defined cybernetics as a field focused on “the study of messages as a means of controlling machinery and society,” with machinery seeming to include, by analogy at least, biological organisms. For Wiener, the world, like the anti-aircraft predictor, was composed of systems linked by, and to some extent made out of, messages. Drawing on Claude Shannon’s information theory (published in 1948, but likely familiar to Wiener much earlier), Wiener defined messages as “forms of pattern and organization.”37 Like Shannon’s information, Wiener’s messages were surrounded by “noise,” yet they somehow maintained their integrity. So too did organisms and machines: incorporating and responding to feedback through structural mechanisms, Wiener explained, both kept themselves in a state of homeostasis.

Cybernetics, as the theory of control mechanisms in technology and nature and founded on the concepts of information and feedback, is but a part of a general theory of systems; cybernetic systems are a special case, however important, of systems showing self-regulation.” Bertalanffy, General System Theory, 3. For Bertalanffy, cybernetics was only one root of systems theory, albeit an important one. Others included the servomechanisms of the nineteenth century, Claude Shannon’s information theory, Von Neumann and Morgenstern’s game theory, and the increasing need in the post–World War II world to monitor and control large systems for social functions such as traffic and finance. For a critical analysis of the relationship between cybernetics and other systems theories, see Lilienfeld, Rise of Systems Theory. 44. Rau, “Adoption of Operations Research,” 57, 6. For a fascinating demonstration of the ways systems analysis helped set the aesthetic terms of planning for nuclear war, see Ghamari-Tabrizi, Worlds of Herman Kahn, esp. 54 –57, 128 –30. 45.

Sirius), 164 Goldstein, Emmanuel, 168, 169 Gore, Al, 219 Graham, Bill, 66 graphical user interface, 111 Grateful Dead, 13, 65, 66, 166 Great Society, 26 Greenblatt, Richard, 136 Greenspan, Alan, 215 Griesemer, James, 72 [ 319 ] Grooms, Red, 48 Gullichsen, Eric, 163 Gurdjieff, Georges Ivanovitch, Meetings with Remarkable Men, 187 hacker ethic, 134 –35, 136 hackers, 117, 132 –35, 133 Hackers’ Conference, 132, 137–38, 168, 169, 171, 219, 249, 254 hacking, as a free-speech issue, 169 Hafner, Katie, 143, 145, 221, 252 Hagel, John, Net Gain, 234 Haight-Ashbury, 32, 48, 66 – 67 Hapgood, Fred, 221–22 “happenings,” 48, 49, 67, 269n14 “hardware hackers,” 133 Harman, Willis, 61, 185, 274n12 Harper’s Magazine, 167 Hart, Pam, 117 Harvard Business Review, analysis of Out of Control, 204 –5 Harvey, David, 242 Hawken, Paul, 128, 185, 188 Hayles, Katherine, 26, 122 Hedgepeth, William, 77 Hefner, Christy, 211 Heims, Steve, 26, 122 Helmreich, Stefan, 198 Hermosillo, Carmen, 155 Herring, Susan, 152 Hertzfeld, Andy, 135 heterarchy, 156 Hewlett-Packard, 138 Hickman, Berry, 96 High Frontiers (’zine), 164 Hillis, Danny, 182, 183, 189 hippies, 32 Hiroshima, 16 Hitt, Jack, 167 Hofmann, Albert, 164 Hog Farm commune, 110 Holm, Richard, 44 Homebrew Computer Club, 70, 102, 106, 114 homeostat, 26, 146, 178 Horowitz, Ed, 208 Hoyt, Brad, 193 HTML code, 222 Hudson Institute, 186 Hudson Review, 47 human-machine collaboration, 108 –9, 111 hyperlinks, 213 [ 320 ] Index I Ching, 65, 82, 93 I. M. Pei, 178 Industry Standard, 207 information: economic paradox of, 136 –37; free dissemination of, 137 Information Processing Techniques Office, 108 Information Superhighway, 219 information system, material world imagined as, 15 information theory: and American art, 268n13; of Claude Shannon, 265n43; and microbiology, 43 – 44 Information Week, 131 Innis, Harold, 52, 269n21 Institute for Advanced Study, 185 Intel, 212 Intercontinental Ballistic Missile, 24 interdisciplinary migration, 58 International Federation for Advanced Study (IFAS), 61 Internet, 247; growth of, 160, 214; as infrastructure and symbol of new economic era, 7; as the New Millennium, 232 –36; privatization of backbone, 213; as symbol of a post-Fordist economic order, 202; utopian claims surrounding the emergence of the, 1–3, 33 Internet stocks, 214, 232 Inuit, 53 IT-290, 60 Jackson, Charles, 211 Jennings, Lois, 70 Jerome, Judson, 32 Jobs, Steve, 133, 138 Johnson, Lyndon, 26 Joselit, David, 46 journalism, shaping of public perceptions, 253 Joy, Bill, 220 juxtaposition, 84 Kahn, Herman, 130, 181, 186, 197 Kahn, Lloyd, 94, 95, 97 Kanter, Rosabeth Moss, 76, 271n9 Kapor, Mitch, 171–72, 218 Kaprow, Allan, 46, 48, 58, 67 Katz, Bruce, 211, 277n1 Kay, Alan, 111–13, 117, 177, 246 Kay, Lily, 44 Kelly, Kevin, 3, 16, 131–32; account of “vivisystems,” 200; commercial sphere as a site of social change, 202 –3; “computational metaphor,” 216; concept of “hive mind,” 201, 202, 204; doctrine of cyberevolutionism, 204; editorial model, 195; editor of Signal, 196; editor of Whole Earth Review, 177, 195 –96; as executive director of Wired, 7, 206, 209, 212, 217; and first Hackers’ Conference, 195; forum on hacking on the WELL, 168 –70; and Gilder, 223; and hacking community, 135; Internet as symbol of post-Fordist economy, 202; longing to return to an egalitarian world, 248; as network entrepreneur, 194 –99; “New Rules for the New Economy,” 15, 234 –35; Out of Control: The Rise of Neo-Biological Civilization, 176, 195, 199 –206; response to 1987 conference on artificial life, 199; review of Electric Word, 211; underplayed the work of embodied labor, 204; and the WELL, 148; on WELL design goals, 143 Keniston, Kenneth, 31; The Young Radicals, 261– 62 Kennedy, Alison (aka Queen Mu), 163 Kennedy, John F., 229, 271n10 Kent State University killings, 98, 118 Kepler’s bookstore, 70 Kerouac, Jack, 62 Kerr, Clark, 11, 12 Kesey, Ken: and geodesic dome, 94; leadership of Merry Pranksters, 63, 65, 67; and LSD, 61, 63; notion of Acid Test, 65; One Flew Over the Cuckoo’s Nest, 59 – 60, 64; rejection of agonistic politics, 64; subject of CIA experimental drug protocols, 60 – 61; and the Supplement, 81; and Trips Festival, 66; at Vietnam Day in 1965, 98 Keyworth, George, 222; “Magna Carta for the Knowledge Age,” 228 –30 Kleiner, Art, 131, 132, 135, 185 Kline, David, 287n37 Korzybski, Alfred, 62 Kravitz, Henry, 211 Kubrick, Stanley, 186 Kuhr, Barbara, 211, 285n2 Lama Foundation, 75, 76, 94, 97, 109, 119 Lampson, Butler, 111 Langton, Christopher, 198 Language Technology (magazine), 211 Lanier, Jaron, 163, 165, 172, 195 laser printer, 111 Index Last Whole Earth Catalog, 70, 81, 98, 112, 118 Learning Conferences, 181– 84 Leary, Timothy, 51, 163, 164, 165 legitimacy exchange, 25 –26, 84, 85, 88, 95, 250 Lehr, Stan, 210 Levy, Steven, 137, 139, 195; Hackers: Heroes of the Computer Revolution, 132 –35 Leyden, Peter, 233 –34 Libertarianism, 210, 249, 259, 287n49 Libre commune, 81, 94, 96, 109 Licklider, Joseph C.


pages: 242 words: 68,019

Why Information Grows: The Evolution of Order, From Atoms to Economies by Cesar Hidalgo

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Ada Lovelace, Albert Einstein, Arthur Eddington, assortative mating, Claude Shannon: information theory, David Ricardo: comparative advantage, Douglas Hofstadter, Everything should be made as simple as possible, frictionless, frictionless market, George Akerlof, Gödel, Escher, Bach, income inequality, income per capita, industrial cluster, information asymmetry, invention of the telegraph, invisible hand, Isaac Newton, James Watt: steam engine, Jane Jacobs, job satisfaction, John von Neumann, New Economic Geography, Norbert Wiener, p-value, Paul Samuelson, phenotype, price mechanism, Richard Florida, Ronald Coase, Rubik’s Cube, Silicon Valley, Simon Kuznets, Skype, statistical model, Steve Jobs, Steve Wozniak, Steven Pinker, The Market for Lemons, The Nature of the Firm, The Wealth of Nations by Adam Smith, total factor productivity, transaction costs, working-age population

It is physical order, like what distinguishes different shuffles of a deck of cards. What is surprising to most people, however, is that information is meaningless, even though the meaningless nature of information, much like its physicality, is often misunderstood. In 1949 Claude Shannon and Warren Weaver published a short book entitled The Mathematical Theory of Communication. In its first section, Weaver described the conceptual aspects of information. In the second section, Shannon described the mathematics of what we now know as information theory. For information theory to be properly understood, Shannon and Weaver needed to detach the word information from its colloquial meaning. Weaver made this distinction early on his essay: “The word information, in this theory, is used in a special sense that must not be confused with its ordinary usage.

As the parts that made the Bugatti were pulled apart and twisted, the information that was embodied in the Bugatti was largely destroyed. This is another way of saying that the $2.5 million worth of value was stored not in the car’s atoms but in the way those atoms were arranged.3 That arrangement is information.4 So the value of the Bugatti is connected to physical order, which is information, even though people still debate what information is.5 According to Claude Shannon, the father of information theory, information is a measure of the minimum volume of communication required to uniquely specify a message. That is, it’s the number of bits we need to communicate an arrangement, like the arrangement of atoms that made the Bugatti. To grasp Shannon’s definition of information firmly, however, it is better to start with something simpler than a Bugatti. Here I will use a tweet.

Encoding and decoding messages was a mathematical problem that was too interesting to be abandoned as the war dwindled. Mathematicians continued to formalize the idea of information, but they framed their efforts in the context of communication technologies, transcending the efforts to decipher intercepted messages. The mathematicians who triumphed became known as the world’s first information theorists or cyberneticists. These pioneers included Claude Shannon, Warren Weaver, Alan Turing, and Norbert Wiener. In the 1950s and 1960s the idea of information took science by storm. Information was welcomed in all academic fields as a powerful concept that cut across scientific boundaries. Information was neither microscopic nor macroscopic.3 It could be inscribed sparsely on clay tablets or packed densely in a strand of DNA. For many practical purposes, the scale at which information was embodied was not crucial.


pages: 322 words: 88,197

Wonderland: How Play Made the Modern World by Steven Johnson

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Ada Lovelace, Alfred Russel Wallace, Antoine Gombaud: Chevalier de Méré, Berlin Wall, bitcoin, Book of Ingenious Devices, Buckminster Fuller, Claude Shannon: information theory, Clayton Christensen, colonial exploitation, computer age, conceptual framework, crowdsourcing, cuban missile crisis, Drosophila, Edward Thorp, Fellow of the Royal Society, game design, global village, Hedy Lamarr / George Antheil, HyperCard, invention of air conditioning, invention of the printing press, invention of the telegraph, Islamic Golden Age, Jacquard loom, Jacquard loom, Jacques de Vaucanson, James Watt: steam engine, Jane Jacobs, John von Neumann, joint-stock company, Joseph-Marie Jacquard, land value tax, Landlord’s Game, lone genius, mass immigration, megacity, Minecraft, moral panic, Murano, Venice glass, music of the spheres, Necker cube, New Urbanism, Oculus Rift, On the Economy of Machinery and Manufactures, pattern recognition, peer-to-peer, pets.com, placebo effect, probability theory / Blaise Pascal / Pierre de Fermat, profit motive, QWERTY keyboard, Ray Oldenburg, spice trade, spinning jenny, statistical model, Steve Jobs, Steven Pinker, Stewart Brand, supply-chain management, talking drums, the built environment, The Great Good Place, the scientific method, The Structural Transformation of the Public Sphere, trade route, Turing machine, Turing test, Upton Sinclair, urban planning, Victor Gruen, Watson beat the top human players on Jeopardy!, white flight, white picket fence, Whole Earth Catalog, working poor, Wunderkammern

He was, instead, a computer scientist from MIT named Edward Thorp, who had come to Vegas not to break the bank but rather to test a brand-new device: the very first wearable computer ever designed. Thorp had an accomplice at the roulette table, standing unobserved at the other end, pretending not to know his partner. He would have been unrecognizable to the average casino patron, but he was in fact one of the most important minds of the postwar era: Claude Shannon, the father of information theory and one of the key participants in the invention of digital computers. Thorp had begun thinking about beating the odds at roulette as a graduate student in physics at UCLA in 1955. Unlike card games like blackjack or poker where strategy could make a profound difference in outcomes, roulette was supposed to be a game of pure chance; the ball was equally likely to end up on any number on the wheel.

Turing’s speculations form a kind of origin point for two parallel paths that would run through the rest of the century: building intelligence into computers by teaching them to play chess, and studying humans playing chess as a way of understanding our own intelligence. Those interpretative paths would lead to some extraordinary breakthroughs: from the early work on cybernetics and game theory from people like Claude Shannon and John von Neumann, to machines like IBM’s Deep Blue that could defeat grandmasters with ease. In cognitive science, the litany of insights that derived from the study of chess could almost fill an entire textbook, insights that have helped us understand the human capacity for problem solving, pattern recognition, visual memory, and the crucial skill that scientists call, somewhat awkwardly, chunking, which involves grouping a collection of ideas or facts into a single “chunk” so that they can be processed and remembered as a unit.

In 1960, having moved on to MIT, Thorp decided to try publishing his blackjack analysis in the Proceedings of the National Academy of Sciences, and sought out the advice of Shannon, the only mathematician at MIT who was also a member of the academy. Impressed by Thorp’s blackjack system, Shannon inquired whether Thorp was working on anything else “in the gambling area.” Dormant for five years, Thorp’s roulette investigation was suddenly reawakened as the two men began a furious year of activity, seeking a predictable pattern in the apparent randomness of the roulette wheel. Claude Shannon with an electronic mouse In his old, rambling wooden house outside of Cambridge, Shannon had created a basement exploratorium that would have astounded Merlin and Babbage. Thorp later described it as a “gadgeteer’s paradise”: It had perhaps a hundred thousand dollars (about six hundred thousand 1998 dollars) worth of electronic, electrical and mechanical items. There were hundreds of mechanical and electrical categories, such as motors, transistors, switches, pulleys, gears, condensers, transformers, and on and on.


pages: 238 words: 46

When Things Start to Think by Neil A. Gershenfeld

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, Ada Lovelace, Bretton Woods, cellular automata, Claude Shannon: information theory, Dynabook, Hedy Lamarr / George Antheil, I think there is a world market for maybe five computers, invention of movable type, Iridium satellite, Isaac Newton, Jacquard loom, Jacquard loom, John von Neumann, means of production, new economy, Nick Leeson, packet switching, RFID, speech recognition, Stephen Hawking, Steve Jobs, telemarketer, the medium is the message, Turing machine, Turing test, Vannevar Bush

Turing thought so; in 194 7 he was able to describe a chess-playing computer program. Since then computer chess has been studied by a who's who of computing pioneers who took it to be a defining challenge for what came to be known as Artificial Intelligence. It was thought that if a machine could win at chess it would have to draw on fundamental insights into how humans think. Claude Shannon, the inventor of Information Theory, which provides the foundation for modern digital communications, designed a simple chess program in 1949 and was able to get it running to play endgames. The first program that could play a full game of chess was developed at IBM in 1957, and an MIT computer won the first BIT BELIEFS + 129 tournament match against a human player in 1967. The first grandmaster lost a game to a computer in 1977.

There is a disconnect between the breathless pronouncements of cyber gurus and the experience of ordinary people left perpetually upgrading hardware to meet the demands of new software, or wondering where their files have gone, or trying to understand why they can't connect to the network. The revolution so far has been for the computers, not the people. Digital data of all kinds, whether an e-mail message or a movie, is encoded as a string of O's and 1's because of a remarkable discovery by Claude Shannon and John von Neumann in the 1940s. Prior to their work, it was obvious that engineered systems degraded with time and use. A tape recording sounds worse after it is duplicated, a photocopy is less satisfactory than an original, a telephone call becomes more garbled the farther it has to travel. They showed that this is not so for a digital representation. Errors still occur in digital systems, but instead of continuously degrading the performance there is a threshold below which errors can be corrected with near certainty.

Many people spent many years unsuccessfully trying to exorcise Maxwell's demon. An important step came in 1929 when Leo Szilard reduced the problem to its essence with a single molecule that could be on either side of a partition. While he wasn't able to solve the demon paradox, this introduced the notion of a "bit" of information. Szilard's one-bit analysis of Maxwell's demon provided the inspiration for Claude Shannon's theory of information in 1948. Just as the steam engine powered the Industrial Revolution, electronic communications was powering an information revolution. And just as finding the capacity of a steam engine was a matter of some industrial import, the growing demand for communications links required an understanding of how many messages could be sent through a wire. Thanks to Szilard, Shannon realized that entropy could measure the capacity of a telephone wire as well as an engine.


pages: 561 words: 120,899

The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant From Two Centuries of Controversy by Sharon Bertsch McGrayne

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Bayesian statistics, bioinformatics, British Empire, Claude Shannon: information theory, Daniel Kahneman / Amos Tversky, double helix, Edmond Halley, Fellow of the Royal Society, full text search, Henri Poincaré, Isaac Newton, John Markoff, John Nash: game theory, John von Neumann, linear programming, meta analysis, meta-analysis, Nate Silver, p-value, Pierre-Simon Laplace, placebo effect, prediction markets, RAND corporation, recommendation engine, Renaissance Technologies, Richard Feynman, Richard Feynman, Richard Feynman: Challenger O-ring, Ronald Reagan, speech recognition, statistical model, stochastic process, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, Turing machine, Turing test, uranium enrichment, Yom Kippur War

While Bayes has helped revolutionize modern life on the web, it is also helping to finesse the Tower of Babel that has separated linguistic communities for millennia. During the Second World War, Warren Weaver of the Rockefeller Foundation was impressed with how “a multiplicity of languages impedes cultural interchange between the peoples of the earth and is a serious deterrent to international understanding.”6 Struck by the power of mechanized cryptography and by Claude Shannon’s new information theory, Weaver suggested that computerized statistical methods could treat translation as a cryptography problem. In the absence of computer power and a wealth of machine-readable text, Weaver’s idea lay fallow for decades. Ever since, the holy grail of translators has been a universal machine that can transform written and spoken words from one language into any other. As part of this endeavor, linguists like Noam Chomsky developed structural rules for English sentences, subjects, verbs, adjectives, and grammar but failed to produce an algorithm that could explain why one string of words makes an English sentence while another string does not.

Yet at the very same time practical problem solvers relied on it to deal with real-world emergencies. One spectacular success occurred during the Second World War, when Alan Turing developed Bayes to break Enigma, the German navy’s secret code, and in the process helped to both save Britain and invent modern electronic computers and software. Other leading mathematical thinkers—Andrei Kolmogorov in Russia and Claude Shannon in New York—also rethought Bayes for wartime decision making. During the years when ivory tower theorists thought they had rendered Bayes taboo, it helped start workers’ compensation insurance in the United States; save the Bell Telephone system from the financial panic of 1907; deliver Alfred Dreyfus from a French prison; direct Allied artillery fire and locate German U-boats; and locate earthquake epicenters and deduce (erroneously) that Earth’s core consists of molten iron.

In a fundamental breakthrough, Turing realized he could not systematize his hunches or compare their probabilities without a unit of measurement. He named his unit a ban for Banburismus and defined it as “about the smallest change in weight of evidence that is directly perceptible to human intuition.”13 One ban represented odds of 10 to 1 in favor of a guess, but Turing normally dealt with much smaller quantities, decibans and even centibans. The ban was basically the same as the bit, the measure of information Claude Shannon discovered by using Bayes’ rule at roughly the same time at Bell Telephone Laboratories. Turing’s measure of belief, the ban, and its supporting mathematical framework have been called his greatest intellectual contribution to Britain’s defense. To estimate the probability of a guess when information was arriving piecemeal, Turing used bans to discriminate between sequential hypotheses. He was thus one of the first to develop what came to be called sequential analysis.


pages: 463 words: 118,936

Darwin Among the Machines by George Dyson

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anti-communist, British Empire, carbon-based life, cellular automata, Claude Shannon: information theory, combinatorial explosion, computer age, Danny Hillis, Donald Davies, fault tolerance, Fellow of the Royal Society, finite state, IFF: identification friend or foe, invention of the telescope, invisible hand, Isaac Newton, Jacquard loom, Jacquard loom, James Watt: steam engine, John Nash: game theory, John von Neumann, Menlo Park, Nash equilibrium, Norbert Wiener, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, phenotype, RAND corporation, Richard Feynman, Richard Feynman, spectrum auction, strong AI, the scientific method, The Wealth of Nations by Adam Smith, Turing machine, Von Neumann architecture, zero-sum game

By repeated sorting and other iterated functions, primitive punched-card machines could perform complex operations, but, like the original Turing machine, they had only a small number of possible states. The fundamental unit of information was the bit; its explicit definition as the contraction of “binary digit” was first noted in an internal Bell Laboratories memo written by John W. Tukey on 9 January 1947,20 and first published in Claude Shannon’s Mathematical Theory of Communication in 1948.21 Shannon’s definition was foreshadowed by Vannevar Bush’s analysis, in 1936, of the number of “bits of information” that could be stored on a punched card. In those days bits were assigned only fleetingly to electrical or electronic form. Most bits, most of the time, were bits of paper (or bits of missing paper, represented by the chad that was carted off to landfills by the ton).

The trustees did not want to lose von Neumann to another institution, so they gave him the go-ahead. In November 1945, a committee led by von Neumann and including Herman Goldstine (still attached to the ENIAC project at the Moore School) held its first meeting in Vladimir Zworykin’s office at RCA. Also in attendance was John Tukey, a statistician with Bell Telephone Laboratories (and the originator of information theorist Claude Shannon’s landmark contraction of “binary digit” to “bit”). Von Neumann issued a memorandum of objectives, concluding that “it is to be expected that the future evolution of high-speed computing will be decisively influenced by the experiences gained.”22 By the spring of 1946 the project was under way and staff, led by Goldstine, were signing on. Goldstine, Arthur Burks, and von Neumann set to work developing the logical plan of the computer, released in June 1946 as Preliminary Discussion of the Logical Design of an Electronic Computing Instrument, revised in September 1947, and followed by a three-volume report, Planning and Coding of Problems for an Electronic Computing Instrument (1947–1948).

Most early digital computers—from the Colossus to the IAS machine—used paper-tape teletype equipment for input and output between the computer and the outside world, augmented by the ubiquitous punched-card equipment from Hollerith and IBM. It was only natural that the first computers incorporated high-speed telegraphic equipment, and it is no accident that the genesis of the Colossus within the British Telecommunications Research Establishment was mirrored in the United States by early steps taken toward computers by Claude Shannon and others within Bell Laboratories and RCA. Only later did the communications industry and the computer industry become temporarily estranged. The solitary computers of the early 1950s exchanged code sequences by means of mutually intelligible storage media and, before the end of the decade, by connecting directly or, in language that has now been extended to human beings, on-line. But no matter what the medium of exchange, the code itself and the protocols that regulate its flow remain directly descended from the first strings of telegraphic bits.


pages: 319 words: 90,965

The End of College: Creating the Future of Learning and the University of Everywhere by Kevin Carey

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, barriers to entry, Bayesian statistics, Berlin Wall, business intelligence, carbon-based life, Claude Shannon: information theory, complexity theory, David Heinemeier Hansson, declining real wages, deliberate practice, discrete time, double helix, Douglas Engelbart, Douglas Engelbart, Downton Abbey, Drosophila, Firefox, Frank Gehry, Google X / Alphabet X, informal economy, invention of the printing press, inventory management, John Markoff, Khan Academy, Kickstarter, low skilled workers, Lyft, Marc Andreessen, Mark Zuckerberg, meta analysis, meta-analysis, natural language processing, Network effects, open borders, pattern recognition, Peter Thiel, pez dispenser, ride hailing / ride sharing, Ronald Reagan, Ruby on Rails, Sand Hill Road, self-driving car, Silicon Valley, Silicon Valley startup, social web, South of Market, San Francisco, speech recognition, Steve Jobs, technoutopianism, transcontinental railway, Vannevar Bush

Bill was enacted, the director of the national Office of Scientific Research and Development, Vannevar Bush, sent a report to President Truman titled Science: The Endless Frontier. Bush had a doctorate in electrical engineering from MIT, where he had served as a scientist and administrator. He and his colleagues had made important contributions to the emerging development of computer science; his student Claude Shannon helped develop the information theory that sits at the heart of modern computing. Science, Bush said, was a source of great good for humanity. Penicillin and other medical advances had saved countless lives. “In 1939 millions of people were employed in industries which did not even exist at the close of the last war—radio, air conditioning, rayon and other synthetic fibers, and plastics. . . . But these things do not mark the end of progress—they are but the beginning if we make full use of our scientific resources.”

Six months later, Simon attended a conference at Dartmouth College, where he and a small group of scientists gave a name to this new field of research: artificial intelligence. The study of the human mind and the exploding power of information technology were coming together, and the smartest people in the world were in the middle of the action. Among the Dartmouth participants was Claude Shannon, a former student of Vannevar Bush at MIT and one of the fathers of modern information theory. In addition to creating the blueprint for the Cold War university, Bush had also seen the future technological revolution. In a 1945 Atlantic article titled “As We May Think,” Bush observed that various fields of manufacturing and computing were on trajectories of improvement that would soon lead to changes that, although they might be unknowable in the specific, were highly predictable in general.

The university relished becoming a breeding ground for thunder lizards because those unruly beasts always left their nest unscathed. Until, that is, the summer of 2011, when a computer science professor named Sebastian Thrun had a kind of inspiration. Thrun was born in Germany and trained in computer science and statistics at the University of Bonn. His specialty was artificial intelligence, continuing the project first outlined at the Dartmouth conference by Herbert Simon, Claude Shannon, and others back in 1956. Carnegie Mellon hired Thrun as a professor in 1995, and he spent most of the next decade in Pittsburgh working at the intersection of computer science, statistics, and machines. Artificial intelligence had come in and out of fashion over the years. Initial hopes for replicating the human mind in silicon had proved highly optimistic as the parallel march of neuroscience and cognitive psychology revealed how fantastically complicated human cognition truly was.


pages: 566 words: 122,184

Code: The Hidden Language of Computer Hardware and Software by Charles Petzold

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Bill Gates: Altair 8800, Claude Shannon: information theory, computer age, Donald Knuth, Douglas Engelbart, Douglas Engelbart, Dynabook, Eratosthenes, Grace Hopper, invention of the telegraph, Isaac Newton, Jacquard loom, Jacquard loom, James Watt: steam engine, John von Neumann, Joseph-Marie Jacquard, Louis Daguerre, millennium bug, Norbert Wiener, optical character recognition, popular electronics, Richard Feynman, Richard Feynman, Richard Stallman, Silicon Valley, Steve Jobs, Turing machine, Turing test, Vannevar Bush, Von Neumann architecture

Long after core memory had been replaced by other technologies, it was common to hear older programmers refer to the memory that the processor accessed as core. John von Neumann wasn't the only person doing some major conceptual thinking about the nature of computers in the 1940s. Claude Shannon (born 1916) was another influential thinker. In Chapter 11, I discussed his 1938 master's thesis, which established the relationship between switches, relays, and Boolean algebra. In 1948, while working for Bell Telephone Laboratories, he published a paper in the Bell System Technical Journal entitled "A Mathematical Theory of Communication" that not only introduced the word bit in print but established a field of study today known as information theory. Information theory is concerned with transmitting digital information in the presence of noise (which usually prevents all the information from getting through) and how to compensate for that.

In this case, the input switches correspond to 4 bits of information that describe a cat. The output device is the lightbulb. This bulb lights up if the switches describe a satisfactory cat. The switches shown in the control panel on page 104 are set for a female unneutered black cat. This satisfies your criteria, so the lightbulb is lit. Now all we have to do is design a circuit that makes this control panel work. You'll recall that Claude Shannon's thesis was entitled "A Symbolic Analysis of Relay and Switching Circuits." The relays he was referring to were quite similar to the telegraph relays that we encountered in Chapter 6. By the time of Shannon's paper, however, relays were being used for other purposes and, in particular, in the vast network of the telephone system. Like switches, relays can be connected in series and in parallel to perform simple tasks in logic.

This is the same as combining the two operands with the Boolean OR operator and then inverting the result (which is the NOR). In the second expression, the two operands are inverted and then combined with the Boolean OR operator. This is the same as combining the operands with the Boolean AND operator and then inverting (which is the NAND). De Morgan's Laws are an important tool for simplifying Boolean expressions and hence, for simplifying circuits. Historically, this was what Claude Shannon's paper really meant for electrical engineers. But obsessively simplifying circuits won't be a major concern in this book. It's preferable to get things working rather than to get things working as simply as possible. And what we're going to get working next is nothing less than an adding machine. Chapter 12. A Binary Adding Machine Addition is the most basic of arithmetic operations, so if we want to build a computer (and that is my hidden agenda in this book), we must first know how to build something that adds two numbers together.


pages: 223 words: 52,808

Intertwingled: The Work and Influence of Ted Nelson (History of Computing) by Douglas R. Dechow

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, Apple II, Bill Duvall, Brewster Kahle, Buckminster Fuller, Claude Shannon: information theory, cognitive dissonance, computer age, conceptual framework, Douglas Engelbart, Douglas Engelbart, Dynabook, Edward Snowden, game design, HyperCard, hypertext link, information retrieval, Internet Archive, Jaron Lanier, knowledge worker, linked data, Marc Andreessen, Marshall McLuhan, Menlo Park, Mother of all demos, pre–internet, RAND corporation, semantic web, Silicon Valley, software studies, Steve Jobs, Steve Wozniak, Stewart Brand, Ted Nelson, the medium is the message, Vannevar Bush, Wall-E, Whole Earth Catalog

Xanadu is a daring design that presented awesome challenges: how to link to evolving documents, how to track changes to a document, how to manipulate linkages, how to organize archival storage, how to name the target of a link, how to track micro-copyright royalties, how to organize the physical storage of a universe of discourse and how to scale storage and processing around the world. Many people are still skeptical of the need for bi-directional links. I am one who suspects links might only occasionally need to be bi-directional, or that a pair of one-way links could simulate a bi-directional link. Claude Shannon’s popular demonstration of his computer-controlled maze-navigating mouse was essential to the success of his project. Shannon’s demonstration appeared as a segment in the television show, Time Machine: Robots [2]. Shannon went to a lot of trouble to prepare a tabletop maze and to eliminate any arm or cord connecting the mouse to the computer. This greatly enhanced the impact of his presentation, and of his theory.

Open Access This chapter is distributed under the terms of the Creative Commons Attribution Noncommercial License, which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited. References 1. Nelson TH (1974) Computer lib: you can and must understand computers now/dream Machines. Hugo’s Book Service, Chicago 2. Time Machine: Robots. Robots. History Channel. Aug.–Sept. 2000. Television. See segment on Claude, Shannon. 1952. Theseus Maze-Solving Mouse. (begins at 9:16 in the video). http://​youtu.​be/​KmURvu4x0Do Part III Hypertext and Ted Nelson-Influenced Research © The Author(s) 2015 Douglas R. Dechow and Daniele C. Struppa (eds.)IntertwingledHistory of Computing10.1007/978-3-319-16925-5_9 9. The Importance of Ted’s Vision Belinda Barnet1 (1)Media and Communications, Faculty of Health, Arts and Design, School of Arts, Social Sciences and Humanities, Department of Media and Communication, Swinburne University of Technology, Hawthorn, Australia Belinda Barnet Email: BBarnet@groupwise.swin.edu.au Sometimes a journey makes itself necessary, as Anne Carson puts it in Autobiography of Red [3, p. 46].

From Nelson, Computer Lib/Dream Machines (Courtesy of Theodor Holm Nelson) The problem of the relationship between coding and thinking has always been central to the work of Theodor Holm Nelson, and a key aspect of his influence both inside and outside computer fields has been his unwavering insistence on the epistemological consequences of this relationship, often discussed under the rubric he calls “systems humanism.” While there is every reason to read Nelson as a figure in the modern history of information theory and design, there are as many reasons to read him in the stream of the contemporary humanities. More concretely, there are excellent reasons to consider Nelson’s work—from his earliest efforts such as the literary journal, Nothing, through to his visionary samizdat manifesto, Computer Lib/Dream Machines, and his recent work reconceptualizing the spreadsheet—as a guide to the universe of paper as it is to that of the screen.


pages: 634 words: 185,116

From eternity to here: the quest for the ultimate theory of time by Sean M. Carroll

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, Albert Michelson, anthropic principle, Arthur Eddington, Brownian motion, cellular automata, Claude Shannon: information theory, Columbine, cosmic microwave background, cosmological constant, cosmological principle, dark matter, dematerialisation, double helix, en.wikipedia.org, gravity well, Harlow Shapley and Heber Curtis, Henri Poincaré, Isaac Newton, John von Neumann, Lao Tzu, lone genius, New Journalism, Norbert Wiener, pets.com, Pierre-Simon Laplace, Richard Feynman, Richard Feynman, Richard Stallman, Schrödinger's Cat, Slavoj Žižek, Stephen Hawking, stochastic process, the scientific method, wikimedia commons

There’s nothing wrong with that; after all, Boltzmann and Gibbs were proposing definitions to supercede Clausius’s perfectly good definition of entropy, which is still used today under the rubric of “thermodynamic” entropy. After quantum mechanics came on the scene, John von Neumann proposed a formula for entropy that is specifically adapted to the quantum context. As we’ll discuss in the next chapter, Claude Shannon suggested a definition of entropy that was very similar in spirit to Gibbs’s, but in the framework of information theory rather than physics. The point is not to find the one true definition of entropy; it’s to come up with concepts that serve useful functions in the appropriate contexts. Just don’t let anyone bamboozle you by pretending that one definition or the other is the uniquely correct meaning of entropy. Just as there are many definitions of entropy, there are many different “arrows of time,” another source of potential bamboozlement.

But the question remains with us, only in a more specific form: Why did the universe have a low entropy near the Big Bang? 9 INFORMATION AND LIFE You should call it entropy, for two reasons. In the first place, your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one knows what entropy really is, so in a debate you will always have the advantage. —John von Neumann, to Claude Shannon144 In a celebrated episode in Swann’s Way, Marcel Proust’s narrator is feeling cold and somewhat depressed. His mother offers him tea, which he reluctantly accepts. He is then pulled into an involuntary recollection of his childhood by the taste of a traditional French teatime cake, the madeleine. And suddenly the memory appeared. That taste was the taste of the little piece of madeleine which on Sunday mornings at Combray . . . when I went to say good morning to her in her bedroom, my aunt Léonie would give me after dipping it in her infusion of tea or lime blossom . . .

To be clear: In the final analysis, none of these thought experiments are letting us violate the Second Law. Rather, they provide ways that we could appear to violate the Second Law, if we didn’t properly account for the crucial role played by information. The information collected and processed by the Demon must somehow be accounted for in any consistent story of entropy. The concrete relationship between entropy and information was developed in the 1940s by Claude Shannon, an engineer/mathematician working for Bell Labs.153 Shannon was interested in finding efficient and reliable ways of sending signals across noisy channels. He had the idea that some messages carry more effective information than others, simply because the message is more “surprising” or unexpected. If I tell you that the Sun is going to rise in the East tomorrow morning, I’m not actually conveying much information, because you already expected that was going to happen.


pages: 396 words: 112,748

Chaos by James Gleick

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Benoit Mandelbrot, butterfly effect, cellular automata, Claude Shannon: information theory, discrete time, Edward Lorenz: Chaos theory, experimental subject, Georg Cantor, Henri Poincaré, Isaac Newton, iterative process, John von Neumann, Louis Pasteur, mandelbrot fractal, Murray Gell-Mann, Norbert Wiener, pattern recognition, Richard Feynman, Richard Feynman, Stephen Hawking, stochastic process, trade route

Ottino and others with real fluids, the process of mixing—ubiquitous in nature and industry, yet still poorly understood—proved intimately bound up with the mathematics of chaos. The patterns revealed a stretching and folding that led back to the horseshoe map of Smale. THE MOST CHARACTERISTICALLY Santa Cruzian imprint on chaos research involved a piece of mathematics cum philosophy known as information theory, invented in the late 1940s by a researcher at the Bell Telephone Laboratories, Claude Shannon. Shannon called his work “The Mathematical Theory of Communication,” but it concerned a rather special quantity called information, and the name information theory stuck. The theory was a product of the electronic age. Communication lines and radio transmissions were carrying a certain thing, and computers would soon be storing this same thing on punch cards or magnetic cylinders, and the thing was neither knowledge nor meaning.

I knew hardly anything about Richard Feynman, but he has a cameo here (see here). Isaac Newton has more than a cameo: he seems to be the antihero of chaos, or the god to be overthrown. I discovered only later, reading his notebooks and letters, how wrong I’d been about him. And for twenty years I’ve been pursuing a thread that began with something Rob Shaw told me, about chaos and information theory, as invented by Claude Shannon. Chaos is a creator of information—another apparent paradox. This thread connects with something Bernardo Hubemian said: that he was seeing complex behaviors emerge unexpectedly in information networks. Something was dawning, and we’re finally starting to see what it is. James Gleick Key West February 2008 Notes on Sources and Further Reading THIS BOOK DRAWS on the words of about two hundred scientists, in public lectures, in technical writing, and most of all in interviews conducted from April 1984 to December 1986.

Because information was stored in binary on-off switches newly designated as bits, bits became the basic measure of information. From a technical point of view, information theory became a handle for grasping how noise in the form of random errors interfered with the flow of bits. It gave a way of predicting the necessary carrying capacity of communication lines or compact disks or any technology that encoded language, sounds, or images. It offered a theoretical means of reckoning the effectiveness of different schemes for correcting errors—for example, using some bits as checks on others. It put teeth into the crucial notion of “redundancy.” In terms of Shannon’s information theory, ordinary language contains greater than fifty percent redundancy in the form of sounds or letters that are not strictly necessary to conveying a message.


pages: 329 words: 88,954

Emergence by Steven Johnson

A Pattern Language, agricultural Revolution, Brewster Kahle, British Empire, Claude Shannon: information theory, complexity theory, Danny Hillis, Douglas Hofstadter, edge city, epigenetics, game design, garden city movement, Gödel, Escher, Bach, hive mind, Howard Rheingold, hypertext link, invisible hand, Jane Jacobs, Kevin Kelly, late capitalism, Marshall McLuhan, mass immigration, Menlo Park, Murano, Venice glass, Naomi Klein, new economy, New Urbanism, Norbert Wiener, pattern recognition, pez dispenser, phenotype, Potemkin village, price mechanism, profit motive, Ray Kurzweil, slashdot, Socratic dialogue, stakhanovite, Steven Pinker, The Death and Life of Great American Cities, The Wealth of Nations by Adam Smith, theory of mind, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, trickle-down economics, Turing machine, Turing test, urban planning, urban renewal, Vannevar Bush

Early in his visit to Bell Labs, Turing hit upon the idea of using another Bell invention, the Vocoder—later used by rock musicians such as Peter Frampton to combine the sounds of a guitar and the human voice—as a way of encrypting speech. (By early 1943, Turing’s ideas had enabled the first secure voice transmission to cross the Atlantic, unintelligible to German eavesdroppers.) Bell Labs was the home base for another genius, Claude Shannon, who would go on to found the influential discipline of information theory, and whose work had explored the boundaries between noise and information. Shannon had been particularly intrigued by the potential for machines to detect and amplify patterns of information in noisy communication channels—a line of inquiry that promised obvious value to a telephone company, but could also save thousands of lives in a war effort that relied so heavily on the sending and breaking of codes.

Drawing upon research in molecular biology, genetics, physics, computer science, and Shannon’s information theory, Weaver divided the last few centuries of scientific inquiry into three broad camps. First, the study of simple systems: two or three variable problems, such as the rotation of planets, or the connection between an electric current and its voltage and resistance. Second, problems of “disorganized complexity”: problems characterized by millions or billions of variables that can only be approached by the methods of statistical mechanics and probability theory. These tools helped explain not only the behavior of molecules in a gas, or the patterns of heredity in a gene pool, but also helped life insurance companies turn a profit despite their limited knowledge about any individual human’s future health. Thanks to Claude Shannon’s work, the statistical approach also helped phone companies deliver more reliable and intelligible longdistance service.

And that learning too will be a kind of emergence, a higher-level order forming out of relatively simple component parts. Five years after his interactions with Turing, Shannon published a long essay in the Bell System Technical Journal that was quickly repackaged as a book called The Mathematical Theory of Communication. Dense with equations and arcane chapter titles such as “Discrete Noiseless Systems,” the book managed to become something of a cult classic, and the discipline it spawned—information theory—had a profound impact on scientific and technological research that followed, on both a theoretical and practical level. The Mathematical Theory of Communication contained an elegant, layman’s introduction to Shannon’s theory, penned by the esteemed scientist Warren Weaver, who had early on grasped the significance of Shannon’s work. Weaver had played a leading role in the Natural Sciences division of the Rockefeller Foundation since 1932, and when he retired in the late fifties, he composed a long report for the foundation, looking back at the scientific progress that had been achieved over the preceding quarter century.


pages: 460 words: 107,712

A Devil's Chaplain: Selected Writings by Richard Dawkins

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, Alfred Russel Wallace, Buckminster Fuller, butterfly effect, Claude Shannon: information theory, complexity theory, Desert Island Discs, double helix, Douglas Hofstadter, epigenetics, experimental subject, Fellow of the Royal Society, gravity well, Necker cube, out of africa, phenotype, placebo effect, random walk, Richard Feynman, Richard Feynman, Silicon Valley, stem cell, Stephen Hawking, Steven Pinker, the scientific method

Rather than engage in further recriminations and disputes about exactly what happened at the time of the interview, I shall try to redress the matter now in constructive fashion by answering the original question, the ‘Information Challenge’, at adequate length – the sort of length you can achieve in a proper article. The technical definition of ‘information’ was introduced by the American engineer Claude Shannon in 1948. An employee of the Bell Telephone Company, Shannon was concerned to measure information as an economic commodity. It is costly to send messages along a telephone line. Much of what passes in a message is not information: it is redundant. You could save money by recoding the message to remove the redundancy. Redundancy was a second technical term introduced by Shannon, as the inverse of information.

You could spend a lifetime reading in this ancient library and die unsated by the wonder of it. 1 See ‘Unfinished Correspondence with a Darwinian Heavyweight’ (pp. 256–62). 2 The producers never deigned to send me a copy: I completely forgot about it until an American colleague called it to my attention. 3 See Barry Williams, ‘Creationist deception exposed’, the Skeptic 18 (1998), 3, pp. 7–10, for an account of how my long pause (trying to decide whether to throw them out) was made to look like hesitant inability to answer the question, followed by an apparently evasive answer to a completely different question. 4 It is important not to blame Shannon for my verbal and intuitive way of expressing what I think of as the essence of his idea. Mathematical readers should go straight to the original, C. Shannon and W. Weaver, The Mathematical Theory of Communication (University of Illinois Press, 1949). Claude Shannon, by the way, had an imaginative sense of humour. He once built a box with a single switch on the outside. If you threw the switch, the lid of the box slowly opened, a mechanical hand appeared, reached down and switched off the box. It then put itself away and the lid closed. As Arthur C. Clarke said, ‘There is something unspeakably sinister about a machine that does nothing – absolutely nothing – except switch itself off.’ 5 These round figures are all decimal approximations.

When the prior uncertainty is some mixture of alternatives that are not equiprobable, Shannon’s formula becomes a slightly more elaborate weighted average, but it is essentially similar. By the way, Shannon’s weighted average is the same formula as physicists have used, since the nineteenth century, for entropy. The point has interesting implications but I shall not pursue them here.8 That’s enough background on information theory. It is a theory which has long held a fascination for me, and I have used it in several of my research papers over the years. Let’s now think how we might use it to ask whether the information content of genomes increases in evolution. First, recall the three-way distinction between total information capacity, the capacity that is actually used, and the true information content when stored in the most economical way possible.


pages: 272 words: 19,172

Hedge Fund Market Wizards by Jack D. Schwager

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

asset-backed security, backtesting, banking crisis, barriers to entry, beat the dealer, Bernie Madoff, Black-Scholes formula, British Empire, Claude Shannon: information theory, cloud computing, collateralized debt obligation, commodity trading advisor, computerized trading, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, delta neutral, diversification, diversified portfolio, Edward Thorp, family office, financial independence, fixed income, Flash crash, hindsight bias, implied volatility, index fund, intangible asset, James Dyson, Long Term Capital Management, margin call, market bubble, market fundamentalism, merger arbitrage, money market fund, oil shock, pattern recognition, pets.com, Ponzi scheme, private sector deleveraging, quantitative easing, quantitative trading / quantitative finance, Right to Buy, risk tolerance, risk-adjusted returns, risk/return, riskless arbitrage, Rubik’s Cube, Sharpe ratio, short selling, statistical arbitrage, Steve Jobs, systematic trading, technology bubble, transaction costs, value at risk, yield curve

The problem from the regulatory standpoint is that the parking concealed who really owned the positions. 6The basic strategy is a set of rules of when to stand, hit, split, and double, depending on the player’s card and the dealer’s card. For example, the basic strategy indicates standing with 16 if the dealer’s card is between two and six, and hitting otherwise. The basic strategy does not involve any card counting. 7The opening paragraph in the Wikipedia entry for Claude Shannon provides the following synopsis: Claude Elwood Shannon (April 30, 1916–February 24, 2001) was an American mathematician, electronics engineer, and cryptographer known as “the father of information theory.” Shannon is famous for having founded information theory with one landmark paper published in 1948. But he is also credited with founding both digital computer and digital circuit design theory in 1937, when, as a 21-year-old master’s student at MIT, he wrote a thesis demonstrating that electrical application of Boolean algebra could construct and resolve any logical, numerical relationship.

Track records such as Thorp’s prove conclusively that it is possible to beat the market and that the large group of economists who insist otherwise are choosing to believe theory over evidence.2 The contention that it is possible to beat the markets, however, does not say anything about the difficulty of the task. In fact, it is the difficulty in beating the market (the vast majority of market participants fail to do so) that helps create the illusion that markets are efficient. Thorp’s career encompasses an extraordinary number of first achievements: He co-developed (along with Claude Shannon) the first wearable computer that could be used to win at roulette. He developed the first blackjack betting strategy that provided a positive edge to the player, which he divulged in his global best seller, Beat the Dealer. The book changed the way casinos operate. Thorp along with Sheen Kassouf developed the first known systematic approach to trading warrants and other convertible securities (e.g., options, convertible bonds, convertible preferred stocks) by hedging them with offsetting stock positions, an approach they detailed in their book, Beat the Market.3 He was the first to formulate an option-pricing model that was equivalent to the Black-Scholes model.

The best way to do that was to get it published in the National Academy of Sciences, but you had to find a member who would submit the paper for you or else they wouldn’t take it. I researched the Cambridge area where I was located and found that there were two members. One member was an algebraist at Harvard who wouldn’t have any idea what I was talking about and probably wouldn’t have cared if he did. The other member was Claude Shannon at MIT. Shannon was a joint professor of mathematics and engineering, and one of only two Distinguished Professors at MIT. I went to his secretary and asked if I could get an appointment. She said, “He might see you for five minutes, but he doesn’t talk to people if he is not interested. So don’t expect more than a very brief interview.” Did you know anything about his reputation when you went to see him?


pages: 360 words: 85,321

The Perfect Bet: How Science and Math Are Taking the Luck Out of Gambling by Adam Kucharski

Ada Lovelace, Albert Einstein, Antoine Gombaud: Chevalier de Méré, beat the dealer, Benoit Mandelbrot, butterfly effect, call centre, Chance favours the prepared mind, Claude Shannon: information theory, collateralized debt obligation, correlation does not imply causation, diversification, Edward Lorenz: Chaos theory, Edward Thorp, Everything should be made as simple as possible, Flash crash, Gerolamo Cardano, Henri Poincaré, Hibernia Atlantic: Project Express, if you build it, they will come, invention of the telegraph, Isaac Newton, John Nash: game theory, John von Neumann, locking in a profit, Louis Pasteur, Nash equilibrium, Norbert Wiener, p-value, performance metric, Pierre-Simon Laplace, probability theory / Blaise Pascal / Pierre de Fermat, quantitative trading / quantitative finance, random walk, Richard Feynman, Richard Feynman, Ronald Reagan, Rubik’s Cube, statistical model, The Design of Experiments, Watson beat the top human players on Jeopardy!, zero-sum game

On one occasion, Thorp failed to come to the door when his in-laws arrived for dinner. They eventually found him inside rolling marbles along the kitchen floor in the midst of an experiment to find out how far each would travel. After completing his PhD, Thorp headed east to work at the Massachusetts Institute of Technology. There he met Claude Shannon, one of the university’s academic giants. Over the previous decade, Shannon had pioneered the field of “information theory,” which revolutionized how data are stored and communicated; the work would later help pave the way for space missions, mobile phones, and the Internet. Thorp told Shannon about the roulette predictions, and the professor suggested they continue the work at his house a few miles outside the city. When Thorp entered Shannon’s basement, it became clear quite how much Shannon liked gadgets.

Bet too much, and you risk wiping out your savings on an event that still has only a 50 percent chance of success; bet too little, and you won’t be fully exploiting your advantage. After Thorp put together his winning blackjack system, he turned his attention to the problem of such bankroll management. Given a particular edge over the casino, what was the optimal amount to bet? He found the answer in a formula known as the Kelly criterion. The formula is named after John Kelly, a gunslinging Texan physicist who worked with Claude Shannon in the 1950s. Kelly argued that, in the long run, you should wager a percentage of your bankroll equal to your expected profit divided by the amount you’ll receive if you win. For the coin toss above, the Kelly criterion would be the expected payoff ($0.50) divided by the potential winnings ($2.00). This works out to 0.25, which means you should bet a quarter of your available bankroll. In theory, wagering this amount will ensure good profits while limiting the risk of chipping away at your funds.

.), 188 heads-up poker, 172, 186, 188, 195 See also Texas hold’em poker hedge funds, 96, 97, 99–100, 101, 120, 122, 204 hedging bets, 95, 96, 99 Heeb, Randal, 198, 199, 200, 201 Henry, Thierry, 104 Heuer, Andreas, 78 Hibbs, Albert, 3–4, 7, 21 Hibernia Atlantic, 113 Hilbert, David, 150 hockey, 85, 204, 205, 207 Holt, Matthew, 88 Hong Kong Jockey Club, 43, 90 horse racing bankroll management in, 65–67 betting syndicates and, 35, 52–53, 54, 56, 57–58, 64, 66–69, 103, 206 compared to golf, 84 fading of data availability limitations in, 73 favorite-long-shot bias in, 45, 46, 57 gambling law and, 198 and the Hong Kong tracks, 43–44, 54–55, 66–69, 74, 114 measuring individual performance in, 103 and the Monte Carlo method, 61–62 pari-mutuel system in, 43–45, 66, 114 Pinnacle Sports and, 93 prediction methods in, 46, 49–50, 51–54, 55–58, 64, 68, 69, 74, 206, 207, 216, 218 and quality measurement, 50–51, 74 and regression analysis, 49, 50 robotic betting and, 115–117 university courses studying, 215 “How to Gamble If You Must” (MIT course), 213–214 human insight, importance of, 105 human language, analyzing, 166 hybrid bots, 184 hydrogen bomb, 58, 59, 60, 61, 83, 202 IBM, 166–167 iceberg order, 115 ignorance first degree of, 2 second degree of, 3, 8, 11, 21–22 third degree of, 3, 8, 12, 21 illegal betting/gambling, 42, 81–82, 90–91, 101, 198 Illegal Gambling Business Act, 200 illusion, 218 imitation game, 170–171, 177, 194 incompleteness, 150, 168, 173 incompleteness theorem, 150, 176 inconsistencies, 150 Indian Premier League, 90 infinite monkey theorem, 156–157, 199 information theory, 11 information-processing problem, 168 inheritance, 47–49, 51, 106 in-play analysis, 97–98 Internet, benefits of, 72–73 investing, 96, 97–100, 204 See also hedge funds; stock/financial markets Irish National Lottery, 33 Jackson, Eric, 167 Jagger, Joseph, 7 Jennings, Ken, 165–166 Jeopardy! (TV game show), 165–166, 167, 171, 190 Johanson, Michael, 177, 183, 188, 189, 192 Johnson, Neil, 124 Journal of Applied Statistics (journal), 74 Julius, George, 44 Kalashnikov, Mikhail, 83 Kaplan, Michael, 172, 177 Kasparov, Garry, 166, 171, 176, 185 Kelly, John, 65 Kelly criterion, 65–67, 144 Kent, Michael, 79, 80–81, 82, 86–87, 102–103, 105, 107, 167 Kentucky Derby, 43 Keynes, John Maynard, 122, 123 King, Brian, 205 Klein, Matthew, 96 Klincewicz, Stefan, 33–34 Knight Capital, 117–118, 119 Laak, Phil, 185–186, 187 Laplace, Pierre Simon, 217 Larsen, Svend Egil, 119 Las Vegas Sports Consultants, 87–88 Latin square, 24 Le Monaco (newspaper), 5, 7 learning ability, 151, 161, 163, 171–172, 173, 174, 176, 177, 187, 188, 190, 217 legalization, 100–102 Leitner, Christoph, 76 Lickel, Charles, 166 lie detection, 190–191 limited-stakes poker, 172, 177, 185, 187, 189 limits, betting, 22, 140 Liverpool soccer club, 104 logistic map, 125–128 Lorenz, Edward, 9, 53, 124 Lorenz cipher, 26 lotteries betting syndicates and, 30, 31–32, 33–34, 202 consultant for, 141 controlled randomness and, 25–26 loopholes found in, 28, 30, 32, 218 roll downs in, 29–32, 33 rollovers in, 29, 33–34, 204 and roulette, biased view of, 98 scooping up jackpots in, 197 security team, 27 traditional, 28–34 university courses studying, 215 Lovelace, Ada, 175 luck in checkers, 156, 157 in chess, 168, 176, 202 habitual, 202 in life, 216 line between skill and, 204 measuring, 205, 217 notion of, exploring, by studying gambling, 216 in poker, 168, 186, 198, 200, 201, 215 probability and, 98 questioning, 28 rarity of profits coming from, 218 in roulette, 202 runs of, 5 skill versus, debate over, 198–200, 201–202, 203 in sports, 84, 85, 204–205 Ma, Will, 214–215 magic formula, 105 magic trick, 217–218 Maisel, Herbert, 37 Major League Baseball playoffs, 209 major sports, problem with focusing on, 107 Maldini, Paolo, 103 Management Science (journal), 50 Mandelbrot, Benoit, 162 man-machine competitions in checkers, 156, 159, 160 in chess, 166, 171, 176 in poker, 185–187 in TV game shows, 165–166, 171 Man-Machine World Championship, 156 maps as abstractions, 210, 211 logistic, 125–128 Markov, Andrei, 62 Markov chain, 62, 64 Markov chain Monte Carlo technique, 64 Markov property, 62, 63 Martinez, Roberto, 209 Massachusetts State Lottery, 29–32 May, Robert, 13, 125, 127, 128, 129, 131 Mazrooei, Parisa, 182 Mazur, Matt, 192–196 McDermott, James, 37 McHale, Ian, 89 mediocrity, regression to, 205 Mega-Millions, 29 memorizing, 179, 180 memory and competitiveness, 161–162 memory capacity, 179–181 Meston, A.

The End of Accounting and the Path Forward for Investors and Managers (Wiley Finance) by Feng Gu

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

active measures, Affordable Care Act / Obamacare, barriers to entry, business process, Claude Shannon: information theory, Clayton Christensen, commoditize, conceptual framework, corporate governance, creative destruction, Daniel Kahneman / Amos Tversky, discounted cash flows, diversified portfolio, double entry bookkeeping, Exxon Valdez, financial innovation, fixed income, hydraulic fracturing, index fund, information asymmetry, intangible asset, inventory management, Joseph Schumpeter, Kenneth Arrow, knowledge economy, moral hazard, new economy, obamacare, quantitative easing, quantitative trading / quantitative finance, QWERTY keyboard, race to the bottom, risk/return, Robert Shiller, Robert Shiller, shareholder value, Steve Jobs, The Great Moderation, value at risk

To see that, we first have to explain our emphasis on “new” information in the previous sentence. Why new? Simply, because as far as information usefulness is concerned, newness and timeliness are of the essence. This is a subtle issue that requires elaboration. Pardon the following brief tutorial, aimed at clarifying an important information principle, central to information (communication) theory, which was developed in the 1940s by Claude Shannon and Warren Weaver and played an important role in the development of computers and communication systems.1 The theory provides a measure of the amount of information conveyed by a message. For example: “It will start raining at 3:00 p.m. tomorrow.” This measure is based on the extent of surprise, or unexpectedness of the message to the receiver. If, for example, it is now November in Seattle, and it rained heavily the whole week, a forecast of rain continuation tomorrow isn’t that surprising, implying a low information content of the “rain tomorrow” forecast.

For a given information source, we include only events that do not coincide with the events of other information sources (e.g., we exclude managers’ forecasts that occur on the same day as quarterly earnings are announced). Sample firms included in Figure 4.1 are all US-listed companies with the required data, obtained from Compustat, CRSP, I/B/E/S First Call, and the S&P SEC Filings Database. Worse Than at First Sight 49 NOTES 1. Claude Shannon and Warren Weaver, The Mathematical Theory of Communication (Champaign–Urbana: University of Illinois Press, 1949). 2. Mathematically, the amount of information conveyed by a message is measured in communication theory by the logarithm of the ratio of the prior (before the message was received) to the posterior (after the message reception) probabilities of the event (e.g., rain at 3:00 pm tomorrow) occurring.

See Book value of equity Subject Index Capitalization 89 Capital lease 222 Capital market-free test 67 Cash burn rate 175 Cash flows analyst prediction 24f concept, accounting impact (intention) 17–18 earnings, domination 18–19 metric 17 prediction, ease 22–24 increase 23f residual cash flows, usage 238f statement 5 strategies, earnings (contrast) 19f Cash flows per share (CPS) 23–24 Causal factors 231–232 Causation, examination 107–108 Cease and desist agreement 174 China National Petroleum, household name 180 Cisco, stockholder equity 126 Claims expenses 154 Coca-Cola assets 83 consumer product business 88 patents 78 Combined ratio 148 increase 150–151 Commission, sins 104 Commodities 78 Communication theory (information theory) 42 Companies balance sheet, inventory account 233 company-related information sources 44 earnings investor reaction 30–31 prediction 16f 8-K filings 106–107 estimates (above-median number), ROE prediction errors (increase) 101f market values 32–33 differences 33 249 patents information 171, 173 portfolios 218 performance prediction, ability (decline) 56–57 prospects, uncertainty (decrease) 65 trademarks, information 171, 173 Comparability 203 Competition enhancement 134 implementation concerns 205–206 Competitive advantage, achievement 120–121, 134 Competitive edge, assessment 232 Complexity.


pages: 374 words: 114,600

The Quants by Scott Patterson

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, asset allocation, automated trading system, beat the dealer, Benoit Mandelbrot, Bernie Madoff, Bernie Sanders, Black Swan, Black-Scholes formula, Bonfire of the Vanities, Brownian motion, buttonwood tree, buy low sell high, capital asset pricing model, centralized clearinghouse, Claude Shannon: information theory, cloud computing, collapse of Lehman Brothers, collateralized debt obligation, commoditize, computerized trading, Credit Default Swap, credit default swaps / collateralized debt obligations, diversification, Donald Trump, Doomsday Clock, Edward Thorp, Emanuel Derman, Eugene Fama: efficient market hypothesis, fixed income, Gordon Gekko, greed is good, Haight Ashbury, I will remember that I didn’t make the world, and it doesn’t satisfy my equations, index fund, invention of the telegraph, invisible hand, Isaac Newton, job automation, John Meriwether, John Nash: game theory, law of one price, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, margin call, merger arbitrage, money market fund, Myron Scholes, NetJets, new economy, offshore financial centre, old-boy network, Paul Lévy, Paul Samuelson, Ponzi scheme, quantitative hedge fund, quantitative trading / quantitative finance, race to the bottom, random walk, Renaissance Technologies, risk-adjusted returns, Rod Stewart played at Stephen Schwarzman birthday party, Ronald Reagan, Sergey Aleynikov, short selling, South Sea Bubble, speech recognition, statistical arbitrage, The Chicago School, The Great Moderation, The Predators' Ball, too big to fail, transaction costs, value at risk, volatility smile, yield curve, éminence grise

The freshly minted mathematics professor shuddered, and his nerves jangled at the very thought of sitting down face-to-face with Claude Shannon. Few figures at MIT were more intimidating. Shannon was the brains behind two of the twentieth century’s greatest intellectual advances. The first was the application of the binary number system to electronic circuits, which laid the groundwork for the birth of the computer. Shannon’s great breakthrough had been to take a two-symbol logic in which problems are resolved by the manipulation of two numbers, 1 and 0, and apply it to a circuit in which a 1 is represented by a switch that is turned on and a 0 by a switch that is turned off. Sequences of on and off switches—essentially strings of 1s and 0s—could represent nearly any kind of information. Shannon was also a founding father of information theory: how to encode information and transmit it from point A to point B.

In 1985, Ax moved the operation to Huntington Beach, California. Axcom was to act as the trading advisor for the fund, which was nominally run as an investing firm owned by a company Simons had founded in July 1982 called Renaissance Technologies. Soon Simons’s growing crew of quants added another math wizard, Elwyn Berlekamp, a game theory expert at Berkeley. Like Ed Thorp, Berlekamp had worked with Claude Shannon and John Kelly at MIT. He’d briefly met Simons during a stint at IDA in the 1960s. The fund put up solid returns for several years, even managing to trade through Black Monday with relatively little damage. In 1988, Ax and Simons renamed the fund Medallion in honor of a math award they’d both won. Almost as soon as they’d renamed the fund, things started going south for Medallion. In the second half of 1988, losses were piling up, and getting steeper every month.

“Continue.” Several hours later, Thorp left Shannon’s office into the darkening November night. Thorp started paying regular visits to Shannon’s home later that November as the two scientists set to work on the roulette problem. Shannon called his home “Entropy House,” a nod to a core concept in information theory, borrowed from the second law of thermodynamics. The law of entropy essentially means everything in the universe will eventually turn into a homogenous, undifferentiated goop. In information theory, Shannon used entropy as a way to discover order within the apparent chaos of strings of seemingly random numbers. Shannon’s three-story wooden house overlooked the Mystic Lakes, several miles northwest of Cambridge. One look indoors told Thorp why Shannon likened it to a theory about the inexorable slide of the universe into utter randomness.


pages: 236 words: 50,763

The Golden Ticket: P, NP, and the Search for the Impossible by Lance Fortnow

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Andrew Wiles, Claude Shannon: information theory, cloud computing, complexity theory, Donald Knuth, Erdős number, four colour theorem, Gerolamo Cardano, Isaac Newton, John von Neumann, linear programming, new economy, NP-complete, Occam's razor, P = NP, Paul Erdős, Richard Feynman, Richard Feynman, Rubik’s Cube, smart grid, Stephen Hawking, traveling salesman, Turing machine, Turing test, Watson beat the top human players on Jeopardy!, William of Occam

The satisfiability problem that started off chapter 4 asks about logical formulas combining the basic operations of AND, OR, and NOT. We can actually express computation as a series of AND, OR, and NOT operations. Some problems need only a small “circuit” of ANDs, ORs, and NOTs and others require huge circuits to compute. Sergey Yablonsky in the early 1950s investigated this notion of what we now call circuit complexity. Claude Shannon, an American who founded the field of information theory, showed that some logical functions require very high circuit complexity. Yablonsky looked at the difficulty of generating such functions. This sounds like a difficult task, but actually a strong version of P ≠ NP would mean that some simple-to-state search problems do not have small circuits that compute them. Yablonsky first noted that by Shannon’s results, randomly generated functions have near maximal circuit complexity.

Diffie and Hellman realized that computers could make complex protocols cheaply available as software, but computers also brought new challenges. Computer-to-computer networks would become commonplace, and efficient and low-cost methods to secure communication over these networks would be needed. Talking about the development of the P versus NP problem earlier that decade, Diffie and Hellman stated, “At the same time, theoretical developments in information theory and computer science show promise of providing probably secure cryptosystems, changing this ancient art into a science.” Before Diffie and Hellman, to decrypt a secret message you needed the same key as the one used to encrypt the message. Both parties would have to meet in advance to agree on a secret key. If a general needed to send a message to a lieutenant in the field using Enigma machines, they both had to know the settings for that machine, particularly how the rotors were initially set at the beginning of the message.

Chapter 8 My account of the early history of cryptography draws heavily on David Kahn’s work, The Codebreakers: The Story of Secret Writing (New York: Macmillan, 1967). Zero-Knowledge Sudoku examples are taken from an August 3, 2006, post on my blog, Computational Complexity (http://blog.computationalcomplexity.org/2006/08/zero-knowledge-sudoku.html). Works Cited Whitfield Diffie and Martin Hellman, “New Directions in Cryptography,” IEEE Transactions on Information Theory 22, no. 6 (November 1976): 644–54. Craig Gentry, “Fully Homomorphic Encryption Using Ideal Lattices,” in Proceedings of the 41st Annual ACM Symposium on Theory of Computing (New York: ACM, 1979), 169–78. Ronald Rivest, Adi Shamir, and Leonard Adleman, “A Method for Obtaining Digital Signatures and Public-Key Cryptosystems,” Communications of the ACM 21, no. 2 (February 1978): 120–26.


pages: 230 words: 61,702

The Internet of Us: Knowing More and Understanding Less in the Age of Big Data by Michael P. Lynch

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Affordable Care Act / Obamacare, Amazon Mechanical Turk, big data - Walmart - Pop Tarts, bitcoin, Cass Sunstein, Claude Shannon: information theory, crowdsourcing, Edward Snowden, Firefox, Google Glasses, hive mind, income inequality, Internet of things, John von Neumann, meta analysis, meta-analysis, Nate Silver, new economy, patient HM, prediction markets, RFID, sharing economy, Steve Jobs, Steven Levy, the scientific method, The Wisdom of Crowds, Thomas Kuhn: the structure of scientific revolutions, WikiLeaks

But it is also true that in other respects we know less, that the walls of our digital life make real objective knowledge harder to come by, and that the Internet has promoted a more passive, more deferential way of knowing.13 Like our imaginary neuromedians, we are in danger of depending too much on one way of accessing the world and letting our other senses dull. Socrates on the Way to Larissa Data is not the same thing as information. As the founding father of information theory, Claude Shannon, put it back in the 1940s, data signals are noisy, and if you want to filter out what’s meaningful from those signals, you have to filter out at least some of that noise. For Shannon the noise was literal. His groundbreaking work concerned how to extract discernible information from the signals sent across telephone lines.14 But the moral is entirely general: bits of code aren’t themselves information; information is what we extract from those bits.

., 102 French Revolution, 58 Freud, Sigmund, 184 Fricker, Miranda, 146–48, 201 Galileo, 34, 68 Galton, Francis, 120 games, gaming, 20, 191 gatekeeping, 128, 134, 146 gender, 162 in marriage, 53–54, 72 in problem solving, 137 Georgetown University, 77–78 Gilbert, Margaret, 117–19, 200 Glass, Ira, 78 Glaucon, 54 Glauconian reasoning, 54–55, 56–58 global economy, 139, 142, 152 global warming, 56, 100, 124, 144, 185, 198 Goldberg, Sandy, 115 Goldman, Alvin, 194 Google, 5, 23, 30, 113, 128, 130, 135, 163, 174, 182, 203 business model of, 9 data collection and tracking by, 90, 155–56, 158, 161 as hypothetical “guy,” 24 monopolization by, 145–46 propaganda disseminated on, 66 in reinforcement of one’s own beliefs, 56 Google Complete, 155 Google Flu Trends, 158, 183 Google Glass, 149, 186 Google-knowing, xvi, 21–40, 25 defined, 23 limitations of, 174, 180 reliance on, 6–7, 23, 25–26, 30–31, 36, 113, 116, 153, 163, 179–80 Google Maps, 116 Google Street View, 23 Gordon, Lewis, 148 gorilla suit experiment, 30 government: autonomy limited by, 109 closed politics of, 144–45 data mining and analysis used by, 9, 90–91, 93, 104, 107 online manipulation used by, 81 purpose of, 38 transparency of, 137–38 Greece, classical philosophy of, 13, 47, 166–67, 171–72 Grimm, Stephen, 164 Guardian, 81 Gulf of Mexico, oil spill in, 118 H1N1 flu outbreak, tracking of, 158 Haidt, Jonathan, 51–54, 56, 57, 60, 196–97 Halpern, Sue, 106 Harvard Law Review, 89 Hazlett, Allan, 49 HBO GO, 145 Heidegger, Martin, 177 Hemingway, Mark, 46 Higher Order Thinking Skills (HOTS), 61 Hippocrates, 13 hive-mind, 4, 136 HM (patient), 168–69 Hobbes, Thomas, 38, 109 holiness, logical debate over, 166–67 homosexuality, changing attitudes toward, 53–54 Houla massacre, 83 Howe, Jeff, 136 Huffington Post, 43 human dignity: autonomy and, 58, 59–60 information technology as threat to, 187 interconnectedness and, 184–88 privacy and, 101–9 human rights, 54, 60 digital equality as, 142–48 protection of, 145 Hume, David, 48 hyperconnectivity, 184–88 identity: digital reshaping of, 73–74 manufactured online, 80–81 “scrubbing” of, 74 illegal searches, 93 illusion, distinguishing truth from, 67–74 incidental data collection, 95–96, 99 inclusivity, 135–37 income inequality, 142 inference, 29, 60, 172 information: accuracy and reliability of, 14, 27–30, 39–40, 44–45 collected pools of, 95–100, 107–9 distribution vs. creation of, 24 immediate, unlimited access to, 3–4, 23, 30, 42, 56, 113–16, 135–36, 141, 149, 153, 180 as interconnective, 184–88 vs. knowledge, 14 sorting and filtering of, 12, 26–29, 44–45, 127–28 information age, 111 information analysis, techniques of, 8–9 information cascades, 36, 66, 121 defined, 32 information coordination problem, 38–39, 56 information “glut,” 9–10, 44 information privacy, 94–100 and autonomy, 102–7 information sharing, coordination in, 4–5 information technology: costs of, 145 data trail in, 9 democratization through, 133–38, 148 devices and platforms of, xvii–xviii, 3, 7–8, 10, 41–43, 69, 70, 77–78, 90–91, 106–7, 144, 148–49, 156, 180, 185–87 disquieting questions about, 6 in education, 148–54 experience vs., 173–74 hypothetical loss of, 5 paradox of, 6, 12, 179 pool of data in, 95–100 surveillance and, 89–109 typified and dephysicalized objects in, 69 unequal distribution of, 144–45 see also Internet of Things information theory, 12 infosphere: defined, 10 feedback loop of social constructs in, 72–73 network of, 180 pollution of, 148 vastness of, 128 InnoCentive, 136–37, 141 institutions, cooperative, 60–61 intellectual labor, 139–40 International Telecommunications Union, 135 Internet: author’s experiment in circumventing, 21–24, 25, 35 in challenges to reasonableness, 41–63 changes wrought by, xv–xviii, 6–7, 10–11, 23, 180, 184–88 as a construction, 69 cost and profit debate over, 145 as epistemic resource, 143–45 expectations of, 80–83 as force for cohesion and democracy, 55–63 freedom both limited and enhanced by, 92–93 international rates of access to, 135, 144–45 monopolization and hegemony in, 145–46 as network, 111–13 “third wave” of, 7 see also World Wide Web; specific applications Internet of Everything, 184 Internet of Things: blurring of online and offline in, 71 defined, 7–8 integration of, 10 shared economy in, 140–41 threat from, 107, 153, 184–88 Internet of Us, digital form of life as, 10, 39, 73, 83–86, 106, 179–88 interracial marriage, 54 interrogation techniques, 105 In the Plex (Levy), 5–6 Intrade, 122–23, 136 intuition, 15, 51–53 iPhone, production of, 77–78, 80, 139, 144 IQ, 52 Iraq, 83 Iraq War, 137 ISIS, 128 isolation, polarization and, 42–43 I think, I exist, 127 James, William, 11 Jefferson, Thomas, 143 Jeppesen, Lars Bo, 137 joint commitments, defined, 117–18 journalism, truth and, 84 judgment, 51–55, 57 collective vs. individual, 117, 120–25 justice, 54 “just so” stories, 27–28 Kahneman, Daniel, 29, 51 Kant, Immanuel, 34, 58–60, 62, 85 Kitcher, Philip, 182 knowing-which, as term, 171 knowledge: in big data revolution, 87–190 changing structure of, 125–32 common, 117–19 defined and explained, xvii, 12–17 democratization of, 133–38 digital, see digital knowledge; Google-knowing distribution of, 134–35, 138, 141 diverse forms of, 130 economy of, 138–45 hyperconnectivity of, 184–88 individual vs. aggregate, 120–24 information vs., 14 Internet revolution in, xv–xviii minimal definition of, 14–15 as networked, 111–32 new aspects in old problems of, 1–86, 90 personal observation in, 33–35 political economy of, 133–54 as power, 9, 98–99, 133, 185–86 practical vs. theoretical, 169, 172 procedural, 167–74 recording and storage of, 127–28 reliability of sources of, 14, 27–31, 39–40, 44–45, 114–16 as a resource, 38–39 shared cognitive process in attainment of, 114–25 three forms of, 15–17 three simple points about, 14–17 truth and, 19, 126 understanding vs. other forms of, 6, 16–17, 90, 154, 155–73, 181 value and importance of, 12–13 knowledge-based education, 61 Kodak camera, 89 Koran, 48, 61 Kornblith, Hilary, 194 Krakauer, John, 169 Kuhn, Thomas, 159–60 Lakhani, Karim, 137 Larissa, Greece, 13, 15, 182 Leonhardt, David, 122–23 Levy, Steven, 5–6 liberals, 43 libraries, 22, 134, 153–54 of Alexandria, 8 digital form of life compared to, xvi, 17, 20, 44–45, 56, 63, 128 as epistemic resource, 145 Google treated as, 24 “Library of Babel” (Borges), 17 “Lies, Damned Lies, and ‘Fact-Checking’: The Liberal Media’s Latest Attempt to Control the Discourse” (Hemingway), 46 Lifespan of a Fact, The (D’Agata), 79 literacy, 35, 134 literal artifacts: defined, 69 social artifacts and, 71, 72 lobectomy, 168 Locke, John, 33–36, 39, 60, 67–70, 85, 127, 143 “Locke’s command,” 33–34 London Underground, mapping of, 112–13 machines, control by, 116 “mainstream” media, 32 censorship of, 66 majority rule, 120 manipulation: data mining and, 97, 104–6 of expectations, 80–82 persuasion and, 55, 57–58, 81–83, 86 manuals, 22 manufacturing, 138–39 maps, 21–22 marine chronometer, 137 marketing: bots in, 82 Glauconian, 58 targeted, 9, 90, 91, 105 marriage: changing attitudes toward, 53–54 civil vs. religious, 58–59 as social construct, 72 martial arts, 170 mass, as primary quality, 68 Massive Open Online Courses (MOOCs), 150–53 mathematics, in data analysis, 160, 161 Matrix, The, 18–19, 75 Mayer-Schönberger, Viktor, 8, 158–59 measles vaccine, 7, 124 Mechanical Turk, 136, 141 media, 134 diversity in, 42 opinion affected by, 53 sensationalist, 77 memory: accessing of, 114, 115 in educational models, 152 loss of, 168–69 superceded by information technology, xv–xvi, 3, 4, 6, 94, 149 trust in, 28, 33 Meno, 13 merchandising, online vs. brick and mortar, 70 Mercier, Hugo, 54 metrics, 112 Milner, Brenda, 168–69 mirror drawing experiment, 169 misinformation, 6–7, 31–32 in support of moral truth, 78–80, 82 mob mentality, 32–33 MOOCs (Massive Open Online Courses), 150–53 moral dumbfounding, 52 morality, moral values, xvii, 6, 44, 53–54, 195 “Moses Illusion,” 29–30 motor acuity, mastery of, 170–71, 173 motor skills, 167–74 Murray, Charles J., 147 music, as dephysicalized object, 69–70 Nagel, Thomas, 84 naming, identification by, 94 narrative license, truth and falsehood in, 78–79 National Endowment for the Humanities, 61 National Science Foundation, 61 Nature, 158, 161 Netflix, 69, 145 Net neutrality, defined, 145 netography, 112–13 of knowledge, 125–32 networked age, 111 networks, 111–32 collective knowledge of, 116–25, 180 knowledge reshaped and altered by, 125–32, 133, 140 in problem solving, 136 use of term, 111–12 neural system, 26 neural transplants, 3, 5 Neurath, Otto, 128–29 neuromedia, 3–5, 12, 17–19, 113–14, 132, 149, 168, 180–82, 184 limitations of, 174 as threat to education, 153–54 Newton, Isaac, 175 New Yorker, 25, 26 New York Times, 122, 174 Nietzsche, Friedrich, 111 Nobel laureates, 149 noble lie, 83, 86 nonfiction, 79–80 NPR, 78, 80 NSA: alleged privacy abuses by, 98–100, 138 data mining by, 9, 91, 95–96, 108, 167 proposed limitations on, 109 Ntrepid, 81 nuclear weapons technology, xvii nullius in verba (take nobody’s word for it), 34 Obama, Barack, 7, 100 administration, 109 objectivity, objective truth, 45, 74 as anchor for belief, 131 in constructed world, 83–86 as foundation for knowledge, 127 observation, 49, 60 affected by expectations, 159–60 behavior affected by, 91, 97 “oceanic feeling,” 184 “offlife,” 70 OkCupid, 157 “onlife,” 70 online identity creation, 73–74 online ranking, 119–21, 136 open access research sharing sites, 135–36 open society: closed politics vs., 144–45 values of, 41–43, 62 open source software, 135 Operation Earnest Voice, 81 Operation Ivy, ix opinion: knowledge vs., 13, 14, 126 in online ranking, 119–20 persuasion and, 50–51 truth as constructed by, 85–86 optical illusions, 67 Oracle of Delphi, 16–17, 171 Outcome-Based Education (OBE), 61–62 ownership, changing concept of, 73 ox, experiment on weight of, 120 Oxford, 168 Page, Larry, 5–6 Panopticon, 91, 92, 97 perception: acuity of, 173 distinguishing truth in, 67–74 expectations and, 159–60 misleading, 29–30, 67 as relative, 67–68 perceptual incongruity, 159–60 personal freedom, 101 persuasion, 50–51, 54–55, 56–58 by bots, 82 phone books, 22 phone data collection, 95, 108 photography: privacy and, 89, 93 sexually-explicit, 99 photo-sharing, manipulation in, 82–83 Plato, 13–14, 16–17, 54, 59, 83, 126, 165–67 polarization, 7 herd mentality in, 66 isolated tribes in, 43–46 politics, 162, 196 accessibility in, 23 activism in, 66, 67 bias in, 43–46 closed, 144–45 elections in, 120–23 of knowledge, 133–54 opposition to critical thinking in, 61–62 persuasion in, 57–58, 82–83 power in, 86, 133 prediction market in, 122–23 Politifact, 46 Popper, Karl, 41–43 Postman, L.


pages: 413 words: 119,587

Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots by John Markoff

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

A Declaration of the Independence of Cyberspace, AI winter, airport security, Apple II, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, basic income, Baxter: Rethink Robotics, Bill Duvall, bioinformatics, Brewster Kahle, Burning Man, call centre, cellular automata, Chris Urmson, Claude Shannon: information theory, Clayton Christensen, clean water, cloud computing, collective bargaining, computer age, computer vision, crowdsourcing, Danny Hillis, DARPA: Urban Challenge, data acquisition, Dean Kamen, deskilling, don't be evil, Douglas Engelbart, Douglas Engelbart, Douglas Hofstadter, Dynabook, Edward Snowden, Elon Musk, Erik Brynjolfsson, factory automation, From Mathematics to the Technologies of Life and Death, future of work, Galaxy Zoo, Google Glasses, Google X / Alphabet X, Grace Hopper, Gunnar Myrdal, Gödel, Escher, Bach, Hacker Ethic, haute couture, hive mind, hypertext link, indoor plumbing, industrial robot, information retrieval, Internet Archive, Internet of things, invention of the wheel, Jacques de Vaucanson, Jaron Lanier, Jeff Bezos, job automation, John Conway, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, knowledge worker, Kodak vs Instagram, labor-force participation, loose coupling, Marc Andreessen, Mark Zuckerberg, Marshall McLuhan, medical residency, Menlo Park, Mother of all demos, natural language processing, new economy, Norbert Wiener, PageRank, pattern recognition, pre–internet, RAND corporation, Ray Kurzweil, Richard Stallman, Robert Gordon, Rodney Brooks, Sand Hill Road, Second Machine Age, self-driving car, semantic web, shareholder value, side project, Silicon Valley, Silicon Valley startup, Singularitarianism, skunkworks, Skype, social software, speech recognition, stealth mode startup, Stephen Hawking, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, strong AI, superintelligent machines, technological singularity, Ted Nelson, telemarketer, telepresence, telepresence robot, Tenerife airport disaster, The Coming Technological Singularity, the medium is the message, Thorstein Veblen, Turing test, Vannevar Bush, Vernor Vinge, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, William Shockley: the traitorous eight, zero-sum game

He found that another graduate student was a double agent in their games, plotting with McCarthy against Nash while at the same time plotting with Nash against McCarthy. Game theory was in fashion at the time and Nash later received his Nobel Prize in economics for contributions to that field. During the summer of 1952 both McCarthy and Minsky were hired as research assistants by mathematician and electrical engineer Claude Shannon at Bell Labs. Shannon, known as the father of “information theory,” had created a simple chess-playing machine in 1950, and there was early interest in biological-growth simulating programs known as “automata,” of which John Conway’s 1970 Game of Life would become the most famous. Minsky was largely distracted by his impending wedding, but McCarthy made the most of his time at Bell Labs, working with Shannon on a collection of mathematical papers that was named at Shannon’s insistence Automata Studies.11 Using the word “automata” was a source of frustration for McCarthy because it shifted the focus of the submitted papers away from the more concrete artificial intelligence ideas and toward more esoteric mathematics.

In contrast, Turkle describes a world in which computer networks increasingly drive a wedge between humans, leaving them lonely and isolated. For Weizenbaum, computing systems risked fundamentally diminishing the human experience. In very much the same vein that Marxist philosopher Herbert Marcuse attacked advanced industrial society, he was concerned that the approaching Information Age might bring about a “One-Dimensional Man.” In the wake of the creation of Eliza, a group of MIT scientists, including information theory pioneer Claude Shannon, met in Concord, Massachusetts, to discuss the social implications of the phenomenon.8 The seductive quality of the interactions with Eliza concerned Weizenbaum, who believed that an obsessive reliance on technology was indicative of a moral failing in society, an observation rooted in his experiences as a child growing up in Nazi Germany. In 1976, he sketched out a humanist critique of computer technology in his book Computer Power and Human Reason: From Judgment to Calculation.


pages: 429 words: 114,726

The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise by Nathan L. Ensmenger

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

barriers to entry, business process, Claude Shannon: information theory, computer age, deskilling, Donald Knuth, Firefox, Frederick Winslow Taylor, future of work, Grace Hopper, informal economy, information retrieval, interchangeable parts, Isaac Newton, Jacquard loom, Jacquard loom, job satisfaction, John von Neumann, knowledge worker, loose coupling, new economy, Norbert Wiener, pattern recognition, performance metric, Philip Mirowski, post-industrial society, Productivity paradox, RAND corporation, Robert Gordon, Shoshana Zuboff, sorting algorithm, Steve Jobs, Steven Levy, the market place, Thomas Kuhn: the structure of scientific revolutions, Thorstein Veblen, Turing machine, Von Neumann architecture, Y2K

In 1965, the ACM Curriculum Committee attempted to bring unity to computer science by defining it in terms of a single fundamental unit of analysis: computer science “is concerned with information in much the same sense that physics is concerned with energy; it is devoted to the representation, storage, manipulation, and presentation of information.”60 This redefinition of computer science around the study of information offered several immediate benefits. Not only did it lay claim to the valuable intellectual territory suggested by the commonsense understanding of information as knowledge or data but it also linked the discipline to the specific formulation of information developed in the late 1940s by the mathematician Claude Shannon. In his seminal book with Warren Weaver from 1949, A Mathematical Theory of Communication, Shannon had defined information in terms of the physical concept of negative entropy.61 His information theory appealed to scientists in a wide variety of disciplines, and for a time it appeared as if information might serve as a broadly unifying concept in the sciences.62 But despite its intellectual appeal, Shannon’s mathematical definition of information was never widely applicable outside of communications engineering.

Atchison and Hamblen, “Status of Computer Sciences Curricula in Colleges and Universities.” 59. Michael Mahoney, “Software as Science–Science as Software,” in Mapping the History of Computing: Software Issues, ed. Ulf Hashagen, Reinhard Keil-Slawik, and Arthur Norberg (Berlin: Springer-Verlag, 2002), 25–48. 60. ACM Curriculum Committee, “An Undergraduate Program in Computer Science.” 61. Claude Shannon and Warren Weaver, A Mathematical Theory of Communication (Urbana: University of Illinois Press, 1949). 62. Lily Kay, “Who Wrote the Book of Life? Information and the Transformation of Molecular Biology,” Science in Context 8 (1995): 609–634; Ronald Kline, “Cybernetics, Management Science, and Technology Policy: The Emergence of ‘Information Technology’ as a Keyword, 1948–1985,” Technology and Culture 47, no. 3 (2006): 513–535. 63.


pages: 397 words: 110,130

Smarter Than You Think: How Technology Is Changing Our Minds for the Better by Clive Thompson

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, 4chan, A Declaration of the Independence of Cyberspace, augmented reality, barriers to entry, Benjamin Mako Hill, butterfly effect, citizen journalism, Claude Shannon: information theory, conceptual framework, corporate governance, crowdsourcing, Deng Xiaoping, discovery of penicillin, Douglas Engelbart, Douglas Engelbart, drone strike, Edward Glaeser, Edward Thorp, en.wikipedia.org, experimental subject, Filter Bubble, Freestyle chess, Galaxy Zoo, Google Earth, Google Glasses, Gunnar Myrdal, Henri Poincaré, hindsight bias, hive mind, Howard Rheingold, information retrieval, iterative process, jimmy wales, Kevin Kelly, Khan Academy, knowledge worker, lifelogging, Mark Zuckerberg, Marshall McLuhan, Menlo Park, Netflix Prize, Nicholas Carr, patent troll, pattern recognition, pre–internet, Richard Feynman, Richard Feynman, Ronald Coase, Ronald Reagan, Rubik’s Cube, sentiment analysis, Silicon Valley, Skype, Snapchat, Socratic dialogue, spaced repetition, telepresence, telepresence robot, The Nature of the Firm, the scientific method, The Wisdom of Crowds, theory of mind, transaction costs, Vannevar Bush, Watson beat the top human players on Jeopardy!, WikiLeaks, X Prize, éminence grise

He was frustrated by a paradox in note taking: Whenever he wrote a note, he stopped paying attention to the professor and risked missing out on new information. But if he didn’t take notes? He’d have nothing to study from. He wanted to split the difference—to have a computer that was so integrated into his body and his field of vision that he could keep looking at the lecturer while he typed. Starner knew about similar prototypes. The first wearable was cocreated in 1960 by Claude Shannon, the founder of information theory, for a purpose both whimsical and mathematically ambitious: He wanted to beat the Las Vegas roulette wheels. Using sensors built into his shoes, Shannon would “type” information about how the wheel had been spun, and a circuit in his pocket would send beeps to an earpiece telling him how to bet. (The question of whether wearable computing constitutes amplified smarts or blatant cheating hinges, I suppose, on whether you identify with the casino owner or the players.)

Perlow describes similarly positive effects in Sleeping with Your Smartphone (Boston: Harvard Business Press, 2012), in documenting how a group of consultants with the Boston Consulting Group agreed to stay off their devices for a set number of hours per day, which they called predictable time off, and which in saner times was referred to as evenings and weekends. “an overarching ability to watch and understand your own mind”: Maggie Jackson and Bill McKibben, Distracted: The Erosion of Attention and the Coming Dark Age (Amherst, NY: Prometheus Books, 2008), Kindle edition. The first wearable was cocreated in 1960 by Claude Shannon: Edward O. Thorp, “The Invention of the First Wearable Computer,” Proceedings of the 2nd IEEE International Symposium on Wearable Computers (1998): 4–8, accessed March 23, 2013, graphics.cs.columbia.edu/courses/mobwear/resources/thorp-iswc98.pdf. Critics have already noted how unsettling it might feel: Mark Hurst, “The Google Glass Feature No One Is Talking About,” Creative Good (blog), February 28, 2013, accessed March 24, 2013, creativegood.com/blog/the-google-glass-feature-no-one-is-talking-about/; Adrian Chen, “If You Wear Google’s New Glasses You Are an Asshole,” Gawker, March 3, 2013, accessed March 24, 2013, http://gawker.com/5990395.


pages: 218 words: 63,471

How We Got Here: A Slightly Irreverent History of Technology and Markets by Andy Kessler

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, Andy Kessler, automated trading system, bank run, Big bang: deregulation of the City of London, Bob Noyce, Bretton Woods, British Empire, buttonwood tree, Claude Shannon: information theory, Corn Laws, Douglas Engelbart, Edward Lloyd's coffeehouse, fiat currency, fixed income, floating exchange rates, Fractional reserve banking, full employment, Grace Hopper, invention of the steam engine, invention of the telephone, invisible hand, Isaac Newton, Jacquard loom, Jacquard loom, James Hargreaves, James Watt: steam engine, John von Neumann, joint-stock company, joint-stock limited liability company, Joseph-Marie Jacquard, Leonard Kleinrock, Marc Andreessen, Maui Hawaii, Menlo Park, Metcalfe's law, Metcalfe’s law, packet switching, price mechanism, probability theory / Blaise Pascal / Pierre de Fermat, profit motive, railway mania, RAND corporation, Robert Metcalfe, Silicon Valley, Small Order Execution System, South Sea Bubble, spice trade, spinning jenny, Steve Jobs, supply-chain management, supply-chain management software, trade route, transatlantic slave trade, transatlantic slave trade, tulip mania, Turing machine, Turing test, William Shockley: the traitorous eight

In 1941, a University of Michigan undergrad and MIT grad, Claude Shannon, joined Bell Labs. As per custom, no one told him what to do. So he started trying to apply mathematics to communications. He wasn’t interested so much in the signals, but in the probability that what one shoved into one end of a channel would come out the other end intact, through all the noise that impairs the signal. And what if you shoved an encoder and decoder on either end. How much information could you transmit? Or how clear could the voice signal be? Through a series of papers starting in 1948 came Shannon’s Law, which calculated the maximum throughput of error-free information through a channel with a certain amount of noise in it. Put another way, Shannon laid the foundation of modern information theory, turned encryption into a science (the fallibility of Enigma-like machines is behind us) and set limits on what wireless networks could be used for.


pages: 338 words: 106,936

The Physics of Wall Street: A Brief History of Predicting the Unpredictable by James Owen Weatherall

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Albert Einstein, algorithmic trading, Antoine Gombaud: Chevalier de Méré, Asian financial crisis, bank run, beat the dealer, Benoit Mandelbrot, Black Swan, Black-Scholes formula, Bonfire of the Vanities, Bretton Woods, Brownian motion, butterfly effect, capital asset pricing model, Carmen Reinhart, Claude Shannon: information theory, collateralized debt obligation, collective bargaining, dark matter, Edward Lorenz: Chaos theory, Edward Thorp, Emanuel Derman, Eugene Fama: efficient market hypothesis, financial innovation, fixed income, George Akerlof, Gerolamo Cardano, Henri Poincaré, invisible hand, Isaac Newton, iterative process, John Nash: game theory, Kenneth Rogoff, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, martingale, Myron Scholes, new economy, Paul Lévy, Paul Samuelson, prediction markets, probability theory / Blaise Pascal / Pierre de Fermat, quantitative trading / quantitative finance, random walk, Renaissance Technologies, risk-adjusted returns, Robert Gordon, Robert Shiller, Robert Shiller, Ronald Coase, Sharpe ratio, short selling, Silicon Valley, South Sea Bubble, statistical arbitrage, statistical model, stochastic process, The Chicago School, The Myth of the Rational Market, tulip mania, V2 rocket, Vilfredo Pareto, volatility smile

Moore instructorship in mathematics at MIT — a position held a decade earlier by John Nash, the pioneering mathematician profiled by Sylvia Nasar in her book A Beautiful Mind. Thorp and his wife, Vivian, left Southern California and moved to Cambridge, Massachusetts. They spent only two years on the East Coast before moving back west, to New Mexico. But it was enough to set their lives on a different track: it was at MIT that Thorp met Claude Shannon. Shannon may be the only person in the twentieth century who can claim to have founded an entirely new science. The field he invented, information theory, is essentially the mathematics behind the digital revolution. It undergirds computer science, modern telecommunications, cryptography, and code-breaking. The basic object of study is data: bits (a term Shannon coined) of information. The study of things such as how light waves move through air or how human languages work is very old; Shannon’s groundbreaking idea was that you could study the information itself — the stuff that’s carried by the light waves from objects in the world to your retinas, or the stuff that passes from one person to another when they speak — independently of the waves and the words.

Ed Thorp is a prime example. He accomplished what Bachelier and Osborne never could: he showed that physics and mathematics could be used to profit from financial markets. Building on the work of Bachelier and Osborne, and on his own experience with gambling systems, Thorp invented the modern hedge fund — by applying ideas from a new field that combined mathematical physics and electrical engineering. Information theory, as it’s known, was as much a part of the 1960s as the Vegas Strip. And in Thorp’s hands, it proved to be the missing link between the statistics of market prices and a winning strategy on Wall Street. Thorp was born at the peak of the Depression, on August 14, 1932. His father was a retired army officer, a veteran of the First World War. When Thorp was born, his father was fortunate enough to have found work as a bank guard, but money was still tight and the young Thorp developed an early instinct for thrift and financial savvy.

The study of things such as how light waves move through air or how human languages work is very old; Shannon’s groundbreaking idea was that you could study the information itself — the stuff that’s carried by the light waves from objects in the world to your retinas, or the stuff that passes from one person to another when they speak — independently of the waves and the words. It is hard to overstate how important this idea would become. Information theory grew out of a project Shannon worked on during World War II, as a staff scientist at Bell Labs, AT&T’s research division in Murray Hill, New Jersey. The goal of the project was to build an encrypted telephone system so that generals at the front could safely communicate with central command. Unfortunately, this was hard to do. There is only one code system that can be mathematically proven to be unbreakable.


pages: 791 words: 85,159

Social Life of Information by John Seely Brown, Paul Duguid

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

AltaVista, business process, Claude Shannon: information theory, computer age, cross-subsidies, disintermediation, double entry bookkeeping, Frank Gehry, frictionless, frictionless market, future of work, George Gilder, George Santayana, global village, Howard Rheingold, informal economy, information retrieval, invisible hand, Isaac Newton, John Markoff, Just-in-time delivery, Kenneth Arrow, Kevin Kelly, knowledge economy, knowledge worker, loose coupling, Marshall McLuhan, medical malpractice, moral hazard, Network effects, new economy, Productivity paradox, Robert Metcalfe, rolodex, Ronald Coase, shareholder value, Shoshana Zuboff, Silicon Valley, Steve Jobs, Superbowl ad, Ted Nelson, telepresence, the medium is the message, The Nature of the Firm, The Wealth of Nations by Adam Smith, Thomas Malthus, transaction costs, Turing test, Vannevar Bush, Y2K

Sometime around 2012, it has been predicted, Moore's Law will come up against the physical limitations of current microchip components, though by then solid-state components may well have been replaced. 6. Kelly, 1997. 7. Negroponte, 995. John Tukey coined the term bit. It stands for "binary digit." 8. It's worth remembering that formal information theory, while it holds the bit as a central concern, is indifferent to meaning. Claude Shannon, who with Warren Weaver laid the foundations of modern information theory, is quite clear about this: "the semantic aspects of communication are irrelevant to the engineering aspects" (Shannon and Weaver, 1964, p. 8). 9. The pervasive image of the new open frontier only works if we forget the presence on the old frontier of the U.S. Army, the Church of Latter-Day Saints, and myriad other organizations and institutions large and small (let alone its native inhabitants) that shaped the frontier. 10.

Thus while one person often has conflicting information, he or she will not usually have conflicting knowledge. And while it seems quite reasonable to say, "I've got the information, but I don't understand it," it seems less reasonable to say, "I know, but I don't understand," or "I have the knowledge, but I can't see what it means.'' (Indeed, while conventional uses of information don't necessarily coincide with the specialist uses, as we noted earlier, "information theory" holds information to be independent of meaning.)5 Where Is the Knower Lost in the Information? Knowledge's personal attributes suggest that the shift toward knowledge may (or should) represent a shift toward people. Focusing on process, as we argued, draws attention away from people, Page 121 concentrating instead on disembodied processes and the information that drives them.

In turn, the identity under development shapes what that person comes to know, how he or she assimilates knowledge and information. So, even when people are learning about, in Bruner's terms, the identity they are developing determines what they pay attention to and what they learn. What people learn about, then, is always refracted through who they are and what they are learning to be. 27 So information, while a critical part of learning, is only one among many forces at work. Information theory portrays information as a change registered in an otherwise steady state. It's a light flashing out on a dark hillside (to borrow an example from the philosopher Fred Dretske28) or the splash of a pebble breaking the calm of a still lake. In either case, the result, as the anthropologist Gregory Bateson puts it neatly, is "a difference that makes a difference."29 The importance of disturbance or change makes it almost inevitable that we focus on these.


pages: 378 words: 110,518

Postcapitalism: A Guide to Our Future by Paul Mason

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Alfred Russel Wallace, bank run, banking crisis, banks create money, Basel III, basic income, Bernie Madoff, Bill Gates: Altair 8800, bitcoin, Branko Milanovic, Bretton Woods, BRICs, British Empire, business process, butterfly effect, call centre, capital controls, Cesare Marchetti: Marchetti’s constant, Claude Shannon: information theory, collaborative economy, collective bargaining, Corn Laws, corporate social responsibility, creative destruction, credit crunch, currency manipulation / currency intervention, currency peg, David Graeber, deglobalization, deindustrialization, deskilling, discovery of the americas, Downton Abbey, drone strike, en.wikipedia.org, energy security, eurozone crisis, factory automation, financial repression, Firefox, Fractional reserve banking, Frederick Winslow Taylor, full employment, future of work, game design, income inequality, inflation targeting, informal economy, information asymmetry, intangible asset, Intergovernmental Panel on Climate Change (IPCC), Internet of things, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, Joseph Schumpeter, Kenneth Arrow, Kevin Kelly, knowledge economy, knowledge worker, late capitalism, low skilled workers, market clearing, means of production, Metcalfe's law, money: store of value / unit of account / medium of exchange, mortgage debt, Network effects, new economy, Norbert Wiener, Occupy movement, oil shale / tar sands, oil shock, Paul Samuelson, payday loans, Pearl River Delta, post-industrial society, precariat, price mechanism, profit motive, quantitative easing, race to the bottom, RAND corporation, rent-seeking, reserve currency, RFID, Richard Stallman, Robert Gordon, Robert Metcalfe, secular stagnation, sharing economy, Stewart Brand, structural adjustment programs, supply-chain management, The Future of Employment, the scientific method, The Wealth of Nations by Adam Smith, Transnistria, union organizing, universal basic income, urban decay, urban planning, Vilfredo Pareto, wages for housework, women in the workforce

Competition and patent ownership issues were put on hold.20 These were remarkable things to achieve within capitalism: to treat research as public property, to suppress competition and to plan not just production but the direction of research. And though the USA perfected it, all major combatant states attempted it. The result was to stimulate an unprecedented culture of cross-fertilization in strategic disciplines. The new approach inserted maths and science into the heart of the industrial process; economics and data management into political decision-making. It was the OSRD that took Claude Shannon, the founder of information theory, out of Princeton and put him into Bell Labs to design algorithms for anti-aircraft guns.21 There, he would meet Alan Turing and discuss the possibility of ‘thinking machines’. Turing, too, had been scooped out of academia by the British government to run the Enigma codebreaking operation at Bletchley Park. This culture of innovation survived the transition to peacetime, even as individual corporations tried to monopolize the results and scrapped over patent rights.

But one key technology was invisible: information. Though the ‘information economy’ lay decades in the future, the post-war economies saw information used on an industrial scale. It flowed as science, as management theory, as data, as mass communications and even – in a few hallowed places – out of a computer and into a tray of folding paper. A transistor is simply a switch with no moving parts. Information theory plus transistors gives you the ability to automate physical processes. So factories throughout the West were re-tooled with semi-automated machinery: pneumatic presses, drills, cutters, lathes, sewing machines and production lines. What they lacked was sophisticated feedback mechanisms: electronic sensors and automated logic systems were so crude that the latter used compressed air to do what we now do with iPhone apps.

Deloitte, who did these calculations, describes the falling price of basic info-tech as exponential: ‘The current pace of technological advance is unprecedented in history and shows no signs of stabilizing as other historical technological innovations, such as electricity, eventually did.’26 It has become commonplace to think of information as ‘immaterial’. Norbert Wiener, one of the founders of information theory once claimed: ‘Information is information, not matter or energy. No materialism which does not admit this can survive at the present day.’27 But this is a fallacy. In 1961, IBM physicist Rolf Landauer proved, logically, that information is physical.28 He wrote: ‘Information is not a disembodied abstract entity; it is always tied to a physical representation. This ties the handling of information to all the possibilities and restrictions of our real physical world, its laws of physics and its storehouse of available parts.’29 Specifically, he showed that information processing consumes energy and it should be possible to measure the amount of energy used in deleting one ‘bit’ of information.


pages: 332 words: 109,213

The Scientist as Rebel by Freeman Dyson

Albert Einstein, Asilomar, British Empire, Claude Shannon: information theory, dark matter, double helix, Edmond Halley, Ernest Rutherford, experimental subject, Fellow of the Royal Society, From Mathematics to the Technologies of Life and Death, Henri Poincaré, Isaac Newton, John von Neumann, kremlinology, Mikhail Gorbachev, Norbert Wiener, Paul Erdős, Richard Feynman, Richard Feynman, Ronald Reagan, Silicon Valley, Stephen Hawking, Thomas Kuhn: the structure of scientific revolutions, traveling salesman

In spite of his original intentions, cybernetics became a theory of analog processes. Meanwhile, also in 1948, Claude Shannon published his classic pair of papers with the title “A Mathematical Theory of Communication,” in The Bell System Technical Journal. Shannon’s theory was a theory of digital communication, using many of Wiener’s ideas but applying them in a new direction. Shannon’s theory was mathematically elegant, clear, and easy to apply to practical problems of communication. It was far more user-friendly than cybernetics. It became the basis of a new discipline called “information theory.” During the next ten years, digital computers began to operate all over the world, and analog computers rapidly became obsolete. Electronic engineers learned information theory, the gospel according to Shannon, as part of their basic training, and cybernetics was forgotten.


pages: 297 words: 89,820

The Perfect Thing: How the iPod Shuffles Commerce, Culture, and Coolness by Steven Levy

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Apple II, British Empire, Claude Shannon: information theory, en.wikipedia.org, indoor plumbing, Internet Archive, Jeff Bezos, John Markoff, Jony Ive, Kevin Kelly, Sand Hill Road, Saturday Night Live, Silicon Valley, social web, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, technology bubble, Thomas L Friedman

Robbin is talking randomness in terms that software can reasonably produce, which is not perfect randomness. True randomness, it turns out, is very difficult to produce. This is well known to cryptographers. A well-funded, sophisticated cryptanalyst will Shuffle seize on any variation from a random distribution as a means of attacking a code. This subject was most famously examined by Claude Shannon, arguably the Father of Randomness. Shannon himself expressed some random behavior: the MIT math professor was known for his eccentric habits, which included riding a unicy-cle. But his papers on information theory are rock solid. Basically, he defined randomness as a question of unpredictability. If a series of numbers is truly random, you have no possible way of guessing what comes next. If something isn't random (as in the case of what letter might follow another in a message written in English), you have a better chance of figuring out what comes next.


pages: 372 words: 101,174

How to Create a Mind: The Secret of Human Thought Revealed by Ray Kurzweil

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, Albert Michelson, anesthesia awareness, anthropic principle, brain emulation, cellular automata, Claude Shannon: information theory, cloud computing, computer age, Dean Kamen, discovery of DNA, double helix, en.wikipedia.org, epigenetics, George Gilder, Google Earth, Isaac Newton, iterative process, Jacquard loom, Jacquard loom, John von Neumann, Law of Accelerating Returns, linear programming, Loebner Prize, mandelbrot fractal, Norbert Wiener, optical character recognition, pattern recognition, Peter Thiel, Ralph Waldo Emerson, random walk, Ray Kurzweil, reversible computing, selective serotonin reuptake inhibitor (SSRI), self-driving car, speech recognition, Steven Pinker, strong AI, the scientific method, theory of mind, Turing complete, Turing machine, Turing test, Wall-E, Watson beat the top human players on Jeopardy!, X Prize

Even within the arithmetic unit, there is communication from one bit register to another. Communication is pervasive at every level. If we consider that error rates escalate rapidly with increased communication and that a single-bit error can destroy the integrity of a process, digital computation was doomed—or so it seemed at the time. Remarkably, that was the common view until American mathematician Claude Shannon (1916–2001) came along and demonstrated how we can create arbitrarily accurate communication using even the most unreliable communication channels. What Shannon stated in his landmark paper “A Mathematical Theory of Communication,” published in the Bell System Technical Journal in July and October 1948, and in particular in his noisy channel-coding theorem, was that if you have available a channel with any error rate (except for exactly 50 percent per bit, which would mean that the channel was just transmitting pure noise), you are able to transmit a message in which the error rate is as accurate as you desire.

As a simple example, if I transmit each bit three times and take the majority vote, I will have substantially increased the reliability of the result. If that is not good enough, simply increase the redundancy until you get the reliability you need. Simply repeating information is the easiest way to achieve arbitrarily high accuracy rates from low-accuracy channels, but it is not the most efficient approach. Shannon’s paper, which established the field of information theory, presented optimal methods of error detection and correction codes that can achieve any target accuracy through any nonrandom channel. Older readers will recall telephone modems, which transmitted information through noisy analog phone lines. These lines featured audibly obvious hisses and pops and many other forms of distortion, but nonetheless were able to transmit digital data with very high accuracy rates, thanks to Shannon’s noisy channel theorem.


pages: 313 words: 101,403

My Life as a Quant: Reflections on Physics and Finance by Emanuel Derman

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Berlin Wall, bioinformatics, Black-Scholes formula, Brownian motion, capital asset pricing model, Claude Shannon: information theory, Donald Knuth, Emanuel Derman, fixed income, Gödel, Escher, Bach, haute couture, hiring and firing, implied volatility, interest rate derivative, Jeff Bezos, John Meriwether, John von Neumann, law of one price, linked data, Long Term Capital Management, moral hazard, Murray Gell-Mann, Myron Scholes, Paul Samuelson, pre–internet, publish or perish, quantitative trading / quantitative finance, Richard Feynman, Sharpe ratio, statistical arbitrage, statistical model, Stephen Hawking, Steve Jobs, stochastic volatility, technology bubble, the new new thing, transaction costs, value at risk, volatility smile, Y2K, yield curve, zero-coupon bond, zero-sum game

Area 10 had played a large role in evangelizing the dual view of programs as both tools and text, written not only to control electronic machines but also to to be understood and manipulated by people. At the labs people were proud of programming and viewed it as an art. In physics and engineering the Labs was an experimental and theoretical powerhouse, producing research in electronics and information theory that made possible many of the subsequent advances in communications. Bardeen, Brattain, and Shockley had invented the transistor there in 1947, and Claude Shannon published his landmark paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in 1948. There were fundamental discoveries made, too-Penzias and Wilson won the Nobel Prize for discovering the cosmic radiation left behind by the Big Bang, as predicted by Robert Herman. Even during the period I worked there, Horst Stormer, now at Columbia University, did the research on the quantum Hall effect that recently won him a share of the Nobel Prize.


pages: 396 words: 107,814

Is That a Fish in Your Ear?: Translation and the Meaning of Everything by David Bellos

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Clapham omnibus, Claude Shannon: information theory, Douglas Hofstadter, Etonian, European colonialism, haute cuisine, invention of the telephone, invention of writing, natural language processing, Republic of Letters, Sapir-Whorf hypothesis, speech recognition

In a famous memorandum written in July 1949, Warren Weaver, then a senior official with the Rockefeller Foundation, found it “very tempting to say that a book written in Chinese is simply a book in English which was coded into the ‘Chinese code.’ If we have useful methods for solving almost any cryptographic problem, may it not be that with proper interpretation we already have useful methods for translation?”2 Weaver was aware of the pioneering work of Claude Shannon and others in the nascent disciplines of information theory and cybernetics and could see that if language could be treated as a code, then there would be huge development contracts available for mathematicians, logicians, and engineers working on the new and exciting number-crunching devices that had only just acquired their modern name of “computers.” But the temptation to see “language as code” comes from much deeper sources than just an intuition that it would create interesting jobs for very smart boys.


pages: 396 words: 117,149

The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World by Pedro Domingos

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

3D printing, Albert Einstein, Amazon Mechanical Turk, Arthur Eddington, basic income, Bayesian statistics, Benoit Mandelbrot, bioinformatics, Black Swan, Brownian motion, cellular automata, Claude Shannon: information theory, combinatorial explosion, computer vision, constrained optimization, correlation does not imply causation, creative destruction, crowdsourcing, Danny Hillis, data is the new oil, double helix, Douglas Hofstadter, Erik Brynjolfsson, experimental subject, Filter Bubble, future of work, global village, Google Glasses, Gödel, Escher, Bach, information retrieval, job automation, John Markoff, John Snow's cholera map, John von Neumann, Joseph Schumpeter, Kevin Kelly, lone genius, mandelbrot fractal, Mark Zuckerberg, Moneyball by Michael Lewis explains big data, Narrative Science, Nate Silver, natural language processing, Netflix Prize, Network effects, NP-complete, off grid, P = NP, PageRank, pattern recognition, phenotype, planetary scale, pre–internet, random walk, Ray Kurzweil, recommendation engine, Richard Feynman, Richard Feynman, Second Machine Age, self-driving car, Silicon Valley, speech recognition, statistical model, Stephen Hawking, Steven Levy, Steven Pinker, superintelligent machines, the scientific method, The Signal and the Noise by Nate Silver, theory of mind, Thomas Bayes, transaction costs, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, white flight, zero-sum game

The simplest algorithm is: flip a switch. The state of one transistor is one bit of information: one if the transistor is on, and zero if it’s off. One bit somewhere in your bank’s computers says whether your account is overdrawn or not. Another bit somewhere in the Social Security Administration’s computers says whether you’re alive or dead. The second simplest algorithm is: combine two bits. Claude Shannon, better known as the father of information theory, was the first to realize that what transistors are doing, as they switch on and off in response to other transistors, is reasoning. (That was his master’s thesis at MIT—the most important master’s thesis of all time.) If transistor A turns on only when transistors B and C are both on, it’s doing a tiny piece of logical reasoning. If A turns on when either B or C is on, that’s another tiny logical operation.

For more information, please contact the Special Markets Department at the Perseus Books Group, 2300 Chestnut Street, Suite 200, Philadelphia, PA 19103, or call (800) 810-4145, ext. 5000, or e-mail special.markets@perseusbooks.com. Library of Congress Cataloging-in-Publication Data Domingos, Pedro. The master algorithm: how the quest for the ultimate learning machine will remake our world / Pedro Domingos. pages cm Includes index. ISBN 978-0-465-06192-1 (ebook) 1. Knowledge representation (Information theory) 2. Artificial intelligence—Social aspects. 3. Artificial intelligence—Philosophy. 4. Cognitive science—Mathematics. 5. Algorithms. I. Title. Q387.D66 2015 003′54—dc23 2015007615 10 9 8 7 6 5 4 3 2 1 TO THE MEMORY OF MY SISTER RITA, WHO LOST HER BATTLE WITH CANCER WHILE I WAS WRITING THIS BOOK The grand aim of science is to cover the greatest number of experimental facts by logical deduction from the smallest number of hypotheses or axioms.

We repeat this for each new node we induce until all the examples in a branch have the same class, at which point we label that branch with the class. One salient question is how to pick the best attribute to test at a node. Accuracy—the number of correctly predicted examples—doesn’t work very well, because we’re not trying to predict a particular class; rather, we’re trying to gradually separate the classes until each branch is “pure.” This brings to mind the concept of entropy from information theory. The entropy of a set of objects is a measure of the amount of disorder in it. If a group of 150 people includes 50 Republicans, 50 Democrats, and 50 independents, its political entropy is maximum. On the other hand, if they’re all Republican then the entropy is zero (as far as party affiliation goes). So to learn a good decision tree, we pick at each node the attribute that on average yields the lowest class entropy across all its branches, weighted by how many examples go into each branch.


pages: 509 words: 132,327

Rise of the Machines: A Cybernetic History by Thomas Rid

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

1960s counterculture, A Declaration of the Independence of Cyberspace, agricultural Revolution, Albert Einstein, Alistair Cooke, Apple II, Apple's 1984 Super Bowl advert, back-to-the-land, Berlin Wall, British Empire, Brownian motion, Buckminster Fuller, business intelligence, Claude Shannon: information theory, conceptual framework, connected car, domain-specific language, Douglas Engelbart, Douglas Engelbart, dumpster diving, Extropian, full employment, game design, global village, Haight Ashbury, Howard Rheingold, Jaron Lanier, job automation, John Markoff, John von Neumann, Kevin Kelly, Marshall McLuhan, Menlo Park, Mother of all demos, new economy, New Journalism, Norbert Wiener, offshore financial centre, oil shale / tar sands, pattern recognition, RAND corporation, Silicon Valley, Simon Singh, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, technoutopianism, Telecommunications Act of 1996, telepresence, V2 rocket, Vernor Vinge, Whole Earth Catalog, Whole Earth Review, Y2K, Yom Kippur War, Zimmermann PGP

Ron Hubbard, July 8, 1950, Norbert Wiener Papers, MC 22, box 8 (“Correspondence 1950”), folder 121, Institute Archives and Special Collections, MIT Libraries, Cambridge, MA. 9.L. Ron Hubbard to Norbert Wiener, July 26, 1950, Norbert Wiener Papers, MC 22, box 8 (“Correspondence 1950”), folder 121, Institute Archives and Special Collections, MIT Libraries, Cambridge, MA. 10.L. Ron Hubbard to Claude Shannon, December 6, 1949, Claude Elwood Shannon Papers, box 1, MSS84831, Library of Congress, Washington, DC. 11.Norbert Wiener to William Schlecht, July 8, 1950. 12.Norbert Wiener, “Some Maxims for Biologists and Psychologists,” Dialectica 4, no. 3 (September 15, 1950): 190. 13.Ibid., 191. 14.Ibid. 15.William Grey Walter, The Living Brain (London: Duckworth, 1953), 223. 16.Ibid. 17.Maxwell Maltz, Psycho-Cybernetics (New York: Pocket Books/Simon & Schuster, 1969), cover. 18.The figure of thirty million is provided by the book’s publisher.

Unsurprisingly, the breakthrough happened in the San Francisco Bay Area of the mid-1970s, with its inspiring mix of counterculture and tech entrepreneurship. These pioneers were Whitfield Diffie and Martin Hellman of Stanford University, and Ralph Merkle of UC Berkeley. Their discovery resembled what the British spy agency had already found in secret. In November 1976, a history-changing article appeared in an obscure journal, IEEE Transactions on Information Theory. It was titled “New Directions in Cryptography.” Diffie and Hellman knew that computers would be coming to the people, as Stewart Brand had just reported from the mesmerized Spacewar players on their own campus. And they knew that these computers would be networked. “The development of computer controlled communication networks promises effortless and inexpensive contact between people or computers on opposite sides of the world,” they wrote in the introduction to their landmark paper.

NM-1B (Washington, DC: Historic American Engineering Record, National Park Service, Department of the Interior, 1986), sheet 1-6. 2.Malcolm Macdonald and Viorel Bedesco, The International Handbook of Space Technology (Heidelberg: Springer, 2014), 8. 3.Norbert Wiener, “A Scientist Rebels,” Atlantic 179, no. 1 (January 1947): 46. 4.Norbert Wiener, I Am a Mathematician (Cambridge, MA: MIT Press, 1956), 308. 5.Ibid. 6.Ibid. 7.Wiesner, quoted in David Jerison, I. M. Singer, and Daniel W. Strook, eds., The Legacy of Norbert Wiener: A Centennial Symposium in Honor of the 100th Anniversary of Norbert Wiener’s Birth, October 8–14, 1994, Massachusetts Institute of Technology, Cambridge, Massachusetts (Providence, RI: American Mathematical Society, 1997), 19. 8.Entropy is a fundamental and related concept in physics as well as in information theory. See James Gleick, The Information (New York: Pantheon, 2011), chap. 9. 9.Norbert Wiener, The Human Use of Human Beings (New York: Houghton Mifflin, 1954), 263. 10.Ibid., 33. 11.Ibid., 24. 12.Norbert Wiener, Cybernetics (Cambridge, MA: MIT Press, 1948), 43. 13.Norbert Wiener, God and Golem, Inc. (Cambridge, MA: MIT Press, 1963), 74. 14.Ibid., 76. 15.Ibid. 16.Wiener, Cybernetics, 157–58; Harry Davis, “An Interview with Norbert Wiener,” New York Times, April 10, 1949, BR23. 17.William Laurence, “Cybernetics, a New Science, Seeks the Common Elements in Human and Machine,” New York Times, December 19, 1948, E9. 18.John Pfeiffer, “The Stuff That Dreams Are Made on: Cybernetics,” New York Times, January 23, 1949, BR27. 19.


pages: 298 words: 43,745

Understanding Sponsored Search: Core Elements of Keyword Advertising by Jim Jansen

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

AltaVista, barriers to entry, Black Swan, bounce rate, business intelligence, butterfly effect, call centre, Claude Shannon: information theory, complexity theory, correlation does not imply causation, en.wikipedia.org, first-price auction, information asymmetry, information retrieval, intangible asset, inventory management, life extension, linear programming, megacity, Nash equilibrium, Network effects, PageRank, place-making, price mechanism, psychological pricing, random walk, Schrödinger's Cat, sealed-bid auction, search engine result page, second-price auction, second-price sealed-bid, sentiment analysis, social web, software as a service, stochastic process, telemarketer, the market place, The Present Situation in Quantum Mechanics, the scientific method, The Wisdom of Crowds, Vickrey auction, Vilfredo Pareto, yield management

In other words, the ability of technology to provide information has increased exponentially (i.e., Moore’s Law), but the ability of humans to process it has remained stable. Potpourri: Why are most query terms short? Some of it may have to do with simple information needs. It may have to do with the cognitive limits of our memory (i.e., the human processing and storage power). (Continued) 42 Understanding Sponsored Search Based on Claude Shannon and Norbert Weiner’s information theory research [21], Harvard psychologist George A. Miller [22] proposed the theoretical construct, known as the Magical Number Seven, Plus or Minus Two. The gist of the construct is that the typical human can process about seven chunks of similar information at any instant in time (sometimes a couple more, sometimes a couple fewer). The concept of chunking is now a foundational element in many theories of memory.

As wealth rises, consumers will move away from less costly, perhaps inferior, goods and services, choosing higher-priced alternatives for a variety of perceived benefits, such as quality or status. 92 Understanding Sponsored Search Concerning their perceived lack of searching, consumers have a trade-off between the cost of search, usually measured by time, and the benefit of that search [9]. A strict consumer search model assumes that consumers are likely to search for information as long as they believe that the benefits of acquiring information outweigh the cost of information search as indicated in the economics of information theory [11]. However, a more workable consumer search model should assume that consumers are likely to search for information until they believe they have a reasonable solution, regardless of the cost-benefit ratio. This behavior, again, illustrates the concept of satisficing [12, 13] and the principle of least effort [14]. Additionally, the searcher may actually be doing more searching then it first appears from just looking at their online activities.

., 116 Hunter, Lee, 19 IdeaLab, 11 imperfect information, 179, 194 impression, 12, 14, 24, 75–77, 133, 209 impulse buying, 93, 97–98 income effect, 91 information access, 54–55, 70, 72 information acquisition, 35, 63 information asymmetry, 63–64 information foraging, 42, 44, 63, 66–67, 70, 72, 80, 96, 213 information foraging theory, 42, 67, 70 information imbalance, 63–64 information load, 63 275 information obtainability, 42–43, 49, 70–71, 211, 213 information overload, 68 information processing theory, 65, 94, 96 information retrieval, 2, 20, 37, 51, 70–71, 154, 211 information scent, 66–67 information science, xiii, 14, 35, 71, 87 information searching, 20, 37, 211 information theory, 42, 92 informational, xi, 44–46, 64, 93, 96 informivore, 66 Infoseek, 38 Intangible content, 131 internal information search, 88 Internet television, 224 iProspect, xvi, 167 iWon, 21 Jansen, Bernard J., 20, 44, 113–114, 116, 118, 158 Johnson, Samuel, 229 Kamangar, Salar, 181 key drivers of marketing, 128 key performance indicators, 24. See€key performance indicators Knowledge uncertainty, 99 known-unknown and aware-unaware, 150 Kotler, P., 128, 138 Kraatz, Mark, 12 La Gazette de France, 122 law of diminishing marginal returns, 50 law of probability, 165 Levine, Rick, 129 Levinson, Danny, 44 Lewis, E.


The Blockchain Alternative: Rethinking Macroeconomic Policy and Economic Theory by Kariappa Bheemaiah

accounting loophole / creative accounting, Ada Lovelace, Airbnb, algorithmic trading, asset allocation, autonomous vehicles, balance sheet recession, bank run, banks create money, Basel III, basic income, Ben Bernanke: helicopter money, bitcoin, blockchain, Bretton Woods, business process, call centre, capital controls, Capital in the Twenty-First Century by Thomas Piketty, cashless society, cellular automata, central bank independence, Claude Shannon: information theory, cloud computing, cognitive dissonance, collateralized debt obligation, commoditize, complexity theory, constrained optimization, corporate governance, creative destruction, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, crowdsourcing, cryptocurrency, David Graeber, deskilling, Diane Coyle, discrete time, distributed ledger, diversification, double entry bookkeeping, ethereum blockchain, fiat currency, financial innovation, financial intermediation, Flash crash, floating exchange rates, Fractional reserve banking, full employment, George Akerlof, illegal immigration, income inequality, income per capita, inflation targeting, information asymmetry, interest rate derivative, inventory management, invisible hand, John Maynard Keynes: technological unemployment, John von Neumann, joint-stock company, Joseph Schumpeter, Kenneth Arrow, Kenneth Rogoff, Kevin Kelly, knowledge economy, labour market flexibility, large denomination, liquidity trap, London Whale, low skilled workers, M-Pesa, Marc Andreessen, market bubble, market fundamentalism, Mexican peso crisis / tequila crisis, money market fund, money: store of value / unit of account / medium of exchange, mortgage debt, natural language processing, Network effects, new economy, Nikolai Kondratiev, offshore financial centre, packet switching, Pareto efficiency, pattern recognition, peer-to-peer lending, Ponzi scheme, precariat, pre–internet, price mechanism, price stability, private sector deleveraging, profit maximization, QR code, quantitative easing, quantitative trading / quantitative finance, Ray Kurzweil, Real Time Gross Settlement, rent control, rent-seeking, Satoshi Nakamoto, Satyajit Das, savings glut, seigniorage, Silicon Valley, Skype, smart contracts, software as a service, software is eating the world, speech recognition, statistical model, Stephen Hawking, supply-chain management, technology bubble, The Chicago School, The Future of Employment, The Great Moderation, the market place, The Nature of the Firm, the payments system, the scientific method, The Wealth of Nations by Adam Smith, Thomas Kuhn: the structure of scientific revolutions, too big to fail, trade liberalization, transaction costs, Turing machine, Turing test, universal basic income, Von Neumann architecture, Washington Consensus

It consisted of multiple rotating disks and cylinders driven by electric motors linked together with metal rods that were manually set up (sometime taking up to two days) to solve any differential equation problem. Vannevar had recruited Claude Shannon (known today as the father of information theory), a young graduate who specialised symbolic logic. Although the Differential Analyzer was a mechanical machine with moving parts, Shannon identified it as a complicated control circuit with relays. Shannon thus began creating the first generation of circuit designs and in the process, was able to transform information into a quantity that could be subjected to manipulation by a machine. Using Boolean algebra, logic gates and binary arithmetic (bits and bytes), Shannon was able to represent all types of information by numbers and in the process created the foundations for today’s modern information theory. It is for this reason that he is referred to as the father of information technology. 218 Chapter 4 ■ Complexity Economics: A New Way to Witness Capitalism As World War Two began in 1939, these advances in information technology had been adopted by various militaries to communicate sensitive information.


pages: 437 words: 132,041

Alex's Adventures in Numberland by Alex Bellos

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Andrew Wiles, Antoine Gombaud: Chevalier de Méré, beat the dealer, Black Swan, Black-Scholes formula, Claude Shannon: information theory, computer age, Daniel Kahneman / Amos Tversky, Edward Thorp, family office, forensic accounting, game design, Georg Cantor, Henri Poincaré, Isaac Newton, Myron Scholes, pattern recognition, Paul Erdős, Pierre-Simon Laplace, probability theory / Blaise Pascal / Pierre de Fermat, random walk, Richard Feynman, Richard Feynman, Rubik’s Cube, SETI@home, Steve Jobs, The Bell Curve by Richard Herrnstein and Charles Murray, traveling salesman

If we knew exactly the speed and angle of the flip, the density of the air and any other relevant physical data, we would be able to calculate exactly the face on which it would land. In the mid 1950s a young mathematician named Ed Thorp began to ponder what set of information would be required to predict where a ball would land in roulette. Thorp was helped in his endeavour by Claude Shannon, his colleague at the Massachusetts Institute of Technology. He couldn’t have wished for a better co-conspirator. Shannon was a prolific inventor with a garage full of electronic and mechanical gadgets. He was also one of the most important mathematicians in the world, as the father of information theory, a crucial academic breakthrough that led to the development of the computer. The men bought a roulette wheel and conducted experiments in Shannon’s basement. They discovered that if they knew the speed of the ball as it went around the stationary outer rim, and the speed of the inner wheel (which is spun in the opposite direction of the ball), they could make pretty good estimates regarding which segment of the wheel the ball would fall in.


pages: 416 words: 129,308

The One Device: The Secret History of the iPhone by Brian Merchant

Airbnb, Apple II, Apple's 1984 Super Bowl advert, citizen journalism, Claude Shannon: information theory, computer vision, conceptual framework, Douglas Engelbart, Dynabook, Edward Snowden, Elon Musk, Ford paid five dollars a day, Frank Gehry, global supply chain, Google Earth, Google Hangouts, Internet of things, Jacquard loom, Jacquard loom, John Gruber, John Markoff, Jony Ive, Lyft, M-Pesa, more computing power than Apollo, Mother of all demos, natural language processing, new economy, New Journalism, Norbert Wiener, offshore financial centre, oil shock, pattern recognition, peak oil, pirate software, profit motive, QWERTY keyboard, ride hailing / ride sharing, rolodex, Silicon Valley, Silicon Valley startup, skunkworks, Skype, Snapchat, special economic zone, speech recognition, stealth mode startup, Stephen Hawking, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Tim Cook: Apple, Turing test, Upton Sinclair, Vannevar Bush, zero day

With the resonant opening line “I propose to consider the question, ‘Can machines think?’” in his 1950 paper “Computing Machinery and Intelligence,” Alan Turing framed much of the debate to come. That work discusses his famous Imitation Game, now colloquially known as the Turing Test, which describes criteria for judging whether a machine may be considered sufficiently “intelligent.” Claude Shannon, the communication theorist, published his seminal work on information theory, introducing the concept of the bit as well as a language through which humans might speak to computers. In 1956, Stanford’s John McCarthy and his colleagues coined the term artificial intelligence for a new discipline, and we were off to the races. Over the next decade, as the scientific investigation of AI began to draw interest from the public and as, simultaneously, computer terminals became a more ubiquitous machine-human interface, the two future threads—screen-based interfaces and AI—wound into one, and the servile human-shaped robots of yore became disembodied.


pages: 685 words: 203,949

The Organized Mind: Thinking Straight in the Age of Information Overload by Daniel J. Levitin

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

airport security, Albert Einstein, Amazon Mechanical Turk, Anton Chekhov, Bayesian statistics, big-box store, business process, call centre, Claude Shannon: information theory, cloud computing, cognitive bias, complexity theory, computer vision, conceptual framework, correlation does not imply causation, crowdsourcing, cuban missile crisis, Daniel Kahneman / Amos Tversky, delayed gratification, Donald Trump, en.wikipedia.org, epigenetics, Eratosthenes, Exxon Valdez, framing effect, friendly fire, fundamental attribution error, Golden Gate Park, Google Glasses, haute cuisine, impulse control, index card, indoor plumbing, information retrieval, invention of writing, iterative process, jimmy wales, job satisfaction, Kickstarter, life extension, meta analysis, meta-analysis, more computing power than Apollo, Network effects, new economy, Nicholas Carr, optical character recognition, Pareto efficiency, pattern recognition, phenotype, placebo effect, pre–internet, profit motive, randomized controlled trial, Rubik’s Cube, Skype, Snapchat, statistical model, Steve Jobs, supply-chain management, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Bayes, Turing test, ultimatum game, zero-sum game

Separate research by Kahneman and Tversky shows that people are unable to ignore information that is not relevant to them, so there is a real neural cost of being presented with information you don’t care about and can’t use. Then the question becomes not one of how many things you can do at once, but how orderly you can make the information environment. There is considerable research into the difference in utility between simple and complex information. Claude Shannon, an electrical engineer who worked at Bell Laboratories, developed information theory in the 1940s. Shannon information theory is among the most important mathematical ideas of the twentieth century; it has profoundly affected computing and telecommunications, and is the basis for the compression of sound, image, and movie files (e.g., MP3, JPEG, and MP4 respectively). A fundamental problem in telecommunications, signaling, and security is how to transmit a message as briefly as possible, to pack the maximum amount of data into the minimum amount of time or space; this packing is called data compression.

information you don’t care about and can’t use Kahneman, D., Slovic, P., & Tversky, A. (Eds.). (1982). Judgment under uncertainty: Heuristics and biases. Cambridge, UK: Cambridge University Press. developed information theory in the 1940s Shannon, C. E. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27, 379–423, 623–656. See also, Cover, T. M., & Thomas, J. A. (2006). Elements of information theory (2nd ed.). New York, NY: Wiley-Interscience. and, Hartley, R. V. L. (1928). Transmission of information. The Bell System Technical Journal, 7(3), 535–563. Pierce, J. R. (1980) An introduction to information theory: Symbols, signals, and noise. New York, NY: Dover Publications. transmitted only 300–3300 hertz Anderson, H., & Yull, S. (2002). BTEC nationals—IT practioners tutor resource pack.

This bandwidth limitation is most apparent if you try to listen to music over the telephone—the low frequencies of the bass and the high frequencies of cymbals are almost completely absent. Information theory came up in Chapter 1 in discussing the number of simultaneous conversations that a person can follow, and the information processing limits of human attention being estimated at around 120 bits per second. It is a way to quantify the amount of information contained in any transmission, instruction, or sensory stimulus. It can apply to music, speech, paintings, and military orders. The application of information theory generates a number that allows us to compare the amount of information contained in one transmission with that contained in another. Suppose that you want to convey instructions to someone on how to construct a chessboard.


pages: 781 words: 226,928

Commodore: A Company on the Edge by Brian Bagnall

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

Apple II, belly landing, Bill Gates: Altair 8800, Byte Shop, Claude Shannon: information theory, computer age, Douglas Engelbart, Douglas Engelbart, Firefox, game design, index card, inventory management, Isaac Newton, low skilled workers, Menlo Park, packet switching, pink-collar, popular electronics, prediction markets, pre–internet, QWERTY keyboard, Robert Metcalfe, Robert X Cringely, Silicon Valley, special economic zone, Steve Jobs, Steve Wozniak, Ted Nelson

One of Shannon’s cultists would soon spread the word to Peddle at the University of Maine. During Peddle’s senior year, the University of Maine accepted a lecturer from MIT who studied under Claude Shannon. According to Peddle, “He had a nervous breakdown, so he left MIT. The University of Maine was so happy to get him because he was so superior to the type of instructor they could normally get. They gave him the opportunity to teach only four classes per week between the hours of eleven o’clock and noon. The guy was being totally babied and should have been since he was a great instructor. He decided to put together a class to teach people about Information Theory.” At the time, Peddle was enrolling for his junior year. The new Information Theory class happened to fit into his schedule. As Peddle recalls, “It changed my life.” The class began with the instructor discussing the eyes and ears as the primary sensors for receiving information.

In 1948, while working at Bell Labs, Shannon produced a groundbreaking paper, A Mathematical Theory of Communication. In it, he rigorously analyzed the concept of Information Theory and how pictures, words, sounds and other media are transmitted using a stream of ones and zeros. He even coined the word “bit.” Peddle was enchanted with his theories. “Today, you take this for granted, but you have to remember that someone had to dream all this up,” he says. “Everyone else’s work stands on his shoulders and most people don’t even know it.” Chuck Peddle, father of the 6502 (Das neue P.M. Computerheft). In 1956, Shannon returned to MIT at Lincoln Labs as a lecturer and Artificial Intelligence researcher. While there, he spread his concepts on Information Theory. “He changed the world,” says Peddle. “Shannon was not only a pioneer, but a prophet. He effectively developed a following, almost like a cult.”

The class began with the instructor discussing the eyes and ears as the primary sensors for receiving information. “He started teaching us about Boolean algebra and binary logic, and the concept of Information Theory,” recalls Peddle. “I just fell in love. This was where I was going to spend my life.” However, the topic that interested Peddle the most was computers. “Information Theory was interesting, and I’ve used it from time to time, but the computer stuff this guy taught me was life changing.” Peddle immersed himself in computer theory. “I got an A on my senior paper in physics class by giving a discussion on binary and Boolean arithmetic. I was trying to build an AND gate in my senior class [using early transistors] and the top electrical engineers on campus couldn’t help me figure out the structures and why my AND gate didn’t work.”


pages: 829 words: 186,976

The Signal and the Noise: Why So Many Predictions Fail-But Some Don't by Nate Silver

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

airport security, availability heuristic, Bayesian statistics, Benoit Mandelbrot, Berlin Wall, Bernie Madoff, big-box store, Black Swan, Broken windows theory, Carmen Reinhart, Claude Shannon: information theory, Climategate, Climatic Research Unit, cognitive dissonance, collapse of Lehman Brothers, collateralized debt obligation, complexity theory, computer age, correlation does not imply causation, Credit Default Swap, credit default swaps / collateralized debt obligations, cuban missile crisis, Daniel Kahneman / Amos Tversky, diversification, Donald Trump, Edmond Halley, Edward Lorenz: Chaos theory, en.wikipedia.org, equity premium, Eugene Fama: efficient market hypothesis, everywhere but in the productivity statistics, fear of failure, Fellow of the Royal Society, Freestyle chess, fudge factor, George Akerlof, haute cuisine, Henri Poincaré, high batting average, housing crisis, income per capita, index fund, Intergovernmental Panel on Climate Change (IPCC), Internet Archive, invention of the printing press, invisible hand, Isaac Newton, James Watt: steam engine, John Nash: game theory, John von Neumann, Kenneth Rogoff, knowledge economy, locking in a profit, Loma Prieta earthquake, market bubble, Mikhail Gorbachev, Moneyball by Michael Lewis explains big data, Monroe Doctrine, mortgage debt, Nate Silver, negative equity, new economy, Norbert Wiener, PageRank, pattern recognition, pets.com, Pierre-Simon Laplace, prediction markets, Productivity paradox, random walk, Richard Thaler, Robert Shiller, Robert Shiller, Rodney Brooks, Ronald Reagan, Saturday Night Live, savings glut, security theater, short selling, Skype, statistical model, Steven Pinker, The Great Moderation, The Market for Lemons, the scientific method, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, too big to fail, transaction costs, transfer pricing, University of East Anglia, Watson beat the top human players on Jeopardy!, wikimedia commons

The Birth of the Chess Computer The Spanish engineer Leonardo Torres y Quevedo built a version of the Mechanical Turk in 1912, which he called El Ajedrecista (the chess player). Although El Ajedrecista is sometimes regarded as the first computer game,7 it was extremely limited in its functionality, restricted to determining positions in an endgame in which there are just three pieces left on the board. (El Ajedrecista also did not have any stereotypical Turkish headgear.) The father of the modern chess computer was MIT’s Claude Shannon, a mathematician regarded as the founder of information theory, who in 1950 published a paper called “Programming a Computer for Playing Chess.”8 Shannon identified some of the algorithms and techniques that form the backbone of chess programs today. He also recognized why chess is such an interesting problem for testing the powers of information-processing machines. Chess, Shannon realized, has an exceptionally clear and distinct goal—achieving checkmate.

., 187 incentives, 184, 250, 313, 333, 356, 357, 501 Inconvenient Truth, An, 385 index funds, 344, 370 India, 210n Indiana Pacers, 236n, 489 indicators: lagging, 187–88 leading, 186–88, 196–97 Indonesia, 209 Industrial Revolution, 2, 5–6, 112, 212 infectious disease, 16, 204–31 SIR model of, 220–21, 221, 223, 225, 389 see also specific diseases inflation, 11, 186n, 191, 198, 202 information, 1, 451 analyzing, 232 asymmetrical, 35 collecting, 232 exclusive, 98, 99–101 explosion of, 3–4, 6, 7, 12, 13, 17, 45 forecasting and, 5 quality of, 2, 13, 17 information age, 7, 45, 358 information overload, 12 information processing, 449 information technology, 411 revolution in, 1–4 information theory, 265 InfoSpace, 353, 360 initial condition uncertainty, 390–92 insider trading, 341–42 intelligent extraterrestrial species, 488 interest rates, 186n, 190, 202 International Panel on Climate Change (IPCC), 382, 383–84, 393 1990 predictions of, 373–76, 389, 393, 397–99, 397, 399, 401, 507 “uncertainty” in, 389 Internet, 13, 223, 250, 514 poker on, 296–97, 310 quality of information on, 3 Intrade, 333, 334, 335, 336–37, 358, 367, 497 invisible hand, 332 Ioannidis, John P.

The Singularity Is Near: When Humans Transcend Biology by Ray Kurzweil

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

additive manufacturing, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anthropic principle, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, Benoit Mandelbrot, Bill Joy: nanobots, bioinformatics, brain emulation, Brewster Kahle, Brownian motion, business intelligence, c2.com, call centre, carbon-based life, cellular automata, Claude Shannon: information theory, complexity theory, conceptual framework, Conway's Game of Life, cosmological constant, cosmological principle, cuban missile crisis, data acquisition, Dava Sobel, David Brooks, Dean Kamen, disintermediation, double helix, Douglas Hofstadter, en.wikipedia.org, epigenetics, factory automation, friendly AI, George Gilder, Gödel, Escher, Bach, informal economy, information retrieval, invention of the telephone, invention of the telescope, invention of writing, Isaac Newton, iterative process, Jaron Lanier, Jeff Bezos, job automation, job satisfaction, John von Neumann, Kevin Kelly, Law of Accelerating Returns, life extension, lifelogging, linked data, Loebner Prize, Louis Pasteur, mandelbrot fractal, Mikhail Gorbachev, mouse model, Murray Gell-Mann, mutually assured destruction, natural language processing, Network effects, new economy, Norbert Wiener, oil shale / tar sands, optical character recognition, pattern recognition, phenotype, premature optimization, randomized controlled trial, Ray Kurzweil, remote working, reversible computing, Richard Feynman, Richard Feynman, Robert Metcalfe, Rodney Brooks, Search for Extraterrestrial Intelligence, selection bias, semantic web, Silicon Valley, Singularitarianism, speech recognition, statistical model, stem cell, Stephen Hawking, Stewart Brand, strong AI, superintelligent machines, technological singularity, Ted Kaczynski, telepresence, The Coming Technological Singularity, Thomas Bayes, transaction costs, Turing machine, Turing test, Vernor Vinge, Y2K, Yogi Berra

See Theodore Modis, "Forecasting the Growth of Complexity and Change," Technological Forecasting and Social Change 69.4 (2002), http://ourworld.compuserve.com/homepages/tmodis/TedWEB.htm. 3. Compressing files is a key aspect of both data transmission (such as a music or text file over the Internet) and data storage. The smaller the file is, the less time it will take to transmit and the less space it will require. The mathematician Claude Shannon, often called the father of information theory, defined the basic theory of data compression in his paper "A Mathematical Theory of Communication," The Bell System Technical Journal 27 (July–October 1948): 379–423, 623–56. Data compression is possible because of factors such as redundancy (repetition) and probability of appearance of character combinations in data. For example, silence in an audio file could be replaced by a value that indicates the duration of the silence, and letter combinations in a text file could be replaced with coded identifiers in the compressed file.

Data Mining: Concepts and Techniques: Concepts and Techniques by Jiawei Han, Micheline Kamber, Jian Pei

Amazon: amazon.comamazon.co.ukamazon.deamazon.fr

bioinformatics, business intelligence, business process, Claude Shannon: information theory, cloud computing, computer vision, correlation coefficient, cyber-physical system, database schema, discrete time, distributed generation, finite state, information retrieval, iterative process, knowledge worker, linked data, natural language processing, Netflix Prize, Occam's razor, pattern recognition, performance metric, phenotype, random walk, recommendation engine, RFID, semantic web, sentiment analysis, speech recognition, statistical model, stochastic process, supply-chain management, text mining, thinkpad, Thomas Bayes, web application

Let D, the data partition, be a training set of class-labeled tuples. Suppose the class label attribute has m distinct values defining m distinct classes, Ci (for ). Let be the set of tuples of class Ci in D. Let and denote the number of tuples in D and , respectively. Information Gain ID3 uses information gain as its attribute selection measure. This measure is based on pioneering work by Claude Shannon on information theory, which studied the value or “information content” of messages. Let node N represent or hold the tuples of partition D. The attribute with the highest information gain is chosen as the splitting attribute for node N. This attribute minimizes the information needed to classify the tuples in the resulting partitions and reflects the least randomness or “impurity” in these partitions.

Perception-based classification (PBC), a visual and interactive approach to decision tree construction, is presented in Ankerst, Elsen, Ester, and Kriegel [AEEK99]. For a detailed discussion on attribute selection measures, see Kononenko and Hong [KH97]. Information gain was proposed by Quinlan [Qui86] and is based on pioneering work on information theory by Shannon and Weaver [SW49]. The gain ratio, proposed as an extension to information gain, is described as part of C4.5 (Quinlan [Qui93]). The Gini index was proposed for CART in Breiman, Friedman, Olshen, and Stone [BFOS84]. The G-statistic, based on information theory, is given in Sokal and Rohlf [SR81]. Comparisons of attribute selection measures include Buntine and Niblett [BN92], Fayyad and Irani [FI92], Kononenko [Kon95], Loh and Shih [LS97] and Shih [Shi99]. Fayyad and Irani [FI92] show limitations of impurity-based measures such as information gain and the Gini index.

A good weighting function should obey the following properties: (1) the best semantic indicator of a pattern, p, is itself, (2) assign the same score to two patterns if they are equally strong, and (3) if two patterns are independent, neither can indicate the meaning of the other. The meaning of a pattern, p, can be inferred from either the appearance or absence of indicators. Mutual information is one of several possible weighting functions. It is widely used in information theory to measure the mutual independency of two random variables. Intuitively, it measures how much information a random variable tells about the other. Given two frequent patterns, pα and pβ, let X = {0, 1} and Y = {0, 1} be two random variables representing the appearance of pα and pβ, respectively. Mutual information I(X; Y) is computed as(7.18) where , , , and . Standard Laplace smoothing can be used to avoid zero probability.