probability theory / Blaise Pascal / Pierre de Fermat

25 results back to index


pages: 266 words: 86,324

The Drunkard's Walk: How Randomness Rules Our Lives by Leonard Mlodinow

Albert Einstein, Alfred Russel Wallace, Antoine Gombaud: Chevalier de Méré, Atul Gawande, behavioural economics, Brownian motion, butterfly effect, correlation coefficient, Daniel Kahneman / Amos Tversky, data science, Donald Trump, feminist movement, forensic accounting, Gary Kildall, Gerolamo Cardano, Henri Poincaré, index fund, Isaac Newton, law of one price, Monty Hall problem, pattern recognition, Paul Erdős, Pepto Bismol, probability theory / Blaise Pascal / Pierre de Fermat, RAND corporation, random walk, Richard Feynman, Ronald Reagan, Stephen Hawking, Steve Jobs, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, Thomas Bayes, V2 rocket, Watson beat the top human players on Jeopardy!

Though the new ideas would again be developed in the context of gambling, the first of this new breed was more a mathematician turned gambler than, like Cardano, a gambler turned mathematician. His name was Blaise Pascal. Pascal was born in June 1623 in Clermont-Ferrand, a little more than 250 miles south of Paris. Realizing his son’s brilliance, and having moved to Paris, Blaise’s father introduced him at age thirteen to a newly founded discussion group there that insiders called the Académie Mersenne after the black-robed friar who had founded it. Mersenne’s society included the famed philosopher-mathematician René Descartes and the amateur mathematics genius Pierre de Fermat. The strange mix of brilliant thinkers and large egos, with Mersenne present to stir the pot, must have had a great influence on the teenage Blaise, who developed personal ties to both Fermat and Descartes and picked up a deep grounding in the new scientific method.

And so in 1654 began one of the great correspondences in the history of mathematics, between Pascal and Pierre de Fermat. In 1654, Fermat held a high position in the Tournelle, or criminal court, in Toulouse. When the court was in session, a finely robed Fermat might be found condemning errant functionaries to be burned at the stake. But when the court was not in session, he would turn his analytic skills to the gentler pursuit of mathematics. He may have been an amateur, but Pierre de Fermat is usually considered the greatest amateur mathematician of all times. Fermat had not gained his high position through any particular ambition or accomplishment.

Henk Tijms, Understanding Probability: Chance Rules in Everyday Life (Cambridge: Cambridge University Press, 2004), p. 16. 4. Ibid., p. 80. 5. David, Gods, Games and Gambling, p. 65. 6. Blaise Pascal, quoted in Jean Steinmann, Pascal, trans. Martin Turnell (New York: Harcourt, Brace & World, 1962), p. 72. 7. Gilberte Pascal, quoted in Morris Bishop, Pascal: The Life of a Genius (1936; repr., New York: Greenwood Press, 1968), p. 47. 8. Ibid., p. 137. 9. Gilberte Pascal, quoted ibid., p. 135. 10. See A.W.F. Edwards, Pascal’s Arithmetical Triangle: The Story of a Mathematical Idea (Baltimore: Johns Hopkins University Press, 2002). 11. Blaise Pascal, quoted in Herbert Westren Turnbull, The Great Mathematicians (New York: New York University Press, 1961), p. 131. 12.


pages: 415 words: 125,089

Against the Gods: The Remarkable Story of Risk by Peter L. Bernstein

Alan Greenspan, Albert Einstein, Alvin Roth, Andrew Wiles, Antoine Gombaud: Chevalier de Méré, Bayesian statistics, behavioural economics, Big bang: deregulation of the City of London, Bretton Woods, business cycle, buttonwood tree, buy and hold, capital asset pricing model, cognitive dissonance, computerized trading, Daniel Kahneman / Amos Tversky, diversified portfolio, double entry bookkeeping, Edmond Halley, Edward Lloyd's coffeehouse, endowment effect, experimental economics, fear of failure, Fellow of the Royal Society, Fermat's Last Theorem, financial deregulation, financial engineering, financial innovation, full employment, Great Leap Forward, index fund, invention of movable type, Isaac Newton, John Nash: game theory, John von Neumann, Kenneth Arrow, linear programming, loss aversion, Louis Bachelier, mental accounting, moral hazard, Myron Scholes, Nash equilibrium, Norman Macrae, Paul Samuelson, Philip Mirowski, Post-Keynesian economics, probability theory / Blaise Pascal / Pierre de Fermat, prudent man rule, random walk, Richard Thaler, Robert Shiller, Robert Solow, spectrum auction, statistical model, stocks for the long run, The Bell Curve by Richard Herrnstein and Charles Murray, The Wealth of Nations by Adam Smith, Thomas Bayes, trade route, transaction costs, tulip mania, Vanguard fund, zero-sum game

In 1654, a time when the Renaissance was in full flower, the Chevalier de Mere, a French nobleman with a taste for both gambling and mathematics, challenged the famed French mathematician Blaise Pascal to solve a puzzle. The question was how to divide the stakes of an unfinished game of chance between two players when one of them is ahead. The puzzle had confounded mathematicians since it was posed some two hundred years earlier by the monk Luca Paccioli. This was the man who brought double-entry bookkeeping to the attention of the business managers of his day-and tutored Leonardo da Vinci in the multiplication tables. Pascal turned for help to Pierre de Fermat, a lawyer who was also a brilliant mathematician. The outcome of their collaboration was intellectual dynamite.

By 1700, as we mentioned earlier, the English government was financing its budget deficits through the sale of life annuities. The story of the three Frenchmen begins with an unlikely trio who saw beyond the gaming tables and fashioned the systematic and theoretical foundations for measuring probability. The first, Blaise Pascal, was a brilliant young dissolute who subsequently became a religious zealot and ended up rejecting the use of reason. The second, Pierre de Fermat, was a successful lawyer for whom mathematics was a sideline. The third member of the group was a nobleman, the Chevalier de Mere, who combined his taste for mathematics with an irresistible urge to play games of chance; his fame rests simply on his having posed the question that set the other two on the road to discovery.

In addition, Omar Khayyam used technical mathematical observations to reform the calendar and to devise a triangular rearrangement of numbers that facilitated the figuring of squares, cubes, and higher powers of mathematics; this triangle formed the basis of concepts developed by the seventeenth-century French mathematician Blaise Pascal, one of the fathers of the theory of choice, chance, and probability. The impressive achievements of the Arabs suggest once again that an idea can go so far and still stop short of a logical conclusion. Why, given their advanced mathematical ideas, did the Arabs not proceed to probability theory and risk management? The answer, I believe, has to do with their view of life. Who determines our future: the fates, the gods, or ourselves? The idea of risk management emerges only when people believe that they are to some degree free agents.


pages: 289 words: 85,315

Fermat’s Last Theorem by Simon Singh

Albert Einstein, Andrew Wiles, Antoine Gombaud: Chevalier de Méré, Arthur Eddington, Augustin-Louis Cauchy, Bletchley Park, Fellow of the Royal Society, Georg Cantor, Henri Poincaré, Isaac Newton, John Conway, John von Neumann, kremlinology, probability theory / Blaise Pascal / Pierre de Fermat, RAND corporation, Rubik’s Cube, Simon Singh, Wolfskehl Prize

Mathematical Magic Show, by Martin Gardner, 1977, Knopf. A collection of mathematical puzzles and riddles. River meandering as a self-organization process, by Hans-Henrik Støllum, Science 271 (1996), 1710-1713. Chapter 2 The Mathematical Career of Pierre de Fermat, by Michael Mahoney, 1994, Princeton University Press. A detailed investigation into the life and work of Pierre de Fermat. Archimedes’ Revenge, by Paul Hoffman, 1988, Penguin. Fascinating tales which describe the joys and perils of mathematics. Chapter 3 Men of Mathematics, by E.T. Bell, Simon and Schuster, 1937. Biographies of history’s greatest mathematicians, including Euler, Fermat, Gauss, Cauchy and Kummer.

Once published, proofs would be examined and argued over by everyone and anyone who knew anything about the subject. When Blaise Pascal pressed him to publish some of his work, the recluse replied: ‘Whatever of my work is judged worthy of publication, I do not want my name to appear there.’ Fermat was the secretive genius who sacrificed fame in order not to be distracted by petty questions from his critics. This exchange of letters with Pascal, the only occasion when Fermat discussed ideas with anyone but Mersenne, concerned the creation of an entirely new branch of mathematics – probability theory. The mathematical hermit was introduced to the subject by Pascal, and so, despite his desire for isolation, he felt obliged to maintain a dialogue.

I built up an extraordinary picture around Andrew Wiles, and the puzzle that dominated his life, but I seemed destined never to meet the man himself. Although the maths involved in Wiles’s proof is some of the toughest in the world, I found that the beauty of Fermat’s Last Theorem lies in the fact that the problem itself is supremely simple to understand. It is a puzzle that is stated in terms familiar to every schoolchild. Pierre de Fermat was a man in the Renaissance tradition, who was at the centre of the rediscovery of ancient Greek knowledge, but he asked a question that the Greeks would not have thought to ask, and in so doing produced what became the hardest problem on earth for others to solve. Tantalisingly, he left a note for posterity suggesting that he had an answer, but not what it was.


pages: 437 words: 132,041

Alex's Adventures in Numberland by Alex Bellos

Andrew Wiles, Antoine Gombaud: Chevalier de Méré, beat the dealer, Black Swan, Black-Scholes formula, Claude Shannon: information theory, computer age, Daniel Kahneman / Amos Tversky, digital rights, Edward Thorp, family office, forensic accounting, game design, Georg Cantor, Henri Poincaré, Isaac Newton, Johannes Kepler, lateral thinking, Myron Scholes, pattern recognition, Paul Erdős, Pierre-Simon Laplace, probability theory / Blaise Pascal / Pierre de Fermat, random walk, Richard Feynman, Rubik’s Cube, SETI@home, Steve Jobs, The Bell Curve by Richard Herrnstein and Charles Murray, traveling salesman, two and twenty

He had a couple of questions about gambling, though, that he was unable to answer himself, so in 1654 he approached the distinguished mathematician Blaise Pascal. His chance enquiry was the random event that set in motion the proper study of randomness. Blaise Pascal was only 31 when he received de Méré’s queries, but he had been known in intellectual circles for almost two decades. Pascal had shown such gifts as a young child that at 13 his father had let him attend the scientific salon organized by Marin Mersenne, the friar and prime-number enthusiast, which brought together many famous mathematicians, including René Descartes and Pierre de Fermat. While still a teenager, Pascal proved important theorems in geometry and invented an early mechanical calculation machine, which was called the Pascaline.

In 1621, a Latin translation of Diophantus’s masterpiece Arithmetica was published in France. The new edition rekindled interest in ancient problem-solving techniques, which, combined with better numerical and symbolic notation, ushered in a new era of mathematical thought. Less convoluted notation allowed greater clarity in descrig problems. Pierre de Fermat, a civil servant and judge living in Toulouse, was an enthusiastic amateur mathematician who filled his own copy of Arithmetica with numerical musings. Next to a section dealing with Pythagorean triples – any set of natural numbers a, b and c such that a2 + b2 = c2, for example 3, 4 and 5 – Fermat scribbled some notes in the margin.

The Pythagoreans saw 220 and 284 as symbols of friendship. During the Middle Ages talismans with these numbers were made, to promote love. One Arab wrote that he tried to test the erotic effect of eatg something labelled with the number 284, while a partner was eating something labelled 220. It was only in 1636 that Pierre de Fermat discovered the second set of amicable numbers: 17,296 and 18,416. Because of the advent of computer processing, more than 11 million amicable pairs are now known. The largest pair has more than 24,000 digits each, which makes them tricky to write on a slice of baklava. In 1918 the French mathematician Paul Poulet coined the term sociable for a new type of numerical friendship.


pages: 233 words: 62,563

Zero: The Biography of a Dangerous Idea by Charles Seife

Albert Einstein, Albert Michelson, Arthur Eddington, Cepheid variable, cosmological constant, dark matter, Eddington experiment, Edmond Halley, Georg Cantor, heat death of the universe, Isaac Newton, Johannes Kepler, John Conway, machine readable, Pierre-Simon Laplace, place-making, probability theory / Blaise Pascal / Pierre de Fermat, retrograde motion, Richard Feynman, Stephen Hawking

The Divine Wager What is man in nature? Nothing in relation to the infinite, everything in relation to nothing, a mean between nothing and everything. —BLAISE PASCAL, PENSÉES Pascal was a mathematician as well as a scientist. In science Pascal investigated the vacuum—the nature of the void. In mathematics Pascal helped invent a whole new branch of the field: probability theory. When Pascal combined probability theory with zero and with infinity, he found God. Probability theory was invented to help rich aristocrats win more money with their gambling. Pascal’s theory was extremely successful, but his mathematical career was not to last.

The steepness of the tangent line—its slope—has some important properties in physics: for instance, if you’ve got a curve that represents the position of, say, a bicycle, then the slope of the tangent line to that curve at any given point tells you how fast that bicycle is going when it reaches that spot. Figure 24: Flying off at a tangent For this reason, several seventeenth-century mathematicians—like Evangelista Torricelli, René Descartes, the Frenchman Pierre de Fermat (famous for his last theorem), and the Englishman Isaac Barrow—created different methods for calculating the tangent line to any given point on a curve. However, like Cavalieri, all of them came up against the infinitesimal. To draw a tangent line at any given point, it’s best to make a guess.

No matter the dimensions of the tube that Torricelli used, the mercury would sink down until its highest point was about 30 inches above the dish; or, looking at it another way, mercury could only rise 30 inches to combat the vacuum above it. Nature only abhorred a vacuum as far as 30 inches. It would take an anti-Descartes to explain why. In 1623, Descartes was twenty-seven, and Blaise Pascal, who would become Descartes’s opponent, was zero years old. Pascal’s father, Étienne, was an accomplished scientist and mathematician; the young Blaise was a genius equal to his father. As a young man, Blaise invented a mechanical calculating machine, named the Pascaline, which is similar to some of the mechanical calculators that engineers used before the invention of the electronic calculator.


pages: 262 words: 65,959

The Simpsons and Their Mathematical Secrets by Simon Singh

Albert Einstein, Andrew Wiles, Apollo 13, Benoit Mandelbrot, Bletchley Park, cognitive dissonance, Donald Knuth, Erdős number, Georg Cantor, Grace Hopper, Higgs boson, Isaac Newton, John Nash: game theory, Kickstarter, mandelbrot fractal, Menlo Park, Norbert Wiener, Norman Mailer, P = NP, Paul Erdős, probability theory / Blaise Pascal / Pierre de Fermat, quantum cryptography, Richard Feynman, Rubik’s Cube, Schrödinger's Cat, Simon Singh, Stephen Hawking, Wolfskehl Prize, women in the workforce

“The Saga of Carl” opens with Marge dragging her family away from the television and taking them on an educational trip to the Hall of Probability at Springfield’s Science Museum. There, they watch a video introduced by an actor playing the role of Blaise Pascal (1623–62), the father of probability theory, and they also see an experimental demonstration of probability theory known as the Galton board. This involves marbles rolling down a slope and ricocheting off a series of pins. At each pin, the marbles bounce randomly to the left or right, only to hit the next row of pins and be met by the same random opportunity.

The second line on Homer’s blackboard is perhaps the most interesting, as it contains the following equation: 3,98712 + 4,36512 = 4,47212 The equation appears to be innocuous at first sight, unless you know something about the history of mathematics, in which case you are about to smash up your slide rule in disgust. For Homer seems to have achieved the impossible and found a solution to the notorious mystery of Fermat’s last theorem! Pierre de Fermat first proposed this theorem in about 1637. Despite being an amateur who only solved problems in his spare time, Fermat was one of the greatest mathematicians in history. Working in isolation at his home in southern France, his only mathematical companion was a book called Arithmetica, written by Diophantus of Alexandria in the third century A.D.

He began by stating that there are no whole number solutions for any of the infinite number of equations above, and then he confidently added this second sentence: “Cuius rei demonstrationem mirabilem sane detexi, hanc marginis exiguitas non caperet.” (I have discovered a truly marvelous proof of this, which this margin is too narrow to contain.) Pierre de Fermat had found a proof, but he did not bother to write it down. This is perhaps the most frustrating note in the history of mathematics, particularly as Fermat took his secret to the grave. Fermat’s son Clément-Samuel later found his father’s copy of Arithmetica and noticed this intriguing marginal note.


pages: 442 words: 94,734

The Art of Statistics: Learning From Data by David Spiegelhalter

Abraham Wald, algorithmic bias, Anthropocene, Antoine Gombaud: Chevalier de Méré, Bayesian statistics, Brexit referendum, Carmen Reinhart, Charles Babbage, complexity theory, computer vision, confounding variable, correlation coefficient, correlation does not imply causation, dark matter, data science, deep learning, DeepMind, Edmond Halley, Estimating the Reproducibility of Psychological Science, government statistician, Gregor Mendel, Hans Rosling, Higgs boson, Kenneth Rogoff, meta-analysis, Nate Silver, Netflix Prize, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, p-value, placebo effect, probability theory / Blaise Pascal / Pierre de Fermat, publication bias, randomized controlled trial, recommendation engine, replication crisis, self-driving car, seminal paper, sparse data, speech recognition, statistical model, sugar pill, systematic bias, TED Talk, The Design of Experiments, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, Thomas Bayes, Thomas Malthus, Two Sigma

People have always counted and measured, but modern statistics as a discipline really began in the 1650s when, as we shall see in Chapter 8, probability was properly understood for the first time by Blaise Pascal and Pierre de Fermat. Given this solid mathematical basis for dealing with variability, progress was then remarkably rapid. When combined with data on the ages at which people die, the theory of probability provided a firm basis for calculating pensions and annuities. Astronomy was revolutionized when scientists grasped how probability theory could handle variability in measurements. Victorian enthusiasts became obsessed with collecting data about the human body (and everything else), and established a strong connection between statistical analysis and genetics, biology and medicine.

This went against his (erroneous) attempts at calculating the chances of winning,fn1 and so he turned to the fashionable Mersenne Salon in Paris for help. Fortunately the philosopher Blaise Pascal was also a member of the Salon, and Pascal in turn wrote to his friend Pierre de Fermat (the one with the famous Last Theorem) about the problems presented by the Chevalier. Together they developed the first steps in probability theory. Despite the fact that for millennia humans have gambled on the way bits of bone or dice would turn up when thrown, the formal theory of probability is a comparatively recent idea.

Traditionally a statistics course would start with probability – that is how I have always begun when teaching in Cambridge – but this rather mathematical initiation can be an obstruction to grasping all the important ideas in the preceding chapters that did not require probability theory. In contrast, this book is part of what could be called a new wave in statistics teaching, in which formal probability theory as a basis for statistical inference does not come in till much later.2 We have seen that computer simulation is a very powerful tool for both exploring possible future events and bootstrapping historical data, but it is a rather clumsy and brute-force way of carrying out statistical analysis. So although we have got a long way while avoiding formal probability theory, it is time to face up to its vital role in providing ‘the language of uncertainty’.


pages: 404 words: 92,713

The Art of Statistics: How to Learn From Data by David Spiegelhalter

Abraham Wald, algorithmic bias, Antoine Gombaud: Chevalier de Méré, Bayesian statistics, Brexit referendum, Carmen Reinhart, Charles Babbage, complexity theory, computer vision, confounding variable, correlation coefficient, correlation does not imply causation, dark matter, data science, deep learning, DeepMind, Edmond Halley, Estimating the Reproducibility of Psychological Science, government statistician, Gregor Mendel, Hans Rosling, Higgs boson, Kenneth Rogoff, meta-analysis, Nate Silver, Netflix Prize, Northpointe / Correctional Offender Management Profiling for Alternative Sanctions, p-value, placebo effect, probability theory / Blaise Pascal / Pierre de Fermat, publication bias, randomized controlled trial, recommendation engine, replication crisis, self-driving car, seminal paper, sparse data, speech recognition, statistical model, sugar pill, systematic bias, TED Talk, The Design of Experiments, The Signal and the Noise by Nate Silver, The Wisdom of Crowds, Thomas Bayes, Thomas Malthus, Two Sigma

People have always counted and measured, but modern statistics as a discipline really began in the 1650s when, as we shall see in Chapter 8, probability was properly understood for the first time by Blaise Pascal and Pierre de Fermat. Given this solid mathematical basis for dealing with variability, progress was then remarkably rapid. When combined with data on the ages at which people die, the theory of probability provided a firm basis for calculating pensions and annuities. Astronomy was revolutionized when scientists grasped how probability theory could handle variability in measurements. Victorian enthusiasts became obsessed with collecting data about the human body (and everything else), and established a strong connection between statistical analysis and genetics, biology and medicine.

This went against his (erroneous) attempts at calculating the chances of winning,* and so he turned to the fashionable Mersenne Salon in Paris for help. Fortunately the philosopher Blaise Pascal was also a member of the Salon, and Pascal in turn wrote to his friend Pierre de Fermat (the one with the famous Last Theorem) about the problems presented by the Chevalier. Together they developed the first steps in probability theory. Despite the fact that for millennia humans have gambled on the way bits of bone or dice would turn up when thrown, the formal theory of probability is a comparatively recent idea.

Traditionally a statistics course would start with probability—that is how I have always begun when teaching in Cambridge—but this rather mathematical initiation can be an obstruction to grasping all the important ideas in the preceding chapters that did not require probability theory. In contrast, this book is part of what could be called a new wave in statistics teaching, in which formal probability theory as a basis for statistical inference does not come in till much later.2 We have seen that computer simulation is a very powerful tool for both exploring possible future events and bootstrapping historical data, but it is a rather clumsy and brute-force way of carrying out statistical analysis. So although we have got a long way while avoiding formal probability theory, it is time to face up to its vital role in providing ‘the language of uncertainty’.


pages: 218 words: 63,471

How We Got Here: A Slightly Irreverent History of Technology and Markets by Andy Kessler

Albert Einstein, Andy Kessler, animal electricity, automated trading system, bank run, Big bang: deregulation of the City of London, Black Monday: stock market crash in 1987, Bletchley Park, Bob Noyce, Bretton Woods, British Empire, buttonwood tree, Charles Babbage, Claude Shannon: information theory, Corn Laws, cotton gin, Dennis Ritchie, Douglas Engelbart, Edward Lloyd's coffeehouse, Fairchild Semiconductor, fiat currency, fixed income, floating exchange rates, flying shuttle, Fractional reserve banking, full employment, GPS: selective availability, Grace Hopper, invention of the steam engine, invention of the telephone, invisible hand, Isaac Newton, Jacquard loom, James Hargreaves, James Watt: steam engine, John von Neumann, joint-stock company, joint-stock limited liability company, Joseph-Marie Jacquard, Ken Thompson, Kickstarter, Leonard Kleinrock, Marc Andreessen, Mary Meeker, Maui Hawaii, Menlo Park, Metcalfe's law, Metcalfe’s law, military-industrial complex, Mitch Kapor, Multics, packet switching, pneumatic tube, price mechanism, probability theory / Blaise Pascal / Pierre de Fermat, profit motive, proprietary trading, railway mania, RAND corporation, Robert Metcalfe, Silicon Valley, Small Order Execution System, South Sea Bubble, spice trade, spinning jenny, Steve Jobs, Suez canal 1869, supply-chain management, supply-chain management software, systems thinking, three-martini lunch, trade route, transatlantic slave trade, tulip mania, Turing machine, Turing test, undersea cable, UUNET, Wayback Machine, William Shockley: the traitorous eight

Sure, they had a gut feel for the likelihood of a caravan being robbed or a squall hitting a ship of Grecian urns, but they were just making it up. Our hero Blaise Pascal did more than invent the calculator for his tax collecting father, lighting a slow fuse on the computer revolution. He also proved that vacuums exist and that pressure could be measured with a tube of inverted mercury – two phenomena that James Watt needed to get his steam engine working. But Pascal’s rapidly firing mathematical mind would go to other, more near-term pursuits. He was a vicious gambler. He had to know when to hold ‘em and when to fold them, as well as calculate the odds on any given roll of the dice. Along with Pierre de Fermat, whose Last Theorem puzzled math-heads, Pascal struggled on one particular gambling puzzle: How two dice rollers would split the stakes or bets if they left before finishing the game.

Why does the U.S. have any industrial businesses left? There are too many questions to answer. So instead, I wrote this primer. Enjoy. Send me feedback, ideas and suggestions at akessler@velcap.com with HWGH in the subject. Logic and Memory I hate to admit it, but it was taxes that got it all started. In 1642, 18-year-old Blaise Pascal, the son of a French tax collector, tired of waiting for his dad to come play a game of “le catch”. Blaise’s dad was what is known as a tax farmer, sort of a 17th century version of a loan shark, threat of broken bones and all. Tax farmers advanced tax money to the government and then had a license to collect taxes, hopefully “harvesting” more than they advanced.

In 1649, King Louis XIV granted Pascal a patent for his odd device but it failed to effect much change over the next 45 years. Pascal, by the way, would contribute more than a mechanical calculator to this tale. He proved that vacuums exist; that one could measure pressure by inverting a tube of mercury; and in figuring out how others could beat the house at gambling, ended up inventing probability theory. 8 HOW WE GOT HERE In 1694, a German, Gottfried Wilhelm von Leibniz, created a box similar to Pascal’s but his could actually multiply. Leibniz used something called a stepped drum, a cylinder with a number of cogs carved into it, and gears that would engage a different number of cogs depending on their position.


pages: 416 words: 112,268

Human Compatible: Artificial Intelligence and the Problem of Control by Stuart Russell

3D printing, Ada Lovelace, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Alfred Russel Wallace, algorithmic bias, AlphaGo, Andrew Wiles, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, augmented reality, autonomous vehicles, basic income, behavioural economics, Bletchley Park, blockchain, Boston Dynamics, brain emulation, Cass Sunstein, Charles Babbage, Claude Shannon: information theory, complexity theory, computer vision, Computing Machinery and Intelligence, connected car, CRISPR, crowdsourcing, Daniel Kahneman / Amos Tversky, data science, deep learning, deepfake, DeepMind, delayed gratification, Demis Hassabis, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Ernest Rutherford, fake news, Flash crash, full employment, future of work, Garrett Hardin, Geoffrey Hinton, Gerolamo Cardano, Goodhart's law, Hans Moravec, ImageNet competition, Intergovernmental Panel on Climate Change (IPCC), Internet of things, invention of the wheel, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Nash: game theory, John von Neumann, Kenneth Arrow, Kevin Kelly, Law of Accelerating Returns, luminiferous ether, machine readable, machine translation, Mark Zuckerberg, multi-armed bandit, Nash equilibrium, Nick Bostrom, Norbert Wiener, NP-complete, OpenAI, openstreetmap, P = NP, paperclip maximiser, Pareto efficiency, Paul Samuelson, Pierre-Simon Laplace, positional goods, probability theory / Blaise Pascal / Pierre de Fermat, profit maximization, RAND corporation, random walk, Ray Kurzweil, Recombinant DNA, recommendation engine, RFID, Richard Thaler, ride hailing / ride sharing, Robert Shiller, robotic process automation, Rodney Brooks, Second Machine Age, self-driving car, Shoshana Zuboff, Silicon Valley, smart cities, smart contracts, social intelligence, speech recognition, Stephen Hawking, Steven Pinker, superintelligent machines, surveillance capitalism, Thales of Miletus, The Future of Employment, The Theory of the Leisure Class by Thorstein Veblen, Thomas Bayes, Thorstein Veblen, Tragedy of the Commons, transport as a service, trolley problem, Turing machine, Turing test, universal basic income, uranium enrichment, vertical integration, Von Neumann architecture, Wall-E, warehouse robotics, Watson beat the top human players on Jeopardy!, web application, zero-sum game

Mathematical Guarantees We will want, eventually, to prove theorems to the effect that a particular way of designing AI systems ensures that they will be beneficial to humans. A theorem is just a fancy name for an assertion, stated precisely enough so that its truth in any particular situation can be checked. Perhaps the most famous theorem is Fermat’s Last Theorem, which was conjectured by the French mathematician Pierre de Fermat in 1637 and finally proved by Andrew Wiles in 1994 after 357 years of effort (not all of it by Wiles).1 The theorem can be written in one line, but the proof is over one hundred pages of dense mathematics. Proofs begin from axioms, which are assertions whose truth is simply assumed. Often, the axioms are just definitions, such as the definitions of integers, addition, and exponentiation needed for Fermat’s theorem.

It turns out that gambling played a central role in generalizing Aristotle’s proposal to account for uncertainty. In the 1560s, the Italian mathematician Gerolamo Cardano developed the first mathematically precise theory of probability—using dice games as his main example. (Unfortunately, his work was not published until 1663.13) In the seventeenth century, French thinkers including Antoine Arnauld and Blaise Pascal began—for assuredly mathematical reasons—to study the question of rational decisions in gambling.14 Consider the following two bets: A: 20 percent chance of winning $10 B: 5 percent chance of winning $100 The proposal the mathematicians came up with is probably the same one you would come up with: compare the expected values of the bets, which means the average amount you would expect to get from each bet.

Intelligent Computers The development of logic by Aristotle and others made available precise rules for rational thought, but we do not know whether Aristotle ever contemplated the possibility of machines that implemented these rules. In the thirteenth century, the influential Catalan philosopher, seducer, and mystic Ramon Llull came much closer: he actually made paper wheels inscribed with symbols, by means of which he could generate logical combinations of assertions. The great seventeenth-century French mathematician Blaise Pascal was the first to develop a real and practical mechanical calculator. Although it could only add and subtract and was used mainly in his father’s tax-collecting office, it led Pascal to write, “The arithmetical machine produces effects which appear nearer to thought than all the actions of animals.”


pages: 283 words: 81,376

The Doomsday Calculation: How an Equation That Predicts the Future Is Transforming Everything We Know About Life and the Universe by William Poundstone

Albert Einstein, anthropic principle, Any sufficiently advanced technology is indistinguishable from magic, Arthur Eddington, Bayesian statistics, behavioural economics, Benoit Mandelbrot, Berlin Wall, bitcoin, Black Swan, conceptual framework, cosmic microwave background, cosmological constant, cosmological principle, CRISPR, cuban missile crisis, dark matter, DeepMind, digital map, discounted cash flows, Donald Trump, Doomsday Clock, double helix, Dr. Strangelove, Eddington experiment, Elon Musk, Geoffrey Hinton, Gerolamo Cardano, Hans Moravec, heat death of the universe, Higgs boson, if you see hoof prints, think horses—not zebras, index fund, Isaac Newton, Jaron Lanier, Jeff Bezos, John Markoff, John von Neumann, Large Hadron Collider, mandelbrot fractal, Mark Zuckerberg, Mars Rover, Neil Armstrong, Nick Bostrom, OpenAI, paperclip maximiser, Peter Thiel, Pierre-Simon Laplace, Plato's cave, probability theory / Blaise Pascal / Pierre de Fermat, RAND corporation, random walk, Richard Feynman, ride hailing / ride sharing, Rodney Brooks, Ronald Reagan, Ronald Reagan: Tear down this wall, Sam Altman, Schrödinger's Cat, Search for Extraterrestrial Intelligence, self-driving car, Silicon Valley, Skype, Stanislav Petrov, Stephen Hawking, strong AI, tech billionaire, Thomas Bayes, Thomas Malthus, time value of money, Turing test

Cardano’s short treatise on probability was an attempt to understand how so much money had slipped through his fingers. Gamblers already knew how cards, dice, and roulette wheels worked. They needed to know the odds: how to calculate the chance of drawing two aces, rolling a 7, or winning repeated bets on red. Cardano and his French successors Pierre de Fermat and Blaise Pascal supplied that long before Bayes’s time. Bayes took up the opposite issue: inverse probability or the probability of causes. Suppose we already know the outcomes (the hands we’ve been dealt). What can we conclude about the causes (whether the dealer is honest or a cheat)? This too is a pressing question for any serious gambler.

Pressed to characterize Musk’s position as “hysterical” or “valid,” Zuckerberg picked the former. Musk tweeted in response: “I’ve talked to Mark about this. His understanding of the subject is limited.” The AI safety debate has become the Pascal’s Wager of a secular industry. In the seventeenth century Blaise Pascal decided that he should believe in God, even though he had serious doubts, because the stakes are so high. Why miss out on heaven, or get sent to hell, just to be right about atheism? In its general form, Pascal’s Wager is a classic problem of decision theory. Should a rational person be willing to pay a small cost in order to avoid a chance of a great loss?

“Time Without End: Physics and Biology in an Open Universe.” Reviews of Modern Physics 51 (1979): 447–460. Dyson, Lisa, Matthew Kleban, and Leonard Susskind. “Disturbing Implications of a Cosmological Constant.” 2002. arXiv:hep-th/0208013. Eckhardt, William. Paradoxes in Probability Theory. Dordrecht: Springer, 2013. . “Probability Theory and the Doomsday Argument.” Mind 102 (1993): 483–488. . “A Shooting-Room View of Doomsday.” Journal of Philosophy 94 (1997): 244–259. Eddington, Arthur. The Nature of the Physical World. Cambridge: Cambridge University Press, 1928. . The Philosophy of Physical Science.


pages: 360 words: 85,321

The Perfect Bet: How Science and Math Are Taking the Luck Out of Gambling by Adam Kucharski

Ada Lovelace, Albert Einstein, Antoine Gombaud: Chevalier de Méré, beat the dealer, behavioural economics, Benoit Mandelbrot, Bletchley Park, butterfly effect, call centre, Chance favours the prepared mind, Claude Shannon: information theory, collateralized debt obligation, Computing Machinery and Intelligence, correlation does not imply causation, diversification, Edward Lorenz: Chaos theory, Edward Thorp, Everything should be made as simple as possible, Flash crash, Gerolamo Cardano, Henri Poincaré, Hibernia Atlantic: Project Express, if you build it, they will come, invention of the telegraph, Isaac Newton, Johannes Kepler, John Nash: game theory, John von Neumann, locking in a profit, Louis Pasteur, Nash equilibrium, Norbert Wiener, p-value, performance metric, Pierre-Simon Laplace, probability theory / Blaise Pascal / Pierre de Fermat, quantitative trading / quantitative finance, random walk, Richard Feynman, Ronald Reagan, Rubik’s Cube, statistical model, The Design of Experiments, Watson beat the top human players on Jeopardy!, zero-sum game

Which is more likely: throwing a single six in four rolls of a single die, or throwing double sixes in twenty-four rolls of two dice? Gombaud believed the two events would occur equally often but could not prove it. He wrote to his mathematician friend Blaise Pascal, asking if this was indeed the case. To tackle the dice problem, Pascal enlisted the help of Pierre de Fermat, a wealthy lawyer and fellow mathematician. Together, they built on Cardano’s earlier work on randomness, gradually pinning down the basic laws of probability. Many of the new concepts would become central to mathematical theory. Among other things, Pascal and Fermat defined the “expected value” of a game, which measured how profitable it would be on average if played repeatedly.

Other options, such as predicting which half of the table the ball lands in, were less sensitive to initial conditions. It would therefore take a lot of spins before the result becomes as good as random. Fortunately for gamblers, a roulette ball does not spin for an extremely long period of time (although there is an oft-repeated myth that mathematician Blaise Pascal invented roulette while trying to build a perpetual motion machine). As a result, gamblers can—in theory—avoid falling into Poincaré’s second degree of ignorance by measuring the initial path of the roulette ball. They just need to work out what measurements to take. THE RITZ WASN’T THE first time a story of roulette-tracking technology emerged.

Their research showed that Gombaud had been wrong: he was more likely to get a six in four rolls of one die than double sixes in twenty-four rolls of two dice. Still, thanks to Gombaud’s gambling puzzle, mathematics had gained an entirely new set of ideas. According to mathematician Richard Epstein, “Gamblers can rightly claim to be the godfathers of probability theory.” As well as helping researchers understand how much a bet is worth in purely mathematical terms, wagers have also revealed how we value decisions in real life. During the eighteenth century, Daniel Bernoulli wondered why people would often prefer low-risk bets to ones that were, in theory, more profitable.


pages: 315 words: 93,628

Is God a Mathematician? by Mario Livio

Albert Einstein, Alvin Toffler, Antoine Gombaud: Chevalier de Méré, Brownian motion, cellular automata, correlation coefficient, correlation does not imply causation, cosmological constant, Dava Sobel, double helix, Edmond Halley, Eratosthenes, Future Shock, Georg Cantor, Gerolamo Cardano, Gregor Mendel, Gödel, Escher, Bach, Henri Poincaré, Isaac Newton, Johannes Kepler, John von Neumann, music of the spheres, Myron Scholes, Plato's cave, probability theory / Blaise Pascal / Pierre de Fermat, Russell's paradox, seminal paper, Thales of Miletus, The Design of Experiments, the scientific method, traveling salesman

In particular, in the middle of the seventeenth century, a French nobleman—the Chevalier de Méré—who was also a reputed gamester, addressed a series of questions about gambling to the famous French mathematician and philosopher Blaise Pascal (1623–62). The latter conducted in 1654 an extensive correspondence about these questions with the other great French mathematician of the time, Pierre de Fermat (1601–65). The theory of probability was essentially born in this correspondence. Let’s examine one of the fascinating examples discussed by Pascal in a letter dated July 29, 1654. Imagine two noblemen engaged in a game involving the roll of a single die.

Because of his relatively fragile health, Descartes was excused from having to get up at the brutal hour of five a.m., and he was allowed to spend the morning hours in bed. Later in life, he continued to use the early part of the day for contemplation, and he once told the French mathematician Blaise Pascal that the only way for him to stay healthy and be productive was to never get up before he felt comfortable doing so. As we shall soon see, this statement turned out to be tragically prophetic. After La Flèche, Descartes graduated from the University of Poitiers as a lawyer, but he never actually practiced law.

The serious study of probability: Recently published, entertaining popular accounts of probability, its history, and its uses include Aczel 2004, Kaplan and Kaplan 2006, Connor 2006, Burger and Starbird 2005, and Tabak 2004. in a letter dated July 29, 1654: Todhunter 1865, Hald 1990. The essence of probability theory: An excellent, popular, brief description of some of the essential principles of probability theory can be found in Kline 1967. Probability theory provides us with accurate information: The relevance of probability theory to many real-life situations is beautifully described in Rosenthal 2006. The person who brought probability: For an excellent biography, see Orel 1996. Mendel published his paper: Mendel 1865.


pages: 231 words: 64,734

Safe Haven: Investing for Financial Storms by Mark Spitznagel

Albert Einstein, Antoine Gombaud: Chevalier de Méré, asset allocation, behavioural economics, bitcoin, Black Swan, blockchain, book value, Brownian motion, Buckminster Fuller, cognitive dissonance, commodity trading advisor, cryptocurrency, Daniel Kahneman / Amos Tversky, data science, delayed gratification, diversification, diversified portfolio, Edward Thorp, fiat currency, financial engineering, Fractional reserve banking, global macro, Henri Poincaré, hindsight bias, Long Term Capital Management, Mark Spitznagel, Paul Samuelson, phenotype, probability theory / Blaise Pascal / Pierre de Fermat, quantitative trading / quantitative finance, random walk, rent-seeking, Richard Feynman, risk free rate, risk-adjusted returns, Schrödinger's Cat, Sharpe ratio, spice trade, Steve Jobs, tail risk, the scientific method, transaction costs, value at risk, yield curve, zero-sum game

Julius Caesar's famous line “the die has been cast” was not a statement about probability. (For all the wisdom that we ascribe to the ancients, if you had the ability to go back in time, you would totally clean up gambling against them.) It wasn't until much later, by the seventeenth century, that Galileo and then Blaise Pascal and Pierre de Fermat became gambling advisor‐mercenaries to various noblemen. For instance, the Chevalier de Méré needed advice on his costly observation that betting with even odds on the appearance of a 6 in four rolls of a single die was profitable in the long run, whereas betting with the same odds on a double‐6 in 24 rolls of two dice was not.

He finds ways to furtively inflict his musical tastes on his coworkers (Mahler, mainly, with performances by von Karajan) and in the early days, as in a ritual, the conversations used to start and end with Karl Popper and central (Black Swan) asymmetries in the scientific method. There is this insistence that we are not in the business of trading, but partaking of an intellectual enterprise, that is, both applying proper inference and probability theory to the business world and, without any modesty, improving these fields according to feedback from markets. And there is all this German terminology, such as Gedankenexperiment. I suspect that there was a nonrandom geography of origin for the authors and topics that have invade the office: prewar Vienna and its Weltanschauung.

On the other hand the neither‐this‐nor‐that MBA in finance with year‐end evaluation filed by the personnel department needs a helping hand—they can neither connect to the intuitions nor to the mathematics. At the time when I met Mark, we both were at the intersection of pit trading and novel branches of probability theory (such as Extreme Value Theory), an intersection that at the time (and still, presently) included no more than two persons. MUTUA MULI Now what was the dominant idea to emerge? There are activities with remove payoff and no feedback that are ignored by the common crowd. With the associated corollary: Never underestimate the effect of absence of feedback on the unconscious behavior and choices of people.


pages: 199 words: 64,272

Money: The True Story of a Made-Up Thing by Jacob Goldstein

Alan Greenspan, Antoine Gombaud: Chevalier de Méré, back-to-the-land, bank run, banks create money, Bear Stearns, Berlin Wall, Bernie Sanders, bitcoin, blockchain, break the buck, card file, central bank independence, collective bargaining, coronavirus, COVID-19, cryptocurrency, David Graeber, Edmond Halley, Fall of the Berlin Wall, fiat currency, financial innovation, Fractional reserve banking, full employment, German hyperinflation, Glass-Steagall Act, index card, invention of movable type, invention of writing, Isaac Newton, life extension, M-Pesa, Marc Andreessen, Martin Wolf, Menlo Park, Mikhail Gorbachev, mobile money, Modern Monetary Theory, money market fund, probability theory / Blaise Pascal / Pierre de Fermat, Ronald Reagan, Ross Ulbricht, Satoshi Nakamoto, Second Machine Age, side hustle, Silicon Valley, software is eating the world, Steven Levy, the new new thing, the scientific method, The Wealth of Nations by Adam Smith, too big to fail, transaction costs

It can be any game of chance—dice rolls, coin flips, whatever. The players start the game, but they have to stop before they’ve completed the agreed number of rounds. What’s a fair system for splitting the pot based on the score when the game is stopped? These questions inspired Pascal to write to Pierre de Fermat, a lawyer who moonlighted as a math genius. They sent letters back and forth for a few months, working on the problems. The problem of rolling double sixes was easy. The problem of the points took a little longer—and the solution Pascal and Fermat worked out had a more profound impact on the history of money, and of human thought.

Now, finally, gamblers started to calculate exactly how likely they were to win or lose. At a time when most people still thought of outcomes as luck or divine providence, doing the math at the gambling table was like having a superpower. One of the most important of the gambler-mathematicians was the weirdo genius Blaise Pascal. As a teenager, he had written a treatise on geometry that was good enough to impress Descartes (who was in the midst of inventing a branch of modern geometry). He invented a mechanical calculator, which he named after himself (the Pascaline), which never took off, possibly because it was too expensive to produce.

It doesn’t seem like he cheated. Law won because he discovered an intellectual discipline—a way of looking at the world—that was emerging during his lifetime and that would eventually shape the way millions of people thought about God, money, death, and the unknown future. The discipline is probability theory. It’s the basis of much of modern finance and, for that matter, much of modern thought. It was invented by gamblers. People had been gambling forever; four-sided knucklebones used as dice have been found in ancient archaeological sites around the world. But, almost unbelievably to our modern sensibility, gamblers had never really done the math.


pages: 665 words: 159,350

Shape: The Hidden Geometry of Information, Biology, Strategy, Democracy, and Everything Else by Jordan Ellenberg

Albert Einstein, AlphaGo, Andrew Wiles, autonomous vehicles, British Empire, Brownian motion, Charles Babbage, Claude Shannon: information theory, computer age, coronavirus, COVID-19, deep learning, DeepMind, Donald Knuth, Donald Trump, double entry bookkeeping, East Village, Edmond Halley, Edward Jenner, Elliott wave, Erdős number, facts on the ground, Fellow of the Royal Society, Geoffrey Hinton, germ theory of disease, global pandemic, government statistician, GPT-3, greed is good, Henri Poincaré, index card, index fund, Isaac Newton, Johannes Kepler, John Conway, John Nash: game theory, John Snow's cholera map, Louis Bachelier, machine translation, Mercator projection, Mercator projection distort size, especially Greenland and Africa, Milgram experiment, multi-armed bandit, Nate Silver, OpenAI, Paul Erdős, pets.com, pez dispenser, probability theory / Blaise Pascal / Pierre de Fermat, Ralph Nelson Elliott, random walk, Rubik’s Cube, self-driving car, side hustle, Snapchat, social distancing, social graph, transcontinental railway, urban renewal

But the next time the dove went out on a random flight, it came back with an olive leaf in its beak, and by this Noah was able to infer that the ark was close to the water’s edge.* And random walks have appeared in the study of games for centuries, especially in games of chance, where the walk through the tree is always random, at least in part. Pierre de Fermat, when he wasn’t writing letters about prime numbers, was corresponding with the mathematician and mystic Blaise Pascal about the problem of the gambler’s ruin. In this game, Akbar and Jeff go head to head at dice, each starting with a stake of twelve coins and rolling three dice at each turn. Every time Akbar rolls an 11, he gets one of Jeff’s coins; every time Jeff rolls a 14, he takes one from Akbar.

We thought we were screwing around with spinning bracelets, but in fact we were using the geometry of the circle and its rotations to prove a fact about prime numbers, which on the face of it you’d never have thought geometric at all. Geometry is hiding everywhere, deep in the gears of things. Our observation about primes isn’t just a fact, it’s a fact with a name: it’s called Fermat’s Little Theorem, after Pierre de Fermat, the first person to write it down.* No matter which prime number n you take, however large it may be, 2 raised to the nth power is 2 more than a multiple of n. Fermat wasn’t a professional mathematician (in seventeenth-century France there were hardly any such people) but a provincial lawyer, a comfortable member of the Toulouse bourgeoisie.

He first stated the Little Theorem in a 1640 letter to Bernard Frénicle de Bessy, with whom Fermat was engaged in a vigorous exchange on the subject of perfect numbers.* Fermat stated the theorem but didn’t set down a proof; he had one, he told Frénicle, which he would definitely have included in the letter “if he did not fear being too long.” This move is classic Pierre de Fermat. If you’ve heard his name at all, it’s not because of Fermat’s Little Theorem, but the other FLT, Fermat’s Last Theorem, which was neither his theorem nor the last thing he did; it was a conjecture about numbers Fermat jotted down in the margin of his copy of Diophantus’s Arithmetic, sometime in the 1630s.


pages: 893 words: 199,542

Structure and interpretation of computer programs by Harold Abelson, Gerald Jay Sussman, Julie Sussman

Andrew Wiles, conceptual framework, Donald Knuth, Douglas Hofstadter, Eratosthenes, Fermat's Last Theorem, functional programming, Gödel, Escher, Bach, higher-order functions, industrial robot, information retrieval, iterative process, Ivan Sutherland, Johannes Kepler, loose coupling, machine translation, Multics, probability theory / Blaise Pascal / Pierre de Fermat, Richard Stallman, Turing machine

Tabulation can sometimes be used to transform processes that require an exponential number of steps (such as count-change) into processes whose space and time requirements grow linearly with the input. See exercise 3.27. 35 The elements of Pascal's triangle are called the binomial coefficients, because the nth row consists of the coefficients of the terms in the expansion of (x + y)n. This pattern for computing the coefficients appeared in Blaise Pascal's 1653 seminal work on probability theory, Traité du triangle arithmétique. According to Knuth (1973), the same pattern appears in the Szu-yuen Yü-chien (“The Precious Mirror of the Four Elements”), published by the Chinese mathematician Chu Shih-chieh in 1303, in the works of the twelfth-century Persian poet and mathematician Omar Khayyam, and in the works of the twelfth-century Hindu mathematician Bháscara Áchárya. 36 These statements mask a great deal of oversimplification.

By our induction hypotheses, we have bk-1> Fib(k - 1) and bk> Fib(k). Thus, applying the claim we just proved together with the definition of the Fibonacci numbers gives bk+1> bk + bk-1> Fib(k) + Fib(k - 1) = Fib(k + 1), which completes the proof of Lamé's Theorem. 44 If d is a divisor of n, then so is n/d. But d and n/d cannot both be greater than √n. 45 Pierre de Fermat (1601-1665) is considered to be the founder of modern number theory. He obtained many important number-theoretic results, but he usually announced just the results, without providing his proofs. Fermat's Little Theorem was stated in a letter he wrote in 1640. The first published proof was given by Euler in 1736 (and an earlier, identical proof was discovered in the unpublished manuscripts of Leibniz).

evening star, see Venus event-driven simulation evlis tail recursion exact integer exchange exclamation point in names execute execute-application metacircular nondeterministic execution procedure in analyzing evaluator in nondeterministic evaluator, [2], [3] in register-machine simulator, [2] exp register expand-clauses explicit-control evaluator for Scheme assignments combinations compound procedures conditionals controller data paths definitions derived expressions driver loop error handling, [2] expressions with no subexpressions to evaluate as machine-language program machine model modified for compiled code monitoring performance (stack use) normal-order evaluation operand evaluation operations optimizations (additional) primitive procedures procedure application registers running sequences of expressions special forms (additional), [2] stack usage tail recursion, [2], [3] as universal machine expmod, [2], [3] exponential growth of tree-recursive Fibonacci-number computation exponentiation modulo n expression, see also compound expression; primitive expression algebraic, see algebraic expressions self-evaluating symbolic, see also symbol(s) expression-oriented vs. imperative programming style expt linear iterative version linear recursive version register machine for extend-environment, [2] extend-if-consistent extend-if-possible external-entry extract-labels, [2] #f factorial, see also factorial infinite stream with letrec without letrec or define factorial as an abstract machine compilation of, [2] environment structure in evaluating linear iterative version linear recursive version register machine for (iterative), [2] register machine for (recursive), [2] stack usage, compiled stack usage, interpreted, [2] stack usage, register machine with assignment with higher-order procedures failure continuation (nondeterministic evaluator), [2] constructed by amb constructed by assignment constructed by driver loop failure, in nondeterministic computation bug vs. searching and false false false? fast-expt fast-prime? feedback loop, modeled with streams Feeley, Marc Feigenbaum, Edward Fenichel, Robert Fermat, Pierre de Fermat test for primality variant of Fermat's Little Theorem alternate form proof fermat-test fetch-assertions fetch-rules fib linear iterative version logarithmic version register machine for (tree-recursive), [2] stack usage, compiled stack usage, interpreted tree-recursive version, [2] with memoization with named let Fibonacci numbers, see also fib Euclid's GCD algorithm and infinite stream of, see fibs fibs (infinite stream) implicit definition FIFO buffer filter, [2] filter filtered-accumulate find-assertions find-divisor first-agenda-item, [2] first-class elements in language first-exp first-frame first-operand first-segment first-term, [2] fixed point computing with calculator of cosine cube root as fourth root as golden ratio as as iterative improvement in Newton's method nth root as square root as, [2], [3] of transformed function unification and fixed-length code fixed-point as iterative improvement fixed-point-of-transform flag register flatmap flatten-stream flip-horiz, [2] flip-vert, [2] flipped-pairs, [2], [3] Floyd, Robert fold-left fold-right for-each, [2] for-each-except Forbus, Kenneth D.


The Book of Why: The New Science of Cause and Effect by Judea Pearl, Dana Mackenzie

affirmative action, Albert Einstein, AlphaGo, Asilomar, Bayesian statistics, computer age, computer vision, Computing Machinery and Intelligence, confounding variable, correlation coefficient, correlation does not imply causation, Daniel Kahneman / Amos Tversky, data science, deep learning, DeepMind, driverless car, Edmond Halley, Elon Musk, en.wikipedia.org, experimental subject, Great Leap Forward, Gregor Mendel, Isaac Newton, iterative process, John Snow's cholera map, Loebner Prize, loose coupling, Louis Pasteur, Menlo Park, Monty Hall problem, pattern recognition, Paul Erdős, personalized medicine, Pierre-Simon Laplace, placebo effect, Plato's cave, prisoner's dilemma, probability theory / Blaise Pascal / Pierre de Fermat, randomized controlled trial, Recombinant DNA, selection bias, self-driving car, seminal paper, Silicon Valley, speech recognition, statistical model, Stephen Hawking, Steve Jobs, strong AI, The Design of Experiments, the scientific method, Thomas Bayes, Turing test

For example, until about four hundred years ago, people were quite happy with their natural ability to manage the uncertainties in daily life, from crossing a street to risking a fistfight. Only after gamblers invented intricate games of chance, sometimes carefully designed to trick us into making bad choices, did mathematicians like Blaise Pascal (1654), Pierre de Fermat (1654), and Christiaan Huygens (1657) find it necessary to develop what we today call probability theory. Likewise, only when insurance organizations demanded accurate estimates of life annuity did mathematicians like Edmond Halley (1693) and Abraham de Moivre (1725) begin looking at mortality tables to calculate life expectancies. Similarly, astronomers’ demands for accurate predictions of celestial motion led Jacob Bernoulli, Pierre-Simon Laplace, and Carl Friedrich Gauss to develop a theory of errors to help us extract signals from noise.

Armed with this judgment, Frost ends the poem pleased with his choice, realizing that it “made all the difference.” (Source: Drawing by Maayan Harel.) 8 COUNTERFACTUALS: MINING WORLDS THAT COULD HAVE BEEN Had Cleopatra’s nose been shorter, the whole face of the world would have changed. —BLAISE PASCAL (1669) AS we prepare to move up to the top rung of the Ladder of Causation, let’s recapitulate what we have learned from the second rung. We have seen several ways to ascertain the effect of an intervention in various settings and under a variety of conditions. In Chapter 4, we discussed randomized controlled trials, the widely cited “gold standard” for medical trials.

With nothing to lean on except his internal conviction that path coefficients tell you what correlations do not, he still declared, “And yet it moves!” Colleagues tell me that when Bayesian networks fought against the artificial intelligence establishment (see Chapter 3), I acted stubbornly, single-mindedly, and uncompromisingly. Indeed, I recall being totally convinced of my approach, with not an iota of hesitation. But I had probability theory on my side. Wright didn’t have even one theorem to lean on. Scientists had abandoned causation, so Wright could not fall back on any theoretical framework. Nor could he rely on authorities, as Niles did, because there was no one for him to quote; the gurus had already pronounced their verdicts three decades earlier.


pages: 322 words: 88,197

Wonderland: How Play Made the Modern World by Steven Johnson

"hyperreality Baudrillard"~20 OR "Baudrillard hyperreality", Ada Lovelace, adjacent possible, Alfred Russel Wallace, Antoine Gombaud: Chevalier de Méré, Berlin Wall, bitcoin, Book of Ingenious Devices, Buckminster Fuller, Charles Babbage, Claude Shannon: information theory, Clayton Christensen, colonial exploitation, computer age, Computing Machinery and Intelligence, conceptual framework, cotton gin, crowdsourcing, cuban missile crisis, Drosophila, Edward Thorp, Fellow of the Royal Society, flying shuttle, game design, global village, Great Leap Forward, Hedy Lamarr / George Antheil, HyperCard, invention of air conditioning, invention of the printing press, invention of the telegraph, Islamic Golden Age, Jacquard loom, Jacques de Vaucanson, James Watt: steam engine, Jane Jacobs, John von Neumann, joint-stock company, Joseph-Marie Jacquard, land value tax, Landlord’s Game, Lewis Mumford, lone genius, mass immigration, megacity, Minecraft, moral panic, Murano, Venice glass, music of the spheres, Necker cube, New Urbanism, Oculus Rift, On the Economy of Machinery and Manufactures, pattern recognition, peer-to-peer, pets.com, placebo effect, pneumatic tube, probability theory / Blaise Pascal / Pierre de Fermat, profit motive, QWERTY keyboard, Ray Oldenburg, SimCity, spice trade, spinning jenny, statistical model, Steve Jobs, Steven Pinker, Stewart Brand, supply-chain management, talking drums, the built environment, The Great Good Place, the scientific method, The Structural Transformation of the Public Sphere, trade route, Turing machine, Turing test, Upton Sinclair, urban planning, vertical integration, Victor Gruen, Watson beat the top human players on Jeopardy!, white flight, white picket fence, Whole Earth Catalog, working poor, Wunderkammern

Written in 1564, Cardano’s book wasn’t published for another century. By the time his ideas got into wider circulation, an even more important breakthrough had emerged out of a famous correspondence between Blaise Pascal and Pierre de Fermat in 1654. This, too, was prompted by a compulsive gambler, the French aristocrat Antoine Gombaud, who had written Pascal for advice about the most equitable way to predict the outcome of a dice game that had been interrupted. Their exchange put probability theory on a solid footing and created the platform for the modern science of statistics. Within a few years, Edward Halley (of comet legend) was using these new tools to calculate mortality rates for the average Englishman, and the Dutch scientist Christiaan Huygens and his brother Lodewijk had set about to answer “the question . . . to what age a newly conceived child will naturally live.”

Lodewijk even went so far as to calculate that his brother, then aged forty, was likely to live for another sixteen years. (He beat the odds and lived a decade beyond that, as it turns out.) It was the first time anyone had begun talking, mathematically at least, about what we now call life expectancy. Probability theory served as a kind of conceptual fossil fuel for the modern world. It gave rise to the modern insurance industry, which for the first time could calculate with some predictive power the claims it could expect when insuring individuals or industries. Capital markets—for good and for bad—rely extensively on elaborate statistical models that predict future risk.

That regularity may have foiled the swindlers in the short term, but it had a much more profound effect that had never occurred to dice-making guilds: it made the patterns of the dice games visible, which enabled Cardano, Pascal, and Fermat to begin to think systematically about probability. Ironically, making the object of the die itself more uniform ultimately enabled people like Huygens and Halley to analyze the decidedly nonuniform experience of human mortality using the new tools of probability theory. No longer mere playthings, the dice had become, against all odds, tools for thinking. — The logic of games is ethereal. We have no idea how most ancient games were played, either because written rule books did not survive to modern times, or because the rules themselves evolved and then died out before the game’s players adopted the technology of writing.


pages: 807 words: 154,435

Radical Uncertainty: Decision-Making for an Unknowable Future by Mervyn King, John Kay

Airbus A320, Alan Greenspan, Albert Einstein, Albert Michelson, algorithmic trading, anti-fragile, Antoine Gombaud: Chevalier de Méré, Arthur Eddington, autonomous vehicles, availability heuristic, banking crisis, Barry Marshall: ulcers, battle of ideas, Bear Stearns, behavioural economics, Benoit Mandelbrot, bitcoin, Black Swan, Boeing 737 MAX, Bonfire of the Vanities, Brexit referendum, Brownian motion, business cycle, business process, capital asset pricing model, central bank independence, collapse of Lehman Brothers, correlation does not imply causation, credit crunch, cryptocurrency, cuban missile crisis, Daniel Kahneman / Amos Tversky, David Ricardo: comparative advantage, DeepMind, demographic transition, discounted cash flows, disruptive innovation, diversification, diversified portfolio, Donald Trump, Dutch auction, easy for humans, difficult for computers, eat what you kill, Eddington experiment, Edmond Halley, Edward Lloyd's coffeehouse, Edward Thorp, Elon Musk, Ethereum, Eugene Fama: efficient market hypothesis, experimental economics, experimental subject, fear of failure, feminist movement, financial deregulation, George Akerlof, germ theory of disease, Goodhart's law, Hans Rosling, Helicobacter pylori, high-speed rail, Ignaz Semmelweis: hand washing, income per capita, incomplete markets, inflation targeting, information asymmetry, invention of the wheel, invisible hand, Jeff Bezos, Jim Simons, Johannes Kepler, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Snow's cholera map, John von Neumann, Kenneth Arrow, Kōnosuke Matsushita, Linda problem, Long Term Capital Management, loss aversion, Louis Pasteur, mandelbrot fractal, market bubble, market fundamentalism, military-industrial complex, Money creation, Moneyball by Michael Lewis explains big data, Monty Hall problem, Nash equilibrium, Nate Silver, new economy, Nick Leeson, Northern Rock, nudge theory, oil shock, PalmPilot, Paul Samuelson, peak oil, Peter Thiel, Philip Mirowski, Phillips curve, Pierre-Simon Laplace, popular electronics, power law, price mechanism, probability theory / Blaise Pascal / Pierre de Fermat, quantitative trading / quantitative finance, railway mania, RAND corporation, reality distortion field, rent-seeking, Richard Feynman, Richard Thaler, risk tolerance, risk-adjusted returns, Robert Shiller, Robert Solow, Ronald Coase, sealed-bid auction, shareholder value, Silicon Valley, Simon Kuznets, Socratic dialogue, South Sea Bubble, spectrum auction, Steve Ballmer, Steve Jobs, Steve Wozniak, Suez crisis 1956, Tacoma Narrows Bridge, Thales and the olive presses, Thales of Miletus, The Chicago School, the map is not the territory, The Market for Lemons, The Nature of the Firm, The Signal and the Noise by Nate Silver, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, Thomas Bayes, Thomas Davenport, Thomas Malthus, Toyota Production System, transaction costs, ultimatum game, urban planning, value at risk, world market for maybe five computers, World Values Survey, Yom Kippur War, zero-sum game

Does he not define probability to be that which the many think?’ PHAEDRUS: ‘Certainly, he does.’ —PLATO , Phaedrus 1 T he ‘probabilistic turn’ in human reasoning reportedly began when the Chevalier de Méré, an inveterate gambler, sought the advice of the mathematician and philosopher Blaise Pascal. Pascal in turn consulted an even more distinguished French polymath, Pierre de Fermat. The resulting exchange of letters between Pascal and Fermat in the winter of 1653–4 represents the first formal analysis of probability. 2 Historians of mathematics have speculated on why the discoveries of Pascal and Fermat came so late in the history of human thought.

We shall call this problem of model failure the Viniar problem, in honour of the former Goldman Sachs executive. 24 5 A FORGOTTEN DISPUTE This is what I see and what troubles me. I look on all sides, and I see only darkness everywhere. Nature presents to me nothing which is not matter of doubt and concern . . . The true course is not to wager at all. —BLAISE PASCAL , Pensées 1 B y the early twentieth century, the uses of probability were well established in understanding games of chance, such as cards, roulette, or Let’s Make a Deal . The theory had also proved valuable in the analysis of data generated by a more or less stationary process, such as mortality, for which extensive frequency data were available.

Arguably the two most brilliant economists of the post-war period, Paul Samuelson and Robert Solow, occupied adjoining offices at MIT for over half a century. As Samuelson relates, ‘When young he [Solow] would say, if you don’t regard probability theory as the most interesting subject in the world, then I feel sorry for you. I always agreed with that.’ 9 The appeal of probability theory is understandable. But we suspect the reason that such mathematics was, as we shall see, not developed until the seventeenth century is that few real-world problems can properly be represented in this way. The most compelling extension of probabilistic reasoning is to situations where the possible outcomes are well defined, the underlying processes which give rise to them change little over time, and there is a wealth of historic information.


pages: 790 words: 150,875

Civilization: The West and the Rest by Niall Ferguson

Admiral Zheng, agricultural Revolution, Albert Einstein, Andrei Shleifer, Atahualpa, Ayatollah Khomeini, Berlin Wall, BRICs, British Empire, business cycle, clean water, collective bargaining, colonial rule, conceptual framework, Copley Medal, corporate governance, creative destruction, credit crunch, David Ricardo: comparative advantage, Dean Kamen, delayed gratification, Deng Xiaoping, discovery of the americas, Dissolution of the Soviet Union, Easter island, European colonialism, Fall of the Berlin Wall, financial engineering, Francisco Pizarro, full employment, Great Leap Forward, Gregor Mendel, guns versus butter model, Hans Lippershey, haute couture, Hernando de Soto, income inequality, invention of movable type, invisible hand, Isaac Newton, James Hargreaves, James Watt: steam engine, John Harrison: Longitude, joint-stock company, Joseph Schumpeter, Kickstarter, Kitchen Debate, land reform, land tenure, liberal capitalism, Louis Pasteur, Mahatma Gandhi, market bubble, Martin Wolf, mass immigration, means of production, megacity, Mikhail Gorbachev, new economy, Pearl River Delta, Pierre-Simon Laplace, power law, probability theory / Blaise Pascal / Pierre de Fermat, profit maximization, purchasing power parity, quantitative easing, rent-seeking, reserve currency, retail therapy, road to serfdom, Ronald Reagan, savings glut, Scramble for Africa, Silicon Valley, South China Sea, sovereign wealth fund, special economic zone, spice trade, spinning jenny, Steve Jobs, Steven Pinker, subprime mortgage crisis, Suez canal 1869, Suez crisis 1956, The Great Moderation, the market place, the scientific method, The Wealth of Nations by Adam Smith, Thomas Kuhn: the structure of scientific revolutions, Thomas Malthus, Thorstein Veblen, total factor productivity, trade route, transaction costs, transatlantic slave trade, undersea cable, upwardly mobile, uranium enrichment, wage slave, Washington Consensus, women in the workforce, work culture , World Values Survey

.* 1530 Paracelsus pioneers the application of chemistry to physiology and pathology 1543 Nicolaus Copernicus’ De revolutionibus orbium coelestium states the heliocentric theory of the solar system Andreas Vesalius’ De humani corporis fabrica supplants Galen’s anatomical textbook 1546 Agricola’s De natura fossilium classifies minerals and introduces the term ‘fossil’ 1572 Tycho Brahe records the first European observation of a supernova 1589 Galileo’s tests of falling bodies (published in De motu) revolutionize the experimental method 1600 William Gilbert’s De magnete, magnetisque corporibus describes the magnetic properties of the earth and electricity 1604 Galileo discovers that a free-falling body increases its distance as the square of the time 1608 Hans Lippershey and Zacharias Jansen independently invent the telescope 1609 1609 Galileo conducts the first telescopic observations of the night sky 1610 Galileo discovers four of Jupiter’s moons and infers that the earth is not at the centre of the universe 1614 John Napier’s Mirifici logarithmorum canonis descriptio introduces logarithms 1628 William Harvey writes Exercitatio anatomica de motu cordis et sanguinis in animalibus, accurately describing the circulation of blood 1637 René Descartes’ ‘La Géométrie’, an appendix to his Discours de la méthode, founds analytic geometry 1638 Galileo’s Discorsi e dimonstrazioni matematiche founds modern mechanics 1640 Pierre de Fermat founds number theory 1654 Fermat and Blaise Pascal found probability theory 1661 Robert Boyle’s Skeptical Chymist defines elements and chemical analysis 1662 Boyle states Boyle’s Law that the volume occupied by a fixed mass of gas in a container is inversely proportional to the pressure it exerts 1669 Isaac Newton’s De analysi per aequationes numero terminorum infinitas presents the first systematic account of the calculus, independently developed by Gottfried Leibniz 1676 Antoni van Leeuwenhoek discovers micro-organisms 1687 Newton’s Philosophiae naturalis principia mathematica states the law of universal gravitation and the laws of motion 1735 Carolus Linnaeus’ Systema naturae introduces systematic classification of genera and species of organisms 1738 Daniel Bernoulli’s Hydrodynamica states Bernoulli’s Principle and founds the mathematical study of fluid flow and the kinetic theory of gases 1746 Jean-Etienne Guettard prepares the first true geological maps 1755 Joseph Black identifies carbon dioxide 1775 Antoine Lavoisier accurately describes combustion 1785 James Hutton’s ‘Concerning the System of the Earth’ states the uniformitarian view of the earth’s development 1789 Lavoisier’s Traité élémentaire de chimie states the law of conservation of matter By the mid-1600s this kind of scientific knowledge was spreading as rapidly as had the doctrine of the Protestant Reformers a century before.

The great Dutch-Jewish philosopher Baruch or Benedict Spinoza, who hypothesized that there is only a material universe of substance and deterministic causation, and that ‘God’ is that universe’s natural order as we dimly apprehend it and nothing more, died in 1677 at the age of forty-four, probably from the particles of glass he had inhaled doing his day-job as a lens grinder. Blaise Pascal, the pioneer of probability theory and hydrodynamics and the author of the Pensées, the greatest of all apologias for the Christian faith, lived to be just thirty-nine; he would have died even younger had the road accident that reawakened his spiritual side been fatal. Who knows what other great works these geniuses might have brought forth had they been granted the lifespans enjoyed by, for example, the great humanists Erasmus (sixty-nine) and Montaigne (fifty-nine)?


pages: 338 words: 106,936

The Physics of Wall Street: A Brief History of Predicting the Unpredictable by James Owen Weatherall

Alan Greenspan, Albert Einstein, algorithmic trading, Antoine Gombaud: Chevalier de Méré, Apollo 11, Asian financial crisis, bank run, Bear Stearns, beat the dealer, behavioural economics, Benoit Mandelbrot, Black Monday: stock market crash in 1987, Black Swan, Black-Scholes formula, Bonfire of the Vanities, book value, Bretton Woods, Brownian motion, business cycle, butterfly effect, buy and hold, capital asset pricing model, Carmen Reinhart, Claude Shannon: information theory, coastline paradox / Richardson effect, collateralized debt obligation, collective bargaining, currency risk, dark matter, Edward Lorenz: Chaos theory, Edward Thorp, Emanuel Derman, Eugene Fama: efficient market hypothesis, financial engineering, financial innovation, Financial Modelers Manifesto, fixed income, George Akerlof, Gerolamo Cardano, Henri Poincaré, invisible hand, Isaac Newton, iterative process, Jim Simons, John Nash: game theory, junk bonds, Kenneth Rogoff, Long Term Capital Management, Louis Bachelier, mandelbrot fractal, Market Wizards by Jack D. Schwager, martingale, Michael Milken, military-industrial complex, Myron Scholes, Neil Armstrong, new economy, Nixon triggered the end of the Bretton Woods system, Paul Lévy, Paul Samuelson, power law, prediction markets, probability theory / Blaise Pascal / Pierre de Fermat, quantitative trading / quantitative finance, random walk, Renaissance Technologies, risk free rate, risk-adjusted returns, Robert Gordon, Robert Shiller, Ronald Coase, Sharpe ratio, short selling, Silicon Valley, South Sea Bubble, statistical arbitrage, statistical model, stochastic process, Stuart Kauffman, The Chicago School, The Myth of the Rational Market, tulip mania, Vilfredo Pareto, volatility smile

No one had an answer, or much interest in looking for one, until de Méré tried his problem out on Blaise Pascal. Pascal had been a child prodigy, working out most of classical geometry on his own by drawing pictures as a child. By his late teens he was a regular at the most important salon, run by a Jesuit priest named Marin Mersenne, and it was here that de Méré and Pascal met. Pascal didn’t know the answer, but he was intrigued. In particular, he agreed with de Méré’s appraisal that the problem should have a mathematical solution. Pascal began to work on de Méré’s problem. He enlisted the help of another mathematician, Pierre de Fermat. Fermat was a lawyer and polymath, fluent in a half-dozen languages and one of the most capable mathematicians of his day.

Savage had just finished writing a textbook on probability and statistics and had developed an interest in the history of probability theory along the way. He had been poking around the university library for early-twentieth-century work on probability when he came across a textbook from 1914 that he had never seen before. When he flipped through it, Savage realized that, in addition to some pioneering work on probability, the book had a few chapters dedicated to what the author called “speculation” — literally, probability theory as applied to market speculation. Savage guessed (correctly) that if he had never come across this work before, his friends in economics departments likely hadn’t either, and so he sent out a series of postcards asking if anyone knew of Bachelier.

Bachelier knew the rudiments of the French financial system, but little more. The Bourse did not seem like the right place for a quiet boy, a mathematician with a scholar’s temperament. But there was no turning back. It’s just a game, he told himself. Bachelier had always been fascinated by probability theory, the mathematics of chance (and, by extension, gambling). If he could just imagine the French financial markets as a glorified casino, a game whose rules he was about to learn, it might not seem so scary. He repeated the mantra — just an elaborate game of chance — as he pushed forward into the throng.


pages: 626 words: 181,434

I Am a Strange Loop by Douglas R. Hofstadter

Albert Einstein, Andrew Wiles, Benoit Mandelbrot, Brownian motion, Charles Babbage, double helix, Douglas Hofstadter, Georg Cantor, Gödel, Escher, Bach, Hans Moravec, Isaac Newton, James Watt: steam engine, John Conway, John von Neumann, language acquisition, mandelbrot fractal, pattern recognition, Paul Erdős, place-making, probability theory / Blaise Pascal / Pierre de Fermat, publish or perish, random walk, Ronald Reagan, self-driving car, Silicon Valley, telepresence, Turing machine

The two phenomena involved — integer powers with arbitrarily large exponents, on the one hand, and Fibonacci numbers on the other — simply seemed (like gemstones and the Caspian Sea) to be too conceptually remote from each other to have any deep, systematic, inevitable interrelationship. And then along came a vast team of mathematicians who had set their collective bead on the “big game” of Fermat’s Last Theorem (the notorious claim, originally made by Pierre de Fermat in the middle of the seventeenth century, that no positive integers a, b, c exist such that an + bn equals c n, with the exponent n being an integer greater than 2). This great international relay team, whose final victorious lap was magnificently sprinted by Andrew Wiles (his sprint took him about eight years), was at last able to prove Fermat’s centuries-old claim by using amazing techniques that combined ideas from all over the vast map of contemporary mathematics.

Ted entelechy entrenchedness of “I”: in main brain; in other brains entwinement: of feedback loops; of human souls envelopes in box; not perceivable individually Epi (apparent marble in envelope box); parameters determining reality of; poem about; possible explanatory power of; seeming reality of Epimenides epiphenomena: in brain; in careenium; in envelope box; in minerals; in video feedback episodes as concepts episodic memory: central role of, in “I”-ness; containing precedents for new situations; of dogs; of human beings episodic projectory episodic subjunctory Erdös, Paul Ernst, Tom errors, study of Escher, Maurits Cornelis essence: extraction of, in brain; pinpointing of, as the goal of thought essential incompleteness essentially self-referential quality required to make an “I” esthetic pressures as affecting content études (Chopin) etymologies of words Euclid’s Elements Euclid’s proof of infinitude of primes Eugene Onegin (Pushkin) Euler, Leonhard Everest, Mount everyday concepts defining human reality; blurriness of evolution: of brain complexity; of careenium; and efficiency; of hearts; producing meanings in brains as accidental by-product; producing universality in brains as accidental by-product; throwing consciousness in as a bonus feature existence: blurriness of, of “I” experiences, as co-present with “I”-ness; as determining “I”, not vice versa; “pure”, as unrelated to physics experiencers vs. non-experiencers explanations, proper level of Exploratorium Museum exponential explosions extensible category system of humans; and representational universality; yielding consciousness extra bonus feature of consciousness extra-physical nature of consciousness F F numbers; see also Fibonacci fading afterglow of a soul failures, perception of one’s own faith in one’s own thought processes falafel, savored by two brains Falen, James falsity in mathematics, assumed equivalent to lacking proof Fauconnier, Gilles Fauré, Gabriel fear: of feedback loops; of self-reference; of self-representation in art feedback loops; and central goals of living creatures; content-free; and exponential growth; as germ of consciousness; in growth of human self-symbol; as instinctive taboo; irrelevant to hereness, for SL #642; level-crossing; see also strange loops feeding a formula its own Gödel number feeling one is elsewhere feeling posited to be independent of physics feeling vs. nonfeeling machines feelium, as stuff of experience and sensation Feigen, George Femme du boulanger, La (Pagnol), indirect meanings carried by analogy in Fermat, Pierre de Fermat’s Last Theorem, proof of fetus having no soul Fibonacci (Leonardo di Pisa) Fibonacci numbers; perfect mutual avoidance with powers fidelity of copy of another’s interiority Fields, W. C. fine-grained vs. coarse-grained loops first-person vs. third-person view of “I”; see also The Hard Problem first-person writing style fish, respect for life of flap loop; photo of flexibility, maximal, of machines flip side of Mathematician’s Credo flipping-around of perceptual system flirting with infinity float-ball in flush toilet fluid pointers flush-toilet fill mechanism food: for people in virtual environments; for virtual creatures fooled by the realism of Drawing Hands football: sensation swapped with baseball sensation; on television forces: in the brain; in physics; vs. desires forest vs. trees form–content interplay formula fragment in PM 43 and 49, as uninteresting integers foundations of mathematics, quest for stable fractalic gestalts of video feedback frames as brain structures Frank, Anne Franklin, Benjamin free will: of machine; as opposed to will; overridden by itself Free Willie Frege, Gottlob French people having flipped color qualia French translation: of Bach aria words; of “my leg is asleep” Freud, Sigmund friends: constituting threshold for consciousness; giving rise to “I”; personal identity of; self-inventorying sentences as; sharing of joys and pains of Frucht, William frustration of one’s will funerals, purpose of funneling of complex input into few symbols fusion, psychic: of Chaplin twins; of halves in Twinwirld; of husband and wife; of souls, as inevitable consequence of long-term intimate sharing G g (Gödel number of Gödel’s formula); bypassing of; code number of unpennable line in play “g is not prim”; see also KG galaxies: colliding in cyberspace; colliding in space; emerging from video feedback Galileo Galilei Galois, Évariste, as radical Gandhi, (Mahatma) Mohandas gargantuan integers gases, behavior of gemstones found in the Caspian Sea generosity, as bedrock cause for humans genes: as abstract entities; human, as source of potential strange loop; as patterns copied in different organisms; as physical entities; as remote from real life genetic code Gershwin, George Giant Electronic Brain gleb/knurk fence; see also knurking Glosspan, Aunt (in “Pig”) Glover, Henry glue: on dollops of Consciousness; on envelopes as source of Epi gluons gluttony vs. weight-watching goal-lacking and goal-possessing entities, perceived schism between goal-orientedness of systems, blurriness of goals of living creatures goals, shared God: elusiveness of mathematical truth and; not a player of dice; reality of Göd Gödel, Escher, Bach: central message of; dialogues in; linking author with Bach; typesetting of; video feedback photos in; writing of Gödel, Kurt; arithmetizes PM; believer in PM ’s consistency; birthyear of; as black belt; bypasses indexicality; code for PM symbols and formulas; concocts self-referential statement in PM; discovery of strange loop by; “God” in name of; growing up in Brünn; 1931 article of; as re-analyzer of what meaning is; respect for power of PM; sees analogy between Fibonacci numbers and PM theorems; sees causal potency of meaning of strings; sees representational richness of whole numbers; umlaut in name of; as young mathematician Gödel numbering; freedom in; revealing secondary meaning of Imp Gödel rays Gödelian formulas, infinitude of Gödelian swirl; ease of transportability; isomorphism of any two, at coarse-grained level; pointers in; see also strange loop Gödel’s formula KG: condensability of; described through abbreviations; downward causality and; hiddenness of higher-level view of; high-level and low-level meanings of; higher-level view fantasized as being obvious; inconceivable length of; lack of indexicals in; as repugnant to Bertrand Russell; translated with the indexical pronoun “I”; truth of; unprovability of Gödel’s formula KH, for Super-PM Gödel’s incompleteness theorem: as proven by Chaitin; as proven by Gödel Gödel’s Proof (Nagel and Newman) Gödel’s strange loop: as inevitable as video feedback; as prototype for “I”-ness Gödel–Turing threshold for universality “God’s-eye point of view” Goldbach Conjecture; decided by Göru Golden Rule Goldsmith, John good and bad, sense of, in growing self Göru: falling short of hopes; machine for solving all mathematical problems; machine for telling prims from non-prims; non-realizability of Gott ist unsre Zu versicht (Bach); counterfactual extension of Gould, Glenn gradations of consciousness Graham’s constant graininess: in representations of others; in video loops grandmother cells grand-music neuron grass and souls grasshopper rescued gravitation grazing of paradox Greater Metropolitan You “great soul” as hidden meaning of “magnanimous” and of “Mahatma” green button, fear of pushing Greg and Karen, Twinwirld couple Greg’l and Greg’r griffins, uncertain reality of “grocery store checkout stand”: causality attached to the concept of; as nested concept; vignette involving grocery stores’ meat displays growth rules, recursive, defining sequences of numbers growth rules, typographical, for strings in PM guinea pigs sacrificed for science gulf between truth and provability Gunkel, Pat guns fashioned from sandwiches Guru, machine that tells primes from non-primes Gutman, Hattie Gutman, Kellie H h (Gödel number of Gödel’s formula KH) “h is not a super-prim number” H1 (robot vehicle) halflings in Twinwirld hallucination hallucinated by hallucination halo: counterfactual, around personal identity; of each soul; of national souls halves, left and right, in Twinwirld hammerhead shark, location of hands-in-water experiment hangnails, reality of one’s own hardness and roundness of Epi Hard Problem, The; see also first-person hardware vs. patterns “hard-wiring”: of hemispheric links; hypothetically tweaked; of motor control as source of identity; of perceptual hardware as source of identity Hastorf, Albert H.

The point is just that this rule shows how one can mechanically shunt symbols and ignore their meanings, and yet preserve truth while doing so. This rule is rather trivial, but there are subtler ones that do real work. That, indeed, is the whole idea of symbolic logic, first suggested by Aristotle and then developed piecemeal over many centuries by such thinkers as Blaise Pascal, Gottfried Wilhelm von Leibniz, George Boole, Augustus De Morgan, Gottlob Frege, Giuseppe Peano, David Hilbert, and many others. Russell and Whitehead were simply developing the ancient dream of totally mechanizing reasoning in a more ambitious fashion than any of their predecessors had. Mechanizing the Mathematician’s Credo If you apply PM’s rules of inference to its axioms (the seeds that constitute the “zeroth generation” of theorems), you will produce some “progeny” — theorems of the “first generation”.


pages: 436 words: 127,642

When Einstein Walked With Gödel: Excursions to the Edge of Thought by Jim Holt

Ada Lovelace, Albert Einstein, Andrew Wiles, anthropic principle, anti-communist, Arthur Eddington, Benoit Mandelbrot, Bletchley Park, Brownian motion, cellular automata, Charles Babbage, classic study, computer age, CRISPR, dark matter, David Brooks, Donald Trump, Dr. Strangelove, Eddington experiment, Edmond Halley, everywhere but in the productivity statistics, Fellow of the Royal Society, four colour theorem, Georg Cantor, George Santayana, Gregor Mendel, haute couture, heat death of the universe, Henri Poincaré, Higgs boson, inventory management, Isaac Newton, Jacquard loom, Johannes Kepler, John von Neumann, Joseph-Marie Jacquard, Large Hadron Collider, Long Term Capital Management, Louis Bachelier, luminiferous ether, Mahatma Gandhi, mandelbrot fractal, Monty Hall problem, Murray Gell-Mann, new economy, Nicholas Carr, Norbert Wiener, Norman Macrae, Paradox of Choice, Paul Erdős, Peter Singer: altruism, Plato's cave, power law, probability theory / Blaise Pascal / Pierre de Fermat, quantum entanglement, random walk, Richard Feynman, Robert Solow, Schrödinger's Cat, scientific worldview, Search for Extraterrestrial Intelligence, selection bias, Skype, stakhanovite, Stephen Hawking, Steven Pinker, Thorstein Veblen, Turing complete, Turing machine, Turing test, union organizing, Vilfredo Pareto, Von Neumann architecture, wage slave

Just as you did when rescuing the drowning person, the light beam considers its destination, then chooses the trajectory that gets it there in the least time, given its differential rate of progress in the two elements through which it must travel. But this can’t be right, can it? Our explanation for the route taken by the light beam—first formulated by Pierre de Fermat in the seventeenth century as the “principle of least time”—assumes that the light somehow knows where it is going in advance and that it acts purposefully in getting there. This is what’s called a teleological explanation. The idea that things in nature behave in goal-directed ways goes back to Aristotle.

All three were edging away from the barren structure of Euclidean geometry toward a fertile, if freewheeling and unrigorous, science of motion, one that represented bodies as moving through infinitely divisible space and time. But there was a certain theological nettle to be grasped by these natural philosophers: How could the real infinite, which was supposed to be an attribute of God alone, be present in the finite world he created? It was Blaise Pascal who was most agitated by this question. None of his contemporaries embraced the idea of the infinite more passionately than did Pascal. And no one has ever written with more conviction of the awe that the infinite vastness and minuteness of nature can evoke. Nature proposes the two infinities to us as mysteries, “not to understand, but to admire,” Pascal wrote—and to use in our reasoning, he might have added.

The French trio “lost their nerve”; they “confronted an intellectual abyss before which they came to a halt.” And the price they paid for their qualms, we are given to believe, was psychological as well as mathematical. Borel retreated from the abstractions of set theory to the safer ground of probability theory. “Je vais pantoufler dans les probabilités,” as he charmingly put it (“I’m going to dally with probability”; pantoufler literally means “play around in my slippers”). Lebesgue in his “frustration” became “somewhat sour.” Baire, whose physical and mental health were always delicate, ended his life in solitude and suicide.


pages: 471 words: 124,585

The Ascent of Money: A Financial History of the World by Niall Ferguson

Admiral Zheng, Alan Greenspan, An Inconvenient Truth, Andrei Shleifer, Asian financial crisis, asset allocation, asset-backed security, Atahualpa, bank run, banking crisis, banks create money, Bear Stearns, Black Monday: stock market crash in 1987, Black Swan, Black-Scholes formula, Bonfire of the Vanities, Bretton Woods, BRICs, British Empire, business cycle, capital asset pricing model, capital controls, Carmen Reinhart, Cass Sunstein, central bank independence, classic study, collateralized debt obligation, colonial exploitation, commoditize, Corn Laws, corporate governance, creative destruction, credit crunch, Credit Default Swap, credit default swaps / collateralized debt obligations, currency manipulation / currency intervention, currency peg, Daniel Kahneman / Amos Tversky, deglobalization, diversification, diversified portfolio, double entry bookkeeping, Edmond Halley, Edward Glaeser, Edward Lloyd's coffeehouse, equity risk premium, financial engineering, financial innovation, financial intermediation, fixed income, floating exchange rates, Fractional reserve banking, Francisco Pizarro, full employment, Future Shock, German hyperinflation, Greenspan put, Herman Kahn, Hernando de Soto, high net worth, hindsight bias, Home mortgage interest deduction, Hyman Minsky, income inequality, information asymmetry, interest rate swap, Intergovernmental Panel on Climate Change (IPCC), Isaac Newton, iterative process, James Carville said: "I would like to be reincarnated as the bond market. You can intimidate everybody.", John Meriwether, joint-stock company, joint-stock limited liability company, Joseph Schumpeter, junk bonds, Kenneth Arrow, Kenneth Rogoff, knowledge economy, labour mobility, Landlord’s Game, liberal capitalism, London Interbank Offered Rate, Long Term Capital Management, low interest rates, market bubble, market fundamentalism, means of production, Mikhail Gorbachev, Modern Monetary Theory, Money creation, money market fund, money: store of value / unit of account / medium of exchange, moral hazard, mortgage debt, mortgage tax deduction, Myron Scholes, Naomi Klein, National Debt Clock, negative equity, Nelson Mandela, Nick Bostrom, Nick Leeson, Northern Rock, Parag Khanna, pension reform, price anchoring, price stability, principal–agent problem, probability theory / Blaise Pascal / Pierre de Fermat, profit motive, quantitative hedge fund, RAND corporation, random walk, rent control, rent-seeking, reserve currency, Richard Thaler, risk free rate, Robert Shiller, rolling blackouts, Ronald Reagan, Savings and loan crisis, savings glut, seigniorage, short selling, Silicon Valley, South Sea Bubble, sovereign wealth fund, spice trade, stocks for the long run, structural adjustment programs, subprime mortgage crisis, tail risk, technology bubble, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, Thomas Bayes, Thomas Malthus, Thorstein Veblen, tontine, too big to fail, transaction costs, two and twenty, undersea cable, value at risk, W. E. B. Du Bois, Washington Consensus, Yom Kippur War

Then, in a remarkable rush of intellectual innovation, beginning in around 1660, that theoretical basis was created. In essence, there were six crucial breakthroughs:1. Probability. It was to a monk at Port-Royal that the French mathematician Blaise Pascal attributed the insight (published in Pascal’s Ars Cogitandi) that ‘fear of harm ought to be proportional not merely to the gravity of the harm, but also to the probability of the event.’ Pascal and his friend Pierre de Fermat had been toying with problems of probability for many years, but for the evolution of insurance, this was to be a critical point. 2. Life expectancy. In the same year that Ars Cogitandi appeared (1662), John Graunt published his ‘Natural and Political Observations . . .

Or riskier but probably higher yielding stocks? Insurance, in other words, is where the risks and uncertainties of daily life meet the risks and uncertainties of finance. To be sure, actuarial science gives insurance companies an in-built advantage over policy-holders. Before the dawn of modern probability theory, insurers were the gamblers; now they are the casino. The case can be made, as it was by Dickie Scruggs before his fall from grace, that the odds are now stacked unjustly against the punters/policy-holders. But as the economist Kenneth Arrow long ago pointed out, most of us prefer a gamble that has a 100 per cent chance of a small loss (our annual premium) and a small chance of a large gain (the insurance payout after disaster) to a gamble that has a 100 per cent chance of a small gain (no premiums) but an uncertain chance of a huge loss (no payout after a disaster).


pages: 532 words: 133,143

To Explain the World: The Discovery of Modern Science by Steven Weinberg

Albert Einstein, Alfred Russel Wallace, Astronomia nova, Brownian motion, Commentariolus, cosmological constant, dark matter, Dava Sobel, double helix, Edmond Halley, Eratosthenes, Ernest Rutherford, fudge factor, invention of movable type, Isaac Newton, James Watt: steam engine, Johannes Kepler, music of the spheres, On the Revolutions of the Heavenly Spheres, Pierre-Simon Laplace, probability theory / Blaise Pascal / Pierre de Fermat, retrograde motion, Thomas Kuhn: the structure of scientific revolutions

For Descartes’ purposes his failure to explain the value of n didn’t really matter, because he could and did take the value of n from experiment (perhaps from the data in Ptolemy’s Optics), which of course gives n greater than 1. A more convincing derivation of the law of refraction was given by the mathematician Pierre de Fermat (1601–1665), along the lines of the derivation by Hero of Alexandria of the equal-angles rule governing reflection, but now making the assumption that light rays take the path of least time, rather than of least distance. This assumption (as shown in Technical Note 28) leads to the correct formula, that n is the ratio of the speed of light in medium A to its speed in medium B, and is therefore greater than 1 when A is air and B is glass or water.

In 1643 he observed that if a vertical glass tube longer than this and closed at the top end is filled with mercury, then some mercury will flow out until the height of the mercury in the tube is about 30 inches. This leaves empty space on top, now known as a “Torricellian vacuum.” Such a tube can then serve as a barometer, to measure changes in ambient air pressure; the higher the air pressure, the higher the column of mercury that it can support. The French polymath Blaise Pascal is best known for his work of Christian theology, the Pensées, and for his defense of the Jansenist sect against the Jesuit order, but he also contributed to geometry and to the theory of probability, and explored the pneumatic phenomena studied by Torricelli. Pascal reasoned that if the column of mercury in a glass tube open at the bottom is held up by the pressure of the air, then the height of the column should decrease when the tube is carried to high altitude on a mountain, where there is less air overhead and hence lower air pressure.

., 22–23 astronomy and, 19–20, 51, 61, 91, 117, 151, 155, 264 deduction vs. observation and, 132 Francis Bacon and, 202 homework problem, 79–80 Kepler and, 163, 167, 171 magnetism and, 257 mathematics and, 17–20, 203 matter and, 10–13, 111, 274–79 medieval Europe and, 124, 127, 132 religion and, 45, 47, 50 Plato’s Universe (Vlastos), 6n Pleiades, 55, 176 Plotinus, 47 Plutarch, 70, 153–54 Pneumatics (Philo), 35 poetry, 1, 12–14 Pogson, Norman, 88n Polaris (North Star), 75 polygon, 275–76, 294–95 polyhedrons, regular, 10–12, 15, 18, 162–64, 171, 274–79 Pope, Alexander, 252–53 precession of apparent orbit of Sun around Earth, 113–14 of equinoxes, 74–75, 107, 118, 153, 241–42, 244, 248 of perihelia, 241, 244, 250 prediction, 154, 242–43, 265 Priestley, Joseph, 11 Principles of Philosophy (Descartes), 203–4, 212–13 Prior Analytics (Aristotle), 17 prism, 211, 218–20 probability, theory of, 199 Proclus, 51, 97–98 projectiles, thrown objects and, 27, 51, 71, 133, 135, 161, 193–94, 213, 342–46 proportions, theory of, 17 Protestantism, 156–57, 166, 253 protons, 243, 262–64 Prutenic Tables, 158, 166, 172 Ptolemy, Claudius, 48, 51, 79, 330 Arabs and, 105–7, 110, 112–14, 117–18, 141, 160 Aristotelian models vs., 95–99, 106, 112–14, 141–43, 160 chords vs. sines and, 309–11 Copernicus vs., 149–55, 255 Descartes and, 204 Earth’s rotation and, 135 epicycles and, 87, 255, 303–7 equal-angles and equants, 37, 87, 254–55, 324–25 experiment and, 189 Francis Bacon and, 201 Galileo and, 172–73, 179–80, 185 Kepler, 165, 167–68 lunar parallax and, 237n, 307–9 medieval Europe and, 126, 128–29, 141 planetary motion and, 51, 71–74, 87–96, 100, 137, 254–55 refraction and, 37, 79, 137, 330 Tycho and, 159–61, 165 Ptolemy I, 31–32, 35 Ptolemy II, 32 Ptolemy III, 40, 75 Ptolemy IV, 40, 75 Ptolemy XV, 31 pulmonary circulation, 118 Punic War, Second, 39 Pythagoras, 15, 47, 72 Pythagoreans, 15–20, 72, 78–79, 111, 141n, 151, 153–54, 279–82 Pythagorean theorem, 17, 283–84 quadratic equations, 15 quantum chromodynamics, 243 quantum electrodynamics, 180, 263, 268 quantum field theory, 262, 263–64 quantum mechanics, 21, 34, 152, 220, 248–49, 261–65, 268 quarks, 243, 263–65 quaternions, 163 Questiones quandam philosophicae (Newton), 217, 218 Qutb, Sayyid, 123 radiation, 261 radioactivity, 260–62, 264 radio astronomers, 159 radio waves, 259 Ragep, F.


The Art of Computer Programming: Fundamental Algorithms by Donald E. Knuth

Charles Babbage, discrete time, distributed generation, Donald Knuth, fear of failure, Fermat's Last Theorem, G4S, Gerard Salton, Isaac Newton, Ivan Sutherland, Jacquard loom, Johannes Kepler, John von Neumann, linear programming, linked data, Menlo Park, probability theory / Blaise Pascal / Pierre de Fermat, sorting algorithm, stochastic process, Turing machine

The idea behind mathematical induction is thus intimately related to the concept of number. The first European to apply mathematical induction to rigorous proofs was the Italian scientist Francesco Maurolico, in 1575. Pierre de Fermat made further improvements, in the early 17th century; he called it the "method of infinite descent." The notion also appears clearly in the later writings of Blaise Pascal A653). The phrase "mathematical induction" appar- apparently was coined by A. De Morgan in the early nineteenth century. [See AMM 24 A917), 199-207; 25 A918), 197-201; Arch. Hist. Exact Sci. 9 A972), 1-21.] Further discussion of mathematical induction can be found in G.

(r - k + 1) r- y-p r + 1 - j 3 ' C) r\ , = 0, integer k < 0. kj In particular cases we have 2— D) Table 1 gives values of the binomial coefficients for small integer values of r and k; the values for 0 < r < 4 should be memorized. Binomial coefficients have a long and interesting history. Table 1 is called "Pascal's triangle" because it appeared in Blaise Pascal's Traite du Triangle Arithmetique in 1653. This treatise was significant because it was one of the first works on probability theory, but Pascal did not invent the binomial coefficients (which were well-known in Europe at that time). Table 1 also appeared in the treatise Szu-yiian Yii-chien ("The Precious Mirror of the Four Elements") by the Chinese mathematician Shih-Chieh Chu in 1303, where they were said to be an old invention.

A6) The notation used in Eq. A6) will be used to describe the statistical character- characteristics of other probabilistic quantities throughout this book. We have completed the analysis of Algorithm M; the new feature that has appeared in this analysis is the introduction of probability theory. Elementary probability theory is sufficient for most of the applications in this book: The simple counting techniques and the definitions of mean, variance, and standard deviation already given will answer most of the questions we want to ask. More complicated algorithms will help us develop an ability to reason fluently about probabilities.


Statistics in a Nutshell by Sarah Boslaugh

Antoine Gombaud: Chevalier de Méré, Bayesian statistics, business climate, computer age, confounding variable, correlation coefficient, experimental subject, Florence Nightingale: pie chart, income per capita, iterative process, job satisfaction, labor-force participation, linear programming, longitudinal study, meta-analysis, p-value, pattern recognition, placebo effect, probability theory / Blaise Pascal / Pierre de Fermat, publication bias, purchasing power parity, randomized controlled trial, selection bias, six sigma, sparse data, statistical model, systematic bias, The Design of Experiments, the scientific method, Thomas Bayes, Two Sigma, Vilfredo Pareto

Many historians trace the beginning of modern probability theory to the Chevalier de Mere, a gentleman gambler in seventeenth-century France. He was fond of betting that he would roll at least one six in four rolls of a single die: the wisdom of this bet will be demonstrated in the following paragraphs. However, he also believed that it was a good bet to propose that he would roll one or more double sixes in 24 rolls of a pair of dice: this turned out to be a losing proposition. Fortunately for future statisticians, the Chevalier took this problem to his friend, the philosopher Blaise Pascal, who discussed it with his friend, the mathematician Pierre de Fermat.

Fortunately for future statisticians, the Chevalier took this problem to his friend, the philosopher Blaise Pascal, who discussed it with his friend, the mathematician Pierre de Fermat. Consideration of this type of question led to the development of, among other things, Pascal’s triangle, the binomial distribution, and the modern concept of probability. In an even bet among friends, when there is no “house” taking a percentage of the proceeds, a good bet is one you are likely to win more than 50% of the time. To put it another way, a good bet is one in which your likelihood of winning is 0.5 or greater. The Chevalier’s first bet met this standard: the probability of rolling at least one six in four rolls of a die is 0.518. This is easily calculated by considering the probability of rolling no sixes in four trials, which is (5/6)4.

What the Tribune did not anticipate was that Truman would carry many western states, including California, and thus amass sufficient electoral votes to win the election. Chapter 2. Probability Probability theory is fundamental to statistics. Some people find probability to be an intimidating topic, but there’s no reason anyone willing to put in the time can’t come to understand it at the level necessary to succeed in statistics. As is the case in many fields of study, advanced probability theory can become very complex and difficult to understand, but the basic principles of probability are intuitive and easy to comprehend. What’s more, most people are already familiar with probabilistic statements, from the weather report that tells you there is a 30% chance of rain this afternoon to the warning on cigarette packages that smoking increases your risk of developing lung cancer.


pages: 1,387 words: 202,295

Structure and Interpretation of Computer Programs, Second Edition by Harold Abelson, Gerald Jay Sussman, Julie Sussman

Andrew Wiles, conceptual framework, Donald Knuth, Douglas Hofstadter, Eratosthenes, functional programming, Gödel, Escher, Bach, higher-order functions, industrial robot, information retrieval, iterative process, Ivan Sutherland, Johannes Kepler, loose coupling, machine translation, Multics, probability theory / Blaise Pascal / Pierre de Fermat, Richard Stallman, Turing machine, wikimedia commons

Tabulation can sometimes be used to transform processes that require an exponential number of steps (such as count-change) into processes whose space and time requirements grow linearly with the input. See Exercise 3.27. 35 The elements of Pascal’s triangle are called the binomial coefficients, because the row consists of the coefficients of the terms in the expansion of . This pattern for computing the coefficients appeared in Blaise Pascal’s 1653 seminal work on probability theory, Traité du triangle arithmétique. According to Knuth (1973), the same pattern appears in the Szu-yuen Yü-chien (“The Precious Mirror of the Four Elements”), published by the Chinese mathematician Chu Shih-chieh in 1303, in the works of the twelfth-century Persian poet and mathematician Omar Khayyam, and in the works of the twelfth-century Hindu mathematician Bháscara Áchárya. 36 These statements mask a great deal of oversimplification.

Let be successive pairs in the reduction process. By our induction hypotheses, we have and . Thus, applying the claim we just proved together with the definition of the Fibonacci numbers gives , which completes the proof of Lamé’s Theorem. 44 If is a divisor of , then so is . But and cannot both be greater than . 45 Pierre de Fermat (1601-1665) is considered to be the founder of modern number theory. He obtained many important number-theoretic results, but he usually announced just the results, without providing his proofs. Fermat’s Little Theorem was stated in a letter he wrote in 1640. The first published proof was given by Euler in 1736 (and an earlier, identical proof was discovered in the unpublished manuscripts of Leibniz).

In Future Generation Computer Systems, vol. 9, pp. 105-117. –› Feeley, Marc. 1986. Deux approches à l’implantation du language Scheme. Masters thesis, Université de Montréal. –› Feeley, Marc and Guy Lapalme. 1987. Using closures for code generation. Journal of Computer Languages 12(1): 47-66. –› Feller, William. 1957. An Introduction to Probability Theory and Its Applications, volume 1. New York: John Wiley & Sons. Fenichel, R., and J. Yochelson. 1969. A Lisp garbage collector for virtual memory computer systems. Communications of the ACM 12(11): 611-612. –› Floyd, Robert. 1967. Nondeterministic algorithms. JACM, 14(4): 636-644. –› Forbus, Kenneth D., and Johan deKleer. 1993.


The Art of Computer Programming by Donald Ervin Knuth

Abraham Wald, Brownian motion, Charles Babbage, complexity theory, correlation coefficient, Donald Knuth, Eratosthenes, G4S, Georg Cantor, information retrieval, Isaac Newton, iterative process, John von Neumann, Louis Pasteur, mandelbrot fractal, Menlo Park, NP-complete, P = NP, Paul Erdős, probability theory / Blaise Pascal / Pierre de Fermat, RAND corporation, random walk, sorting algorithm, Turing machine, Y2K

Other values of c do not seem to lead to simple relationships mod p, and they should all be satisfactory when used with suitable starting values. Richard Brent used a modification of Algorithm B to discover the prime factor 1238926361552897 of 2256 + 1. [See Math. Comp. 36 A981), 627-630; 38 A982), 253-255.] Fermat's method. Another approach to the factoring problem, which was used by Pierre de Fermat in 1643, is more suited to finding large factors than small .5.4 FACTORING INTO PRIMES 387 nes. [Fermat's original description of his method, translated into English, can e found in L. E. Dickson's monumental History of the Theory of Numbers 1 Carnegie Inst. of Washington, 1919), 357.] Assume that N = uv, where u < v.

For many years astronomers had occasionally used sexagesimal 4.1 POSITIONAL NUMBER SYSTEMS 199 arithmetic both for the integer and the fractional parts of numbers, primarily when performing multiplication [see John Wallis, Treatise of Algebra (Oxford: 1685), 18-22, 30]. The fact that any integer greater than 1 could serve as radix was apparently first stated in print by Blaise Pascal in De Numeris Multiplicibus, which was written about 1658 [see Pascal's (Euvres Completes (Paris: Editions de Seuil, 1963), 84-89]. Pascal wrote, "Denaria enim ex instituto hominum, non ex necessitate naturae ut vulgus arbitratur, et sane satis inepte, posita est"; i.e., "The decimal system has been established, somewhat foolishly to be sure, according to man's custom, not from a natural necessity as most people would think."

Although courses on "Random Numbers" and on "Arithmetic" are not presently a part of many college curricula, I be- believe the reader will find that the subject matter of these chapters lends itself nicely to a unified treatment of material that has real educational value. My own experience has been that these courses are a good means of introducing elementary probability theory and number theory to college students. Nearly all of the topics usually treated in such introductory courses arise naturally in connection with applications, and the presence of these applications can be an important motivation that helps the student to learn and to appreciate the theory. Furthermore, each chapter gives a few hints of more advanced topics that will whet the appetite of many students for further mathematical study.