history of Unix

9 results back to index


pages: 612 words: 187,431

The Art of UNIX Programming by Eric S. Raymond

A Pattern Language, Albert Einstein, Apple Newton, barriers to entry, bioinformatics, Boeing 747, Clayton Christensen, combinatorial explosion, commoditize, Compatible Time-Sharing System, correlation coefficient, David Brooks, Debian, Dennis Ritchie, domain-specific language, don't repeat yourself, Donald Knuth, end-to-end encryption, Everything should be made as simple as possible, facts on the ground, finite state, Free Software Foundation, general-purpose programming language, George Santayana, history of Unix, Innovator's Dilemma, job automation, Ken Thompson, Larry Wall, level 1 cache, machine readable, macro virus, Multics, MVC pattern, Neal Stephenson, no silver bullet, OSI model, pattern recognition, Paul Graham, peer-to-peer, premature optimization, pre–internet, publish or perish, revision control, RFC: Request For Comment, Richard Stallman, Robert Metcalfe, Steven Levy, the Cathedral and the Bazaar, transaction costs, Turing complete, Valgrind, wage slave, web application

. -- George Santayana The Life of Reason (1905) The past informs practice. Unix has a long and colorful history, much of which is still live as folklore, assumptions, and (too often) battle scars in the collective memory of Unix programmers. In this chapter we'll survey the history of Unix, with an eye to explaining why, in 2003, today's Unix culture looks the way it does. Origins and History of Unix, 1969-1995 A notorious ‘second-system effect‘ often afflicts the successors of small experimental prototypes. The urge to add everything that was left out the first time around all too frequently leads to huge and overcomplicated design.

Contents Frontmatter Title Page Dedication Preface Preface Who Should Read This Book How to Use This Book Related References Conventions Used in This Book Our Case Studies Author's Acknowledgements Context Philosophy Philosophy Culture? What Culture? The Durability of Unix The Case against Learning Unix Culture What Unix Gets Wrong What Unix Gets Right Basics of the Unix Philosophy The Unix Philosophy in One Lesson Applying the Unix Philosophy Attitude Matters Too History History Origins and History of Unix, 1969-1995 Origins and History of the Hackers, 1961-1995 The Open-Source Movement: 1998 and Onward The Lessons of Unix History Contrasts Contrasts The Elements of Operating-System Style Operating-System Comparisons What Goes Around, Comes Around Design Modularity Modularity Encapsulation and Optimal Module Size Compactness and Orthogonality Software Is a Many-Layered Thing Libraries Unix and Object-Oriented Languages Coding for Modularity Textuality Textuality The Importance of Being Textual Data File Metaformats Application Protocol Design Application Protocol Metaformats Transparency Transparency Studying Cases Designing for Transparency and Discoverability Designing for Maintainability Multiprogramming Multiprogramming Separating Complexity Control from Performance Tuning Taxonomy of Unix IPC Methods Problems and Methods to Avoid Process Partitioning at the Design Level Minilanguages Minilanguages Understanding the Taxonomy of Languages Applying Minilanguages Designing Minilanguages Generation Generation Data-Driven Programming Ad-hoc Code Generation Configuration Configuration What Should Be Configurable?

The other (and more important) intention behind “open source” was to present the hacker community's methods to the rest of the world (especially the business mainstream) in a more market-friendly, less confrontational way. In this role, fortunately, it proved an unqualified success — and led to a revival of interest in the Unix tradition from which it sprang. The Lessons of Unix History The largest-scale pattern in the history of Unix is this: when and where Unix has adhered most closely to open-source practices, it has prospered. Attempts to proprietarize it have invariably resulted in stagnation and decline. In retrospect, this should probably have become obvious much sooner than it did. We lost ten years after 1984 learning our lesson, and it would probably serve us very ill to ever again forget it.


pages: 1,202 words: 144,667

The Linux kernel primer: a top-down approach for x86 and PowerPC architectures by Claudia Salzberg Rodriguez, Gordon Fischer, Steven Smolski

Debian, Dennis Ritchie, domain-specific language, en.wikipedia.org, Free Software Foundation, G4S, history of Unix, Ken Thompson, level 1 cache, Multics, recommendation engine, Richard Stallman

It takes you through an overview of the components and features of the kernel and introduces some of the features that make Linux so appealing. To understand the concepts of the Linux kernel, you need to have a basic understanding of its intended purpose. 1.1. History of UNIX We mentioned that Linux is a type of UNIX. Although Linux did not develop directly from an existing UNIX, the fact that it implements common UNIX standards makes the history of UNIX relevant to our discussion. MULTiplexed Information and Computing Service (MULTICS), which is considered the precursor of the UNIX operating systems, came about from a joint venture between MIT, Bell Laboratories, and the General Electric Company (GEC), which was involved in the computer-manufacturing business at that time.

The Linux® Kernel Primer: A Top-Down Approach for x86 and PowerPC Architectures Table of Contents Copyright Prentice Hall: Open Source Software Development Series Foreword Acknowledgments About the Authors Preface Intended Audience Organization of Material Our Approach Conventions Chapter 1. Overview Section 1.1. History of UNIX Section 1.2. Standards and Common Interfaces Section 1.3. Free Software and Open Source Section 1.4. A Quick Survey of Linux Distributions Section 1.5. Kernel Release Information Section 1.6. Linux on Power Section 1.7. What Is an Operating System? Section 1.8. Kernel Organization Section 1.9.

Publisher: Prentice Hall PTR Pub Date: September 21, 2005 ISBN: 0-13-118163-7 Pages: 648 Table of Contents | Index Copyright Prentice Hall: Open Source Software Development Series Foreword Acknowledgments About the Authors Preface Intended Audience Organization of Material Our Approach Conventions Chapter 1. Overview Section 1.1. History of UNIX Section 1.2. Standards and Common Interfaces Section 1.3. Free Software and Open Source Section 1.4. A Quick Survey of Linux Distributions Section 1.5. Kernel Release Information Section 1.6. Linux on Power Section 1.7. What Is an Operating System? Section 1.8. Kernel Organization Section 1.9.


Smart Mobs: The Next Social Revolution by Howard Rheingold

"hyperreality Baudrillard"~20 OR "Baudrillard hyperreality", A Pattern Language, Alvin Toffler, AOL-Time Warner, augmented reality, barriers to entry, battle of ideas, Brewster Kahle, Burning Man, business climate, citizen journalism, computer vision, conceptual framework, creative destruction, Dennis Ritchie, digital divide, disinformation, Douglas Engelbart, Douglas Engelbart, experimental economics, experimental subject, Extropian, Free Software Foundation, Garrett Hardin, Hacker Ethic, Hedy Lamarr / George Antheil, Herman Kahn, history of Unix, hockey-stick growth, Howard Rheingold, invention of the telephone, inventory management, Ivan Sutherland, John Markoff, John von Neumann, Joi Ito, Joseph Schumpeter, Ken Thompson, Kevin Kelly, Lewis Mumford, Metcalfe's law, Metcalfe’s law, more computing power than Apollo, move 37, Multics, New Urbanism, Norbert Wiener, packet switching, PalmPilot, Panopticon Jeremy Bentham, pattern recognition, peer-to-peer, peer-to-peer model, pez dispenser, planetary scale, pre–internet, prisoner's dilemma, radical decentralization, RAND corporation, recommendation engine, Renaissance Technologies, RFID, Richard Stallman, Robert Metcalfe, Robert X Cringely, Ronald Coase, Search for Extraterrestrial Intelligence, seminal paper, SETI@home, sharing economy, Silicon Valley, skunkworks, slashdot, social intelligence, spectrum auction, Steven Levy, Stewart Brand, the Cathedral and the Bazaar, the scientific method, Tragedy of the Commons, transaction costs, ultimatum game, urban planning, web of trust, Whole Earth Review, Yochai Benkler, zero-sum game

William Henry Gates III, “An Open Letter to Hobbyists,” Altair Users’ Newsletter, 3 February 1976. 53. Dennis M. Ritchie, “The Evolution of the Unix Time-Sharing System,” AT&TBell Laboratories Technical Journal 63 (October 1984): 15771593. 54. Nick Moffit, “Nick Moffit’s $7 History of Unix,” <http://crackmonkey.org/unix.html > (29 January 2002). 55. Ritchie, “The Evolution of the Unix Time-Sharing System.” 56. Moffit, “Nick Moffit’s $7 History of Unix.” 57. Richard Stallman, “The Free Software Definition,” The GNU Project, Free Software Foundation, 2000, <http://www.gnu.org/philosophy/free-sw.html> (17 June 2001). 58. Ibid. See also: Michael Stutz, “Freed Software Winning Support, Making Waves,” Wired News, 30 January 1998, <http://www.wired.com/news/technology/0,1282,9966,00.html > (5 February 2002). 59.


pages: 494 words: 142,285

The Future of Ideas: The Fate of the Commons in a Connected World by Lawrence Lessig

AltaVista, Andy Kessler, AOL-Time Warner, barriers to entry, Bill Atkinson, business process, Cass Sunstein, commoditize, computer age, creative destruction, dark matter, decentralized internet, Dennis Ritchie, disintermediation, disruptive innovation, Donald Davies, Erik Brynjolfsson, Free Software Foundation, Garrett Hardin, George Gilder, Hacker Ethic, Hedy Lamarr / George Antheil, history of Unix, Howard Rheingold, Hush-A-Phone, HyperCard, hypertext link, Innovator's Dilemma, invention of hypertext, inventory management, invisible hand, Jean Tirole, Jeff Bezos, John Gilmore, John Perry Barlow, Joseph Schumpeter, Ken Thompson, Kenneth Arrow, Larry Wall, Leonard Kleinrock, linked data, Marc Andreessen, Menlo Park, Mitch Kapor, Network effects, new economy, OSI model, packet switching, peer-to-peer, peer-to-peer model, price mechanism, profit maximization, RAND corporation, rent control, rent-seeking, RFC: Request For Comment, Richard Stallman, Richard Thaler, Robert Bork, Ronald Coase, Search for Extraterrestrial Intelligence, SETI@home, Silicon Valley, smart grid, software patent, spectrum auction, Steve Crocker, Steven Levy, Stewart Brand, systematic bias, Ted Nelson, Telecommunications Act of 1996, the Cathedral and the Bazaar, The Chicago School, tragedy of the anticommons, Tragedy of the Commons, transaction costs, vertical integration, Yochai Benkler, zero-sum game

The success of this would depend upon how well structured the original code was. 2 Ceruzzi, A History of Modern Computing, 108. 3 For a brief history of Unix, see William Shattuck, “The Meaning of UNIX,” in The Unix System Encyclopedia, 2d ed.,Yates Ventures, eds. (Palo Alto, Calif.: Yates Ventures, 1985), 89, 93-94; Peter H. Salus, A Quarter Century of UNIX (Reading, Mass.: Addison-Wesley Publishing Company, 1994), 5-61; Ronda Hauben, “The History of UNIX,” available at http://www.dei.isep.ipp.pt/docs/unix.html (last visited June 12, 2001). 4 Robert Young and Wendy Goldman Rohm, Under the Radar: How Red Hat Changed the Software Business—and Took Microsoft by Surprise (Scottsdale, Ariz.: Coriolis Group Books, 1999), 21; Donald K.


pages: 1,201 words: 233,519

Coders at Work by Peter Seibel

Ada Lovelace, Bill Atkinson, bioinformatics, Bletchley Park, Charles Babbage, cloud computing, Compatible Time-Sharing System, Conway's Game of Life, Dennis Ritchie, domain-specific language, don't repeat yourself, Donald Knuth, fallacies of distributed computing, fault tolerance, Fermat's Last Theorem, Firefox, Free Software Foundation, functional programming, George Gilder, glass ceiling, Guido van Rossum, history of Unix, HyperCard, industrial research laboratory, information retrieval, Ken Thompson, L Peter Deutsch, Larry Wall, loose coupling, Marc Andreessen, Menlo Park, Metcalfe's law, Multics, no silver bullet, Perl 6, premature optimization, publish or perish, random walk, revision control, Richard Stallman, rolodex, Ruby on Rails, Saturday Night Live, side project, slashdot, speech recognition, systems thinking, the scientific method, Therac-25, Turing complete, Turing machine, Turing test, type inference, Valgrind, web application

Every time you type to the shell and it creates a new process and runs whatever you typed and when that dies you come back so that you're at arm's length from the thing you're running. Seibel: So those are all things you did take; there's nothing you left behind that you now regret? Thompson: No. Seibel: From what I've read about the history of Unix, it sounds like you used the design process that you described earlier. You thought about it for a while and then your wife and kid went away for a month and you said, “Oh, great—now I can write the code.” Thompson: Yeah.... A group of us sat down and talked about a file system. There were about three or four of us.

Nothing much new has happened in computers that you couldn't have predicted. The last significant thing, I think, was the Internet, and that was certainly in place in '99. Everything has expanded—the speed of individual computers is still expanding exponentially, but what's different? Seibel: Reading the history of Unix, it seems like you guys basically invented an operating system because you wanted a way to play with this computer. So in order to do what today might be a very basic thing, such as write a game or something on a computer, well, you had to write a whole operating system. You needed to write compilers and build a lot of infrastructure to be able to do anything.


From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry by Martin Campbell-Kelly

Apple II, Apple's 1984 Super Bowl advert, barriers to entry, Bill Gates: Altair 8800, business process, card file, Charles Babbage, computer age, computer vision, continuous integration, Dennis Ritchie, deskilling, Donald Knuth, Gary Kildall, Grace Hopper, history of Unix, hockey-stick growth, independent contractor, industrial research laboratory, information asymmetry, inventory management, John Markoff, John von Neumann, Larry Ellison, linear programming, longitudinal study, machine readable, Menlo Park, Mitch Kapor, Multics, Network effects, popular electronics, proprietary trading, RAND corporation, Robert X Cringely, Ronald Reagan, seminal paper, Silicon Valley, SimCity, software patent, Steve Jobs, Steve Wozniak, Steven Levy, Thomas Kuhn: the structure of scientific revolutions, vertical integration

Among the many examples are Unisys’s EXEC-8 and MCP, DEC’s VMS, and NEC’s MODE IV. Unix and Open Systems Unix is the only non-proprietary operating system of major significance and longevity. It originated in the early 1970s, and it became popular for use with the minicomputers that were then coming on the market. It is now available for all computer platforms. The history of Unix is well documented.29 Like the origins of the Internet (with which it is closely associated), it has entered the folklore 144 Chapter 5 of post-mainframe computing. The Unix project was initiated in 1969 by two researchers at Bell Labs, Ken Thomson and Dennis Ritchie. “Unix” was a pun on “MULTICS,” the name of a multi-access time-sharing system then being developed by a consortium of Bell Labs, General Electric, and MIT.


pages: 483 words: 145,225

Rebel Code: Linux and the Open Source Revolution by Glyn Moody

barriers to entry, business logic, commoditize, Compatible Time-Sharing System, Debian, Dennis Ritchie, Donald Knuth, Eben Moglen, Free Software Foundation, ghettoisation, Guido van Rossum, history of Unix, hypertext link, Johann Wolfgang von Goethe, John Gilmore, Ken Thompson, Kickstarter, Larry Ellison, Larry Wall, Marc Andreessen, MITM: man-in-the-middle, Multics, Network effects, new economy, packet switching, RFC: Request For Comment, Richard Stallman, Silicon Valley, skunkworks, slashdot, SoftBank, Steve Ballmer, Steve Jobs, Steven Levy, the Cathedral and the Bazaar, thinkpad, VA Linux

Token gesture or not, the sources came with surprisingly full release notes–some 1,800 words. These emphasized that “this version is meant mostly for reading”–hacker’s fare, that is, building on a tradition of code legibility that had largely begun with the creation of C, as Dennis Ritchie had noted in his 1979 history of Unix. And highly readable the code is, too. As well as being well laid out with ample use of space and indentation to delineate the underlying structure, it is also fully commented, something that many otherwise fine hackers omit. Some of the annotations are remarkably chirpy, and they convey well the growing excitement that Linus obviously felt as the kernel began to take shape beneath his fingers:This is GOOD CODE!


UNIX® Network Programming, Volume 1: The Sockets Networking API, 3rd Edition by W. Richard Stevens, Bill Fenner, Andrew M. Rudoff

Dennis Ritchie, exponential backoff, failed state, fudge factor, global macro, history of Unix, information retrieval, OpenAI, OSI model, p-value, RFC: Request For Comment, Richard Stallman, UUNET, web application

Unification of Standards The above brief backgrounds on POSIX and The Open Group both continue with The Austin Group’s publication of The Single Unix Specification Version 3, as mentioned at the beginning of this section. Getting over 50 companies to agree on a single standard is certainly a landmark in the history of Unix. Most Unix systems today conform to some version of POSIX.1 and POSIX.2; many comply with The Single Unix Specification Version 3. Historically, most Unix systems show either a Berkeley heritage or a System V heritage, but these differences are slowly disappearing as most vendors adopt the standards.

Our wrapper functions all begin with a capital letter. 30 Introduction Chapter 1 The Single Unix Specification Version 3, known by several other names and called simply The POSIX Specification by us, is the confluence of two long-running standards efforts, finally drawn together by The Austin Group. Readers interested in the history of Unix networking should consult [Salus 1994] for a description of Unix history, and [Salus 1995] for the history of TCP/IP and the Internet. Exercises 1.1 Go through the steps at the end of Section 1.9 to discover information about your network topology. 1.2 Obtain the source code for the examples in this text (see the Preface).


pages: 496 words: 174,084

Masterminds of Programming: Conversations With the Creators of Major Programming Languages by Federico Biancuzzi, Shane Warden

Benevolent Dictator For Life (BDFL), business intelligence, business logic, business process, cellular automata, cloud computing, cognitive load, commoditize, complexity theory, conceptual framework, continuous integration, data acquisition, Dennis Ritchie, domain-specific language, Douglas Hofstadter, Fellow of the Royal Society, finite state, Firefox, follow your passion, Frank Gehry, functional programming, general-purpose programming language, Guido van Rossum, higher-order functions, history of Unix, HyperCard, industrial research laboratory, information retrieval, information security, iterative process, Ivan Sutherland, John von Neumann, Ken Thompson, Larry Ellison, Larry Wall, linear programming, loose coupling, machine readable, machine translation, Mars Rover, millennium bug, Multics, NP-complete, Paul Graham, performance metric, Perl 6, QWERTY keyboard, RAND corporation, randomized controlled trial, Renaissance Technologies, Ruby on Rails, Sapir-Whorf hypothesis, seminal paper, Silicon Valley, slashdot, software as a service, software patent, sorting algorithm, SQL injection, Steve Jobs, traveling salesman, Turing complete, type inference, Valgrind, Von Neumann architecture, web application

Brian, Peter, and I had certain classes of application programs we wanted to write, but we wanted to write them with really short programs. Did the presence of tools and the rapidity of practical feedback push people to research better tools and better algorithms? Al: If you look at the early history of Unix and my early research career, I was very strongly motivated by Knuth’s statement that the best theory is motivated by practice, and the best practice by theory. I wrote dozens of papers looking at how to make parsing more efficient and being able to parse constructs that appear in real programming languages in a convenient and efficient way.