The history of computing could be told as the story of hardware and software, or the story of the Internet, or the story of "smart" hand-held devices, with subplots involving IBM, Microsoft, Apple, Facebook, and Twitter. In this concise and accessible account of the invention and development of digital technology, computer historian Paul Ceruzzi offers a broader and more useful perspective. He identifies four major threads that run throughout all of computing's technological development: digitization--the coding of information, computation, and control in binary form, ones and zeros; the convergence of multiple streams of techniques, devices, and machines, yielding more than the sum of their parts; the steady advance of electronic technology, as characterized famously by "Moore's Law"; and the human-machine interface. Ceruzzi guides us through computing history, telling how a Bell Labs mathematician coined the word "digital" in 1942 (to describe a high-speed method of calculating used in anti-aircraft devices), and recounting the development of the punch card (for use in the 1890 U.S. Census). He describes the ENIAC, built for scientific and military applications; the UNIVAC, the first general purpose computer; and ARPANET, the Internet's precursor. Ceruzzi's account traces the world-changing evolution of the computer from a room-size ensemble of machinery to a "minicomputer" to a desktop computer to a pocket-sized smart phone. He describes the development of the silicon chip, which could store ever-increasing amounts of data and enabled ever-decreasing device size. He visits that hotbed of innovation, Silicon Valley, and brings the story up to the present with the Internet, the World Wide Web, and social networking.
Paul E. Ceruzzi is Curator at the National Air and Space Museum at the Smithsonian Institution. He is the author of Computing: A Concise History, A History of Modern Computing, and Internet Alley: High Technology in Tysons Corner, 1945–2005, all published by the MIT Press, and other books.
Nothing against this book. It certainly delivers as promised by the title a CONCISE history of computing (~150 pages). The writing is clear and skims over a lot of details rapidly. And it's largely trivia and facts -- names, places, and dates -- and not much of the theories or mathematical basis of computing (although the author gallantly attempts to synthesize some of the history while bookending with an analogy of Zeno's Paradox as the speed of change in computing and therefore impossible to synthesize). And therein lies the problem: it is concise but almost shockingly so. For instance, the transistor, arguably the single most important invention of the 20th century and the technological basis of modern computing electronics, is first introduced in passing and then, 20-30 pages later, given about one or two pages of discussion. Ah! But I know I shouldn't be so upset about that; this is a concise history, after all.
A boring book in the beginning (I even gave up on it once), but an inspiring one toward the end, where more familiar companies come into focus.
The book does what it's supposed to do: it offers you a concise but decent account of the history of computers and computing, starting from the Second World War up until the emergence of Facebook and Twitter.
We had to read this for a random IAP class @MIT and it delivers precisely what the title promises: a brisk and accessible overview of the milestones in computing history, from the early punched-card systems to the transformative advent of microprocessors.
The book races through key developments with clarity but little depth, making it good for the quick intro pre-reading it was intended for, rather than an exhaustive analysis. Unsurprisingly since it's an MIT Press publication, the book’s emphasis on MIT’s pivotal role in computing — like contributions from the Lincoln Laboratory and breakthroughs in transistor and mainframe design—raises the question of institutional bias, though I guess given the undeniable impact of MIT-affiliated innovations, the prominence of its contributions may be more reflective of historical fact than partiality.
Overall, it reads like a quick pamphlet that captures the sweep of computing history with an efficiency that mirrors its subject matter, though it's not really something meant to standalone as a complete account, and you'd need to seek further reading for a deeper dive into specific events or less institutional focus.
British robots fought Nazi robots during the battle of Britain.
Well, depending on you definition of robot. The Nazi "Buzz Bombs" were arguably robotic in nature, using a simple guidance system, and so were the automated guns we made to shoot them down. By wrapping wire around a cam shaft we could program the gun to follow a certain path, and update it on the fly using data from our radar system. The guns were pretty much un-manned. We just programmed them and let 'em go.
I was looking for an account of the decades when computing started to change the world, the 1940s - '70s. I was especially curious about the '50s - '60s, when computers were still behind the scenes to most people but were already advanced enough to fly rockets to the moon. Ceruzzi did a good job of tracking the breakthroughs and trends in computer hardware, software, and networking. Some of that history is inherently a little technical and dry; you can't blame Ceruzzi for that. But contrary to what some reviewers implied, this was certainly not a monograph of computer languages and model numbers. One of the book's main themes was the importance of ergonomics, the machine / human interface. He also concluded with the interesting observation that today's technology is the collision of top-down military design and bottom-up amateur tinkering.
I previewed other recent books on the subject, namely this author's "History of Modern Computing" and Isaacson's "The Innovators". They both seemed to be high-quality books too, though of the three this title was the most concise and least expensive. True, the socio-cultural references were more cursory in a book of this size. There were hints of hippie culture and the cold war, without much in-depth discussion. I would recommend this title if you are looking for a brief overview or if you would like to supplement it with other reading (the Computer History Museum's timeline is actually very substantive and a good supplement). It will not be an epic history like some other books on the subject. I am still seeking a little more detail about how businesses actually used their computers to improve operations in the mid-century before PCs brought them out into the open.
I was concerned upon starting this book that the Author would neglect the early attempts at inventing calculating machines that were the real roots of the Computers we know. I was happy to find out that he does refer to these devices, successful and unsuccessful though they were, in looking backward in his early chapters.
Since my career in Computing began with the IBM 360 Series in 1969, I found his coverage of Computing’s History since WWII very accurate. It was useful to me as a trip down Memory Lane, but I would recommend it to a General Public whose World is dominated by Amazon, Google and TikTok, who might benefit from remembering the role of AOL, Blackberry, and The Source.
Highly readable and thorough I’d give this work Four Stars. ****
I saw this book at the library in the new book section and decided to borrow. I have worked in computing all my working life and thought this could be interesting. So far, it is a light introduction.
----
Finished reading this book and felt very underwhelmed with the presentation of the material. Not worth reading - back it goes to the library.
This book moved concisely through the history of computing. It's written in a "plain English" style that gives even a layperson like me a good grasp of how the modern Information Age was birthed, and the basic principles by which it will mature in years to come.
- George Stibitz, Bell Telephone Labs mathematician, in 1942 when discussing anti-aircraft gun machines, didn’t like the term “electrical pulses”, so he suggested another term: digital.
- The true mechanization of calculation began when inventors devised ways, not only to record numbers but to add them, in particular the capability to carry a digit from one column to the other (999+1) - this began with Pascal’s adding machine of 1642, or with a device by Wilhelm Schickard in 1623. Leibniz extended Pascal’s invention to be able to multiply as well. The Felt Comptometer, invented in the 1880s, was one of the first commercially successful calculators.
- Charles Babbage proposed in the 1830s the use of punched cards for his Analytical Engine, an idea he borrowed from the Jacquard loom which uses punched cards to control the weaving of cloth by selectively lifting threads according to a predetermined pattern. Herman Hollerith was able to use this technique to develop a machine for the 1890 US Census. The crucial difference between Jacquard’s and Hollerith’s systems: Jacquard used cards for control, whereas Hollerith used them for storage of data. Eventually IBM’s punched card installations would also use the cards for control. Before WWII, the control function of a punched card was carried out by people manually. The concept of automatic control is the ancestor of what we now call software.
- (1) control, (2) storage, (3) calculation, (4) the use of electric or electronic circuits: these attributes, when combined, make a computer. The fifth attribute, (5) communication, was lacking in the early computers built in the 1930s and 1940s. It was ARPA’s project in the 1960s that reoriented the digital computer to be a device that was inherently networked.
- In 1936, Alan Turing proposed the Turing machine and in 1937, Konrad Zuse (unaware of Turing’s paper) created the first universal machine. Babbage never got that far, but he did anticipate the notion of the universality of a programmable machine. John von Neumann conceived the stored program principle which has physical memory storing both the instructions (program) and the data without any distinctions. All of the machines so far used electricity to carry signals, but none used electronic devices to do the actual calculations. In 1938, J. V. Atanasoff did so using vacuum tubes to solve linear algebra equations (and nothing else), the first machine of its kinds. In doing so, he said he considered “analogue” techniques but discarded them (likely the origin of the term analog in computing). Then came the Colossus in 1944, the ENIAC in 1946 (first reprogrammable computer, but it took days to be able to reprogram it).
- The 1950s and 1960s were the decades of the mainframe, so-called because of the large metal frames on which the computer circuits were mounted. The UNIVAC inaugurated the era, but IBM quickly came to dominate the industry. In the 1950s, mainframes used thousands of vacuum tubes; by 1960 these were replaced by transistors, which consumed less power and were more compact. A person skilled in mathematical analysis would lay out the steps needed to solve a problem, translate those steps into a cold that was intrinsic to the machines design, punch those codes into paper tape or its equivalent, and run the machine. Through the 1950s, a special kind of program, a compiler, pioneered by Grace Hopper who led the UNIVAC team, would take human-friendly instructions and translate to machine code.
- Programming computers remained in the hands of few specialists who were comfortable with machine code. The breakthrough came in 1957 with Fortran, introduced by IBM, that had a syntax close to ordinary algebra, which made it familiar to engineers. Fortran compiler generated machine code that was as efficient and fast as code that human beings wrote (it was around this time that these codes came to be called “languages”). Its major competitor was COBOL, in part because the US DoD adopted it officially and, as a result, COBOL became one of the first languages to be standardized to a point where the same program could run on different computers, from different vendors, and produce the same results. As computers took on work of greater complexity, another type of program emerged that replaced the human operators who managed the flow of work in a punched card installation: the operating system around 1956.
- Vacuum tubes worked through the switching of electrons excited by a hot filament, moving at high speeds within a vacuum. They were very fast but cumbersome since tubes tended to burn out (similar to lightbulbs) AND required a hot filament. A new invention by Bell Labs in 1947 (using high purity germanium produced at Purdue, my alma mater) allowed to move on from those two bottlenecks: transistors. Manufacturers began selling transistorized computers beginning in the mid 1950s. The chip was a breakthrough of the early 1960s because it allowed a way of placing multiple transistors and other devices on a single wafer of silicon or germanium. As chip density increased, the functions it performed were more specialized, and the likelihood that a particular logic chip would find common use among a wide variety of customers got smaller and smaller. In this context, IBM started using the term architecture to refer to the overall design of a computer.
- Those computers lacked communication functionalities. The first computers of the ARPANET were linked to one another in 1969, and by 1971 there were 15 computers linked to it. However, as things progressed, there were multiple networks of computers independent of each other (many “internets”). ARPANET was the first, for research and scientific purposes, but there was also AOL, Usenet, BITNET and many others with different purposes and uses. The main difference between the modern internet and ARPANET is the social and cultural dimension that we have today. Things changed when the National Science Foundation (NSF) made 3 key decisions in 1986: (1) create a general-purpose network, available to researchers in general; (2) adopt the TCP/IP protocol promulgated by ARPA; (3) fund the construction of a high-speed backbone to connect supercenters and local/regional networks. One of the main recipients of these construction contracts was MCI (today a subsidiary of Verizon), which to this day is the principal carrier of Internet backbone traffic. Soon BITNET and Usenet established connections to this major network along with other domestic and international networks. The original ARPANET became obsolete and was the commissioned in 1990. By 1995, all internet backbone services were operated by commercial entities.
- In a milestone legislation of 1992, “the foundation is authorized to foster and support access by the research and education communities computer networks which may be substantially for purposes in addition to research and education in the sciences and engineering”. With those three words, “in addition to”, the modern Internet was born and the NSF role in it receded.
- Then Tim Berners-Lee invented the World Wide Web at the CERN. It made the Internet more accessible. It had 3 main components: URL, HTTP, HTML. He wrote a program, called a browser, that users installed on their computer to display information transmitted over the Web. And it spread, a grouped created Mosaic, a browser with rich graphics and integrated with the mouse.
- Another big invention were specialized websites to search the web as it was manually and poorly indexed up to that point. AltaVista was the first big one, but replaced by Google with its high quality algorithm.
This book definitely delivers on the "concise" part of the title, coming in a little over 150 small pages, but concision means making choices on what to exclude, and in this case, that's most of the technical detail. Ceruzzi's thesis links computing to four major concept: digital representation of characters and logic as binary 0s and 1s; the combination of data and control in a unified representation, communication between computers, and the increasing fluidity of the user interface.
Proto-computers, calculating machines of one sort or another had been around for ages, using punched cards to tabulate census data, automate business accounting, or help solve scientific problems. The Second World War revealed an astounding number of scientific problems which needed hefty numerical overhead, from code breaking to ballistic tables to the atomic bomb, and visionary engineers began thinking of a single flexible machine which had the ability to solve all these problems.
Computing did not immediately catch on after the war. The early ENIACs and similar machines were science projects that ran on thousands of balky vacuum tubes, with up times measured in hours. Even as solid state transistors replaced vacuum tubes, computers were still massively expensive installations tended by elite operators, with users submitting programs on stacks of punch cards to be run in batch. The DEC PDP-1 minicomputer was the first break away from this model, a machine cheap enough ($120k in 1960, about $1 million today) that regular people at labs could use them.
The 1960s and 1970s saw the relegation of mechanical input via punched cards in favor of much faster electronic inputs like tape decks, experiments in networking on ARPANET, and the golden bullet, the integrated circuit, which was a computer on a chip rather than painstaking assembled components on boards. Integrated circuits lead the personal computing revolution of the 1980s and the triumph of the mobile internet that we know today.
As a first pass, this book is fine, and I look forward to delving into the sources, but I think I need a complete history, rather than a concise one.
Computing: A Concise History offers an overview of the development of computing, from the early mechanical devices to modern digital systems. While the book succeeds in presenting a broad picture of the history of computing, it struggles with some structural issues that may leave readers feeling disoriented. The plot of technological advancement jumps back and forth in time, making it difficult to follow the chronological flow of events. The frequent shifts in timeline can confuse readers, especially those unfamiliar with the subject, as they try to piece together how different developments are connected over time.
Another downside of the book is its somewhat convoluted explanation of the mechanisms behind the various computing tools. The author often dives into technical details that may overwhelm a casual reader, without offering clear, step-by-step explanations. For those looking to gain a deep understanding of how these early computing machines worked, the book may not provide enough clarity. The technical jargon and complex descriptions may require more background knowledge to fully grasp, which could hinder its accessibility for beginners.
That being said, Computing: A Concise History is still a valuable resource for those seeking a quick overview of the entire history of computing. It serves as a great "flash reading" guide, offering a broad look at key milestones and developments in the field. For readers looking for a more in-depth understanding of the inner workings of computing tools, however, this book may fall short, as it focuses more on the timeline and significant events rather than the mechanisms behind the technology.
Ceruzzi’s concise account is too swift to be legible. The book is crammed solid with names and inventions, details that would be worthwhile in a larger work but just muddy things in a story this terse. He is good at showing his 4 themes: the digital paradigm, convergence of different technologies, the advances in solid-state electronics, and the relationship between humans and machines as recurrent patterns in the history of computing (x-xvi). Yet his narrative is too wushu-washy and has a hard time explaining causality, especially for the personal computer revolution.
In general, Ceruzzi’s story suggests a tripartite framework: 1) in the beginning, down to the thirties, computation (calculation, data processing) was a business application before 2) it was developed for military and scientific uses in the mid-20th century before finally 3) being adopted by hobbyists and consumers and sparking the current boom in consumer electronics.
Ceruzzi’s work is a decade old and so I will not begrudge too much the lack of introspection on the impacts of technology on us, but it seems to me a history of computing which doesn’t mention the ways that computers, the web, and social media in particular change our behaviors is woefully outdated.
"Computing: A Concise History" stays true to its name by providing a concise history of computer from the digital age to the time of Google, Facebook, and Twitter. Ceruzzi lays out the history of computing around four major themes, i.e., the digital paradigm, convergence, solid-state electronics, and the human-machine interface, and he does a great job at providing evidence for them.
Despite being a relatively young field, computing has a wealth of knowledge and Ceruzzi does a great job at balancing knowledge and narrative without being too pedantic or shallow.
Unrelated to the book's prose but I found a few copyediting errors, for example, "was soon be obsolete", "Sperry-UINIVAC", "instead. whenever". The book has a few black display pages which show an excerpt, however this disrupted the reading at times. For example, page 143 ends with "one of the", pages 144 and 145 are black pages, and page 146 starts with "it", thus something happened to the sentence of page 143.
The book has an extensive "Further Reading" list so the interested reader looking for more details can dive into those resources.
I found this book to be quite interesting. While some of the technical talk went a bit over my head at times, there was much I didn’t know about the development of technology. I found it fascinating that the first computer was inspired by a revolutionary fabric loom from the 19th century. Nor did I know that IBM was that old of a company. Also, that icons and using a mouse were innovations to come from Xerox Labs - I typically associate Xerox with copy machines. So many advances were funded because of World War II and the military wanting better, smarter, faster weapons. The overall message the author has for his readers is that, all too often we take technology for granted. He likens how complex computers are to a biological living thing is - though, he does point out to his readers that he recognizes computers are not living things. What we see is the result of “a complex network of servers, routers, databases, fibre optic cables, satellites, mass storage arrays, and sophisticated software.”
The essential or interesting points could have been condensed to a 20 page journal article. Apart from that, its rapidly rotating cast of projects, names, companies and computer models.
That being said, it does the following effectively: a) tell you what a computer *actually* is b) contextualise the development of the computer within a wider social / technological background c) make a compelling case for the 4 fundamental dynamics driving the history of the computer (the digital paradigm, electronic engineering, the convergence of communications, calculation, data storage and control, and the human-machine interface).
I found the human-machine interface parts the most interesting as it relates the most closely to my work. Also, interesting to wonder if, once you cut through the AI bs, the last 20 years of the computer's history have actually been comparatively static? Who can say.
💻Paul E. Ceruzzi’s Computer Science: A Concise History traces the intellectual and technological evolution of computer science from its roots in mathematics and logic to its central role in the modern digital age.
Spanning from the 19th century innovations of Charles Babbage and Ada Lovelace, through Alan Turing’s theoretical machines, to the rise of modern software, algorithms, and networks, the book explains not just how computers work but why they matter. Ceruzzi highlights key breakthroughs, such as the development of programming languages, the advent of personal computing, and the internet’s transformative power.
Rather than focusing solely on hardware or engineering, this book explores the ideas behind computing—logic, abstraction, data structures, and the social impact of information technology. It shows how computer science emerged as a distinct discipline, bridging theory and practice.
Key Themes: ➡️Computing as a fusion of logic, math, and engineering ➡️The shift from machine code to high-level programming ➡️The rise of AI, networks, and cybersecurity concerns ➡️How computer science reshaped society and knowledge itself
This book is perfect for anyone seeking to understand the foundations and future of a field that defines the modern world.
Good book that's pretty well organized. I took a cpsc course that covered much more of the technical side, but this book does a good job of covering the political, social, and economic reasons for many of the changes along with some technical detail. Overall I think the book is quite accessible to most, but keeping all the acronyms in mind is quite difficult (good thing there's a glossary).
The author does a good job of keeping the content interesting, although at the end of the day some of it is just somewhat dry. Paul also did a good job of including some important background in the introduction to put the whole book in perspective nicely. Only real improvement I could see is a prefix or appendix for a timeline to keep everything in order as there is a bit of jumping that happens between chapters.
The title is computing but the content is mostly networking. Very dry reading and not concise. Foggy. Snippits from research abound. Leaves out the dynamic minicomputer and workstation era with very lttle reported about significant companies like Sun and DEC.
Starts with tube computers an IBM and then jumps to the internet. A mish mash of information, bits and pieces of information thrown in without continuity and not well written. This is not a book that covers the true history of computers.
Go read something interesting like "Accidental Empires" by Robert X Cringely or Canion's "Open"about PCs. for a good story.
I worked in the computer industry from 1962 to the present day and this book does not do that timeframe justice.
Ceruzzi seems to apologize right from the beginning for the lack of continuity that will be displayed throughout this small volume. He seems to constantly double back to explain another line of development that will become important later on, thoroughly confusing the mental timeline being created in the reader’s mind. While one can empathize with the unwieldy strands that need to be woven into one story, that is the unsurprising lot of the writer of history. Rather than attempt to cover (poorly) the multiple lines of development that have influenced computing, Ceruzzi would have done better in prioritizing a clearer history of (the/a) main development line.
A great read into computing history and the important engineers behind them. Reading this helped me appreciate our modern technology more, and gave me a more broad perspective of how it may develop in the future. Who was involved in creating computers? What were the capabilities and limitations of them? How did personal computing become popular? I'd recommend this book to anyone curious about those sorts of questions.
This book was great! It gives you a well-paced and extensive set of access points to computer history. Some of it I didn't understand, but that is mainly a gap in my own knowledge about tech and networks, not the books fault. For more or less purely informational book, I flew through this! Does use the terms "ad hoc" and "dovetail" about 4 times each throughout, but that's the biggest "complaint" I got xD
Simply stated this is a must read for any student of computer history. It’s title is accurate in stating that it is a “Concise History” of computing. The book details computing back to its earliest days up through it’s current state. It is detailed and provides lots of information critical to the development of computers and the sharing of data. The author utilized many key resources it telling this tale which make the topic even more interesting.
Poorly written book with snippets of data from references. A mish mash of details. Not a book about computing, mostly about networking. Jumps from tube computers to the internet without covering the dynamic minicomputer and workstation era. Go read something interesting like Accidental Empires by Robert X Cringely for a good story. I worked in the computer industry from 1962 to the present day and this book does not do that timeframe justice
It's to hold this book's concision against it, given the title, but this isn't the book to really drill down one level deeper than what you might pick up with a bit of intentional reading of news about computing developments. Perfectly serviceable, however, and some of the highlighting of concepts in computing history - convergence, digital vs analog, etc. - was helpful in crystalling vague notions into useful insight.
For me, this book is what one can expect from a history book. A lot of names, dates and technical concepts. The way Ceruzzi presents the events its a good proposition. Going from the start of digital vs analog concept to the World Wide Web and social media.
For being a small book, I think is a pretty good summary, but the reader should take notes to not get lost in everything that is mentioned through the chapters.
This is a compact, insightful dive into how computing evolved, from punch cards to smartphones. Paul Ceruzzi does a great job of hitting the major milestones without overwhelming you with too much technical detail. It’s perfect if you want a big-picture view of how we got from early machines to the digital world we live in today. The writing is clear and to the point, and while it’s more informative than entertaining, it’s a solid intro for anyone curious about the history of tech.
Exactly as advertised, concise! A lot of prominent folks are given just a few lines of real estate in this book, which does a great job of avoiding too many details in favor of giving the reader a quick overview of computing history. The author has provided a "Further Readings" section at the end with more in-depth books on topics or people that you may want to further explore.
The book does the job. Just as promised, it gives a concise but solid overview of computing history, highlighting key events and explaining foundational terms. Depending on readers background, either a good intro or a good refresher. Worth a read for those who like to understand things deeper than essential for day to day operations.
as the title clearly states, it was concise and a clear journey of how we got here (a few years ago). it left me wanting more details and facets. Overall, it was a nice exposition of the rapid progress we experienced in the 2nd half of the last century. a nice perspective for today's rapid progression :)
a decent high level description of the history of computers. The author touched on a lot of the reasons why the development of computers has accelerated over the last couple of decades but neglected to bring up the topic of pornography, which was a major driver.