The ultimate non-technical guide to the fast-developing world of quantum computing
Computer technology has improved exponentially over the last 50 years. But the headroom for bigger and better electronic solutions is running out. Our best hope is to engage the power of quantum physics.
‘Quantum algorithms’ had already been written long before hardware was built. These would enable, for example, a quantum computer to exponentially speed up an information search, or to crack the mathematical trick behind internet security. However, making a quantum computer is incredibly difficult. Despite hundreds of laboratories around the world working on them, we are only just seeing them come close to ‘supremacy’ where they can outperform a traditional computer.
In this approachable introduction, Brian Clegg explains algorithms and their quantum counterparts, explores the physical building blocks and quantum weirdness necessary to make a quantum computer, and uncovers the capabilities of the current generation of machines.
Brian's latest books, Ten Billion Tomorrows and How Many Moons does the Earth Have are now available to pre-order. He has written a range of other science titles, including the bestselling Inflight Science, The God Effect, Before the Big Bang, A Brief History of Infinity, Build Your Own Time Machine and Dice World.
Along with appearances at the Royal Institution in London he has spoken at venues from Oxford and Cambridge Universities to Cheltenham Festival of Science, has contributed to radio and TV programmes, and is a popular speaker at schools. Brian is also editor of the successful www.popularscience.co.uk book review site and is a Fellow of the Royal Society of Arts.
Brian has Masters degrees from Cambridge University in Natural Sciences and from Lancaster University in Operational Research, a discipline originally developed during the Second World War to apply the power of mathematics to warfare. It has since been widely applied to problem solving and decision making in business.
Brian has also written regular columns, features and reviews for numerous publications, including Nature, The Guardian, PC Week, Computer Weekly, Personal Computer World, The Observer, Innovative Leader, Professional Manager, BBC History, Good Housekeeping and House Beautiful. His books have been translated into many languages, including German, Spanish, Portuguese, Chinese, Japanese, Polish, Turkish, Norwegian, Thai and even Indonesian.
Maybe another decade of development is required to more mass adoption, and a full on quantum computer being vastly superior in everything to conventional computers might never come, but interesting to learn about a new technology There is plenty more to come
I expected a bit more on the actual technology itself but overall an interesting and down to earth read on a topic I only knew the basics about. Bell Labs developing programming language C, the first transistor: Brian Clegg starts us of from just after the Second World War (with the exception of some Ada Lovelace and Babbage and the difference machine, and the observation that without computers the Census of 1880 in the US requiring 8 years to tabulate, forming the impetus to develop IBM and punchcards) before we get to the main topic of Quantum Computing: The transformative technology of the Qubit Revolution.
Googles index being 100.000 terabytes large, 1 with 17 zeroes, with a Bit being a binary digit. Quanta: the smallest part we can divide something in form the new frontier now computing is reaching the size of atoms on microchips. This brings the strangeness of quantum realm more and more in view: The more accurately you know the position of a quantum particle, the more uncertain is its momentum and Very small things don’t have locations, we just have probabilities where they are.
Superposition: only through observation does a particle arrives/takes on a state. This offers the opportunity of not having binary bits but quantum bits that can be both states at once until collapsing into an outcome. This property would be immensely valuable for the probabilistic nature of data, with uncertainty inherent to the outcomes. A quantum algorithm will usually need to be run many times for us to have confidence in its outcome. Estimates of current state of the field is that 90% of quantum computing capacity dedicated to error checking to counter decoherence of qbits interacting with other particles and losing their superposition
Quantum computers being specialised, and vastly superior, in very narrow fields of applications compared to current day regular computers. Robustness of a conventional computer can hardly be achieved and needs an environment to be cooled around zero kelvin, making a cloud delivery of quantum computing power likely, with server farms being cooled to almost zero kelvin being much more likely than PC delivery model.
Fast scaling up of producing entangled ion pairs is now the holy grail, and there is plenty more to come. Maybe another decade of development is required to more mass adoption, and a full on quantum computers being vastly superior in everything to conventional computers might never come, but this book definitely helps understand the hype and the basic mechanics driving the field.
This is a book for those who want to get a taste in quantum computing without diving into the details using linear algebra, which is the language for quantum computing. The author started out with a third of the book covering conventional computing, then proceeded to cover strange behavior of quantum particles ... superposition & entanglement. The last half of the book is the key in how quantum computing can be deployed ... quantum algorithms and quantum hardware.
For quantum algorithms, Shor's algorithm (cryptanalysis) and Grover's algorithm (for searching data) were used as examples ... but not into the details of the quantum gates and circuits of the algorithms. For quantum hardware, the various considerations covered include qubits, quantum gates (e.g. Hadamard gate), error correction and quantum teleportation.
The last chapter reminds us quantum computing is still in its infant stage and may take a decade or two to mature ... similar to the development of artificial intelligence was initially much hyped in the 1970s, then dormant during the AI winter and only got revived in the 1990s.
A well written book that brings the reader through the basics of quantum physics and computing, eventually marries the two concepts and then goes on to talk about what quantum computing looks like today and might look like going forward.
I was able to follow most of the theoretical math and computing parts of the book even though I have a very patchy and elementary education in these areas.
I would recommend this book to anyone who is looking to get up to date with the state of quantum computing in today's age and who already has a basic knowledge of quantum physics and computing theory concepts.
Excellent. One of those popular science books I went into knowing little and emerging with almost a sense of revelation! I’ve read a few popular articles on quantum computing in the past, mostly magazine/newspaper articles and never really got the point, despite having a science background. Now, for the first time thanks to this book, I think I understand the potential power of the Qubit, the basic unit of computing with a quantum computer, and why researchers keenly pursue the dream of a working device which uses them.
Maybe less than half of the book, the latter half, is about Quantum Computers as such, their development and likely architectures. The early sections concentrate on revisiting basic computing concepts (logic gates, the basic designs of conventional computers) and some relevant quantum physics concepts (entanglement in particular). In this instance, I didn’t think this was ‘padding’ at all, but very worthwhile. Partly so that buzzwords like entanglement and relevant quantum concepts don’t surprise you too much when we get into the core of the book and so that you can understand the similarities and differences between conventional and quantum computers, e.g. logic gates are critical to both. Maybe starting the review of conventional computing with 19th century machines, Babbage and Lovelace, was a bit unnecessarily comprehensive.
I learnt that despite the effort being put into Quantum Computer development that their successful utilisation in solving issues beyond conventional computers isn’t a ‘done deal’. There are many potential ways Qubits can be constructed and, critically, made stable at least to some useable, transient degree. Quantum computers are never likely to replace our laptops or desktops, more likely being a ‘Cloud’ option. Quantum computers usually have to be designed around the class of problems requiring a solution. The output is likely to be probabilistic, requiring multiple runs to use. And there are plenty of other caveats, ‘Ifs’ and ‘Buts’ associated with them. The book also gives a sense that progress is being made, looks at current working models, and suggests unexplored avenues may be uncovered to allow working devices to become easier to make and use.
Entry requirements for the reader? I come from a science and engineering career, and I’ve generated computer code for solving practical problems so nothing in the introductory background was new to me, though once one gets onto how Qubits are generated and used, about halfway through, I needed to concentrate - it’s new territory. I suspect anyone who’s dabbled in computing, maybe come across a little quantum theory, or at least not intimidated by modern physics, or is an occasional reader of the physical popular sciences, will find this a straightforward informative read. Less than 200 pages and that includes a modest sized reference section.
One of those rare popular science books where I came away satisfyingly informed on something I started with very little knowledge on.
I have to admit it: I started reading about Quantum Computing out of spite for Michio Kaku, I wanted to learn more about it so I can diss his book, lol. Okay, so overall I think this book is good for layman or people who are interested in the idea of Quantum Computing but has no professional knowledge about it whatsoever. This book is easy to read and Clegg successfully deliver what the gist of Quantum Computing is. However, this is a relatively thin book with accessible vocabulary for layperson so I don't really think it's suitable if you really want to learn about it like I do. I got an important information about QC through this book though, that is QC is not just a superior computer but it also needs specific algorithm. This book grounded QC as an actual milestones that humans achieved instead of some high-up-there technology. If you're interested, give it a go, but I would try to find a better one than this.
Having previously read several titles on the subject I believe this is perhaps the simplest one that a lay person should start with when in need for an introduction to the area of Quantum Computing. The book is well written, as are all other titles by this author, and slowly covers all areas of importance in the field. The author touches on key aspects of sectors that need improvement and what are the existing challenges. Overall I believe it is a great read and I would definitely recommend it to all readers beginning their journey in this area.
Spent way too much time going over stuff I've been learning at school since year 7. Definitely gets better but if I wasn't trying to finish it for the 2024 reading challenge I wouldn't have gotten past chapter 3 (out of 7)
Gives a good overview of what Quantum Computing is all about. I found out that Quantum Computing is not nearly as straightforward as I had been led to believe. Working in the field still requires PhD level physics and PhD level math. I got the impression that research in the area is very expensive, limiting the work to just a few very large corporations. And although it has the promise of solving certain math problems that are currently unsolvable, it's not at all guaranteed that it will do so at more than a tiny tiny fraction of the speed of current computers. Possibly speed and cost will be such major problems that quantum computing will forever be used to decrypt only a very small amount of very important communication traffic. And the tool might never be readily available to hackers.
It's true that several groups are working hard in the area, and are making notable progress so that the field will be quite different in a year. And it's true that a method is already known of using quantum computing (when it becomes practically available) to break the encryptions that are currently used. But beyond those two hard facts, most of the buzz is just over-eager (and perhaps even irresponsible) journalists.
I obtained this book and read it with a specific question in mind: exactly how will the encryption-cracking capability work? I found that question is NOT answered in this book ...and in fact is not answered in the vast majority of books about quantum computing, not even most of the books that are used by those inside the field. This is apparently because the method is so arcane that it's difficult to even explain it to anyone other than those that work in the field and so have a whole lot of background knowledge and are fluent with some obscure mathematical notation.
So, disappointed with the book, I turned to the internet with the same question. It turns out a whole slew of explanations --some of them quite good-- were written and published about a decade ago, and some of them still exist. I should have turned to the internet in the first place! The actual algorithm was defined by then-PhD student Peter Shor. A couple of sessions by him are available on Youtube. They add to the impression that quantum computing and/or coding decryption will never be something that Joe Sixpac's child can do in the basement.
One of the things these explanations make clear is that the "conventional wisdom" explanation used by most journalists is flat-out hogwash. The qubits do NOT form any sort of superposition of all possible numbers, and do NOT somehow magically collapse to a state that represents the desired prime factor. The actual algorithm is much more complex, slow, and difficult than that.
Another thing that both the book and the explanations on the internet make clear is that conventional computers will continue to do a better job of MOST parts of the algorithm, and only a small bit will be subcontracted to a quantum computer. For example a quantum computer may produce a very short list of possible factors, then a conventional computer will actually test-try each one in order to zero in on the answer.
Um... I felt very bored by this book at times and other times super out of my depth or pay grade. I was interested in learning more after quantum computing was discussed in another book. However questions weren't answered. Also lot of rehashing of computer history I knew from school. Author made sweeping statements about how quantum computing will never be used in personal homes. Which is possibly likely but tech people should know better than to say something will never happen. As that what was said about personal computers back in the day.
Quantum Computing: The Transformative Technology of the Qubit Revolution (2021) by Brian Clegg contains seven chapters, the first three of which summarize "conventional" computing and its history. That material, of course, has already been covered in countless books. Given the rather short length of this book and the complexity of quantum computing itself, that seems like a lot of stock material that could have been delegated to other books, but I guess Clegg wanted to make this book as self-contained as possible.
Clegg's account is considerably more sober than Michio Kaku's in Quantum Supremacy: How the Quantum Computer Revolution Will Change Everything. Neither book gives the reader a complete idea of what quantum computers are actually like, let alone how you might program one, but at least Clegg doesn't imply (as Kaku pretty much does) that quantum computers are about to render all "conventional" computers obsolete at a stroke. Clegg brings the reader back to Earth by focusing on the problems and limitations of quantum computing, likening quantum computers to something more akin to coprocessors in conventional computing. I.e., at least the first generation of practical quantum computers won't be much like your smartphone or laptop, able to download and run many thousands of programs. Rather, quantum computers are likely to be specialized devices only able to solve limited classes of problems, or particular steps in problems, albeit potentially much faster than conventional computers can solve them. The need for many or most types of quantum computing devices to operate at cryogenic temperatures further restricts who might actually own one, but that's not much of a practical limitation on who can use one, thanks to the cloud. Maybe someone will figure out how to build a quantum computer as robust and consumer-friendly as the conventional computers we all use, but unless you are very young, don't expect you'll live to see it.
Unlike Kaku, Clegg doesn't mention climate change at all, which is a stunning omission in a book about the future. If you're writing about technologies that might require 20, 30, or more years to develop, how can you not mention that we might not have a functioning civilization in which to develop them? Continued technological progress requires things to stay hunky-dory enough for large numbers of people to focus on something other than surviving the next day. With humans continuing to burn fossil fuels at an exponentially increasing rate, the wheels could start coming off civilization at about the time that progress in quantum computing needs them to remain on. For a needed corrective I recommend Our Final Warning: Six Degrees of Climate Emergency (2020) by Mark Lynas among many others.
Clegg mentions Artificial Intelligence as a cautionary tale, in particular the notion of AI winter - the period of reduced funding and interest in AI following an initial burst of optimism and hype. He also mentions controlled nuclear fusion, a technology that has been hyped since the 1950s yet has always seemed 50 years away from becoming commercially viable.
He doesn't mention potential synergies between AI and quantum computing - maybe quantum computing can improve AI, and maybe AI will help with the search for quantum algorithms. Even if quantum computers were viable right now, it seems far from straightforward for people to program them. The number of people alive right now who understand the necessary physics and mathematics cannot be large, and most of those people are probably already busy doing other things. But maybe AI will be able to help with that, by starting with problem descriptions, or perhaps with conventional algorithms, and mapping them onto quantum algorithms.
Like a metaphor for quantum superposition, I find this book to be both too technical and too lacking in technical details at once. My own knowledge deficiency in understanding quantum physics is probably why I feel that way. As someone who has never studied quantum, I probably should have started with Mr. Clegg's other books on the subject such as "A Crash Course: Quantum Theory." This book tries to fill in some of the knowledge gaps such as going into the working details of logic gates; differences between NAND and XOR gates, that sort of things. My engineering background really appreciate the smooth takeoff on this complex subject. It's like giving my brain on a ride on a Federation star ship moving at Impulse speed. I can handle Quantum Computing, my brain remarks.
Then the Warp Drive on quantum theory kicks in, and everything's blurry. There are descriptions of the ideas behind the quantum theories, but light on examples or explanations of why or how such ideas actually work. For example, why is quantum coherence an actual state of quantum particles as opposed to it being simply the state probabilities of our particles? And why is knowing the exact state of a property of our particle necessarily means decoherence? It's a bit like trying to understand Newtonian mechanics with just the theories and words describing relationships without working through any physics problems.
Without equations or enough examples for the basic quantum concepts, it was tough for me to understand to appreciate the details on qubits and how quantum computers actually work. It was good for Mr. Clegg to mention D-wave Systems, and I'm definitely going to look them up to find out how they make quantum computing available for production use even today. After I read that other book on quantum theory.
A good overview of quantum computing without getting too in depth. Became interested in this while reading the book on Graphene by the same author. I’m rather skeptical of making use of quantum physics in computers. My big confusion is surrounding Qubits. If the suggestion is that quantum spin is enabling data to be held in percentages that can all carry value, I don’t understand maintaining binary gates. It’s like we’d need to recreate logic gates specifically to work with the variety of quantum spin. I also don’t understand how we can use quantum superposition to store data. The other side of this also is the presupposition that the internet will remain the same as it is, with massive server banks and that blockchain technologies won’t have a significant impact on the way we do internet. It will certainly be interesting to see how technology continues to advance and change over time. Seems like we won’t even recognize the world we’re living in in just a couple decades
Well I do enjoy quantum theory and stuff and like computing so this is like obviously a perfect fit. Like I literally had to skip over the part where he explains quantum theory because I am just so smart and intellegenet!! On a serious note How is this like the 2nd or 1st book that I have read this academic year?? Like this a very poor performance from somebody who literally has a longer commute everyday. Like I am mad at myself rn But like I will read more books this year. Like will ultimately make me
I think the limitations of this book are really the limitations of the reader , i.e., me!
I found the description of quantum weirdness to be fascinating - it really is possible to receive a message before it is sent... I think I understand more about how conventional computers from reading this book but I struggled to keep up with the possibilities of the quantum computer. It does seem unlikely that I will have a quantum computer on my desk anytime soon but one never knows.
The book is largely taken up with mathematics and traditional computers, and only has a minor section at the end on quantum computing.
There are two errors in the quantum computing section i) todays SSD's are not quantum devices (they are traditional PNP transitors) ii) quantum calculations will not give a different answer every time, and wil not need to be repeated many times to confirm the answer is right.
A simple explanation of how a traditional computer works, and what is the difference with a quantum computer. Without maths, or complex explanations, you can truly understand how the future quantum computers can work. I missed a bit more information about the new algorithms designed for those computers, or more examples of tasks where quantum computers can be a true revolution. But I highly recommend this book.
Wasn't sure what to expect, only that I assumed that i'd get a lot of headaches...
However this ended up not being the case, as this book is written at a level and in language that I genuinely understood (although I hope no-one asks me to explain a Josephson junction anytime soon).
The writing style was great, laced with really interesting history and despite the author's expertise he never assumed knowledge - this is an amazing first introduction to the subject.
Recommended to me by a work friend. A lot of generic quantum review--some solid "corrections" to Ada and Feynman myths, which were appreciated. Very good overview of the current state and projected timelines (thanks CALTECH), but overall I found this book to short and also too superficial at some level--code samples, more details, maybe more biography discussion of the resources quoted (particularly some of the more modern researchers). Glad I read it, but not for everyone.
The chapters that set out to explain the premise of the book are delightful but far in between.
Half of the book focuses on traditional computing, another quarter continues to describe quantum as "handwavy", only the final quarter presents information on quantum computing that is descriptive.
I would recommend this to a friend outside of tech entirely who is curious about general computing as an introduction.
There are not many books on the subject that manage to explain the field to a lay audience, and in this the book is unique. However I found the content a bit unsatisfying because it did not dedicate nearly as much effort to explain the core of QC, and instead offered interesting context.
The q-bit is introduced in the last chapter, where i was expecting it might be part of the first!
An interesting perspective on what Quantum computing possibly could be. It presents a great perspective of computing in general, decomposing it into various components. However, the plot seems to get lost when the "quantum hardware" starts to come in, along with the algorithms. Possibly, it is intended for a very lay reader!
This was an interesting dive into the theory behind quantum computing. I was left feeling kind of flat about the whole thing. It got extremely technical in the middle laying out the groundwork for the Qbit. It may also have been less than satisfying in the the whole field is still in its infancy, so there is nothing super transformational in practical term at the moment.
Well written and brief (4.75 he audiobook), this book explains the terms enough for anyone to get in on the ground floor and keep up, yet gets into there advanced topics enough for a strong overview of the state of the art with an eye towards the frontiers of the science. I thoroughly enjoyed the later chapters and will consider more books by this author in the future.
I expect a lay-person-targeted book to be reasonably shallow, but this book spends more time talking about standard computer architectures and math than it does actually talking about quantum computers. If you're looking for any insight into how quantum operations/algorithms work, this book is not a good use of your time. I kept waiting to get to "the good stuff" and then ran out of book. :(
Do not waste your time on this book. It’s generally off the mark quite a bit, and outright wrong in some cases. It comes in light at about 150 pages, because the author clearly does not have any real story or information to offer. Of those 150 pages, over half is filled with irrelevant historical background of classical computing, which doesn’t actually add anything.
This was a work-related listen; the introduction and first few chapters were helpful, but beyond that…oof. The book seemed to focus more on general computing and mathematics, with only a small section targeted on real quantum computing at the end.
Take this with a grain of salt, as I’m just a filthy liberal arts major. 😂
Decent overview of a relatively young technology. Don't expect it to go very in depth. If you're looking for more than "Shor's algorithm factors numbers," this book will be a disappointment. As a pop-sci book targeted to the public it's acceptable, if rather hand-wavy when it comes to the details.