Explore the principles and practicalities of quantum computing Quantum computing is making us change the way we think about computers. Quantum bits, a.k.a. qubits, can make it possible to solve problems that would otherwise be intractable with current computing technology. Dancing with Qubits is a quantum computing textbook that starts with an overview of why quantum computing is so different from classical computing and describes several industry use cases where it can have a major impact. From there it moves on to a fuller description of classical computing and the mathematical underpinnings necessary to understand such concepts as superposition, entanglement, and interference. Next up is circuits and algorithms, both basic and more sophisticated. It then nicely moves on to provide a survey of the physics and engineering ideas behind how quantum computing hardware is built. Finally, the book looks to the future and gives you guidance on understanding how further developments will affect you. Really understanding quantum computing requires a lot of math, and this book doesn't shy away from the necessary math concepts you'll need. Each topic is introduced and explained thoroughly, in clear English with helpful examples. Dancing with Qubits is a quantum computing textbook for those who want to deeply explore the inner workings of quantum computing. This entails some sophisticated mathematical exposition and is therefore best suited for those with a healthy interest in mathematics, physics, engineering, and computer science.
This book is a very good and in-depth introduction into quantum computing.
Half of the book is spent building up all of the necessary mathematics for quantum computing including complex numbers, linear algebra, and tensor products. The other half is spent on quantum physics, qubits, gates, and algorithms. The mathematics and physics is very well explained, but you need to take your time to work through it to understand it properly. This is aided by well-placed questions in each section.
I have to deduct a star because in a few places, this book feels more like a late draft than a finished book. There are several small mathematical mistakes like missing symbols or typos which can leave you wondering, if you understood the section correctly.
I liked this book a lot, but I recommend starting out with the book "Quantum Computing for Everyone", which is much less in-depth, and continuing on to this one, if you like it.
Quantum computing has become a hobby interest of mine. The idea of quantum computers were first introduced decades ago, but until recently they were entirely theoretical. Today, the public's understanding of quantum computers largely reduces to "magic," which is how quantum computers are typically treated in science fiction. It's assumed that something "quantum" happens at the subatomic level and presto, quantum computers can perform massive computations faster than any classical computer, communicate faster than light, and do all sorts of other ridiculous things.
Understanding how and why this is not the case, and what quantum computers actually can do, requires a lot of study. It also requires a lot of math.
I picked up Dancing with Qubits because it looked like a fairly heavy but approachable layman's introduction to quantum computing. It is that, but it's very heavy, and dives deeply into the math, with many chapters on sets and fields and rings, number theory, logic, vector spaces, and matrix operations before you even get into the quantum. This book is clearly intended to be used as a textbook for a class and is probably a lot more digestible in that environment. Reading it entirely as independent study, as I did, was a pretty hard slog.
My prior experience was playing around a little bit with the open source Qiskit API, and with IBM's Cloud Quantum Computing platform, which allows people to build quantum circuits and run them (for free!) on one of IBM's actual quantum computers.
My background is in software engineering, with just the minimal amount of math necessary for a CS degree: a few semesters of calculus and linear algebra, and some basic statistics and probability theory. This is barely enough to do a deep dive into the math behind quantum computing, and I found myself buying refresher linear algebra and calculus workbooks to keep up.
So I will freely admit that I did not actually work through all the problems, and at some points I skimmed through the math and the algorithmic proofs.
Past the math, this is a thorough introduction to quantum computing, and eventually you start with simple 1-qubit operations, work your way up to Hadamard and Tiffoli gates, and then ever-more complex circuits, until you get to the chapter that explains Shor's Algorithm in detail.
Shor's Algorithm, if you know anything about quantum computing, is the famous algorithm that will someday break private-key cryptography and thereby much of our information security infrastructure. To simplify greatly, a lot of modern cryptography (including the "https" connections you rely on to do secure banking with your phone app) relies on the computational difficulty of factoring very, very large prime numbers. Even supercomputers can't do it in less than centuries, for sufficiently large numbers. Shor's Algorithm is a quantum algorithm that has been mathematically proven capable of factoring very, very large prime numbers much, much faster than any classical computer can. So in theory, someday a quantum computer will be able to hack all banks everywhere and take over the Internet.
So why isn't everyone panicking? It's complicated, and fully understanding it requires a book like, well, this one. Which requires kind of understanding the math. But it boils down to this, and these were some of the most useful chapters: what quantum computers can do in theory is still quite a long way from what any real-world quantum computers can do in practice. For example, IBM and Microsoft and China are bragging every day about how they have built a 10-qubit, 20-qubit, or 60-qubit quantum computer, and expect to have a 100-qubit quantum computer by 2024, etc. The catch is that those are physical qubits, and physical qubits are very fragile. They have to be generated using cryonics or lasers or nuclear magnetic resonance - not anything that you'll be able to put on your desktop this decade. And right now we need hundreds of physical qubits to reliably produce one usable logical qubit. Performing a useful quantum operation (such as, for example, using Shor's Algorithm to break private key cryptography) will require hundreds or thousands of logical qubits.
So, it will happen eventually, but it's not going to happen tomorrow. And people are already working on quantum cryptography. It's also important to note that the nature of quantum computers is such that they are never going to "replace" classical computers. It would be enormously inefficient to use a quantum computer as a simple calculator, for example. Even though you theoretically could, most computing will always be cheaper and faster to do on classical computers. Quantum computing, when it "arrives," will be a hybrid where classical computers do their thing and occasionally hand off certain operations to quantum computers and process the results.
None of this is quite as exciting as some quantum news sounds, is it? The bottom line is that although many companies and countries are investing billions in quantum research, there does not exist anywhere in the world today (2022) a quantum computer that can actually do anything useful and practical. Investors are starry-eyed about cryptography, pharmacology (quantum algorithms will eventually be able to model and test new drug molecules much faster than we currently can), information retrieval, and certain other specialty fields, but no quantum computer currently on the market.... actually does anything but toy problems.
That said, don't bet against a field where Microsoft, IBM, and the People's Republic of China are all investing billions of dollars. So quantum computing will be "real" someday and it will make a major impact. Robert Sutor's book provides a grounding in the math and theory behind it, but not much in the way of doing things. He talks about "quantum volume" as a somewhat ill-defined but more useful way to measure how powerful a quantum computer actually is. He points you to some popular software packages and quantum simulators, but actually experimenting with quantum circuits is the next step for you to take on your own, and from a software engineering perspective, I've found you can, to a certain extent, do this already without knowing the deep background.
I recommend this book if you really want to know the math and the theory behind quantum computing, and a good high-level introduction to the technology as it exists today. If you want to get into programming quantum algorithms with more foundational knowledge than just how to use a Python API, it's a great starter. If you don't want to dive into linear algebra and complex numbers and matrix operations, or going in the other direction, you want to know about quantum physics, this is probably not the book you want to start with.
This book is a good and thorough mathematical intro to quantum computing, but would benefit if it had been more focused.
Let's start with the good: this book is not amateurish, which is a pitfall many books about quantum computing (and quantum physics) fall into. It's methodical and well organized. It's clear the author was well aware of the complexity of the math behind quantum computing and the challenge it poses, and did his best to help the reader cope with this complexity. But don't get me wrong, this book is not simplistic and doesn't cut corners. Naturally, not all the detailed mathematical developments are provided, but if they are missing it's clearly stated that there is a leap there, and the reader can choose if she wants to delve in or skip it altogether. And most importantly, after finishing the book, the reader is actually equipped with the necessary background to read further or even start coding programs for quantum computers. This is no small thing.
My problem with the book is that it doesn't address a well-defined target audience, and tries to cover too much. It's not clear if this book is intended for the layman or for people with a sound background in physics in general and quantum physics specifically who want to broaden their horizons to the new emerging domain of quantum computing. Fortunately for me, I belong to the second group, but I'm not sure readers from the first one will benefit a lot from the book.
For example, chapter 3 engages with (relatively) trivial concepts like natural and whole numbers. But by mid-chapter 5 (or some 80 pages later) we are already discussing Hilbert space. One could argue that if the reader is capable of coping with Hilbert spaces, there is no need for introduction about whole numbers. On the other hand, if the reader needs this book to learn about basic concepts like trigonometry or vector algebra, her chances to follow more complex mathematical developments are slim. The author's effort is commendable, but I'm not sure it works. So in my opinion, it's a matter of setting the right expectations: the reader would benefit from assuming a certain level of knowledge and starting the book from a more advanced level of math, computer science and physics. That would have allowed the author to develop some concepts more slowly, instead of rushing from zero to quantum circuits in no time.
Overall, it's not a popular science book. If you have some solid background with physics and computer science, say B.Sc. level; you are not afraid of working out some math yourself; and you are interested in a theoretical background about quantum computing before writhing your first program, this book will serve you well.
really great introductory book on quantum computing for those new and also a great book for those even well-versed in the space and read along with those starting out! used for our book club and had really interesting discussions which help to note what is often misunderstood or hard to understood especially for those starting out that may be easily explained by this book along with someone helping explain concepts! it really gets to the core of how and why it can impact the world
This book does an excellent job explaining what quantum computing is and it’s current technical limitations. However, it is oriented to understanding very detailed aspects of logic gates and specialized algorithms needed for quantum computing. I really need a more basic understanding and implications of quantum mechanics first before I can get the most out of this book.
A great introduction to quantum computing. A lot of mathematics – starts from the basics like vectors and complex numbers and gradually comes to the quantum gates and circuits. I'd like the book to have more physics examples, because all that mathematics feels like it's just a theoretical concept from an ideal world.