This book integrates the foundations of quantum computing with a hands-on coding approach to this emerging field; it is the first work to bring these strands together in an updated manner. This work is suitable for both academic coursework and corporate technical training.
This volume comprises three books under one cover: Part I outlines the necessary foundations of quantum computing and quantum circuits. Part II walks through the canon of quantum computing algorithms and provides code on a range of quantum computing methods in current use. Part III covers the mathematical toolkit required to master quantum computing. Additional resources include a table of operators and circuit elements and a companion GitHub site providing code and updates.
Jack D. Hidary is a research scientist in quantum computing and in AI at Alphabet X, formerly Google X.
"Quantum Computing will change our world in unexpected ways. Everything technology leaders, engineers and graduate students need is in this book including the methods and hands-on code to program on this novel platform."
--Eric Schmidt, PhD, Former Chairman and CEO of Google; Founder, Innovation Endeavors
After all these good review in here, this book was a disappointment.
It is difficult to get a proper understanding from this book, and it is not because the topic is hard or the book is too mathematical. Indeed, I think the material presented should be accessible to anyone with a master's (or possible bachelor's) in any STEM topic. The problem is that the explanations are lacking. Take the Bloch sphere for instance. The author tell you that the state of a qubit can be represented as a point on a sphere, but then goes on without telling *how* it is represented as a point on a sphere. A quick detour to Wikipedia reveals that it is not very difficult to explain this. The author just didn't take the trouble. The book is full of these gaps, and especially the later parts of the book rely on you understanding these details.
What's worse is that there are downright elementary errors in the book. Take the explanation of the oracle functions for instance, where the question is how many times do you have to query a function of N (qu)bits to determine its truth table (actually just whether the function is balanced or not, but read some literature for the details). The book gives the answer "N times" for a classical computer, and "once" for a quantum computer. The correct answer should be 2^N (or actually 2^(N-1)+1 just to check whether it's balanced) for the classical computer. Are there as elementary mistakes in the parts I am trying to learn for the first time?
The presentation of the mathematical topics in part three (Hilbert spaces, fields and groups, etc.) have some interesting and refreshing viewpoints, and the introductory chapters aren't too bad. This, combined with the fact that I've seen worse books, leads me to give this book two stars.
I have switched to another book, "Quantum Computing for Computer Scientists" from 2008, which looks much better. It's about the same size and covers about the same ground. It doesn't include code examples, but from what I've seen so far it would be easy to pick up just about any quantum programming language if only you understand how to apply them.
Context for the review: My background is in Computer Science and Machine Learning research. I was looking for a book to understand if now it is the right time to invest more energy into Quantum Computing (QC) to accelerate machine learning research in the topics such as speech processing and computer vision. Before I started looking for a QC book, I did two courses on QC from https://qiskit.org. 'Intro to Quantum Computing' and 'Intro to Quantum Machine Learning'. So, I was looking for something that is not too theoretical but still provides more details into QC than those two intro courses.
Review: This book is well written and I can recommend it if you are looking for your first contact with QC. Given I did two courses from qiskit.org, I was quite disappointed. Both Qiskit and this book provide similar level of detail on QC except of one detail. On the plus side, the book has a very nice appendix on mathematics used in QC.
Addressing the following points could make this book much better:
1) The chapter about QC hardware presents multiple quantum architectures with too little details/intuition on how actually a quantum computer work. I would prefer this chapter to focus more on building up good intuition on how does a quantum computer work instead of, or in addition, serving as a high level listing of multiple possible Quantum Computing architectures.
2) This book is about applied QC, still I did not find answers to some important questions about practical application of QC. For example, how long does it take to run a quantum circuit on a quantum computer? Improving runtime/space complexity is one thing, but if a quantum computer is not able to execute a large amount of circuits operations efficiently, then using QC may be impractical in a real world applications. Different practical question. How many qubits are really needed for a particular use case?, e.g. ML in vision or speech processing. How to encode 100x100 image or spectrogram on a quantum computer? Do we need 100x100 qubits or this can be done with fewer of them? Such practical questions I would love to see discussed in more detail.
3) The chapter on Quantum Machine Learning is too short. I found it hard to build good intuition on what kind of ML problems can be efficiently solved with QC and if not when will it be possible. What elements of QC have to be improved, e.g. the numer of qubits, reliability of qubits, speed of executing QC circuits? I would like to get better intuition on whether now it is the right time for scientists to start exploring QC in addition to using GPUS or other kind of high-performance hardware.
4) Quite often there are some claims made about Quantum Computing without providing explanation nor giving reference points. This is another data point showing that this book is simply too short or it just discusses too many topics at once.
5) There are far too many code examples in this book. Many of them could be easily put only at the corresponding website, and only some could stay to give better intuition on how QC circuits are implemented in practice. Currently, these code example look like they are here to make enough pages for the book, and not to serve the reader as the primary goal.
Whew! Heavy. Thin book. Thick with information, however, especially math. Understandable to me in general. I'm not big on math. I'm more intuitive in that regard--I know things are supposed to be balanced and even out. In computer science, that usually means things have to end up at a value of 1 or 0. I'm vaguely familiar with matrix multiplication, etc. The book is very well laid out. Everything is explained and well-defined, even the math sections. Need a review of set theory--it's there. Definitions for Bloch sphere, vector space, etc. It was interesting to get an idea where the research, technology and theory are all at regarding Quantum computing at this point in time. Mostly research. The really interesting thing is that even when QCs become feasible and physically possible, you still need a traditional computer to input info into the QC for processing, and then the results of the processing get output to the traditional computer for interpretation. Kind of like using an adding machine to come up with figures to punch into a traditional computer and then plugging the results of the computer's calculation back into adding machine. Not the most perfect analogy, but close. It was also interesting to see that all of the researchers agreed to mainly work together and use a Q variation of Python to work on projects. The only hold out, of course, is Microsoft, who wants to use Q# ("Q sharp" a variation on C#). They always have to try and force people into using something they design or use. Life would be so much easier if they would just play along with everyone else to move research and development forward at a faster pace, for the common good of all. Never happens.
This is a straightforward intro book. It explains to you different levels of complexity problems such as the difference between big o notation and its relation to P vs NP hard (as I read quantum blogs, people talk as if you are expected to know those differences).
I wish it covered deeper into applications. But given there're not that many books in quantum computing, this was a well-worth the read. The last quarter of the book is dedicated for mathematical concepts/notations you should know.
I read/skimmed through this quickly, given that is my second book on Quantum Computing. There are some physics details here, that are not present in the other book, and some coding language, which operates at the quantum gate level. I don't think it is too useful. For the physics, Shankar and Sakurai are better.
This is the depth of coverage on the subject I’ve been looking for. I have a Comp Eng degree so am comfortable with the underlying physics and maths (even if I’m rusty with applying them). This was much better than most common treatments of QC since it actually explains the workings, and even explores some of the algos in one of the QC frameworks that is used in the field.
Very good book! I definitely didn’t understand everything but gave me a good insight on what quantum computers are. do not be scared! I found the mathematics very well explained in supplemental chapters, furthermore showing what it leads to (what a difference from how I got it at school!)
Love this book. Although again the maths is a little challenging, its reference value is huge. If you want to know how quantum computing functions and what it means in some detail, its for you. Also Jack is an approachable author unlike some.
Good overview. Much of the algorithm implementations were written years ago and are already out of date due to software incompatibilities. Cirq, Qiskit, pyQuil and Pennylane tutorials are better sources for actually running and dissecting the code at this point.