A marvelous compendium of mathematical symbols and their fascinating histories
Galileo famously wrote that the book of nature is written in mathematical language. The Language of Mathematics is a wide-ranging and beautifully illustrated collection of short, colorful histories of the most commonly used symbols in mathematics, providing readers with an engaging introduction to the origins, evolution, and conceptual meaning of each one.
In dozens of lively and informative entries, Raúl Rojas shows how today’s mathematics stands on the shoulders of giants, mathematicians from around the world who developed mathematical notation through centuries of collective effort. He tells the stories of such figures as al-Khwarizmi, René Descartes, Joseph-Louis Lagrange, Carl Friedrich Gauss, Augustin-Louis Cauchy, Karl Weierstrass, Sofia Kovalevskaya, David Hilbert, and Kenneth Iverson. Topics range from numbers and variables to sets and functions, constants, and combinatorics. Rojas describes the mathematical problems associated with different symbols and reveals how mathematical notation has sometimes been an accidental process. The entries are self-contained and can be read in any order, each one examining one or two symbols, their history, and the variants they may have had over time.
An essential companion for math enthusiasts, The Language of Mathematics shows how mathematics is a living and evolving entity, forever searching for the best symbolism to express relationships between abstract concepts and to convey meaning.
Interesting anecdotes that put together a roadmap of the development of mathematical thought from Indian, Babylonian, Chinese, Arabic, and European mathematicians (as well as several instances showing how the UK and Europe have not melded intellectually). History, etymology, and oh yes math are used to show how concepts and language flow - glad I got to come along for the ride.
This book does not play well with a Kindle Paperwhite. Since this book is specifically about mathematical symbols and notation I would expect it would show the notation correctly, but on Paperwhite it doesn't even display elementary things such as inline fractions properly. 5 over 8 displays as 58, not even as 5/8. Square root radicals likewise disappear. Superscripts and subscripts get mangled too. Anything more complicated? Forget about it. Sigma notation summations such as summing all elements a sub i from i = 0 to n show up as ∑ i=0nai.
Everyday infinite series such as the Leibniz formula for π/4 = 1 - 1/3 + 1/5 - 1/7 + ... shows up as the nonsensical 1 - 13 + 15 - 17 + ... = π4 !
Kindle on PC displays the notation properly, as does Kindle on Android. If I had been reading the book exclusively on one or the other rather than on my Paperwhite I might have given this review 5 stars, but on Kindle Paperwhite I was tempted to ask for a refund.
One of the biggest developments in the history of maths was moving from describing relationships and functions with words to using symbols. This interesting little book traces the origins of a whole range of symbols from those familiar to all, to the more obscure squiggles used in logic and elsewhere.
On the whole Raúl Rojas does a good job of filling in some historical detail, if in what is generally a fairly dry fashion. We get to trace what was often a bumpy path as different symbols were employed (particularly, for example, for division and multiplication, where several still remain in use), but usually, gradually, standards were adopted.
This feels better as a reference, to dip into if you want to find out about a specific symbol, rather than an interesting end to end read. Rojas tells us the sections are designed to be read in any order, which means that there is some overlap of text - it feels more like a collection of short essays or blog posts that he couldn't be bothered to edit into a consistent whole.
There are a couple of historical points I would raise an eyebrow at. At one point we are told 'While Europe groped through the darkness of the Middle Ages, the Arabs rescued the scientific legacy of the Greeks.' This is not to minimise the huge Arabic contribution to maths in this period, but there was a significant account of mathematical activity in the Middle Ages, which running through to the end of the fifteenth century would include, for instance, the Oxford Calculators, Fibonacci, Oresme and more.
There was also an odd statement that the Romans had no year zero because zero 'simply could not be expressed with Roman notation.' While it's true that it couldn't be expressed it wouldn't have been meaningful anyway. Years aren't a number line, and it seems perfectly logical to go from the first year before Christ to the first year of his life, if you aren't mentally boxed in by current mathematics.
Because it doesn't really work well as a book to read end to end I can't give it more than three stars, but there's plenty to catch the attention of someone with an interest in mathematics and a curiosity as to how our weird and wonderful symbols came into use.
Mathematics either fascinates or leaves us indifferent – there’s no middle ground. But for those who thrill at the insights math provides into the world around us, Raul Rojas, Professor of mathematics, statistics and computer science, and an expert in neural networks and artificial intelligence, has provided this wonderful small book on the stories behind how the symbols we use to make math work were created.
Some symbols indicate, in a single pen- or brush-stroke, a concept like addition or subtraction, but, somewhat surprisingly, took many centuries to evolve. Another universal symbol for ‘equal’ (two parallel horizontal lines) was anything but straightforward to conceptualise, let alone symbolise. Likewise the stunning insight that basic numbering required a ‘null’ number – zero, required breathtaking genius.
So math is simple and complex, often counter-intuitive, and its history is full of characters.
Alfred North-Whitehead and Bertrand Russell’s grand 1910 opus – Principia Mathematica – was an epic quest to find a complete and consistent set of axioms that encompass the entirety of mathematical logic that underpin our ability to question, quantify and understand the natural world. It would’ve been one of mathematics’ greatest achievements, but was its most outstanding and grandest failure. It started out well, though ‘the reader has to move at a snail’s pace through a jungle of mathematical notation to find, after hundreds of pages, the proof that 1+1=2’.
So far, so good. Whitehead and Russell had faith that all that could be known could be reduced to provable logical statements. But it can’t.
Young mathematician, Kurt Godel took a long look at the issues and came away with two Incompleteness Theorems that showed mathematical logic had limits to the provability of problems using natural numbers. In other words, 1+1 [very probably] = 2.
Either way, this book is a set of small joys for math-happy explorers of mathematical-cultural history. Recommended.