Jump to ratings and reviews
Rate this book

The Mathematical Theory of Communication by Claude E. Shannon Warren Weaver

Rate this book
Scientific knowledge grows at a phenomenal pace--but few books have had as lasting an impact or played as important a role in our modern world as The Mathematical Theory of Communication, published originally as a paper on communication theory in the Bell System Technical Journal more than fifty years ago. Republished in book form shortly thereafter, it has since gone through four hardcover and sixteen paperback printings. It is a revolutionary work, astounding in its foresight and contemporaneity. The University of Illinois Press is pleased and honored to issue this commemorative reprinting of a classic.

Unknown Binding

First published January 1, 1949

214 people are currently reading
3923 people want to read

About the author

Claude Shannon

11 books66 followers
Claude Elwood Shannon, Ph.D. (Massachusetts Institute of Technology, 1940; M.S. Electrical Engineering, MIT, 1937; B.S., Electrical Engineering & Mathematics, University of Michigan, 1936), was a mathematician, electrical engineer, and cryptographer often referred to as "the father of information theory." His Master's Thesis, which went largely unappreciated for years, demonstrated that electrical applications of Boolean algebra could construct any logical, numerical relationship, and thereby solve any problem solvable by Boolean algebra; this eventually provided the underpinning for all digital computing.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
380 (55%)
4 stars
204 (30%)
3 stars
78 (11%)
2 stars
14 (2%)
1 star
4 (<1%)
Displaying 1 - 30 of 56 reviews
Profile Image for Jake.
211 reviews45 followers
June 23, 2016
“It is remarkable that a science which began with the consideration of games of chance should have become the most important object of human knowledge.” ~Pierre-Simon Laplace Théorie Analytique des Probabilitiés (1812)

Human communication is a dichotomy between chaos and statistical dependencies. Letters in words are, obviously, in some way dependent upon the previous letters in their sequence. These collections of letters congregate together in different combinations to form words and those words form sentences. Shannon, quite beautifully and too the point, explores in this dissertation how we might use markov models to imitate such dependencies. This imitation would lead his conception of entropy in communication or simply put the amount of average disorder in sequences of letters.

Here is an excerpt from Emma by Jane Austen, let us explore what I just talked about:

Emma Woodh*use, hands*me, clever* and rich,*with a comfortab*e home an* happy di*position,*seemed to*unite som* of the b*st blessings of e*istence;*and had *ived nea*ly twenty-*ne year* in the*world w*th very*little *o distr*ss or vex*her.

*he was the you*gest *f the *wo dau*hters *f a most *ffect*onate* indu*gent *ather* and *ad, i* cons*quence*of h*r si*ter'* mar*iage* bee* mis*ress*of h*s ho*se f*om a ver* ea*ly *eri*d. *er *oth*r h*d d*ed *oo *ong*ago*for*her*to ha*e *or* t*an an in*is*in*t *em*mb*an*e *f *er ca*es*es* a*d h*r p*a*e h*d b*e* *u*p*i*d *y *n e*c*l*e*t w***n a* g*v*****s, w** h** f****n l**t** s***t *f a m****r i* a*******n.

You, the reader, like most humans can deal with some deletion because you are doing a probabilistic judgement in your head to fill in that which letters were taken away. As you make your way down the text, disorder is introduced gradually and the deletion becomes untenable for most. This is due to the fact that more information is being obscured in the system and with less information to act on your probabilistic judgments become harder and harder to make.

The more uncertainty present in a string of letters, by the previous reasoning, leads us to see that that same string carries more information than strings of less uncertainty. The infamous unit that Shannon comes to in this paper is of course the bit, which is our quantitative measure of information but in reality is a measure of surprise. Entropy is maximized when all outcomes are equally likely. Each time we move away from equally likely outcomes we introduce predictability into the system, allowing us to ask fewer binary questions to convey the same amount of information.

‘Your act was unwise,’ I exclaimed ‘as you see
by the outcome.’ He solemnly eyed me.
‘When choosing the course of my action,’ said he,
‘I had not the outcome to guide me.’ ~ Ambrose Bierce
Profile Image for Nick Black.
Author 2 books879 followers
December 16, 2018
Amazon 2009-06-18. Where it all started, communication networks-wise. Perhaps the last great work of amateur science (I forget where I picked up this conjecture), "amateur" here being defined as anyone without a PhD (as opposed to "gentleman scientists" of a bygone era, men like Darwin, Lavoisier and Porter -- although, as emphasized in astronomy, this era may be returning with the advent of high-powered workstations and diffusion of open source simulation software. Gentleman science is pretty cool). The Mathematical Theory was published in 1948 and set up Shannon -- and the computing industry -- for life.

This work built on his 1937 masters thesis, "A Symbolic Analysis of Relay and Switching Circuits", referred to on Wikipedia as "the most famous masters thesis of all time" and expected to retain that title until I get my shit published in a year or so here (malicious grin).
Profile Image for Emre Sevinç.
175 reviews430 followers
October 21, 2016
There isn't much to add: this is one of the classics and if you have any serious interest in the topic, you owe yourself to read this at least twice.
Profile Image for Dean.
141 reviews
June 19, 2022
Let me start with an anecdote about Claude Shannon taken from A Mind at Play by Jimmy Soni and Rob Goodman:

The thin, white-haired man [Shannon, 40 years after publishing The Mathematical Theory of Communication] had spent hours wandering in and out of meetings at the International Information Theory Symposium in Brighton, England, before the rumors of his identity began to proliferate. At first the autograph seekers came in a trickle, and then they clogged hallways in long lines. At the evening banquet, the symposium's chairman took the microphone to announce that 'one of the greatest scientific minds of our time' was in attendance and would share a few words, but once he arrived on stage, the thin, white-haired man could not make himself heard over the peals of applause.

And then finally, when the noise had died down: 'This is... ridiculous!' Lacking more to say, he removed three balls from his pocket and began to juggle.

After it was over, someone asked the chairman to put into perspective what had just happened. 'It was', he said, 'as if Newton had showed up at a physics conference.'


Now on to the review of The Mathematical Theory of Communication:
It is... brilliant!
2 reviews6 followers
November 2, 2008
A humble account by the father of information theory...the first sentence lets you know what you're getting into: "The word communication will be used in a very broad sense to include all of the procedures by which one mind may affect another." I probably wouldn't have read this book if it weren't for the assurance of broadness given from the beginning. It was a quick read, and I was left with the feeling that part of my mind had been tidied up.
Profile Image for Dwayne Roberts.
431 reviews51 followers
September 3, 2023
Frankly, it was above me. I followed some of the development, but got lost along the way.
Profile Image for Peter.
72 reviews2 followers
April 11, 2020
Rare example where the founder of a field of study also wrote one of the best books on the subject.
Profile Image for Kevin.
34 reviews17 followers
July 17, 2018
The Mathematical Theory of Communication is a rigorous explanation of Digital Communication theory, or how a procedure generated and transmitted from one entity to another effects the state of the auxiliary system.

Shannon partitions the essential elements of communication into these primary buckets: sources, source encoders, channel encoders, channels, and associated channel and source decoders. Shannon also covers concept such as noise (or 'entropy'), which probabilisticaly affects the set of included information during transmission in a channel.

As defined by Shannon, the fundamental problem with communication is how to reproduce at one point exactly or approximately a message selected at another point. As explained by Weaver, Claude Shannon attempts to resolve three primary problems with this essay. First, how accurately can the symbols of communication be transmitted from one system to another. Second, how do transmitted symbols carry the desired meaning (transmission or decoding problem). And third, how effective is the the transmitted message at altering conduct in the desired way?

In the paper, Shannon moves from the simple scenario to the complex, starting with a mathematical formalization of messages transmitted across noiseless, discrete systems - before defining the addition noise and continuous information.

Having recently studied Hoffstader's "Godel, Escher, Bach - an Eternal Golden Braid", I was especially interested as to how the interaction between external/tangent formal information systems (such as the set of real numbers, mu theory, etc.) may be generated and effect each other.

I found this paper to be especially valuable in building my understanding of core concepts within mathematical communication theory (specifically concepts like entropy, ergodicity, redundancy of communicatory set, etc.).

After reading the paper, I find it interesting interesting to look for stochastic processes, Markoff threads, and ergodicity within informatic systems. Ergodic systems have a sort of fractal probability to them where statistical trends of high sample significance may be generalized as an attribute of the entire system itself.

I highly recommend this book if you have any interest in the academic understanding of how formal information systems interact and affect each other's state. This book sets the standard for rigorous analysis of binary communication, with a ruthless focus on the statistical efficiency of information transmission (given channel noise, etc.).
Profile Image for William Schram.
2,340 reviews96 followers
April 19, 2024
I rewrote this review to correct some mistakes.

This book contains the landmark paper "The Mathematical Theory of Communication" by Claude E. Shannon in 1948. In it are the equations that define channel capacity and other such things. It is interesting since the first part contains some additional information written by Warren Weaver.

The book has some mathematics, as you might be able to tell from the title. There are some logarithms in there and some Calculus, so it isn't for those that despise math. However, as I said, the initial part by Weaver explains some of the implications of Shannon's findings without going into the math part.

Shannon developed the idea of Information Entropy. He found a way to attribute a measurement to information.

I would reread it if given the time. Thanks for reading my review, and see you next time.
Profile Image for Craig.
318 reviews13 followers
November 18, 2007
The mathematics quickly went over my head but I like to keep this book around to look at in ignorant awe-- its that important. His Master's thesis-- written in 1940, is the most important scientific or technical paper of the 20th century-- is more my speed, he connects electrical switches using boolean algebra and invents digital logic circuits. Brilliant! (to quote the Guinness Irishmen) And it can be understood by a liberal arts major. Free to download at MIT's website.
Profile Image for Damon.
Author 43 books27 followers
January 16, 2013
Dense and intense, Shannon's book breaks down the technological communication challenges we have today. The beauty, of course, is that this book was written a half-century ago.

The number-driven, engineer-focused theories in the latter half of the book were out of my range, but the first chapter alone blew me away and managed to quantify communication ideas we are struggling with right now.
Profile Image for Roberto Rigolin F Lopes.
363 reviews107 followers
Read
November 16, 2015
Weaver started strong describing the breakthroughs, pretty exciting intro. Discrete channels are palatable anywhere, but the continuous ones are tricky. Okay, perhaps I had a "noisy" channel while reading Shannon's ideas. Don't try to read it in public places :D
Profile Image for Cody.
597 reviews49 followers
November 26, 2007
Shannon's use of entropy in his theory is fabulous and an idea that seems as though it could show up in a Pynchon novel.
Profile Image for Olmedo Vogue.
15 reviews
June 13, 2018
Mathematics is very important in the school.
But is really difficult.
Profile Image for Walt.
179 reviews2 followers
January 11, 2019
Way over my head, but sheds light on the basis of cryptographic analysis. The idea of redundancy in language is fascinating. Increase redundancy and 3 dimensional crossword puzzles are possible; decrease it and 2D crosswords become impossible. What are the redundancies of other languages? Has this been studied?
Profile Image for Mengsen Zhang.
74 reviews26 followers
July 13, 2014
Great book! Satisfied my morbid habit of reading classics off a book-like object. Weaver's encapsulating article was a wonderful surprise and nice guidance. And of course Shannon- such a beautiful mind!! Most impressed with the way he unfolded his logic, also his mastery of the art of using examples and diagrams. Nonetheless, for someone who's not very familiar with math, some continuous signal sections created lots of hair pulling moments, but I eventually pulled through.... anyways, it clarified some of my metaperception about communication in terms of perturbation, resonance, and delay.
Profile Image for Alex.
586 reviews46 followers
May 15, 2016
Very informative and relatively understandable, particularly with Weaver's introduction as an aid. The concepts of ergodicity and information transfer as a probabilistic symbolic state machine come across clearly without needing to understand difficult mathematical proofs. The notion outlined by Weaver of the extension of some fundamental principles across all three "levels" of communication I also found to be extremely interesting, and reading through Shannon's paper made them all the more convincing despite Shannon's focus primarily on the mechanics of signal transfer.
Profile Image for Mitchell.
25 reviews
October 31, 2017
This is the theory that created the foundation for how "information" became associated with the aggregation of data, and how the gleaning of information from the randomness of data was a measure of order. Critical for information theorists and those interested in understanding the separation between information as a concept of meaning, and information as "probabilities" that delimit the number of arrangements of data into intelligible construction.
Profile Image for Michal .
17 reviews
February 1, 2018
This is truly one of the most important and groundbreaking works of modern science. Shannon dared to introduce the notion of information as a measurable, quantifiable property of signals and has developed a powerful and surprisingly simple framework to describe it.

A must read for anyone seriously interested in science and engineering and likely a compulsory reading for anyone in information engineering field.
Profile Image for Armin.
242 reviews10 followers
January 20, 2022
This book probably deserves a smarter reader than me. I picked this up via the recommendation of the CEO of Neuralink in their recent announcements as a ‘very readable’ book, knowing that it might get into maths. I like math, but in the end for my way too many formulas in the book, so stopped reading. I read the long first part and foreword by Weaver, which is much more approachable in my mind, and learned some great basic concepts of communication channels and language.
Profile Image for dead.
9 reviews28 followers
July 17, 2015
It's kind of hard to review this - it's a very good book and more accessible than I expected. Very short and lucid but kinda profound in its impact? I guess I can see the reason the title went from "A Mathematical Theory of Communication" to "The Mathematical Theory Of Communication"
1 review1 follower
March 17, 2008
Nice reading. This publication initiated the Theory of Communication as is known today. I bought it on a second-hand bookstore and it is an edition of the year 1964.
32 reviews
March 25, 2017
This is a seminal work in both computer science and the physics of entropy. I cannot delay reading it any longer. Too many other books depend on it.
Profile Image for David.
1,154 reviews59 followers
October 14, 2013
Shannon's original 1948 paper on information theory. A relevant read, even today.
Profile Image for Alex.
105 reviews30 followers
May 3, 2018
A bit too technical for my abilities
Profile Image for Lucas Gelfond.
100 reviews18 followers
April 15, 2024
TL;DR - amazing, got way more out of this time around. Rounding down to 4 stars because he gets kinda mired in formalism by the second half/as a book, but obviously this is some of the most important mathematics of the last hundred years. Really fascinating to see this stuff in its original form, and the sort of generality / breadth of insight it produced. Weaver’s intro is amazing, Shannon’s first half is quite readable and intuitive, gets a bit clunkier by the time he deals with continuous signals.

I read this for the first time about a year ago, with essentially none of the formal mathematical training, in an English independent study; I’ve since taken most of a math core (stats, linear algebra, an info theory/ML class, etc) and got way, way more out of it this time around. By any measure this is one of the most influential (and interesting) works of math of the last 100 years, and it’s quite incredible to read as it was originally formulated.

For one, Shannon is a pretty great writer. For the first three or four chapters there’s a remarkable lack of dense formal notation, and the simplicity, elegance, and generality of a lot of the methods, I think, underscores how profound some of the insights here are.

The first time I read this, I had a version without Warren Weaver’s introduction, a huge mistake; I think his work is sort of essential here in bringing information theory’s significance into context. Weaver begins by defining communication broadly—the way one mind might affect another or, even more generally, “procedures by which one mechanism affects another mechanism.” He then breaks communication down into three phases: (A) the technical problem, accuracy of symbols, (B) the semantic problem, how precisely symbols convey meaning (which we might think of, now, as semiotics!), and (C), the effectiveness problem, whether the received meaning affects conduct in a desired way. In this division is an assumption that Weaver spells out: “communication is always attempting to influence the conduct of the receiver,” somewhat obvious, but sort of remarkable in its generality. Perhaps most interestingly for my study, Weaver suggests (and Shannon eventually elaborates), that study of (A), the focus of the book, yields interesting insights into (B) and (C). This seems almost like a trope now, the hasty overapplication of information theory into other disciplines, but the appeal of such an approach is clear in this description.

Weaver describes all of the communications problems of this first level. A message is encoded into a signal, sent over a communication channel, and decoded by a receiver. There is often noise in the channel. This raises several questions that Shannon goes on to answer—measuring the amount of information, capacity of the communication channel, process of encoding (and how this might be optimal), the characteristics of noise, and how solving this problem might differ in the continuous and discrete case.

I always found Norbert Wiener’s work so seductive because of the breadth (and promise!) of his claims. If I recall correctly (from beyond here), Wiener and Shannon actually had considerable conflict over some of the credit for this work, but they seem to amply acknowledge one another, in Weaver’s description. Regardless, where Wiener’s contributions seem largely sci-fi and to have receded into the background, it’s quite remarkable to see Shannon’s, essentially, living up to the tone of generality and consequence described in the work, in their later use (across tons of disciplines!).

Weaver devotes lots of time into this throughout, particularly in the ability to apply such techniques to meaningful or meaningless messages—semantics are irrelevant to the engineering problem. He then expands on Shannon’s general formulation—that information is a measure of one’s freedom when selecting a message (the logarithm of available choices) and that probability shapes these concerns. For example, we might see English as a stochastic process or Markov process, in that the probabilities of one letter occurring depend directly on the preceding letters. Weaver really elegantly breaks down more of the formal structure here—defining entropy as a degree of randomness (and suggesting how fundamental entropy is as a principle), defining redundancy and the “relative entropy” of a message (the amount of free choice that is not constrained by the statistical contraints of the structure, i.e. how certain letters cannot follow others).

He similarly expands on coding, differentiating between C (the capacity of the channel) and H (the average bits per signal), noting that it is possible to transmit signals approximately at C / H, but no better. The best codings (or more ideal codings) will maximize signal and channel entropy to make them equal to the capacity of the channel, but more ideal coding will incur more delays in coding.

He also works through noise, introducing uncertainty about what message was sent, and “equivocation,” the uncertainty present in a received message when the original signal is known. Noisy channels, as such, must incur some redundancy later.

Weaver ends by attempting to expand some of Shannon’s methods to other domains; suggesting “semantic noise” when concepts or unclear, or an ability of a speaker to overload the capacity of an audience, like a communications channel. He similarly gestures toward future breakthroughs in linguistics, suggesting handling language statistically or as Markov processes. Weaver, interestingly, seems to be addressing an only partially mathematically trained audience in his writing—for example, there’s footnotes explaining what a logarithm is, or formulating entropy.

The book then enters (sidenote: I almost wrote ‘delve’ instead of ‘enter’ here, but find myself avoiding ‘delve’ because of its known overuse in language models, LOL) into Shannon’s portion, which formalizes more of this. Shannon acknowledges debt to Nyquist and Hartley, and suggests the goal of reproducing an exact message as structuring the work.

His progression is really logical—first, simplest, discrete noiseless systems. He then adds in noise, expands to continuous information, and further formalization. In discrete noiseless systems, he enters into a pretty fascinating description of the structure of the English language, particularly showing its statistical structure using first-order, second-order, and n-order approximations. As he writes, suggestively (wow, we really did just need to throw more compute and data the problem!): “a sufficiently complex stochastic process will give a satisfactory representation of a discrete source.” There’s pretty interesting graphical representation of markov processes, and results of entropy in here.

I was pretty amused when he wrote about how James Joyce’s Finnegan’s Wake “arguably achieves compression of semantic context.” The academic part of my brain is inclined to be pretty suspicious of this move across disciplines, or to dismiss this as overreach beyond discipline. However, I’m much more compelled by the idea that this, actually, is incredibly significant; my English professor, I think correctly, argues that Barthes and many others formulating semiotics later on, are greatly indebted to information theory. In essence—this seems like a rare example of technical conclusions actually reaching beyond their disciplien and having real, foundational impact.

Shannon then extends his early formalizations to a discrete channel with noise. This remains quite intuitive; the real capacity of a channel is actually related to the measure of missing information, that would be acaused by noise. This would be, essentially, the chance that a value is mistaken; Shannon formalizes this, in a binary sense, by adding the chance that a 0 was a 1 plus the chance a 1 was a 0. Certain amounts of redundancy that take into account the type and amount of noise will allow for noiseless reproduction eventually—for example, redundancy in English is desirable.

For the continuous case, Shannon notes that one can divide the discrete case into a number of small regions and solve based on a discrete basis. He also suggests that his method does not get toward maximal generality or mathematical rigor, which might require measure theory. This is a pretty interesting challenge to a lot of my personal conception of mathematics, the idea that the most compelling explanations would be those that were maximally foundational and rigorous; this still may be the case, but it’s an intersting move from Shannon.

Regardless, I am not sure I felt like this was sufficient; it’s quite interesting, but, compared with the first two sections, I think Shannon is mired in formalism and loses a lot of his coherence here. I’m reminded of some of Thomas Kuhn’s work detailing how early formulations of a theory are often clunkier and less well-phrased, and the goal of “normal science” is to reformulate observations to be sufficiently general, intuitive, and elegant; this section, unlike the first, feels like the sort of observation that will eventually be refined, and Weaver notes that many of the more rigorous proofs would follow Shannon’ work.

Either way, this is the sort of work I (and I think anyone in a technical discipline!) should aspire toward. Deeply foundation and paradigm-shifting, intuitive and written without an overabundance of formalism, simplistic, clear, elegant, and broadly useful, across an incredible number of fields. It’s sort of impossible to read this without gaining more admiration for Shannon; it’s really, quite inspiring.

My review from my first read is below (note: I came back to it!):

read for deak, rounding up from 3.5. parts of this were really incredible / really interesting and readable (the first section in particular, plus a lot of the introductions at the beginning of chapters), but I found a lot of the actual mathematical expositions / derivations considerably harder to follow / less thoroughly (or entertainingly) explained and sped through a lot of them, although I admittedly don't have a strong foundation in information theory / some of the discrete probability stuff, so this may have just gone over my head. anyways, still a really interesting project / it makes sense why this is considered so foundational / could very much see myself coming back to this again in the future
Profile Image for Alexa Daskalakis.
28 reviews1 follower
March 17, 2025
Communication is not merely the transmission of information—it is the structural foundation of order itself. Claude Shannon’s The Mathematical Theory of Communication is not just the birth of information theory; it is the moment when the very concept of knowledge, entropy, and meaning were formalized into an exact mathematical language. This is the book that turned noise into signal, randomness into probability, and uncertainty into a quantifiable variable—changing the trajectory of computation, artificial intelligence, and even human cognition.

Shannon’s brilliance was in distilling the vast complexity of communication into a simple yet profound insight: information is not about meaning but about reduction of uncertainty. A message is not its content—it is a probabilistic resolution of possibilities. In other words, information theory does not care about what is said, only how much is conveyed. This radical departure from conventional thought did more than just lay the groundwork for digital communication—it provided the theoretical framework for everything from quantum mechanics to genomics.

Key Insights:

• Entropy & Information: Shannon’s entropy function redefined knowledge itself. The less predictable a message, the more information it contains. This insight extends beyond engineering—it underpins complexity theory, cryptography, and even neuroscience.
• Error Correction & Redundancy: Noise is inevitable, but Shannon’s coding theorems demonstrate that perfect communication is still possible. This realization is the reason you can stream high-definition video across continents with near-zero error.
• Compression & Computation: The very act of thinking—reducing vast inputs into manageable representations—is, at its core, compression. Shannon’s theory explains why intelligence itself depends on optimal encoding.
• Limits of Communication: No channel, no matter how powerful, can transmit information faster than its fundamental capacity, a law that constrains everything from human perception to artificial general intelligence.

The implications of Shannon’s work are staggering. If the universe is fundamentally governed by probabilistic laws, then reality itself may be best understood as an information-processing system. Physics becomes computation. Perception becomes signal interpretation. Thought becomes entropy minimization.

This book does not concern itself with philosophical speculation, yet its consequences ripple through every domain of science. Without Shannon, there is no internet, no artificial intelligence, no deep learning, no genetics as we understand it. His mathematics defines the limits of knowledge itself.

The Mathematical Theory of Communication is a brutal, unapologetic intellectual gauntlet—minimalist in exposition but maximalist in consequence. To grasp it is to see the hidden architecture of the modern world. To master it is to touch the very structure of reality.
Displaying 1 - 30 of 56 reviews

Can't find what you're looking for?

Get help and learn more about the design.