Science and Inquiry discussion

The Signal and the Noise: Why So Many Predictions Fail—But Some Don't
This topic is about The Signal and the Noise
126 views
Book Club 2013 > March 2013 - The Signal and the Noise

Comments Showing 1-15 of 15 (15 new)    post a comment »
dateUp arrow    newest »

message 1: by Betsy, co-mod (new)

Betsy | 2160 comments Mod
Our reading selection for March 2013 is The Signal and the Noise: Why So Many Predictions Fail - But Some Don't by Nate Silver.

Please use this thread to post questions, discussions, and reviews.


Marco Paulo Naoe (mark_of_fauxlaw) | 4 comments I'm struggling to finish this book. Has anyone finished this already?


message 3: by Paul (new)

Paul | 9 comments Yes, I finished it. I thought it was a great book. Silver made a very difficult topic very understandable. Anybody who could predict the outcome of two presidential races at +99% accuracy deserves my attention. My suggestion is to not get stopped by difficult statistical theories. For most of us, they're too difficult to understand. Just accept them as true and just depend on his summaries. What I got out of the book was that the new standard in journalism will be based on honest statistical analysis. This is the data age and with it comes the supremacy of "The power of huge numbers." Silver is the prophet in this field and luckily for us seems to be honest. I'm sure there will be many who take his concepts and turn them into slimy manipulations.


Elizabeth Theiss Smith (dakotaprof) Silver's ideas about the limits of Big Data are important. He illustrates the problems inherent in research using "the fourth paradigm," the idea that we can search for patterns in huge data sets instead of engaging in the hard work of experimental research. Theory, models and logic are still essential.


Angus Mcfarlane | 73 comments The human component Silver discusses - theory, models and logic - is the thing that has struck me most so far also. My work is becoming more and more focused on data with an expectation that big data will be the answer to uncertainty (in geology and mining). While mining data is fascinating, the warnings that it can be used blindly is ringing some bells


Elizabeth Theiss Smith (dakotaprof) One area in which data mining has been useful is the study of genetic diseases. 23andMe has amassed an enormous DNA database linked to detailed health health questionnaires, which has allowed the group to identify two genes associated with Parkinson's disease. Now the real work of identifying the genetic mechanisms involved begins. The lesson for me is that Big Data can play an important role as an enormous arrow telling us, "Look Here!" But it cannot "prove" much in and of itself.


David Rubenstein (davidrubenstein) | 1040 comments Mod
I'm a month late getting a start on reading this book. I am finding it to be excellent. It is very well written. Each chapter is fascinating. Unfortunately, we didn't get much discussion about it.


Elizabeth Theiss Smith (dakotaprof) I'm game for continuing the discussion if anyone else is interested. Aside from the chapters on baseball and poker (about which I know nothing), I found Silver's book compelling. Graduate student often come up with correlations and multiple regression findings that are "significant," as in having low probability that the finding is due to chance, but they don't understand that without a logic model their findings have no meaning. I've heard papers by professors that suffer from the same problems. I wish this book was required reading in every PhD's epistemology course.


message 9: by David (last edited Apr 18, 2013 07:37PM) (new) - rated it 5 stars

David Rubenstein (davidrubenstein) | 1040 comments Mod
I finally finished the book. I think it's fantastic. The second half of the book emphasizes the usefulness of Bayes Theorem--which I think is quite appropriate. I use Bayes Theorem on a daily basis; while it is mathematically simple, it sometimes seems to work "like magic". Here is my review.


message 10: by bup (new) - rated it 5 stars

bup | 21 comments I just finished (hey, I've been busy). I thought it was great. The idea that more and more data won't solve everything, and that many people (including professional scientists) blithely overfit models, is maybe the most important thing in the book, IMO.

Of course, the people that need to hear that message are those in the soft sciences, and it's going to be a long road. There's very little accountability in fields like economics, psychology, and sociology, where the inability to replicate a result is understandably difficult (because of the vast noise and tiny signal).

I think it's interesting he didn't count himself as one of the people who, like weather correspondents, tend to understate his predictions. He predicted every state correctly in the last two presidential elections, and with his own stated probabilities, was incredibly lucky to do so. Unbelievably lucky. So much so, that I think we can say his probabilities were understated.


Elizabeth Theiss Smith (dakotaprof) I don't know that social scientists are worse overstaters-of-predictions than,say, medical scientists. Multiple regression models produce a lot of silly advice. Don't eat eggs because of their effect on cholesterol. Whoops! Eat eggs; no problem.

I loved Silver's point about the unreliability of most journal articles. One area in which the reliability problem shows up clearly is in studies of the (unproved) deterrent effect of the death penalty. After reading them all, the only thing that is clear is that we have no idea whether the death penalty has a deterrent effect or not, yet these "scientific" articles are used by proponents to "prove" the necessity of retaining the death penalty.


message 12: by Aloha (new) - added it

Aloha | 334 comments Hmmm...I wasn't planning to read this now since I'm deluged with reads and very busy off-line at the moment. The discussion is making me want to read the book by the month's end. Maybe I should put aside my current reads to read this by the month's end.


message 13: by Betsy, co-mod (new)

Betsy | 2160 comments Mod
I agree. I wasn't planning to read this because it's long and I already can't keep up and after all ... statistics?! But it's sounding more and more interesting.

And with the Reinhart & Rogoff flap, I'm rethinking my interest in statistical analysis.


message 14: by bup (new) - rated it 5 stars

bup | 21 comments Elizabeth said - "I don't know that social scientists are worse overstaters-of-predictions than,say, medical scientists. Multiple regression models produce a lot of silly advice. Don't eat eggs because of their effect on cholesterol. Whoops! Eat eggs; no problem."

That's true enough, I guess, but I think biology/chemistry/physics conclusions that are wrong and reversed are a matter of years, not decades. And there is a lot less noise in studies that don't include human behavior.


Elizabeth Theiss Smith (dakotaprof) It's tools and technology that drive discovery. Weather, as Silver points out, depends on chaotic systems. It is advances in computing power that enabled more accurate prediction. Social sciences depend on fairly primitive tools--survey data, votes, etc. one of the most interesting advances in political science has been the use of fMRI, eye-tracking, EEG, and physiology labs to learn more about the biological bases of political behavior. But as Silver suggests, it is theory building that allows us to find the signal in the noise. Technology is all noise without theory.


back to top