Jump to ratings and reviews
Rate this book

How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life

Rate this book
Thomas Gilovich offers a wise and readable guide to the fallacy of the obvious in everyday life.

When can we trust what we believe—that "teams and players have winning streaks," that "flattery works," or that "the more people who agree, the more likely they are to be right"—and when are such beliefs suspect? Thomas Gilovich offers a guide to the fallacy of the obvious in everyday life. Illustrating his points with examples, and supporting them with the latest research findings, he documents the cognitive, social, and motivational processes that distort our thoughts, beliefs, judgments and decisions. In a rapidly changing world, the biases and stereotypes that help us process an overload of complex information inevitably distort what we would like to believe is reality. Awareness of our propensity to make these systematic errors, Gilovich argues, is the first step to more effective analysis and action.

216 pages, Paperback

First published January 1, 1991

283 people are currently reading
8024 people want to read

About the author

Thomas Gilovich

13 books112 followers
From Wikipedia:

Thomas D. Gilovich (born 1954) is a professor of psychology at Cornell University who has researched decision making and behavioral economics and has written popular books on said subjects. He has collaborated with Daniel Kahneman, Lee Ross and Amos Tversky.

Gilovich earned his B.A. from the University of California, Santa Barbara and his Ph.D. in psychology from Stanford University in 1981.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
1,133 (35%)
4 stars
1,087 (34%)
3 stars
667 (21%)
2 stars
210 (6%)
1 star
62 (1%)
Displaying 1 - 30 of 137 reviews
Profile Image for Trevor.
1,494 reviews24.4k followers
September 22, 2009
I thought this was a remarkable book – five stars all the way – up until the last couple of chapters when it really didn’t live up to its initial promise. But I’m giving it 5 stars anyway, because the first two parts are so good they are more than worth whatever effort is necessary to get your hands on this.

It is a bit old now – first printed in 1991, but many of the ideas are still essential if you have any interest in how our judgement and decision making processes can land us in trouble.

The first two parts of this book are the most interesting – they deal with cognitive determinants and biases and then motivational and social biases.

Many of the mistakes we make about the world have to do with the fact that as humans we are not terribly good at working out what a random string of data might look like. And we do like to see meaning in things. This is the man who discovered that the ‘hot hand’ effect in basketball (that people believe that if you are scoring well that you should be passed the ball more often) is not supported by the data (in fact, the data suggests that you should probably pass to the person who missed their last shot as they will be more focused on not missing again). Interestingly, when he presented this data in a report ‘sports experts’ canned him as some academic know nothing coming onto their turf and making a fool of himself – effectively proving they had been fooled by randomness all their lives and were determined to remain fooled.

So that this review doesn’t go on forever, I might just give a list of the biases discussed in these early parts of the book:

The problem of random data not looking random (he discusses the random falling of V2 bombs on London during the war and how these were felt to be targeted when in fact they were not).

Regression towards the mean. If you do really well or really badly in one attempt you are likely to do worse or better the next time. The Sports Illustrated Jinx is used as an example – people who appear on its cover are supposed to be jinxed immediately after – but you only get on the cover if you have had an incredible run of good performances so regression back toward your average is probably due, so not a jinx, just the way life is.

Self-fulfilling prophesies – where my behaviour on knowing you are an unfriendly bastard means either that I am generally cold towards you (thus confirming my bias) or I avoid you and never find out you are actually a darling – which leads to the not terribly comforting conclusion that while our negative first impressions are long lasting our positive first impressions are likely to be proven wrong. (How we ever end up with friends at all is an interesting question) You could call this the Pride and Prejudice bias.

Gambling – this stuff was fascinating. The myth is that gamblers remember their wins and ignore their losses, but in fact, the opposite is the case. Gamblers focus much more on their losses and remember their losses for much longer. The thing is that they also see their losses not as ‘losses’ but as ‘near wins’. ‘If only’ being the key phrase here.

The problem of multiple end-points. A psychic predicts at the start of the year that a famous politician will die. Not only will this be confirmed if the President dies, but also, in some people’s minds, even if he nearly dies. And of course, even in America, there tends to be more than one famous politician.

Excessive scrutiny of disconfirming information – the idea that, like the gamblers above, we put more effort into scrutinising information that opposes our views than we do to that which confirms our views, if only so we can prove why it is wrong.

One-sided events – that we tend to see confirmation more in data that is ‘one-sided’ – e.g. ‘the phone always rings when I’m in the shower’ seems true because when the phone rings when I’m in the shower it is a pain, but when it doesn’t ring it goes unnoticed. The other side to this is, ‘I always know when someone has had a facelift’, well, accept when I can’t tell, but then I don’t know when that is, do I?

Flatter me – most people tend to believe flattering things about themselves. Most people think they are ‘above average’. Interesting discussion on students deciding if they are either introverted or extroverted. First half of them are told that being introverted is strongly correlated to academic success – then the other half are told that being extroverted is correlated to success. Most students in either group then say they are the one that is related to success and go on to provide ‘evidence’ to show why they are either extroverted or introverted depending on the group they were randomly selected for (and can’t we all do that?) Driving is another interesting example of this – everyone is a good driver, but the criteria changes. I’m a good driver because of my skills, you are a good driver because of your courteously, and he is a good driver because of his patience.

Telling a good story – we sharpen and level the facts to a story so as to heighten the point of the narrative, but this can cause biases. Discusses UFOs and how the media tend to report them much more favourably than the ‘facts’ would seem to justify for the good of the story. The effect? Lots of people believing in little green men buzzing crazies on lonely country roads.

For the greater good – we tell ‘lies’ because they are ‘good’ for people. Drugs are bad, but the lies we tell our kids about them make them sound infinitely worse. And what about this little beauty, “Research studies now project that one in five heterosexuals could be dead from AIDS at the end of the next three years. That’s by 1990. One in five. It is no longer just a gay disease. Believe me.” Oprah Winfrey.

Plausibility – Did you know that Bobby McFerrin (aka, the guy that sang Don’t Worry, Be Happy) committed suicide? Well, actually, he didn’t, but it has the ring of truth about it, doesn’t it? In that strangely ironic kind of way.

Social Projection – we believe others are much more likely to believe what we believe than they necessarily do. When they asked college students if they would wear a sign saying Repent! Of those who said they would they also believed that 60% of other students would also agree to wear such a sign. Of those who said they would not they felt 70% would also not wear such a sign. And that makes a mere 130%...

Victims of Circumstance – if we are studying law, say, we are much more likely to believe we ‘got here almost by accident’ but to then believe that everyone else got here due to their fundamental character. Although not discussed extensively in this book, this is one of my favourite biases. It helps us to blame others and excuse ourselves – worth its weight in gold.

Inadequate feedback – Sometimes there are things about us even our best friends won’t mention, but have you ever considered using a deodorant? Their silence is often construed by us to be support of our views, when it is often anything but.

The first two parts are then followed by three chapters looking at examples of questionable and erroneous beliefs: belief in ineffective ‘alternative’ health practices (well worth reading, even on its own), belief in questionable interpersonal strategies and belief in ESP. I didn’t think the ESP chapter in particular added anything to the book. Let’s face it, the only people capable of believing in ESP are challenged by having to wear lace-up shoes, so the stuff in this chapter was the least interesting in the book.

The fact remains that while this book is discussing the types of errors we are prone to fall into and how to avoid them it is utterly brilliant. It is just a pity that the less great bits come nearly at the end and so one is left with them as the main impression (big mistake, but not fatal). The first two thirds of this book are packed to overflowing with fascinating information that will – quite literally – change the way you think about the world. Now, that can’t be a bad thing, can it?
Profile Image for Tereza Vítková.
81 reviews3 followers
October 20, 2021
Fajn knížka - napůl populárně naučná, napůl odborná. Přestože byla napsaná na počátku 90.let, většina konceptů je pořád aktuálních a aplikovatelných na dnešní dobu. Celkově vysvětluje jakým způsobem nazíráme skutečnost, jakých chyb se při rozhodování dopouštíme a jak se jich vyvarovat. Chvílema mi přišlo, že je přeplněná fantastickými fakty, který v takovým množství ztrácejí na lesku a hlavně věrohodnosti.

Interesting points:
- černá barva jednoznačně evokuje agresivitu a násilí –> všechny sportovní týmy s tmavými drezy bývají více penalizovaní za trestný chyby a nesportovní chování
- Brocovy experimenty: při naměření různých hmotností mozků mezi Němci a Francouzi vysvětlil rozdíl externími faktory (celkovou tělesnou velikostí), ale při publikaci svojí slavný teorie o větší velikosti mužských mozků v porovnání s ženskými tenhle adjustment neudělal
- Gambling: většina lidí si myslí, že gambleři jsou motivováni ojedinělými výhrami, není tomu tak. Mnohem častěji se upínají k prohrám, které vnímají jako “nearly wins“, a proto sází víc
- lidi vidí tělesnou podobnost mezi adoptovanými dětmi a jejich nevlastními rodiči
- v medicíně je důležitější jaký typ pacienta je nemocný, než jaký typ nemoci pacient má
Profile Image for Shannon Hedges.
138 reviews
August 6, 2014
This book examines cognitive biases. Gilovich describes various dubious beliefs, such as faith healing and other homeopathic nonsense. He investigates the thought processes that affect our ability to make sound judgments. It encouraged me to examine the shortcomings of my own reasoning. Highly recommend.
Profile Image for Kazen.
1,475 reviews316 followers
September 11, 2020
I was assigned How We Know What Isn't So as part of a critical thinking course my freshman year of university, and it's one of the few books outside of my major that I've kept all these years.

On reread I realize that I was conflating my love of the class with the content of the book. There's nothing wrong with it - it's an academic, clear-eyed look at all the ways our mind can steer us wrong. We may think we're making a decision on complete data when there's actually a chunk that's missing. We overemphasize the importance of some evidence while dismissing other facts. And so on.

I'm glad I studied critical thinking early in my college career - the class, including this book, gave me a strong foundation in evaluating claims that as served me ever since. I'm also glad I reread it, as it refreshed concepts I haven't interacted with in a while. At the same time, I can't recommend it easily. It's almost 30 years old; I'm guessing a more up-to-date (and more engagingly written) book on the same subject has been published by now.

In short - a book with value, but I'm guessing that it has been supplanted in the decades since it was first released.
Profile Image for Ruxandra Tihon.
21 reviews1 follower
October 13, 2019
Wonderful book, recommend it to anyone looking to improve their critical thinking. Written in the 90’s but very much relevant for today’s world.

“ To truly appreciate the complexities of the world and the intricacies of human experience, it is essential to understand how we can be misled by the apparent evidence of everyday existence. This, in turn, requires that we think clearly about our experiences, question our assumptions, and challenge what we think we know. “
Profile Image for Cell.
451 reviews31 followers
August 19, 2020
我大致上是同意人類心理有書上所寫之各種謬誤的"結論"
但我對於本書還是有很多意見

作者用「投進一球、二球、三球以後,下一球的投進機率,以及錯失一球、二球、三球以後,下一球的投進機率」的統計數據
來證明手感並不存在
然而,如果這樣的統計數據沒有辦法統計到球員體力、對手強度等比手感更巨觀的影響(體力剩越少或對手強度越強,會使整體命中率下降,導致容易連續錯失球;反之整體命中率上升,容易連續投進球)
拿這種平均掉各種要素的數據來證明短期的手感並不存在
我不能接受

再來,作者用「V-1飛彈在倫敦市中心落點」的圖片
來說明居民認為落點集中在特定區域是種群聚錯覺
假設V-1飛彈不是垂直落下
飛彈落點的機率還是有辨法算得出來吧(理論上應該是橢圓形的等機率線,長軸指向歐陸發射地點,或在靠近射程邊緣時等機率線長軸垂直於發射地)
拿這種理論上機率於各點並非一致的資料(拿的圖片還是沿對角線的落點較集中……)
跳過論證,直接認定觀察結果是種群聚錯覺
我不能接受

這本書前面給我的感覺是使用證據力不足的實驗,乘上帶有"正確"的偏見眼光
得出了人有各種心理謬誤的正確結論
雖然本書有其他部分對我是新知,讓我不太好意思送上2星
但被激起的警戒心讓我讀起來感覺好累
--
於21%中斷
--
Bookwalker的一大缺點是沒辦法複製書中的內容
讓我能輕鬆地在寫心得時引用
--
出版社大概沒想過用上原書名所沒有的康乃爾三個字來宣傳
在我的看法會變成「你沒事幹嘛砸人家大學的招牌啊」
Profile Image for Nancy.
853 reviews22 followers
June 4, 2017
Although this was an interesting book, as someone with a psychology degree there wasn't anything ground breaking in it. It gave a thorough discussion of why people are so prone to falling for erroneous beliefs and it showed how difficult it was to do otherwise using evidence of psychological studies coupled with some quite detailed explanations. I think I wanted more examples over and above the theory, but they were confined to the last couple of chapters dealing with belief in alternative medicine and ESP. However the book was written in the early 1990's so I think the field of skepticism has come a long way since then.
Profile Image for Juliet.
149 reviews9 followers
March 11, 2022
Was pretty useful for my theory of knowledge essay
303 reviews16 followers
December 29, 2020
In "How We Know What Isn't So," Thomas Gilovich lays out a useful and engaging introduction to the importance of critical thinking, the risks of cognitive bias, and the implications of decision-making when we aren't attentive to these concerns. Compared to many books & articles on the cognitive science of fallacious reasoning, Gilovich's book is relatively engaging and easy to read. (I do have some minor qualms about the typesetting of the 1993 paperback edition, which is a fairly dense wall of text, so consider an alternative and more recent layout if possible.)

Like many such books, Gilovich's first few chapters survey concepts like cognitive bias and Kahneman & Tversky's Bounded Rationality. This writing is more effective than much of the source material, and I'd think seriously about using segments of this book in undergraduate level classes (as Kahneman & Tversky can prove to be a little much directly). The content does suffer some from being almost 30 years old at this point (e.g., it doesn't include more recent advances in cultural cognition, polarization, etc), but such is life.

The final few chapters of the volume take on much more applied topics, ranging from alternative medicine to ESP to overconfidence in social interactions. While the scope and scale of these chapters varies from one to the next, it is nice to see the book apply the lessons learned in the first portion in such a pragmatic way. Again, setting aside a hair of datedness in the writing and examples, these vignettes offer useful case studies of the theories introduced early on.

The book could use to be a little more heavy on frameworks. A lot of psychological theory is introduced (e.g., different kinds of cognitive biases, different elements of bounded rationality), and a more straightforward framework (e.g., the ten kinds of fallacious reasoning to avoid; five places thinking goes awry; etc) could help to (a) make the big takeaways more distinctive and (b) give consistency in analysis in the case study portion. But, Gilovich can be forgiven here: I struggle to do the exact same thing when I teach disaster psychology - it's a tricky challenge to simplify these theories in a way that doesn't flatten out or 'dumb down' some of the key content.

Overall, I was pleased by this book. It's an approachable introduction to the ways our cognition falls apart. It's readable, generally engaging, and full of content that will make you wiser if you understand it.
Profile Image for Sajid.
453 reviews106 followers
September 1, 2021
How we know what isn't so is a smart book written by psychologist Thomas Gilovich. In this book he questions the very way we believe and make judgements in our everyday life. So it deals with our cognitive biases. He points out that cognitive biases are really necessary for our survival,as it is one of our strong survival mechanism,but the situation gets worse when we escape away and make our cognitive function lazy. And that's when we see pattern even in chaos and make sense out of nonsense.

The funny thing is that, unbounded by reality, we’ll believe some crazy things. Without measurement, we can believe we’re the best physician, architect, developer, or whatever career we’re in. Without some specific, tangible, and irreputable evidence, our ego can make up any story it likes. We’ll emphasize the characteristics we’re good at and ignore the ones we don’t feel like we excel at. We’ll use whatever reference point makes us feel better about ourselves. “At least I’m not as bad as they are” is a common internal monologue.

In the land of beliefs, we fall victim to numerous cognitive biases and errors. The fundamental attribution error causes us to simultaneously judge others more harshly and to explain away our failures based on circumstances.

The truth is that we will believe what we want to believe – about ourselves and others – until there is some sort of inescapable truth that forces us to acknowledge that our beliefs were wrong.We are amazed by the wives who report that they trusted their husbands only to realize after the fact that their blind trust was misplaced. They had all the reasons to suspect there was a problem, but they refused to see it – with disastrous consequences.

So it is a book where the writer encourages us to think critically. And most of all it scrutinizes the attitude of our everyday biased thinking.
223 reviews4 followers
April 18, 2019
This book is reminiscent of other works I’ve read such as Shermer’s Why People Believe Weird Things and Mackay’s Extraordinary Popular Delusions and the Madness of Crowds, neither of which I found particularly interesting (the latter I didn’t even bother to finish).

As I’ve already read quite a bit of literature on this topic the book comes across to me as common sense. Humans generalize, and tend to discount the information that serves their preconceived notions, and will instead scrutinize any information that challenges it, despite how true or compelling it may be.

I think one quote sums up the whole of the book quite well: “... the lifestyles we lead, the roles we play, and the positions we occupy in a social network deny us access to important classes of information and thus distort our view of the world.”

Otherwise I feel like the information is just repeated in different ways, making the book monotonous to me.

The bottom line is our expectations, preconceptions and prior beliefs influence our interpretation of new info. Perhaps this common sense isn’t so common. This book may be generally useful to those unaware of their own personal biases.

The one part that stood out from other works was the concept of self handicapping to alter or normalize other people’s perceptions of our shortcomings.

I do appreciate the ending which pushes for more self awareness and analysis as well as science education to combat these issues.
Profile Image for Sonny  Fertile.
62 reviews2 followers
June 7, 2025
More ammunition to support the ever escalating position that none of us really have any idea of just how much of what we think we know, what we have all along been told to know, by teachers, priests, history books, news media and even eachother in innocent naivety, has any real basis in actual truth. At least anyone with a working brain already knows social media and Google, not unlike entertainment rags like the Enquirer and professional wrestling have no relationship with the truth. Yet some humans still read and/or watch wanting to believe it's true with so sad a desperation they learn to.
I would recommend this book to anyone. Everyone will get something out of reading this book. About the world around them. And if they have an open mind, even themselves. And for that reason, I most emphatically highly recommend this book to any literate maga cult members. Of course that is, if there are any literate maga members.
Profile Image for Drew Flynn.
153 reviews27 followers
April 23, 2018
Some chapters were more interesting than others, but those interesting ones were at times incredible. The book makes you want to constantly keep reading more, while your brain wants you to chill so it can process it all. A wonderful dilemma.
Profile Image for Sharif Mahmud.
7 reviews28 followers
April 1, 2017
Eye-opener, mind-bending tour to the everyday experience.
803 reviews
Read
April 8, 2024
I'm not going to give this a rating because I don't feel like I gave it my full attention. It read more like a text book to me. Not what I enjoy most. Some interesting concepts were discussed.
Profile Image for Jeff.
276 reviews4 followers
January 27, 2020
In the Freakonomics podcast #382 (http://freakonomics.com/podcast/live-...), Stephen Dubner called this one of his favorite books ever, so I decided to add it to my reading list. I found it to be an interesting read that really challenged me to consider how and why I think about some things that seem crystal clear to me. At a minimum, I will benefit from the fore-knowledge that I just *might* not have all the facts and information about some topics if I want to make informed decisions, and perhaps I ought to consider other possible viewpoints before I climb up on my soapbox!
Profile Image for Steve.
37 reviews18 followers
January 21, 2012
This is a really excellent book, though there are a lot more engaging reads in the psychology-for-general-public read as of late. If there weren't so many better written ones as of late and the book itself weren't nearly twenty years old old (I am hoping for a second edition), I'd have given it five stars (there's more research on how people think and decide more recently). Unfortunately, lately, I've been meeting lots of people endorsing truly ridiculous ideas without thinking critically about them who could really benefit from Gilovich. For example, a colleague recently asked me to read a book on meta-analysis of ESP that left out large, federally funded studies of the phenomenon...how can I take such studies seriously? If I know of such studies and they're omitted [I'm not an ESP researcher] and the author does not, clearly the author hasn't done a comprehensive job. Or, don't get me started on alternative medicine -- not that I'm against it per se -- just that I'm against it when evidence doesn't support it or shows that it has problems (see the ridiculous Dr. Oz or those who think vaccine preservatives cause Autism -- why should the money grubbing pharmaceuticals be the only ones to profit off the suffering?). Gilovich has great credentials, as well -- he's published with numerous famous psychologists, including Noble laureate Daniel Kahneman, whose quote from May 27, 2006, always remains above my desk: "One of the signs that you're doing science is when the data forces you to change your mind." I read Gilovich's book after sitting in on a graduate epistemology class for fun near the end of my graduate program in psychology, in which the philosophy professor suggested the text (incidentally, the professor later went on to hold a joint appointment in philosophy and psychology). I think every person who studied psychology would think better off about the field (and pretty much everyone in general would think better about evidence) if they'd read Gilovich's book.
Profile Image for Jacob.
879 reviews71 followers
January 5, 2016
This is a good solid work about people's irrational beliefs, covering just about all the basic psychological mechanisms. It's not breaking new ground, but that's because it's more than 20 years old. Still, it brings a few things to the table that I haven't seen in most other discussions of this topic:

- The authors recognize that while we see many occasions when people form opinions that are incorrect or at best not supported by (complete & unbiased) evidence, human nature leads us to make good decisions and form correct beliefs _most_ of the time. So human reason is a pretty good mechanism with some flaws, not the opposite.

- The "probabilistic" sciences (ones where data is messy, noisy, and often not conclusive) seem to prepare people better to evaluate their own experience. They can recognize these biases and counteract them in their own thinking better than those educated in the "deterministic" sciences such as chemistry or physics. This is not just talking about psychology as the probabilistic science either, but other disciplines that deal with statistics such as medicine.

- There's a recognition of how challenging it is for people to experience something and leave it unexplained. To my knowledge, this tendency hasn't been studied academically, but I think such study could be very interesting.

The one serious drawback to this book is that it is written like a psychology paper or textbook, so it's kind of dry and that almost never lets up. I would have given it four stars except for that; I LIKE psychology and it's still hard to read things written so much that way.
55 reviews18 followers
November 17, 2014
How We Know What Isn't So is a researched book on social psychology by Thomas Gilovich, a psychology professor at Cornell. It talks about why our mind seeks dubious or erronous information to aid our biases, rather than negating or clarifying them, and supplements its claims by examples of researches that did so in the past.

Reading this book would help you to look at the usually pervading superstitions and medical 'quacks' or evidence in support of existence of paranormal activities, or any other incredulous information you might come across, even if it comes from an authentic source with skepticism, questioning, for example, the source of your credible source. It would help in doing your research with a more balanced point of view, seeing all sides, rather than simply "search(ing) for evidence that is biased towards confirmation."

The book has nothing new to offer if you look at the world closely, observing, why people believe certain things and some not.

The last chapter tries to tell why social scientists (Gilovich included) are better than the "hard" scientists or people from other fields like law to dispell erronous beleifs prevalent in everyday life.

It was an interesting book (including a chapter on ESP and why people believe in it) with lots of notes and explanations, and relatable examples given along with other books one can read on the subject if interested. Enjoyable read, through and through, but only if you are interested in knowing about the subject concerning life and everyday phenomenons like co-incidences.
Profile Image for Chris Boutté.
Author 8 books273 followers
March 6, 2023
2nd read:
This was one of the first books I read on human irrationality and faulty reasoning, and I remember thinking it was super boring. Not only was I bored with it, but a lot of the concepts went over my head. Recently some other books I’ve read have mentioned this book or Thomas Gilovich’s work, so I decided to give it another read, and oh my God. This book is now one of my favorites on the topic.

I must have been in a weird headspace the first time I read it, or it just didn’t capture my attention because I was new to the topic. But this book is hands down one of the best books when it comes to understanding why people believe weird things, but it also helps the reader learn how to become a better thinker.

Even though this book was originally published years ago, everything in it holds up. I especially liked the chapter discussing how we can ask biased questions, which skews the answers we’re getting. I don’t see many books discussing this.

This is a must-read book to understand faulty reasoning, and it’s definitely going on my list of books to revisit at least once a year.


1st read:
I regularly try to read books that make me question my thinking and recognize my unconscious biases, and I think it’s something we should all be doing. This book doesn’t disappoint. I learned of the author through a Jonathan Haidt book, and I’m glad I found him.
Profile Image for Dfordoom.
434 reviews123 followers
April 4, 2008
Author Thomas Gilovich gives us concrete examples of the ways in which people can come to believe things for which there is no genuine scientific evidence, and the common errors people make when trying to make sense of statistical and probabilistic data. He shows us how people can consciously or unconsciously delude themselves, and how we so often ignore evidence we don’t like and concentrate on evidence that appears to support views that we want to believe are true. The book is moderately scholarly while still being comprehensible by the math-challenged (like myself). He deals not only with beliefs about scientific matters but also beliefs about social behaviours and strategies. He offers an interesting suggestion for improving the education system to give people the ability to understand complex issues in a better – not more emphasis on the “hard” sciences, but more emphasis on the “soft” sciences. His reasoning is that social sciences like psychology teach people to deal with “the messy, probabilistic phenomena that are often encountered in everyday life.” This is a highly entertaining and extremely stimulating book, the best of its kind that I’ve read.
Profile Image for Sue.
1 review
November 5, 2011
This book makes you think about how unthinking we are, from believing that infertile couples are more likely to have a biological child once they have adopted one.(not too many consequences here) to belief that seal penises are the natural Viagra, (40,000 seal penises found in one raid). We tend to notice only those events which reinforce our own beliefs and prejudices - we ignore evidence which disproves them. Every statement is peppered with entertaining real-life examples including an explanation for phobias.

Controversially for some readers, there is a sobering chapter on belief in alternative medicines - the author is cynic in this area and gives persuasive reasons why. Eyewatering amounts of money are spent by desperate people chasing a cure. The section on 'self-handicapping' is revealing and this reader found herself uncomfortably empathetic with the people described. ESP, religious experiences, gambling and social interactions all come under Gilovich's spotlight. Whether or not you enjoy the book, if you stick with it you will at least know more about yourself than when you started.
Profile Image for Jake Losh.
211 reviews24 followers
September 3, 2015
This is an exceptional book. I got about six chapters deep into it several months ago when I decided that I was trying to read it way too fast. At under 200 pages (excluding notes and references) this is an extremely dense and comprehensive treatment of so many aspects of human reasoning. This book is intense and will make you question so many things about the way you see the world. Once you start to read it, you'll start to see the application of its lessons everywhere and you cannot unsee them. I highly recommend this book.

When I look at the 1 and 2 star reviews for this book they all either have no text or else they complain that the book is, "dry" or "boring. I'm not sure what they were expecting. The examples given are all interesting and probably you've seen them before in other books, but this book is the granddaddy of them all. It is a bit dry, but if you take the time to read it the journey is incredibly rewarding.
6 reviews
February 1, 2008
How We Know What Isn't So is an outstanding read for anybody who tends to be a skeptic or merely wants to be a critical thinker. While the author is an academic, the book is well-written and actually a fairly quick and easy read. The purpose is to explore how we come to understand things, and primarily it focuses on how we come to believe things that are not true. Whether it is ESP or alien abductions or more common myths like strange things happening during full moons, Gilovich documents a widespread belief in these phenomenon and then explores why these beliefs are so widely held when they are clearly false.

Gilovich doesn't bring up the subject of religion, but he doesn't really have to. The framework works well for any atheist or agnostic to figure out why a large percentage of human beings believe in a higher power.

Profile Image for Olga.
100 reviews5 followers
January 4, 2015
It's a good complementary reading with another book "Mistakes were made but not by me" by Carol Tavris and Elliot Aronson, where both books attest human fallibility to consider their biases in offering reasons to different situation with seemingly unexplained phenomena and so taking early and foolish arguments and transform them in believes without looking upon scientific and statistical justification of everyday experiences and logic fallacies.
Author 6 books8 followers
September 1, 2013
I found this a very informative discussion of the many ways our brains can persistently mislead us into erroneous conclusions via otherwise perfectly normal, useful and effective psychological processes.
Profile Image for Cristobal.
724 reviews62 followers
October 19, 2016
This is a must read for those interested about cognitive bias. It will serve as a fantastic companion to "Thinking Fast and Slow," although if I could only read one of the two I'd go with "Thinking Fast" since it much better structured and easier to understand.
Profile Image for Aurélien Thomas.
Author 10 books120 followers
October 20, 2024
I picked this up after reading The Demon-Haunted World: Science as a Candle in the Dark by Carl Sagan, as he refers to it at some point.

Now, don't get me wrong: this is a very good read, doing exactly what it says on the tin, that is, exposing some of our most common fallacies when assessing evidence and so forging opinions. Personally for instance, I particularly like his insisting that being prejudiced/ wrong is not being irrational or stupid but, merely abiding to what he calls a 'flawed rationality'. I was also really engrossed by the chapters relating to the false consensus effect, as they strongly reminded me of another effect detailed in another book that I had read recently (The Spiral of Silence: Public Opinion - Our Social Skin, 2nd Edition, by Elizabet Noelle-Neumann), that of the spiral of silence and how individuals monitor the expression of their own opinions, based on what they believe other people think. The fact that Thomas Gilovich dares exposing how erroneous thinking can lead to dangerous groupthinks and, beyond, catastrophic policies in various fields makes it all the more compelling and relevant.

The thing is, if you are like me, that is, interested in rational thinking, the scientific method, how prejudicial thinking operates, and/ or some of the commonest claptraps when assessing data, then there won't be much to learn here. Even the most engaging parts putting the reader to the test (e.g. a card experiment asking you which cards you would chose in other to confirm/ infirm the veracity of a statement etc.) will feel déjà-vu to those used to this type of literature. Is it bad?

All in all, there is no denying that I enjoyed it. When it comes to re-assert the key features of critical thinking, it will surely feel like barging through an open door with a ram and just so as to state what should be obvious! And yet... And yet, as someone having prejudices of my own (e.g. I dislike the press, and the British press especially) I for one never cease to be amazed by how the medias seem utterly incompetent in basic numeracy when assessing statistics, or, worse, unable to comprehend the necessity to question source before tossing out opinions passed off as facts. Needless to say, then, the parts on how the medias can dangerously serve 'the fallibility of human reason' were right up my street!

Is it ground-breaking? Absolutely not. Do I recommend it, though? You bet I do! We all can be easily mislead. It's good to be reminded so.
175 reviews
May 30, 2025
Thomas Gilovich takes us through the facts behind spurious reasoning, anecdotal evidence, and incomplete analyses. In the current climate of fad diets, herbal remedies, pseudoscience, and most recently "fake news", it remains is an important book, laying bare the faulty reasoning that can lead to errors in judgment or to falling for some con artist's story.

The primary focus of the book is an analysis of how the human mind tends to bring order from randomness. For example, early chapters deal with random events, and Gilovich points out how seemingly ordered random events can be. For example, when flipping a coin, you should expect to see 4 or 5 heads (or tails) in a row at some point when flipping the coin 100 times. This seems obvious, but he then moves to "real life" examples to show how such randomness can lead to a belief in hot streaks when gambling, for example.

Similarly, he tackles and explains studies that clearly show the importance of anecdotal evidence - we'll take information at face value if we hear it from someone we trust, even if we don't know where that person got the info. Likewise, people tend to more closely scrutinise evidence that contradicts their beliefs in an attempt to find a fault with the evidence. Alternatively, they are likely to "mindlessly" accept anything that seems to support a belief. This is human nature, which means that it's human nature to not evaluate evidence objectively, and that is what leads to spurious reasoning and belief in concepts that have no scientific evidence (the efficacy of many herbal medicines, belief in ESP, etc.)

In this era of increasingly acrimonious elections and electorate polarization, this book seems doubly important. It's tough for the average person to sort the steady stream of propaganda out. Gilovich's book points the way to how to ask the right questions on the way to finding the truth, which is something everyone needs to be able to do in a democracy.
Profile Image for Stephen.
340 reviews10 followers
November 26, 2017
Mostly a book about cognitive and social psychology, by a research psychologist, but with some examples in the form of things "everybody knows" but which "just ain't so" - ESP, New Age "holistic" treatments, and (less supernaturally) common but rarely-successful social behavior. Kudos to the author for sticking pretty much to what he knows (note how the three examples are focused on the mind or social interaction), but it's a relatively light treatment of each.

The preparatory chapters on how people form erroneous beliefs are much more fleshed out, and nice to read all gathered together by someone who actually has expertise in the field (and research credit, to boot). The bibliography is probably the greatest asset for anyone who otherwise has pretty much heard all this before. Also of historical interest is the running focus on AIDS, as this was written in the early Nineties and nobody was sure if it would become a pandemic (i.e. among heterosexuals) or not.

Overall, the cog psych of bias and error is something everyone should read about, even if the back half of the book falls a bit flat. I'll rate it 3.5 stars, rounded to 4 because of the bibliography.
Displaying 1 - 30 of 137 reviews

Can't find what you're looking for?

Get help and learn more about the design.