Michael Shermer's Blog
September 21, 2019
Anterior Cervical Discectomy & Fusion (ACDF) or, My Big Bike Crash and Surgical Adventure
On July 26 I had a nasty cycling crash. I’m fond of telling people that cycling is a low-impact sport, unless you impact the ground, which I did that day. My cycling buddy Bob McGlashan were riding the Lake Casitas loop from Summerland, down the coast to Ventura (there’s a nice bike path next to the 101 freeway) and up the road paralleling highway 33 north. We had a howling tailwind and were flying at over 20mph on a modest upgrade. Well, at the big left turn to the lake, a car was coming the other way at high speed, so I timed my left turn to be after he flew past me, but at the last minute he hit the brakes and turned right. I had to veer right around him a little, which would have been fine, but as his car passed in front of me there was a pothole I had to swerve right to miss, sending me straight into the curb, which I hit at around 20mph, flipping me right into a dirt/rock field. I slammed my head, shoulder and hip really hard, the same side as my total hip replacement from 2013. There’s a dent in my helmet from a rock, so it did its job, but I could barely walk. As we were too far from home for a call to my wife, and no Ubers anywhere near us, I tried riding and there was almost no pain at all, so we rode the two hours back to the car. But by the time I got home I couldn’t walk from my car to the house without assistance, so I went to the Cottage Hospital ER and got an X-Ray and then a CT scan. There were two small pelvic fractures, one on the inferior ramps of the pelvic bone (not a problem) but the other one is on the acetabulum of the hip where the ball of the joint presses up against the socket (mine is titanium and plastic), so every time I put weight on the leg it hurt like hell.
I also pinched a nerve in my neck, which two days later was unbearably painful. I couldn’t sleep without strong painkillers. Another cycling partner of mine, Dr. Walter Burnham, happens to be a world-class orthopedic surgeon specializing in the spine, so he did an MRI on my neck and found some pretty severe degeneration of C-5, 6, and 7—not from the crash but from…”life” (he said, when I asked). So that led to the ACDF surgery, which Walt did on September 6, two days before my 65th birthday.
I would have done it earlier, but Medicare coverage started for me on September 1 so I had to wait in order to have the surgery fully covered. Yes, I have health insurances, with Blue Shield, and a fairly expensive plan at that, but even so my share of what would have been owed without Medicare would have been over $10,000. So, it was worth waiting a couple of weeks. I could rant for pages more about our messed up healthcare system, but I won’t as at the moment I am grateful for the near miraculous performance of Dr. Walt and his team, along with everyone else who took care of me along the way. (i.e., it’s the system that’s messed up, not the people.)
I did notice how careful everyone is now with opioid painkillers, which saved me from being miserable for weeks. When the ER docs wrote a script for a codeine painkiller, unbeknownst to me the law now requires that any opioid prescription must include a prescription for Naloxone, the inhaler that saves your life if you overdose. The codeine (Tylenol 3) was $1.57 for 12 pills. The Naloxone was $75. My wife Jennifer picked up the order for me. I called the ER to complain about the price differential. The woman explained the law to me. I told her I understood, but asked “the instructions say to take 1 every 4 hours. What moron would take all 12 at once?” She replied: “you’d be surprised.” Alas, that’s the world we live in now.
September 16, 2019
Conspiracies & Conspiracy Theories: What We Should and Shouldn’t Believe—and Why
Audible Inc., the world’s largest producer and provider of downloadable audiobooks and other spoken-word entertainment, in conjunction with The Great Courses, is creating audio-only, non-fiction content for Audible’s millions of listeners. The first three titles include Dr. Michael Shermer’s new and original course on: Conspiracies & Conspiracy Theories: What We Should Believe and Why.
Watch Dr. Shermer’s introduction
Brief Course Description
What is the difference between a conspiracy and a conspiracy theory? Who is most likely to believe in conspiracies, and why do so many people believe them? Is there some test of truth we can apply when we hear about a conspiracy that can help us determine if the theory about it is true or false? In this myth-shattering course, world-renowned skeptic and bestselling author Dr. Michael Shermer tackles history’s greatest and widespread conspiracy theories, carefully deconstructing them on the basis of the available evidence. In the current climate of fake news, alternative facts, and the rise of conspiracy theories to national prominence and political influence it is time to consider how to distinguish true conspiracies (Lincoln’s assassination, the Pentagon Papers, Watergate) from false conspiracy theories (Sandy Hook, 9/11, fake moon landing). You learn how conspiracies arise, what evidence is used to support them, and how they hold up in the harsh light of true historical, even scientific analysis, as well as why people believe them. Illuminating and compelling, the next time you hear someone talking about a conspiracy theory, this course just may give you the detective skills to parse the truth of the claim.
Conspiracies & Conspiracy Theories consists of 12 lectures, 30-minutes each.
PART I: Conspiracies & Why People Believe Them
The Difference Between Conspiracies and Conspiracy Theories
Classifying Conspiracies and Characterizing Believers
Why People Believe in Conspiracy Theories
Cognitive Biases and Conspiracy Theories
Conspiracy Insanity
Constructive Conspiracism
PART II: Conspiracy Theories & How to Think About Them
The Conspiracy Detection Kit
Truthers and Birthers: The 9/11 and Obama Conspiracy Theories
The JFK Assassination: The Mother of All Conspiracy Theories
Real Conspiracies: What if They Really Are Out to Get You?
The Deadliest Conspiracy Theory in History
The Real X-Files: Conspiracy Theories in Myth and Reality
Bonus Lecture: Letters from Conspiracists
Watch Dr. Shermer’s introduction
About Michael Shermer
Dr. Michael Shermer is the Publisher of Skeptic magazine, a Presidential Fellow at Chapman University, the host of the Science Salon podcast, and for 18 years a monthly columnist for Scientific American. He is the author of a number of New York Times bestselling books including: Heavens on Earth, The Moral Arc, The Believing Brain, Why People Believe Weird Things, Why Darwin Matters, The Mind of the Market, How We Believe, and The Science of Good and Evil. His two TED talks, viewed nearly 10 million times, were voted in the top 100 of the more than 2000 TED talks. Dr. Shermer received his B.A. in psychology from Pepperdine University, M.A. in experimental psychology from California State University, Fullerton, and his Ph.D. in the history of science from Claremont Graduate University.
View all titles by Michael Shermer on Audible.com.
You play a vital part in our commitment to promote science and reason. If you enjoy the content Michael Shermer produces, please show your support by making a donation, or by becoming a patron.
January 1, 2019
Stein’s Law and Science’s Mission

This column was first published in the January 2019 issue of Scientific American.
In the April 2001 issue of Scientific American, I began this column with an entry entitled “Colorful Pebbles and Darwin’s Dictum,” inspired by the British naturalist’s remark that “all observation must be for or against some view, if it is to be of any service.” Charles Darwin penned this comment in a letter addressing those critics who accused him of being too theoretical in his 1859 book On the Origin of Species. They insisted that he should just let the facts speak for themselves. Darwin knew that science is an exquisite blend of data and theory. To these I add a third leg to the science stool—communication. If we cannot clearly convey our ideas to others, data and theory lie dormant.
For 214 consecutive months now, I have tried to communicate my own and others’ thoughts about the data and theory of science as clearly as I am able. But in accordance with (Herb) Stein’s Law—that things that can’t go on forever won’t—this column is ending as the magazine redesigns, a necessary strategy in the evolution of this national treasure, going on 174 years of continuous publication. I am honored to have shared a fleeting moment of that long history, grateful to the editors, artists and production talent for every month I was allowed to share my views with you. I will continue doing so elsewhere until my own tenure on this provisional proscenium ends (another instantiation of Stein’s Law)—many years in the future, nature and chance willing— so permit me to reflect on what I think science brings to the human project of which we are all a part.
Modern science arose in the 16th and 17th centuries following the Scientific Revolution and the adoption of scientific naturalism— the belief that the world is governed by natural laws and forces that are knowable, that all phenomena are part of nature and can be explained by natural causes, and that human cognitive, social and moral phenomena are no less a part of that comprehensible world. In the 18th century the application of scientific naturalism to the understanding and solving of human and social problems led to the widespread embrace of Enlightenment humanism, a cosmopolitan worldview that esteems science and reason, eschews magic and the supernatural, rejects dogma and authority, and seeks to understand how the world works. Much follows. Most of it good.
Human progress, which has been breathtaking over the past two centuries in nearly every realm of life, has principally been the result of the application of scientific naturalism to solving problems, from engineering bridges and eradicating diseases to extending life spans and establishing rights. This blending of scientific naturalism and Enlightenment humanism should have a name. Call it “scientific humanism.”
It wasn’t obvious that the earth goes around the sun, that blood circulates throughout the body, that vaccines inoculate against disease. But because these things are true and because Nicolaus Copernicus, William Harvey and Edward Jenner made careful measurements and observations, they could hardly have found something else. So it was inevitable that social scientists would discover that people universally seek freedom. It was also inevitable that political scientists would discover that democracies produce better lives for citizens than autocracies, economists that market economies generate greater wealth than command economies, sociologists that capital punishment does not reduce rates of homicide. And it was inevitable that all of us would discover that life is better than death, health better than illness, satiation better than hunger, happiness better than depression, wealth better than poverty, freedom better than slavery and sovereignty better than suppression.
Where do these values exist to be discovered by science? In nature—human nature. That is, we can build a moral system of scientific humanism through the study of what it is that most conscious creatures want. How far can this worldview take us? Does Stein’s Law apply to science and progress? Will the upward bending arcs of knowledge and wellbeing reach a fixed upper ceiling?
Remember Davies’s Corollary to Stein’s Law—that things that can’t go on forever can go on much longer than you think. Science and progress are asymptotic curves reaching ever upward but never touching omniscience or omnibenevolence. The goal of scientific humanism is not utopia but protopia—incremental improvements in understanding and beneficence as we move ever further into the open-ended frontiers of knowledge and wisdom. Per aspera ad astra.
December 1, 2018
Kids These Days

This column was first published in the December 2018 issue of Scientific American.
Something is amiss among today’s youth. This observation isn’t the perennial “kids these days” plaint by your middle-aged correspondent. According to San Diego State University psychologist Jean Twenge, as reported in her book iGen (Atria, 2017), to the question “Do you have [a] psychological disorder (depression, etc.)?” the percentage of college students born in 1995 and after (the Internet Generation, or iGen) answering affirmatively in a Higher Education Research Institute study rose between 2012 and 2016. For men, the figure increased from 2.7 to 6.1 percent (a 126 percent increase) and for women from 5.8 to 14.5 percent (a 150 percent rise). The National Survey on Drug Use and Health found that between 2011 and 2016 the percentage of boys who experienced a depressive episode the prior year increased from 4.5 to 6.4 and in girls from 13 to 19.
iGeners began entering college in 2013. Between 2011 and 2016 there was a 30 percent increase in college students who said they intentionally injured themselves (for example, by cutting), and according to the Fatal Injury Reports of the Centers for Disease Control and Prevention, suicide rates increased 46 percent between 2007 and 2015 among 15- to 19-year-olds. Why are iGeners different from Millennials, Gen Xers and Baby Boomers?
Twenge attributes the malaise primarily to the widespread use of social media and electronic devices, noting a positive correlation between the use of digital media and mental health problems. Revealingly, she also reports a negative correlation between lower rates of depression and higher rates of time spent on sports and exercise, in-person social interactions, doing homework, attending religious services, and consuming print media, such as books and magazines. Two hours a day on electronic devices seems to be the cutoff, after which mental health declines, particularly for girls who spend more time on social media, where FOMO (“fear of missing out”) and FOBLO (“fear of being left out”) take their toll. “Girls use social media more often, giving them more opportunities to feel left out and lonely when they see their friends or classmates getting together without them,” Twenge adduces after noting that the percentage of girls who reported feeling left out increased from 27 to 40 between 2010 and 2015, compared with a percentage increase from 21 to 27 for boys.
In search of a deeper cause of this problem—along with that of the campus focus of the past several years involving safe spaces, microaggressions and trigger warnings—Greg Lukianoff and Jonathan Haidt argue in their book The Coddling of the American Mind (Penguin, 2018) that iGeners have been inflenced by their overprotective “helicoptering” parents and by a broader culture that prioritizes emotional safety above all else. The authors identify three “great untruths”:
The Untruth of Fragility: “What doesn’t kill you makes you weaker.”
The Untruth of Emotional Reasoning: “Always trust your feelings.”
The Untruth of Us versus Them: “ Life is a battle between good people and evil people.”
Believing that conflicts will make you weaker, that emotions are a reliable guide for responding to environmental stressors instead of reason and that when things go wrong, it is the fault of evil people, not you, iGeners are now taking those insalubrious attitudes into the workplace and political sphere. “Social media has channeled partisan passions into the creation of a ‘callout culture’; anyone can be publicly shamed for saying something well-intentioned that someone else interprets uncharitably,” the authors explain. “New-media platforms and outlets allow citizens to retreat into self-confirmatory bubbles, where their worst fears about the evils of the other side can be confirmed and amplified by extremists and cyber trolls intent on sowing discord and division.”
Solutions? “Prepare the child for the road, not the road for the child” is the first folk aphorism Lukianoff and Haidt recommend parents and educators adopt. “Your worst enemy cannot harm you as much as your own thoughts, unguarded” is a second because as Buddah counseled, “once mastered, no one can help you as much.” Finally, echoing Aleksandr Solzhenitsyn, “the line dividing good and evil cuts through the heart of every human being,” so be charitable to others.
Such prescriptions may sound simplistic, but their effects are measurable in everything from personal well-being to societal harmony. If this and future generations adopt these virtues, the kids are going to be alright.
November 1, 2018
The Fallacy of Excluded Exceptions

This column was first published in the November 2018 issue of Scientific American.
For a documentary on horror movies that seem cursed, I was recently asked to explain the allegedly spooky coincidences associated with some famous films. Months after the release of Poltergeist, for example, its 22-year-old star, Dominique Dunne, was murdered by her abusive ex-boyfriend; Julian Beck, who played the preacher “beast,” succumbed to stomach cancer before Poltergeist II’ s release; and 12-year-old Heather O’Rourke died months before the release of what would be her last starring role in Poltergeist III.
The Exorcist star Linda Blair hurt her back when she was thrown around on her bed when a piece of rigging broke; Ellen Burstyn was injured on set when flung to the ground; and actors Jack MacGowran and Vasiliki Maliaros both died while the film was in postproduction (their characters died in the film).
When Gregory Peck was on his way to London to make The Omen, his plane was struck by lightning, as was producer Mace Neufeld’s plane a few weeks later; Peck avoided aerial disaster again when he canceled another flight at the last moment (that plane crashed, killing everyone onboard); and two weeks after filming, an animal handler who worked on the set was eaten alive by a lion.
During the making of The Crow, star Brandon Lee was accidentally shot to death by a stage gun with blanks; he was the son of Bruce Lee, who also died mysteriously at a young age, possibly from a drug reaction. While filming Twilight Zone: The Movie, star Vic Morrow was killed in a freak helicopter accident.
For some people, such eerie coincidences suggest evil supernatural forces at work. But that conclusion is not warranted. As I explained on camera, picture a 2×2 square with four cells. Cell 1 contains Cursed Horror Movies (Poltergeist, The Exorcist, The Omen, The Crow, Twilight Zone: The Movie). Cell 2 contains Cursed Nonhorror Movies (Superman, The Wizard of Oz, Rebel Without a Cause, Apocalypse Now). Cell 3 contains Noncursed Horror Movies (It, The Ring, The Sixth Sense, The Shining). Cell 4 contains Noncursed, Nonhorror Movies (The Godfather, Star Wars, Casablanca, Citizen Kane). When they are put into this perspective, it is clear that those seeing supernatural intervention are remembering only the horror movies that seemed cursed and forgetting all the other possibilities.
Call it the Fallacy of Excluded Exceptions, or the failure to note instances that do not support the generalization. In cell 1, for example, Halloween is not included, because there are no “curse” stories associated with it; its star, Jamie Lee Curtis, went on to a successful motion picture career, and the film launched a franchise in the horror genre. In cell 2, no one attributes evil forces at work on the California highway where James Dean lost his life after making Rebel Without a Cause. In cell 3, spine-chilling films like The Shining should be loaded with curses, but it isn’t.
The psychology underlying the Fallacy of Excluded Exceptions is confirmation bias, where once one commits to a belief, the tendency is to look for and find only confirming examples while ignoring those that disconfirm. This is very common with paranormal claims. People grasp at predictions by psychics or astrologers when they come true, but what about all the predictions that did not come true or major events that nobody predicted? In the realm of faith, cancers that go into remission after intercessory prayer are often considered religious miracles, but what about the cancers that disappeared without faith-based intervention or the cancer patients who were prayed for but died? Divine providence is often adduced when a few faithful people survive a disaster, but all the religious folks who died and atheists who lived are expediently ignored.
The problem is rampant not just with paranormal and supernatural claims. Claims of medical cures associated with this or that alternative treatment modality typically exclude cases where treated patients were not cured or were cured but possibly by other means. Crime waves are often linked to economic downturns, but this hypothesis is gainsaid by counterexamples, such as the relatively low crime rates during the 1930s depression and the 2008–2010 recession. Excluded exceptions test the rule. Without them, science reverts to subjective speculation.
October 1, 2018
A Mysterious Change of Mind

This column was first published in the October 2018 issue of Scientific American.
Anthony Bourdain (age 61). Kate Spade (55). Robin Williams (63). Aaron Swartz (26). Junior Seau (43). Alexander McQueen (40). Hunter S. Thompson (67). Kurt Cobain (27). Sylvia Plath (30). Ernest Hemingway (61). Alan Turing (41). Virginia Woolf (59). Vincent van Gogh (37). By the time you finish reading this list of notable people who died by suicide, somewhere in the world another person will have done the same, about one every 40 seconds (around 800,000 a year), making suicide the 10th leading cause of death in the U.S. Why?
According to the prominent psychologist Jesse Bering of the University of Otago in New Zealand, in his authoritative book Suicidal: Why We Kill Ourselves (University of Chicago Press, 2018), “The specific issues leading any given person to become suicidal are as different, of course, as their DNA—involving chains of events that one expert calls ‘dizzying in their variety.’” Indeed, my short list above includes people with a diversity of ages, professions, personality and gender. Depression is commonly fingered in many suicide cases, yet most people suffering from depression do not kill themselves (only about 5 percent Bering says), and not all suicide victims were depressed. “Around 43 percent of the variability in suicidal behavior among the general population can be explained by genetics,” Bering reports, “while the remaining 57 percent is attributable to environmental factors.” Having a genetic predisposition for suicidality, coupled with a particular sequence of environmental assaults on one’s will to live, leads some people to try “to make the sh*t stop,” in the words of Winona Ryder’s character in the 1999 film Girl, Interrupted.
In Bering’s case, it first came as a closeted gay teenager “in an intolerant small Midwestern town” and later with unemployment at a status apex in his academic career (success can lead to unreasonably high standards for happiness, later crushed by the vicissitudes of life). Yet most oppressed gays and fallen academics don’t want to kill themselves. “In the vast majority of cases, people kill themselves because of other people,” Bering adduces. “Social problems—especially a hypervigilant concern with what others think or will think of us if only they knew what we perceive to be some unpalatable truth—stoke a deadly fire.”
Like most human behavior, suicide is a multicausal act. Teasing out the strongest predictive variables is difficult, particularly because such internal cognitive states may not be accessible even to the person experiencing them. We cannot perceive the neurochemical workings of our brain, so internal processes are typically attributed to external sources. Even those who experience suicidal ideation may not understand why or even if and when ideation might turn into action. This observation is reinforced by Ralph Lewis, a psychiatrist at the University of Toronto, who works with cancer patients and others facing death, whom I interviewed for my Science Salon podcast about his book Finding Purpose in a Godless World (Prometheus Books, 2018). “A lot of people who are clinically depressed will think that the reason they’re feeling that way is because of an existential crisis about the meaning of life or that it’s because of such and such a relational event that happened,” Lewis says. “But that’s people’s own subjective attribution when in fact they may be depressed for reasons they don’t understand.” In his clinical practice, for example, he notes, “I’ve seen many cases where these existential crises practically evaporated under the influence of an antidepressant.”
This attributional error, Lewis says, is common: “At a basic level, we all misattribute the causes of our mental states, for example, attributing our irritability to something someone said, when in fact it’s because we’re hungry, tired.” In consulting suicide attempt survivors, Lewis remarks, “They say, ‘I don’t know what came over me. I don’t know what I was thinking.’ This is why suicide prevention is so important: because people can be very persuasive in arguing why they believe life—their life—is not worth living. And yet the situation looks radically different months later, sometimes because of an antidepressant, sometimes because of a change in circumstances, sometimes just a mysterious change of mind.”
If you have suicidal thoughts, call the National Suicide Prevention Lifeline at 800-273-8255 or phone a family member or friend. And wait it out, knowing that in time you will most likely experience one of these mysterious changes of mind and once again yearn for life.
September 1, 2018
Abortion Facts

In May of this year the pro-life/pro-choice controversy leapt back into headlines when Ireland overwhelmingly approved a referendum to end its constitutional ban on abortion. Around the same time, the Trump administration proposed that Title X federal funding be withheld from abortion clinics as a tactic to reduce the practice, a strategy similar to that of Texas and other states to shut down clinics by burying them in an avalanche of regulations, which the U.S. Supreme Court struck down in 2016 as an undue burden on women for a constitutionally guaranteed right. If the goal is to attenuate abortions, a better strategy is to reduce unwanted pregnancies. Two methods have been proposed: abstinence and birth control.
Abstinence would obviate abortions just as starvation would forestall obesity. There is a reason no one has proposed chastity as a solution to overpopulation. Sexual asceticism doesn’t work, because physical desire is nearly as fundamental as food to our survival and flourishing. A 2008 study published in the Journal of Adolescent Health entitled “Abstinence-Only and Comprehensive Sex Education and the Initiation of Sexual Activity and Teen Pregnancy” found that among American adolescents ages 15 to 19, “abstinence-only education did not reduce the likelihood of engaging in vaginal intercourse” and that “adolescents who received comprehensive sex education had a lower risk of pregnancy than adolescents who received abstinence-only or no sex education.” A 2011 PLOS ONE paper analyzing “Abstinence-Only Education and Teen Pregnancy Rates” in 48 U.S. states concluded that “increasing emphasis on abstinence education is positively correlated with teenage pregnancy and birth rates,” controlling for socioeconomic status, educational attainment and ethnicity.
Most telling, a 2013 paper entitled “Like a Virgin (Mother): Analysis of Data from a Longitudinal, US Population Representative Sample Survey,” published in BMJ reported that 45 of the 7,870 American women studied between 1995 and 2009 said they become pregnant without sex. Who were these immaculately conceiving parthenogenetic Marys? They were twice as likely as other pregnant women to have signed a chastity pledge, and they were significantly more likely to report that their parents had difficulties discussing sex or birth control with them.
When women are educated and have access to birth-control technologies, pregnancies and, eventually, abortions decrease. A 2003 study on the “Relationships between Contraception and Abortion,” published in International Family Planning Perspectives, concluded that abortion rates declined as contraceptive use increased in seven countries (Kazakhstan, Kyrgyzstan, Uzbekistan, Bulgaria, Turkey, Tunisia and Switzerland). In six other nations (Cuba, Denmark, the Netherlands, Singapore, South Korea and the U.S.), contraceptive use and abortion rates rose simultaneously, but overall levels of fertility were falling during the period studied. After fertility levels stabilized, contraceptive use continued to increase, and abortion rates fell.
Something similar happened in Turkey between 1988 and 1998, when abortion rates declined by almost half when unreliable forms of birth control (for one, the rhythm method) were replaced by more modern technologies (for example, condoms). Public health consultant Pinar Senlet, who conducted the 2001 study published in International Family Planning Perspectives, and her colleagues reported that “marked reductions in the number of abortions have been achieved in Turkey through improved contraceptive use rather than increased use.”
To be fair, the multivariable mesh of correlations in all these studies makes inferring direct causal links difficult for social scientists to untangle. But as I read the research, when women have limited sex education and no access to contraception, they are more likely to get pregnant, which leads to higher abortion rates. When women are educated about and have access to effective contraception, as well as legal and medically safe abortions, they initially use both strategies to control family size, after which contraception alone is often all that is needed and abortion rates decline.
Admittedly, deeply divisive moral issues are involved. Abortion does end a human life, so it should not be done without grave consideration for what is at stake, as we do with capital punishment and war. Likewise, the recognition of equal rights, especially reproductive rights, should be acknowledged by all liberty-loving people. But perhaps progress for all human life could be more readily realized if we were to treat abortion as a problem to be solved rather than a moral issue over which to condemn others. As gratifying as the emotion of moral outrage is, it does little to bend the moral arc toward justice.
August 1, 2018
23 and We

Like a lot of baby boomers, I find myself gravitating to newspaper obits, cross-checking ages and causes of death with my current health parameters, most notably heart disease (which felled my father and grandfather) and cancer (which slew my mother). And then there is Alzheimer’s disease, which a 2015 report by the Alzheimer’s Association projects will destroy the brains of more than 28 million baby boomers. Given the importance of family history and genetics for longevity, I plunked down $199 for a 23andMe Health + Ancestry Service kit, spit into the little plastic vial, opted in for every test available for disease gene variants and anxiously awaited my reports. How’d they do?
First, the company captured my ancestry well at 99.7 percent European, primarily French/German (29.9 percent), British/Irish (21.6 percent), Balkan/Greece (16.4 percent) and Scandinavian/ Sweden (5.5 percent). My maternal grandmother is German and grandfather Greek; my fraternal great-grandparents were from Sweden and Denmark.
Second, the traits report correctly predicted that I can smell asparagus in my urine, taste bitter and have hazel eyes, ring fingers longer than index fingers, little freckling and straight, light hair. Third, for the disease reports, my eye lit on the phrase “variants not detected” for Parkinson’s, cystic fibrosis, muscular dystrophy, sickle cell anemia, Tay-Sachs and, most concernedly, Alzheimer’s. “Oh joy, oh rapture unforeseen!” (Thank you, Gilbert and Sullivan.)
But wait, 23andMe also says I have no bald spot, no cheek dimples, little upper back hair, a slight unibrow, no widow’s peak and a longer big toe—all wrong. If a genetic test for such comparatively simple physical features can be mistaken, what does that say about its accuracy for more complex diseases? “Our reports do not include all possible genetic variants that could affect these conditions,” 23andMe disclaims. “Other factors can also affect your risk of developing these conditions, including lifestyle, environment, and family history.” Oh, that.
For toe length, for example, 56 percent of research participants with results like mine (15 genetic markers for a longer big toe, 13 for a longer second toe) have a longer big toe, but I’m in the 44 percent. A prediction barely better than 50–50 isn’t terribly expedient. For Alzheimer’s, carrying the e4 variant of the APOE (apolipoprotein E) gene increases one’s risk of developing Alzheimer’s to 1 percent by age 65, 4 to 7 percent by age 75, and 20 to 23 percent by age 85 in men (to the same figure of less than 1 percent, to 5 to 7 percent and to 27 to 30 percent in women). Having two copies of the gene (one from each parent) moves the needle up to 4 percent (by age 65), 28 percent (age 75) and 51 percent (age 85) in men (2, 28 and 60 percent in women). But the test “does not include all possible variants or genes associated with late-onset Alzheimer’s disease,” so, for example, though lacking both e4 variants, I still have a 1 to 2 percent risk of Alzheimer’s by age 75 and 5 to 8 percent chance by age 85.
For further clarity on this tangle of interactive effects, I contacted Rudy Tanzi, a Harvard Medical School neurologist and head of the Alzheimer’s Genome Project, who co-discovered many of the genes for Alzheimer’s. He admitted that “no one can say with certainty [if ] a calculation of the variance of [Alzheimer’s is] due to genetics versus lifestyle,” adding that the e4 variant of the APOE gene “is present in 20 percent of the population and in 50 percent of lateonset cases but does not guarantee disease.”
Moreover, “until we identify all (or most) of the actual disease-causing mutations in these 40 genes, any attempts at putting an actual number at genetic variance is futile. In the meantime…, all we can say responsibly is that no more than 5 percent of gene mutations causing [Alzheimer’s] are guaranteed to do so. This means that in the remaining cases, most if not all almost certainly involve genetic influences (risk-conferring and protective), but in these cases (95 percent), it is an interplay of gene and environment/lifestyle that determines lifelong risk.”
What should we baby boomers do to shield ourselves against Alzheimer’s? “SHIELD” is Tanzi’s acronym for Sleep (uninterrupted seven to eight hours), Handle Stress, Interact (be sociable), Exercise (cardiovascular), Learn (“the more synapses you make, the more you can lose before you lose it,” Tanzi says), and Diet (Mediterranean: high in fruits, vegetables, olive oil, whole grains). As for personal genome service testing, actionable results with measurable outcome differences are still limited. But that is true for most medical knowledge, and yet we absorb everything we can for what ails us, so why not add genetics?
July 1, 2018
The Final Mysterians

In 1967 British biologist and Nobel laureate Sir Peter Medawar famously characterized science as, in book title form, The Art of the Soluble. “Good scientists study the most important problems they think they can solve. It is, after all, their professional business to solve problems, not merely to grapple with them,” he wrote.
For millennia, the greatest minds of our species have grappled to gain purchase on the vertiginous ontological cliffs of three great mysteries—consciousness, free will and God—without ascending anywhere near the thin air of their peaks. Unlike other inscrutable problems, such as the structure of the atom, the molecular basis of replication and the causes of human violence, which have witnessed stunning advancements of enlightenment, these three seem to recede ever further away from understanding, even as we race ever faster to catch them in our scientific nets.
Are these “hard” problems, as philosopher David Chalmers characterized consciousness, or are they truly insoluble “mysterian” problems, as philosopher Owen Flanagan designated them (inspired by the 1960s rock group Question Mark and the Mysterians)? The “old mysterians” were dualists who believed in nonmaterial properties, such as the soul, that cannot be explained by natural processes. The “new mysterians,” Flanagan says, contend that consciousness can never be explained because of the limitations of human cognition. I contend that not only consciousness but also free will and God are mysterian problems—not because we are not yet smart enough to solve them but because they can never be solved, not even in principle, relating to how the concepts are conceived in language. Call those of us in this camp the “final mysterians.”
Consciousness. The hard problem of consciousness is represented by the qualitative experience (qualia) of what it is like to be something. It is the first-person subjective experience of the world through the senses and brain of the organism. It is not possible to know what it is like to be a bat (in philosopher Thomas Nagel’s famous thought experiment), because if you altered your brain and body from humanoid to batoid, you would just be a bat, not a human knowing what it feels like to be a bat. You would not be like the traveling salesman in Franz Kafka’s 1915 novella The Metamorphosis, who awakens to discover he has been transformed into a giant insect but still has human thoughts. You would just be an arthropod. By definition, only I can know my first-person experience of being me, and the same is true for you, bats and bugs.
Free will. Few scientists dispute that we live in a deterministic universe in which all effects have causes (except in quantum mechanics, although this just adds an element of randomness to the system, not freedom). And yet we all act as if we have free will—that we make choices among options and retain certain degrees of freedom within constraining systems. Either we are all delusional, or else the problem is framed to be conceptually impenetrable. We are not inert blobs of matter bandied about the pinball machine of life by the paddles of nature’s laws; we are active agents within the causal net of the universe, both determined by it and helping to determine it through our choices. That is the compatibilist position from whence volition and culpability emerge.
God. If the creator of the universe is supernatural— outside of space and time and nature’s laws—then by definition no natural science can discover God through any measurements made by natural instruments. By definition, this God is an unsolvable mystery. If God is part of the natural world or somehow reaches into our universe from outside of it to stir the particles (to, say, perform miracles like healing the sick), we should be able to quantify such providential acts. This God is scientifically soluble, but so far all claims of such measurements have yet to exceed statistical chance. In any case, God as a natural being who is just a whole lot smarter and more powerful than us is not what most people conceive of as deific.
Although these final mysteries may not be solvable by science, they are compelling concepts nonetheless, well deserving of our scrutiny if for no other reason than it may lead to a deeper understanding of our nature as sentient, volitional, spiritual beings.
June 1, 2018
Soul-Searching

What are the weirdest questions you’ve ever Googled? Mine might be (for my latest book): “How many people have ever lived?” “What do people think about just before death?” and “How many bits would it take to resurrect in a virtual reality everyone who ever lived?” (It’s 10 to the power of 10123.) Using Google’s autocomplete and Keyword Planner tools, U.K.-based Internet company Digitaloft generated a list of what it considers 20 of the craziest searches, including “Am I pregnant?” “Are aliens real?” “Why do men have nipples?” “Is the world flat?” and “Can a man get pregnant?”
This is all very entertaining, but according to economist Seth Stephens-Davidowitz, who worked at Google as a data scientist (he is now an op-ed writer for the New York Times), such searches may act as a “digital truth serum” for deeper and darker thoughts. As he explains in his book Everybody Lies: Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are (Dey Street Books, 2017), “In the pre-digital age, people hid their embarrassing thoughts from other people. In the digital age, they still hide them from other people, but not from the internet and in particular sites such as Google and PornHub, which protect their anonymity.” Employing big data research tools “allows us to finally see what people really want and really do, not what they say they want and say they do.”
People may tell pollsters that they are not racist, for example, and polling data do indicate that bigoted attitudes have been in steady decline for decades on such issues as interracial marriage, women’s rights and gay marriage, indicating that conservatives today are more socially liberal than liberals were in the 1950s.
Using the Google Trends tool in analyzing the 2008 U.S. presidential election, however, Stephens-Davidowitz concluded that Barack Obama received fewer votes than expected in Democrat strongholds because of still latent racism. For example, he found that 20 percent of searches that included the N-word (hereafter, “n***”) also included the word “jokes” and that on Obama’s first election night about one in 100 Google searches with “Obama” in them included “kkk” or “n***(s).”
“In some states, there were more searches for ‘[n***] president’ than ‘first black president,’ ” he reports—and the highest number of such searches were not predominantly from Southern Republican bastions as one might predict but included upstate New York, western Pennsylvania, eastern Ohio, industrial Michigan and rural Illinois. This difference between public polls and private thoughts, Stephens-Davidowitz observes, helps to explain Obama’s underperformance in regions with a lot of racist searches and partially illuminates the surprise election of Donald Trump.
But before we conclude that the arc of the moral universe is slouching toward Gomorrah, a Google Trends search for “n*** jokes,” “bitch jokes” and “fag jokes” between 2004 and 2017, conducted by Harvard University psychologist Steven Pinker and reported in his 2018 book Enlightenment Now: The Case for Reason, Science, Humanism, and Progress, shows downward-plummeting lines of frequency of searches. “The curves,” he writes, “suggest that Americans are not just more abashed about confessing to prejudice than they used to be; they privately don’t find it as amusing.” More optimistically, these declines in prejudice may be an underestimate, given that when Google began keeping records of searches in 2004 most Googlers were urban and young, who are known to be less prejudiced and bigoted than rural and older people, who adopted the search technology years later (when the bigoted search lines were in steep decline). Stephens-Davidowitz confirms that such intolerant searches are clustered in regions with older and less educated populations and that compared with national searches, those from retirement neighborhoods are seven times as likely to include “n*** jokes” and 30 times as likely to contain “fag jokes.” Additionally, he found that someone who searches for “n***” is also likely to search for older-generation topics such as “Social Security” and “Frank Sinatra.”
What these data show is that the moral arc may not be bending toward justice as smoothly upward as we would like. But as members of the Silent Generation (born 1925–1945) and Baby Boomers (born 1946–1964) are displaced by Gen Xers (born 1965–1980) and Millennials (born 1981–1996), and as populations continue shifting from rural to urban living, and as postsecondary education levels keep climbing, such prejudices should be on the wane. And the moral sphere will expand toward greater inclusiveness.
Michael Shermer's Blog
- Michael Shermer's profile
- 1144 followers
