The Truth About Truthiness
Longtime readers know that I tend to get my back up when I see journalists and academics opining that our national political divide exists because liberalism is smart and conservatism is dumb. Last week, I posted one of those articles to my Facebook page, with a rather biting note that I felt bad about after someone tweeted it at the author. Several people asked me to write up my thoughts at greater length, so here it is, with more elaboration and less sarcasm.
The article in question is by Slate's Katy Waldman, and most of it isn't actually about conservatives at all. At the end, however, she slides into a lengthy explanation about how conservative ideas are the product of "low-effort thought." Naturally, that ended up as the subject of the article's headline and rubric, which is why I clicked on it in the first place, so well played, whoever is in charge of Slate's click-bait policy.
Here is the section in question, reproduced in full so that there will be no question of my quoting her unfairly or failing to deal with all the points she has raised:
It is no accident that the current understanding of truthiness unfurled from Stephen Colbert's satire of the American right. Psychologists have found that low-effort thought promotes political conservatism. In a study by Scott Eidelman, Christian Crandall, and others, volunteers were placed in situations that, by forcing them to multitask or to answer questions under time pressure, required them to fall back on intellectual shortcuts. They were then polled about issues such as free trade, private property, and social welfare. Time after time, participants were more likely to espouse conservative ideals when they turned off their deliberative mental circuits. In the most wondrous setup, the researchers measured the political leanings of a group of bar patrons against their blood alcohol levels, predicting that as the beer flowed, so too would the Republican talking points. They were correct, it turns out. Drunkenness is a tax on cognitive capacity; when we're taxed too much, we really do veer right.
The researchers pointed to a few cognitive biases that might prompt an overloaded mind to embrace pillars of conservative thought. First, we reflexively attribute people's behavior to their character rather than to their circumstances (although studies suggest this "fundamental attribution error" is more prevalent in the West). This type of judgment invites a focus on personal responsibility. Second, we learn more easily when knowledge is arranged hierarchically, so in a pinch we may be inclined to accept fixed social strata and gender roles. Third, we tend to assume that persisting and long-standing states are good and desirable, which stirs our faith in the status quo absent any kind of deep reflection. Fourth, we usually reset to a mode of self-interest, so it takes some extended and counterintuitive chin-stroking for those not actively suffering to come around to the idea of pooling resources for a social safety net. (Elaborating on his original definition of truthiness, Colbert told interviewers: "It's not only that I feel it to be true, but that I feel it to be true. There's not only an emotional quality, but there's a selfish quality.") Consider the conservative (and dead wrong) maxim: "A rising tide lifts all boats." Clear visual logic, a lilting simplicity, and the complacent sense that seeking my good is good enough -- the saying is truthiness gold.
And conservative ideology sparks something elemental and reflexive in our predator-wary neural wiring. Motivated social cognition theory states that right-wing political beliefs arise from a need to manage uncertainty and threat. A 2014 study from Rice University showed that self-identified Republicans responded more intensely and fearfully than liberals to negative stimuli (pictures of burning houses or maggot-infested wounds); even in neutral settings, brain scans revealed increased activity in the neural corridors that react to danger. Obama is setting up death panels. Immigrants will flow over the borders until the country is unrecognizable. To researchers, such theories are more than AM radio fear-mongering -- they are beliefs steeped in truthiness, made potent by the ease with which they are processed in the receptively anxious echo chambers of Rush Limbaugh's cranium.
Of course, the overlap between GOP tenets and low-hanging cognitive fruit does not mean that conservatives arrive at their beliefs by disconnecting their brains. ("Conservatives are dumb" would be the reductive, unfair, truthy version of this article's argument.) Just as sometimes the simplest answer is also the correct one, sometimes the dimly intuited dangers bedeviling our reptilian nervous systems are real.
So what's wrong with this passage? Let's start with the fact that most of it is based on a single article, which is a pretty thin reed upon which to rest sweeping pronouncements about the nature of conservative ideas. I'm not saying that I always achieve this ideal, but when I'm looking at a single paper making an argument, I generally try to include qualifiers such as "if this result stands, it would suggest ..." no matter how congenial I find its results, because the annals of science are littered with instances of prominent geniuses "proving" things that we now know to be wrong.
Now when I dig into the actual paper, that thin reed looks more and more like a slender blade of lawn grass. The article summarizes four studies performed by the researchers:
- Study No. 1: They stood outside a New England bar, grabbed patrons and asked them to complete a 10-question political survey of rather elderly vintage. (Sample questions: "Production and trade should be free of government interference" and "Ultimately, private property should be abolished"). Then they asked them to blow into a Breathalyzer so that they could measure their blood-alcohol levels. The problems with this should be obvious: How did these people answer before they started drinking? We have no idea! Moreover, here is a word that doesn't appear anywhere in their analysis: "inhibition." Alcohol lowers social inhibition. If you're in an area where conservatism is relatively frowned upon -- like, oh, say, I dunno, New England -- drinking might make you more willing to give honest answers. Or it might make you more willing to mess with the researchers by giving wrong answers. Or this study might be the next best thing to completely useless.
- Study No. 2: Thirty-eight students from the University of Maine's Psychology 101 course were offered extra credit to participate in a study. They were divided into two groups and given a 15-minute question packet about social perception. One group was given an extra task to perform during the test to raise their cognitive loads. The "load" group had a higher mean conservatism than the "no load" group.
- Study No. 3: Similar to Study No. 2, except it involved 36 students, half of whom performed an exercise under time pressure rather than the pressure of a second task.
- Study No. 4: Similar to Studies 2 and 3, except that 34 undergraduates at the University of Arkansas were directed to give statements of either cursory or extensive contemplation.
In the latter studies, we're talking about differences between groups of 18 to 19 students, and again, no mention of whether the issue might be disinhibition -- "I'm too busy to give my professor the 'right' answer, rather than the one I actually believe" -- rather than "low-effort thought."
I am reluctant to make sweeping generalizations about a very large group of people based on a single study. But I am reluctant indeed when it turns out those generalizations are based on 85 drunk people and 100 psychology students.
When highlighting studies like this, you should probably describe it with words like "suggestive," "possible" and "may." Waldman writes as if we're dealing with Proven Scientific Fact, rather than something that a small team of researchers found in three small groups.
This is particularly vital when you're dealing with research about conservatives, done by a profession that skews liberal by something like 200 to 1. The unstated assumptions of the group are bound to slip into things such as the questions they ask and how tempted they are to go back and look for a "mistake" when they get an answer suggesting that liberals are close-minded barbarians.
To see what I mean, consider the recent tradition of psychology articles showing that conservatives are authoritarian while liberals are not. Jeremy Frimer, who runs the Moral Psychology Lab at the University of Winnipeg, realized that who you asked those questions about might matter -- did conservatives defer to the military because they were authoritarians or because the military is considered a "conservative" institution? And, lo and behold, when he asked similar questions about, say, environmentalists, the liberals were the authoritarians.
It also matters because social psychology, and social science more generally, has a replication problem, which was recently covered in a very good article at Slate. Take the infamous "paradox of choice" study that found that offering a few kinds of jam samples at a supermarket was more likely to result in a purchase than offering dozens of samples. A team of researchers that tried to replicate this -- and other famous experiments -- completely failed. When they did a survey of the literature, they found that the array of choices generally had no important effect either way. The replication problem is bad enough in one subfield of social psychology that Nobel laureate Daniel Kahneman wrote an open letter to its practitioners, urging them to institute tougher replication protocols before their field implodes. A recent issue of Social Psychology was devoted to trying to replicate famous studies in the discipline; more than a third failed replication.
Let me pause here to say something important: Though I mentioned bias above, I'm not suggesting in any way that the replication problems mostly happen because social scientists are in on a conspiracy against conservatives to do bad research or to make stuff up. The replication problems mostly happen because, as the Slate article notes, journals are biased toward publishing positive and novel results, not "there was no relationship, which is exactly what you'd expect." So readers see the one paper showing that something interesting happened, not the (possibly many more) teams that got muddy data showing no particular effect. If you do enough studies on enough small groups, you will occasionally get an effect just by random chance. But because those are the only studies that get published, it seems like "science has proved ..." whatever those papers are about.
This is one reason that John Ioannidis argues that most published results are false:
The probability that a research claim is true may depend on study power and bias, the number of other studies on the same question, and, importantly, the ratio of true to no relationships among the relationships probed in each scientific field. In this framework, a research finding is less likely to be true when the studies conducted in a field are smaller; when effect sizes are smaller; when there is a greater number and lesser preselection of tested relationships; where there is greater flexibility in designs, definitions, outcomes, and analytical modes; when there is greater financial and other interest and prejudice; and when more teams are involved in a scientific field in chase of statistical significance. Simulations show that for most study designs and settings, it is more likely for a research claim to be false than true. Moreover, for many current scientific fields, claimed research findings may often be simply accurate measures of the prevailing bias.
So you want to be cautious about reading too much into a single study, but especially when politics are involved. The fact that most social psychologists, and most journalists, do not have the gut feeling that "something must be wrong" when they see a result showing conservatives are paranoid authoritarians, while I'd bet they do get exactly that feeling when they hear the same thing about nice liberals like themselves and go poking around in the data to discover the problem ... well, I doubt that helps.
To return to that Slate article, consider Waldman's citation of "a rising tide lifts all boats" as an obviously wrong belief, even though it's obviously right at longtime scales, which is why it's less miserable to be poor in America than it is in Africa. Or her paragraph on motivated social cognition, which again states as clearly false beliefs that are entirely debatable when phrased without the buzzwords. In other countries, such as the U.K., national health systems have indeed withheld potentially lifesaving treatments based on cost; it's not crazy to think this might happen here. On immigration, the percentage of immigrants in the U.S. population has been rising pretty close to its historical peaks, net migration is still positive, and previous waves of immigrants, my ancestors among them, did change the country in ways that would make it unrecognizable to, say, a citizen of 1830. I think that those changes were good, having produced, among other things, me, but it's not obviously false to believe that immigrants are streaming over the border, that more will come if we relax our border policy, and that this will change the country in major ways.
Or consider this: "It is no accident that the current understanding of truthiness unfurled from Stephen Colbert's satire of the American right." That's a breathtakingly confident statement to make based mostly on a single journal article, bolstered by the not-entirely-on-point research indicating that conservatives worry more about negative stimuli or references to phenomena like "fundamental attribution error," which can be found among both liberals and conservatives. (Just ask your average liberal -- who is not employed on Wall Street or married to same -- whether it's possible that 2007-vintage investment bankers were basically decent people who responded rationally but disastrously to confusing signals from the mortgage market.)
For that matter ... is it really possible that, in an alternative universe where conservative politicians thought a bit harder about things, one of Hollywood's many well-known conservative television personalities could also have invented "truthiness" while savagely parodying liberals five nights a week? Why be so quick to attribute this to the personality of conservatives, rather than an overwhelmingly liberal Hollywood environment that would probably never have green-lit that show in the first place?
At the end, Waldman dials it back a little bit with the "even a stopped clock is right twice a day" note, but it comes off as unconvincing because no one, conservative or liberal, believes that good ideas mostly come out of failing to think things through. Ultimately, she's making the argument that conservative ideas are the product of lazy thinking, and I don't really think she has enough evidence to support that argument.
I do not have a scientific study to back me up, but I hope that you'll permit me a small observation anyway: We are all of us fond of low-effort thought. Just look at what people share on Facebook and Twitter. We like studies and facts that confirm what we already believe, especially when what we believe is that we are nicer, smarter and more rational than other people. We especially like to hear that when we are engaged in some sort of bruising contest with those wicked troglodytes -- say, for political and cultural control of the country we both inhabit. When we are presented with what seems to be evidence for these propositions, we don't tend to investigate it too closely. The temptation is common to all political persuasions, and it requires a constant mustering of will to resist it.
It's possible that Waldman's thesis that conservative beliefs are the result of "low-effort thought" is 100 percent correct. But I'd be more convinced if she'd grappled with the limitations of the evidence, and explored the reasons it might be wrong, rather than simply presenting as fact something she no doubt found easy to believe. That's something that all of us, left and right, could stand to do a lot more often.
Corrects number of studies after 12th paragraph and adds its findings to 13th and 14th paragraphs in article published Sept. 8.
Here's a good rule for journalists writing about most social psychology papers: If you are tempted to write "studies show that people ..." try replacing that with "studies show that small groups of affluent psychology majors ..." and see if you still want to write the article.
Yes, yes, Mechanical Turk study, but I'm not that convinced that "tiny groups of psych students" is a noticeably better experimental medium for generalizations about political beliefs.
A good rule for readers reading about these sorts of studies: The more interesting the result, the more likely it is to be a product of random chance producing a publication-worthy outlier. The study has now passed through two layers of filters: the journal and the journalists writing about it. Both are more interested if the result is novel and/or has political valence, which means that they are inherently biased toward selecting for statistical outliers.
This column does not necessarily reflect the opinion of Bloomberg View's editorial board or Bloomberg LP, its owners and investors.
To contact the author on this story:
Megan McArdle at email@example.com
To contact the editor on this story:
Brooke Sample at firstname.lastname@example.org