I read your interesting New York Time’s column, “The Gradual Extinction of Accepted Truths” (online title “The Widening World of Hand-Picked Truths”). In that piece, you worry that the social world has divided into camps or tribes, each its own “self-reinforcing bubble of thought.” You write, “Viewed from afar, the world seems almost on the brink of conceding that there are no truths, only competing ideologies—narratives fighting narratives.” The end result, you argue, is this: “Presenting people with the best available science doesn’t seem to change minds.”
When I read your words, I hear despair. Despair over the state of the world, despair over the place of science in society, despair over where we are headed. I share your despair, especially when it comes to our inability to take climate change science seriously and enact meaningful policies around it, but I do not share your reasons for despair. It is these differences of reason that I wish to discuss.
I saw yesterday that you have another critic, Alex Tsakiris, who attacks you harshly for putting faith in “status quo science” when it has proven again and again—and often in the cases you cite, like vaccines, fluoridation, and climate change—that it is not trustworthy. I am not a “skeptic” of Tsakiris’s type. I mostly buy into the causes you defend (though I think you neglect the historical circumstances for why people have come to distrust science since the 1960s).
I also am not a “postmodernist” in the sense that word seems to hold for you of a thorough-going relativism. I believe that there are better and worse ways of coming to know the world and that there is typically some best-available-knowledge, though such knowledge is always open to revision. “Lurking out there is some kind of real world,” you write. For sure, but it frequently eludes us. I guess in this schema, I am a plain-old modernist then, as you are, but it is our modernisms that are in disagreement. Put simply, I believe that you do not take the best available social science into account and that your failure to do so is pungently ironic.
Your column begins, in 1966, with the religion editor of Time magazine asking, “Is God Dead?” You write that “it was natural” for the editor “to assume that people would increasingly stop believing things just because they had always believed them.” But now “almost 50 years later that dream seems to be coming apart.” And near the end of the piece, you mention a “widening gyre of beliefs.” These statements, as well as the print and online titles of your piece (which I realize may have been chosen by editors), make it sound as if you are making a historical argument: in the mid-20th century, we were headed towards wide-scale acceptance of science, but it has gone off the rails. But do you have any historical evidence to support these claims? Put another way, who in the mid-20th century held this dream? Was it widely held? Or was it the domain of relatively well-educated elites, like this editor at Time? What evidence do you have of expansion or contraction?
You and I are in perfect agreement that the Internet has increased the formation of belief subcultures, not only when it comes to groups like anti-vaccers but also, like, ”furries”—people who enjoy dressing up as and pretending to be animals. Yet, since at least the 18th century, individuals have been able to choose media sources, whether newspapers or magazines or cable television, that fit their prejudices, and society didn’t need the Internet to create many subcultures with bizarre beliefs.
These historical questions bring us to the more fundamental issue in your argument: it seems to suggest that we should just accept the findings of science (because it is rational to do so), but the best-available studies of how humans react to and take up information suggests that they have NEVER acted that way.
You might be referring to these studies when you write, “In a kind of psychological immune response," people "reject ideas they consider harmful.” But it is ambiguous in your article whether you believe this response is simply a moral failing or whether you think it is part and parcel of being human. Social science research increasingly finds it to be the latter. What you call an “immune response,” social scientists refer to as the “backfire” or “boomerang” effects. In a 2010 study, Brendan Nyhan and Jason Reifler presented information to hundreds of participants about tax policy, stem cell research funding, and the presence of weapons of mass destruction in Iraq. The authors found that participants who had false beliefs about these things actually held onto their misperceptions more strongly after being exposed to facts.
P. Sol Hart and Erik C. Nisbet conducted a similar study on beliefs about climate change. As they note, many scientists and journalists adhere to “the deficit model, which assumes that increased communication . . . about scientific issues will move public opinion toward the scientific consensus.” But they find that the exact opposite occurs when conservatives are confronted with science that conflicts with their preexisting worldviews.
These studies fit within a larger literature on “motivated reasoning,” the idea that preconceptions strongly influence how later information is viewed and interpreted. Charles S. Taber and Milton Lodge, two of the leading scholars in this literature, have conducted studies demonstrating that individuals seek out information that confirms beliefs that they already hold and that they put more cognitive resources into denigrating and taking apart arguments that don’t fit their worldview. Moreover, they found that people who are better informed and more sophisticated actually have stronger biases, not weaker ones. As Taber and Lodge write, “Far from the rational calculator portrayed in enlightenment prose . . . homo politicus would seem to be a creature of simple likes and prejudices that are quite resistant to change.”
In a related and well-reasoned essay, the sociologist John Levi Martin argues that individuals’ beliefs are largely a product of where they fit within social networks, that “politics involves the establishment of webs of alliance and opposition, and this in turn is used by political actors to generate opinions.” Furthermore, ”the ‘knowledge’ that ideology gives us is that which would justify our side and strip our enemies of their justification.” That is, if the other guys think it, it must be hogwash.
None of these studies are “postmodernist.” They are based on the belief that we should study social reality empirically with the best available methods and ideas at hand. Jacques Derrida would not touch them with a twelve-foot pole.
Furthermore, findings like these aren’t even new. Indeed, they precede the Time essay that you take to be so meaningful. In the 1940s, when the sociologist Paul Lazarsfeld and his colleagues studied the influence of mass media on political elections, they a model called the “two-step flow of communication.” They argued that most people did not get news directly from mass media but rather through influential people in their lives, who Lazarsfeld et al called “opinion leaders.” While subsequent studies have questioned some part of Lazarsfeld’s model, they typically uphold the idea that human beings do not learn about or interpret information on their own but rather as part of a social group, and that influential figures play an important role in whether new information is seen to be relevant, how it is understood, and what it is taken to mean for subsequent decisions.
All of these social scientific studies imply real consequences for how individuals encounter new information, including scientific findings. To paraphrase something John Levi Martin once said to me, “If we get exposed to information that cuts against our opinion, we are less likely to understand it. If we understand it, despite this, we’re less likely to believe it. If we believe it, despite this, we’re less likely to remember it. If we remember it, despite this, we’re less likely to think it has strong implications for anything in particular.”
Obviously, people change their minds, and none of the studies above suggest otherwise. What they do suggest, however, is that changing our minds often goes hand-in-hand with changing who we choose to affiliate with. I know this from personal experience. I was raised in a conservative household and extended social network, which taught me that humans once lived with dinosaurs and that homosexuals are sinners. I no longer believe either of these things (neither do my parents, by the way), but I was also determined to leave that social network behind. My close-knit group of friends made the same decision. When a high school teacher asked one of my best friends to write a personal essay about his goals, he wrote, “My goal is to get the fuck out of Joliet, Illinois.” I exited that town and joined academia, which is full of atheists, humanists, and people just like you. Put another way, just as recovering alcoholics avoid hanging out with old drinking buddies, the best way to buy into the Big Bang is to exit Young Earth Creationist groups.
These findings have many fascinating implications for science communication. For instance, they suggest that (sadly perhaps) who is speaking is often more important than what he or she is saying. (Surely this idea unsettles proponents of scientific reason.) Let me give an example: Al Gore won the Nobel Peace Prize for An Inconvenient Truth, but when it comes to having a spokesperson for global climate change, it is hard to imagine someone worse. Indeed, An Inconvenient Truth likely damaged the chances for meaningful climate policy in the United States. Why? By the time An Inconvenient Truth was released, conservatives had loathed Gore FOR YEARS. In his 1993 book, See, I Told You So, Rush Limbaugh included a chapter titled “Algore: Technology Czar,” in which he lambasted Gore for being a lackey and a tree-hugger who foisted unfounded “scientific” beliefs on the American public.
I am not a “ditto head,” and I do not agree with Limbaugh about Gore. I believe that the world is a better place because of Gore’s leadership, because of his environmentalism and, yes, including because of the policies he shaped around the Internet. The point is, however, if the goal was to change the minds of non-believers in climate change—who are mostly conservative—An Inconvenient Truth was an abject failure, and yet it is milestone of clear scientific communication. It’s just that clarity isn’t the point. When conservatives see Al Gore talking about climate change, they see the spotted owl, or they laugh to themselves, “Al Gore invented the Internet,” or they see a white stain on Monica Lewinsky’s blue dress. They do not hear what Gore says, or if they hear, they do not engage it.
I have brought these ideas up to our mutual acquaintance, the science writer, John Horgan, my buddy and colleague at the Stevens Institute of Technology. John deals with these social scientific findings in this way: first, he acts confused as if he does not understand them. Second, he discounts them. (“That sounds too postmodernist to me,” even though the theory and methods undergirding these findings are quite far from academic postmodernism.) Third, he fails to do his own research in these matters or dig deeper into the available studies or refute them. Fourth, he does not take the findings to mean anything in particular for his own life. In other words, Horgan acts exactly like how John Levi Martin suggests that, say, conservative Evangelical Christians act when confronted with evolutionary science. It appears that Horgan is human.
There are good reasons for Horgan to play ostrich-like and stick his head in the sand when it comes to this vein of social science research. If he took it seriously, he would have to change his whole modus operandi, just as you would have to change yours. And Horgan has spent thirty years carefully crafting his identity as a curmudgeonly herald for good science! After all, his Scientific American blog is titled “Cross-Check,” a reference to a violent hockey move, and it promises to take a “puckish, provocative look at breaking science.” If Horgan took the findings of these studies on board, he would have to change the tone of his communication and indeed spend more time traveling the country talking to Evangelical ministers and other influential figures with whom he shares very little but who, if he were to win them over, would do a great deal for his various causes. But do we really want Horgan to sacrifice this puckish image that we all love and admire, even if maintaining this mode of communication means that he risks being a choir-preacher for the remainder of his days?
There are structural and cultural reasons why scientists and journalists avoid thinking through the kinds of social scientific findings discussed above: both scientists and journalists are—in their idealized image—dedicated to the dogged pursuit of truth and the idea that presenting objective “facts” to the public will improve the world. They do not want to face up to the reality that the second part of this belief system—that humans interact with facts in a rational and unmediated fashion—is based on lousy anthropology and cruddy psychology. And, yet, I believe that there is no greater need in the area of science communication for scientists, journalists, and others to deal with than exactly this one. Otherwise, we are lost.
In the end, we are left with a darkly funny, tragicomedy, perhaps written by Samuel Beckett’s ghost: in one room, we have a public meeting of some neo-hippies and homeopathic medicine types who cry out that cell phone signals are causing cancer. They reject the best available science on this topic. Next door, we have a room full of irascible and curmudgeonly science journalists waving their fists in the air, lamenting the fact that so many people in our society do not take science seriously. Yet, these journalists reject the best available (social) science about how human beings actually behave.
When the curtain falls, there is no light.
PS: I would like to thank my Stevens colleagues Lindsey Cormack and Kristyn Karl for more deeply educating me about motivated reasoning. I'm so happy to have you two aboard our little ship.