Public perceptions of science, tech often filtered through values instead of data

A recurring phrase from candidates in the fall 2014 elections, especially those hoping to deflect questions about climate change, the Keystone pipeline, or the labeling of genetically modified foods, was “I’m not a scientist.”

Few politicians are scientists, of course. Nor are many candidates for elective office doctors, engineers, or accountants.

But that doesn’t disqualify any of them from voting on topics — such as climate change, energy policy, GMOs, health care reform, transportation issues, and budgets — that are part of the public agenda. Most Americans understand that public officials are generalists, elected more for their overall judgment and experience than their professional pedigrees.

So why do elected officials sometimes hide behind the “I’m not a scientist” line, or a close variation, when it comes to approaching major issues? The answer may have less to do with what they know about science and technology than what they think their constituents know, feel, and believe.

Consider the just-completed fall elections, in which climate change was raised in a number of debates in races for U.S. Senate and governor. Public opinion surveys have shown for years that most people recognize climate change is underway, but candidates who acknowledge public awareness tempt being asked a natural follow-up: What’s your solution? For many of those candidates, that’s viewed as risky and premature — especially when their own political parties are far from united on what those solutions might be.

Instead, they preface their remarks with “I’m not a scientist” and punt.

Some of the most polarizing issues in the U.S. are essentially scientific, from climate change to evolution, from stem-cell research to energy policy, and from natural resource management to public health. For many scientists and technologists, the solution is simple: If only the public could learn and understand more about science, those issues would be much less vexing.

Trouble is, most research about public opinions on science and technology — and how those opinions shape politics and policy — don’t support that approach. It’s not simply a matter of Americans needing remedial courses in biology or physics, but how what they “know” is influenced by what they value and believe.

A leading researcher on the interface between science communications and politics is Dietram Scheufele of the UW-Madison’s Department of Life Sciences Communication. In a recent paper for the National Academy of Sciences, Scheufele said the “knowledge deficit model” of science communications misses the boat.



“Unfortunately, science has been slow in adjusting its models for communicating with lay audiences,” he wrote. Connecting with a broader public audience does not rest on disseminating basic scientific facts and assuming everyone will accept them. Rather, it’s more often about taking into account ethical, moral, and religious factors that color how that information is consumed.

“Higher levels of knowledge do not necessarily translate into more positive attitudes toward science,” Scheufele wrote. “In fact, research into motivated reasoning suggests that all of us process information in biased ways based on pre-existing religious views, cultural values or ideologies.”

Those hardwired views can lead people to selectively give more weight to information that supports an initial view — their gut reactions — and may cause them to discount data that doesn’t mesh with their core values.

If so many people are prisoners of their own values and beliefs, is the United States doomed to become a land of scientific policy gridlock? Scheufele doesn’t think so.

Social science research shows that many Americans are willing to pivot if their own networks put them in touch with non-likeminded people. Through controlled discussions in which people were exposed to likeminded individuals, non-likeminded people, and a mixed group, those in the non-likeminded group were more likely to search out additional information that tested their beliefs.

“Exposure to non-likeminded viewpoints, or just the anticipation of such encounters, ultimately promotes more rational decision-making,” Scheufele wrote. 

Getting scientific policy right is not just about the scientists. It’s about public engagement, risk assessment, and taking core values into account. Politicians need not hide behind the “I’m not a scientist” line … and neither should ordinary citizens.

Click here to sign up for the free IB ezine — your twice-weekly resource for local business news, analysis, voices, and the names you need to know. If you are not already a subscriber to In Business magazine, be sure to sign up for our monthly print edition here.