Science AMA Series: We're a social scientist & physical scientist who just launched Evidence Squared, a podcast on the science of why science fails to persuade. Ask Us Anything!

Abstract

[removed]

Not long ago, Richard Horton, editor of The Lancet wrote an op-ed, in which he proposed that half of all peer-reviewed, published science is wrong, due to "small sample sizes, tiny effects, invalid exploratory analyses, and flagrant conflicts of interest, together with an obsession for pursuing fashionable trends of dubious importance." Do you agree with Dr. Horton's assessment? If so, how can science re-assert its credibility?

PhelanIainMacPhail

Peer-review is a crucial part of the scientific process - I characterize it as a spam filter on steroids. Which is why the most reliable source of information is peer-reviewed science. But it doesn't guarantee that the published science is correct. We can trust scientific results when published results are replicated by subsequent studies - particularly when results are replicated by independent methods.

E.g., our scientific understanding that human activity is causing global warming is confirmed by satellite measurements of the changing structure of the atmosphere, of surface measurements of downward infrared radiation, of satellite measurements of outgoing infrared radiation, of thermometer measurements of surface warming trends. When we see independent lines of evidence converging on a single, consistent answer, that's when we can be confident that our scientific understanding is robust.

To reassert science's credibility, we need to explain how science is done - explain how it's not a single step but many steps of hypothesis proposing, data collection, statistical analysis, peer-review and replication.

We explore the concept of a "knowledge based consensus" built on a consilience of evidence at https://youtu.be/HUOMbK1x7MI

~ John Cook


Ok, I'll ask you this thing that perplexes me about the advice I've had from pro scicomm folks. I have tried all the new things they tell us works (before they tell us it's not working--is backfire a thing, or is it not a thing?).

I largely stopped slinging links because we are told the deficit model says people don't move with evidence. Ok, fine. But I can't resist a link to scientific consensus reports or lists of scientific societies and agencies that concur at least as a marker.

So I'm told we have to connect with people on their "shared values". I try this all the time, and people just move the goalposts. I can't tell if they really don't value these things, or what their actual values are then. Let me illustrate a sample (the topic I'm immersed in is GMOs).

I try to tell them how great it is that we can reduce pesticides with GMO eggplant for poor farmers in Bangladesh--who lack protective gear. I love the idea of reducing chemical pesticides and benefitting poor farmers--I assume this is a value we share.

They complain about patents. (goalpost moves)

I tell them about off-patent and academic projects. (goalpost moves)

They claim I'm a shill. (false, but where can you go from here?)

What can you do when either they aren't really sharing values they claim to hold, or refuse to acknowledge that the things available meet their claimed values?

mem_somerville

First, on the question of the backfire effect, yes it is a thing. The backfire effect has been replicated in a number of studies under a variety of different contexts.

But it's important to realize that there are different types of backfire effects. One of the earlier identified backfires was the "Familiarity backfire effect" - the idea that a debunking that puts too much emphasis on the myth rather than the fact can make people believe the myth more strongly. Attempts to replicate this effect have failed. So the familiarity backfire effect is on shaky grounds.

However, more solid is the "worldview backfire effect", where debunking a myth that agrees with a person's worldview will cause them to believe the myth more strongly. This has been replicated in cases of loaded issues like climate change, vaccination, WMDs, etc.

Second, the topic of deficit model. It bothers me how people claim "the deficit model has failed" as if information doesn't matter at all. This is a case of "deficit model doesn't do everything" reducing down to "deficit model does nothing". Information does matter. Education matters. Telling people facts matter. They won't always work. They can even backfire in some cases. We need to recognize that communicating evidence is not going to give you 100% success but its also not going to give you 0% success rate.

Similarly, framing the science in ways that are consistent with people's values (e.g., your shared value approach) will be effective with some people. With other people, with rusted on beliefs and conspiratorial thinking, it won't work. Nothing will work. You have to recognize that there are some people whose science denial is so strong that no intervention will work.

Science communication is about psychology. And psychology isn't deterministic, because people are complicated. It's probabilistic. That means that adopting tested science communication approaches increases your probability of success but there is no 100% guarantee.

When I encounter a hard-core science denialist, the approach I generally take is I will engage with them on the assumption that they are not going to change their mind. Therefore my science communication isn't for their benefit but for the benefit of all the people watching the exchange. And on a personal level, I will say that it helps your own emotional state and demeanor - you can have a conversation with a frustrating science denialist without getting frustrated when you take that mindset.

~ John Cook


In light of controversies like the Sokal hoax and others like in, what do you say to people who feel justified in their skepticism of professional academia as an overly politicized environment, and consequently too susceptible to group think and confirmation bias in its publications?

bokavitch

Hello there!

First of all, I think your second example is a terrible example and the authors should be embarrassed for their misrepresentation. They are basically trying to claim that a field is bullshit because they got a hoax paper accepted in a predatory journal.

That's nonsense. You can (and people do!) get garbage papers published in predatory journals in fields across science, from biology to physics. That tells you a lot about predatory publishing. Nothing about a field. So, shame on them.

Second, I would refer these people to the scientists who have worked on climate change for decades in the private sector, like at Exxon, who came up with the same findings as the scientists working in academia. The conclusions are the conclusions because of the evidence, not because of some academic groupthink.

~ Peter


There is no better (recent) example of this issue than a segment from the Joe Rogan podcast where one of the guys (Eddie Bravo, someone very successful in the martial arts industry) was seriously defending flat earth theory.

It obviously comes down to lack of science literacy. Eddie, and those like him, do not understand how basic science works. With that in hand, all they need is that wiggle room that Science is never 100% on anything. 97% agreement on climate change means 3% chance that anything else they can think of could be the answer. Then the discussion is treated as though it is only 50% agreement among scientists.

I had an ex who was science illiterate. We had a discussion about vaccines and autism that was difficult to wade through.

My question is: How do we make someone aware of their lack of science literacy during these discussions?

They tried so hard in that segment mentioned above to get through to Eddie, and those are his friends, and Eddie wasn't having it. What is the solution? What discussion techniques work?

ThereIRuinedIt

Hello there!

I am not so sure science literacy is the problem there. When you see someone advocating a position because of genuine lack of information, then increasing knowledge/literacy will probably help.

But it's also the case where you see people who have been presented with all of the information they need, but they reject that information and cling to their previous position.

Then we're sort of beyond literacy and into motivated reasoning. At that point, we need to understand what's driving the science denial/misconception- what's at the heart of it. For a lot of flat earthers, it really has nothing to do with planetary science whatsoever, it's about a willingness or need to believe in a massive conspiracy. A lot of time this is driven by someone's feeling that they lack agency in their own lives. Conspiracies give them a feeling that there is control, even if it's The Illumnati that are in control. (And also, thinking you know something everyone else has been duped by is another way to feel like you have some control in your life).

I suspect that the reason why there is no progress in the clip you mentioned is because they're using the wrong tools to address the wrong problem.

With climate change, denial is usually due to a worldview that is inherently antagonistic to environmental safeguards (i.e. pro-"free markets"). With anti-vaccines, there are a few camps, from the "nature = good, pharma = bad" to the "government shouldn't tell us what to do, FEMA has death camps" side, and this spans the political spectrum.

The commonality is that these things really aren't about science and science literacy per se, they're about worldviews.

Does that help?

~ Peter


do you think the repetition of increasingly apocalyptic predictions over the last 20 years has made people deaf to the dangers of climate change? why does fearmongering seem to be the only lasting strategy to convince people of the importance? do you think there's a better strategy?

howardCK

Hello there!

I think it's important to note that the scientific community is not actually guilty of that sort of fearmongering- quite the opposite in fact. Scientists and scientific assessments routinely err (and I do actually mean err) on the side of too much caution when it comes to climate change. When you look at the scientific community over time, the tendency has been greater to underestimate or low-ball the physical science than it has been to overestimate or exaggerate it.

Edited to add: See this article about "Erring on the side of least drama" for more.

But from a communications perspective, I would say this. Endless fearmongering is ultimately self-defeating. Fear is a great attention grabber, and it can deliver a state of arousal (non-sexual, obviously), but if there no solution or positive alternative offered, it is ultimately disempowering and will reduce perceived agency.

I think being clear not just about the dire scope of the problem but also about the very hopeful reality that we can decide our own fate both should be emphasized. Not only is this the truth, it's effective communications!

~ Peter


In your opinion, what role should Philosophy play in science? I see many people try to completely get rid of it yet, I feel as though we need philosophy and science together to better understand the world.

jasons2121

Hello there!

I think philosophy has a very negative reputation among some groups of physical scientists, and that this is a shame. Philosophy has a lot to share with physical and social science.

For example, physical science informs about the climate system. Social science can tell us that a scientific consensus exists, and that the perception of expert agreement matters a lot to the public. And philosophy can help us understand how and why a scientific consensus can guard against pitfalls like groupthink.

All of these branches of knowledge working together can benefit society much more than the sum of their individual parts.

~ Peter


In your opinion, what role should Philosophy play in science? I see many people try to completely get rid of it yet, I feel as though we need philosophy and science together to better understand the world.

jasons2121

Let me add to Peter's response. There is another invaluable contribution that philosophy makes to science and that is critical thinking. In order to inoculate people against misinformation, we need to explain the techniques used to distort science. In order to properly understand the techniques of denial, we need to deconstruct misinforming arguments, identifying the premises and conclusion, in order to pinpoint any fallacies or false premises. To do this properly requires the expertise of critical thinking philosophers.

To see a philosopher applying his critical thinking mojo to climate change misinformation, see our interview with UQ academic Peter Ellerton: https://youtu.be/xPOu9gFOC10

~ John Cook


Climate change is a very well known topic which has a large public divide between scientific observation and study and what a significant portion of the general public believes. Do you think that there are other areas that suffer from being miscommunicated to the general public by the media? If so what are they?

Awesome5auce

First, let me give credit where credit is due. I don't think we should pin all or most of the blame on public misperceptions about science on the scientists or science communicators. While they're not perfect, while there is always room for improvement (hence why we started http://evidencesquared.com), science communicators and scientists have made heroic efforts to make science accessible to the public.

A large contributor to public confusion in a number of specific areas of science is due to misinformation. Some of this is driven by ideology, such as misinformation about evolution that is created and disseminated by creationists. Some of it is driven by vested interests such as the tobacco industry spending millions to convince the public that they're completely safe (not to mention look cool and rugged) smoking their product. Some of it is driven by the "unholy alliance" (as Naomi Oreskes puts it) between ideology and vested interested, such as right-wing think-tanks funded by the fossil fuel industry to the order of hundreds of millions of dollars to publish misinformation arguing that fossil fuel burning isn't causing climate change. Some misinformation is generated by more benign (but still destructive) sources such as misconceptions and fallacious thinking about vaccination.

I work in the area of climate change so I tend to think about climate misinformation and misperceptions more than anything else. But occasionally when I poke my head out of my bubble, I see that the same patterns of misinformation and misunderstandings are happening all over. Efforts by the sugar industry echo misinformation campaigns of the tobacco and fossil fuel industries. Scientists working in many other disciplines get viciously attacked when certain segments of the population don't like their published results. It reminds me that climate change is not the only discipline experiencing trouble.

So it's crucial that science communicators across all disciplines heed the social science research into how to communicate science better. But also to heed the social science research into the impact of misinformation and how to counter it.

~ John Cook


What are the reasons that make people reject scientifically proven facts?

TheWhisperingOaks

Why do people reject scientific evidence? Humans are incredibly effective motivated reasoners. We can reason ourself out of anything if sufficiently motivated.

What are typical motivators? There are many. Religious belief - does science conflict with how (or when) we think the universe formed? Political ideology - are their policy implications of scientific evidence that conflict with the policies that we believe in? Social identity - do we belong to a social group that collectively disbelieves a particular scientific view?

One or more of these motivators can be at play when we encounter scientific evidence. So if there is a conflict between the science and our beliefs/identity, the science is going to have a rough time. Sometimes it prevails but a number of studies have found that when science conflicts with worldview, worldview wins the majority of the time.

~ John Cook


Can you give some guidelines a scientist has to pay attention to, when communicating his research?

AbuDhur

Hello there!

Unlearn everything you learned in grad school about what being a "serious" scientist means. Embrace simplicity, anecdote, narrative, humor/emotion. Look at how effective communicators outside of science communicate. Do what those people are doing.

You will absolutely get pushback from curmudgeons, but way more people will thank you for communicating to them like the human beings we all actually are.

Read a book like "Made to Stick", for example.

~ Peter


Can you give some guidelines a scientist has to pay attention to, when communicating his research?

AbuDhur

Geez, Peter, you forgot the most important answer to this question:

Subscribe to the Evidence Squared podcast where on a weekly basis, we outline guidelines for scientists communicating their research! http://evidencesquared.com

~ John Cook


I would definitely like to give this a listen. My question is: where are the first three episodes?

jedinborough

First three eps are at: http://evidencesquared.com/ep1/ http://evidencesquared.com/ep2/ http://evidencesquared.com/ep3/

Not sure why they're not coming up in iTunes, checking that out...

~ John Cook


Do you think media (including news and films etc) have contributed to the distrust of scientists? How do you combat this? Why do newspapers always say "scientists say" as if they are one entity?

Also I'm working to become a researcher and how do you deal with the frustration of people denying evidence?

AvalonDreamer9

The way media cover science isn't always helpful and it's kind of built in to the way media works. For example, they are always looking for the latest news, especially news that is surprising and breaks past paradigms. This leads to "single study bias" where a new study comes out that conflicts with past studies or conventional wisdom, and is breathlessly covered by the media. The scientific community view paradigm shifting research with appropriate skepticism and will examine the methodology or attempt to replicate it. But from the public's point of view, media covers science as if it's lurching all over the place from finding to finding.

They also tend to simplify science, treating it like a monolith. E.g., failing to recognize the nuance that we understand some areas with higher understanding while other areas are at the edge of our knowledge with lower understanding. For example, we know with high understanding the humans are causing climate change but we have lower understanding of areas of climate change like how clouds react in a changing climate or how El Niño will behave in a warming world.

Similarly, scientists (and sometimes non-scientists) are presented as experts in a topic even when they have no relevant expertise in that specific topic. Add to this the media norm of presenting both sides of a "debate" even when there is no genuine scientific debate and the public are left confused about the level of scientific agreement among the relevant scientific experts. This does great harm to public perceptions of scientific topics.

How do we combat this? It's not easy. We need to explain how science works, and how science isn't a monolith but a bumpy terrain where some areas are well understood while other areas are still being figured out. We need to explain how scientists have different levels of expertise in specific scientific disciplines, and it's the relevant experts that we need to look to for the most qualified opinion.

How do you deal with the frustration of science denial? Let me tell you know I deal with it. I take science denial and use it as an educational opportunity. It turns out several decades of education research have found that directly refuting misinformation is one of the most effective ways to teach science (and as a bonus, you also get to increase critical thinking skills). And inoculation research that I just published a few weeks ago replicates other research finding that the way to neutralize misinformation is to expose people to just a little bit of misinformation, in order to build immunity to science denial.

So turning denial into an educational opportunity is a robust, evidence-based way to deal with misinformation. But I will say it also feels emotionally satisfying taking attempts to cast doubt on science and using it for positive benefit!

~ John Cook


Do you think this is an issue with science journalists not getting the message across properly? Or is it the way that the scientific community expects many of these articles to be written? Do you think there needs to be more than just putting down facts without bias, or will trying to lead the reader a certain way compromise the validity of the article?

So I guess my question boils down to if you're trying to persuade someone then doesn't that weaken the validity of that source because it shows a clear bias instead of just presenting facts and data?

SHavens

Hello there!

I think for something like climate change, for many years (decades really) there was a tendency towards what's called "false balance". Which is the journalistic convention of presenting "both sides" to a story when in fact for something like climate change, there was an overwhelming scientific consensus rather than a 50-50 split. This was an unintentional thing on the journalists' side, but it was absolutely exploited by the contrarians. I think we saw a lot of progress on that front, but I'm concerned we're already seeing a regression back to it in recent months.

I think how a journalist chooses to write a story is up to her or him, but I also think that journalists should be trained to understand the science of communication. In other words, I think if a journalist wants to do a "balanced" story, that's their call, but in my perfect world they would be absolutely clear that the social science shows how misinforming it is. My sense is that if journalists knew what social scientists and physical scientists knew about communication and climate change respectively, they would voluntarily choose to write about it differently.

~ Peter

Edited to add: Max Boykoff has done a tremendous amount of work on the idea of "Balance as Bias", look him up!


Wait. You founded "Skeptical Science"?

Do you think it's ironic that you would claim to support critical analysis and truth when considering the various scientists and researchers who publicly stated that you misrepresented their data and research with regard to global warming?

Bowlslaw

Thanks for your question, a good opportunity for critical analysis.

To provide some background, we published a 2013 study (http://sks.to/tcppaper) where we analyzed 21 years of climate papers, identified all the papers stating a position on human-caused global warming and found 97.1% endorsed the consensus that humans are causing global warming.

But because replication is the heart of the scientific method, we also independently measured the consensus by asking the scientists who authored those climate papers to rate their own papers. 1200 scientists responded, with over 2000 papers being rated. Among the papers self-rated as stating a position on human-caused global warming, 97.2% endorsed the consensus.

This was strong vindication that our finding of overwhelming consensus was robust (not to mention 97% consensus has been found in multiple other studies using independent methods).

So a survey with 1200 scientists participating found 97% consensus. How did climate science denialists try to cast doubt on our result? By cherry picking a handful of scientists (I think 7 in total) claiming we had misrepresented what their papers were saying. Cherry picking is the technique of taking a small sample (e.g., 7 dissenting scientists) to say the opposite of what you find when you look at the full picture (e.g., 1200 scientists showing 97.2% consensus).

Let me take a step back for a moment. Other questions in this AMA have asked how to respond to misinformation and science denial. The answer: explain the science then explain the techniques used to distort the science. Use attempts to cast doubt on the science as a way to explain the science plus boost critical thinking, thus inoculating people against misinformation. This reply is an example.

~ John Cook


How can we tell the difference between corporate science and science for the people?

unicraven

Hello there!

Corporate or industry funded science is a tricky thing. It has been well established in the biomedical field that funding can bias study results, which is one of the reasons why disclosure of conflicts of influence is so important.

But industry science is science. It is part of the scientific corpus. In fact, it's super important to include all of the science being done, from industry to government to academia to NGOs when looking at what the "consensus" view has to say. When you get agreement across all of these different sources, that really strengthens our confidence that a conclusion will be valid, and we see this with climate change.

There's a big campaign called "#ExxonKnew" that accuses Exxon of knowing full well about the reality of climate change for decades and decades, but trying to mislead the public about it. I'm not going to adjudicate whether Exxon is guilty of a crime, but interestingly, Exxon's defense is basically "yeah, we knew, but we published our work" and indeed they did. You can see decades of Exxon funded work affirming the scientific consensus on climate change. Knowing that studies funded by say the Sierra Club, Exxon, the US military, universities, etc. all say humans are changing the climate is reassuring because it guards against sources of possible bias like finances or groupthink.

~ Peter


There's one big issue when it comes to communicating science to people: conflicts of interest. I think that's the main reason we didn't know about climate change / pollution levels earlier. That's the reason we have that expensive and often dangerous drugs in pharmacies, when there could be better and cheaper alternatives. Do you think it's possible to move past this?

itakmaszraka

We have seen past examples (e.g., the tobacco industry) where vested interests spent millions of dollars to misinform the public but eventually science "won". Tobacco is an instructive example - how did science win? The turning point was that iconic moment when the tobacco industry was put on trial for misinforming the public, the executives all said they did not believe smoking was addictive, then it was revealed that their own internal scientists had found that smokingwas addictive.

The turning point was when the public realized that they had been deceived by industry distorting the science for profit. So we need to communicate the scientific evidence but a crucial story that we also need to tell is the story of vested interests spending huge amounts of money to misinform the public.

That's why efforts such as Inside Climate News' reporting are so important. They found that companies such as Exxon knew back in the 1980s from their internal scientists that humans were causing global warming. But in the 1990s, they went on to fund climate misinformation to confuse the public about human-caused global warming. These cynical, profit-motivated activities are powerful stories that are easily grasped by the public, and inoculate them to some degree against misinformation from these industries.

~ John Cook


People tend to have a very strong emotional reaction to climate change and strongly deny it. Have you met anyone that is will not try and listen to your arguments? Have you ever been able to get through to them with the vast knowledge you have?

Its_Free_RealEstate

In most cases, when engaging with a person who rejects a scientific consensus, I haven't had much success. Probably the one exception is my father. We had lots of "conversations" (arguments?) about climate change, where my dad argued against human-caused global warming. The one day he told me he accepted that humans are causing climate change. After I picked myself off the floor from shock, I asked him what changed his mind. He replied, "I've always believed that."

So unfortunately because of his denial of his past denial, I wasn't able to directly discern what lever changed his mind. However, I've tried to deconstruct it and I have a possible explanation. Over the previous year, he installed solar panels on his roof. He'd always stress to me that he did it for hip pocket reasons, not environmental reasons. Every three months, he'd call me to tell me how his electricity bill was a check where he received money rather than paid money.

When people's beliefs conflict with their behavior, they experience cognitive dissonance. That makes us uncomfortable. So we try to reduce the dissonance - either by changing our behavior or changing our beliefs. Changing our beliefs is actually easier to do and takes less effort. We like smoking so we're skeptical of the science proving smoking causes cancer. For example, our lifestyle is carbon intensive, so we don't believe that carbon emissions is causing climate change.

But in my father's case, he had actually adopted environmentally friendly behavior. So my guess is that he changed his beliefs about climate change to bring them in line with his low-carbon lifestyle. (disclaimer: I have no scientific evidence for this, it's just speculation :-)

More broadly speaking, I think that when people realized they've been deceived by misinforming techniques, that inoculates them against the influence of misinformation and makes them more open to scientific evidence. This is what I found in my recent paper on inoculation (http://sks.to/inoculation) where I explained a technique of science denial then found that misinformation using that technique no longer influenced people. Most interesting was the misinformation was neutralized across the political spectrum. It doesn't matter whether you're liberal or conservative, no-one likes to be misled.

But we also see this dynamic anecdotally. For example, a famous recent case of someone who changed his mind about climate change is Jerry Taylor, who founded the conservative organization Niskanen Center (that advocates for climate action). The catalyst that got him investigating the science was when he realized he was being fed misinformation by a climate science denialist.

~ John Cook


Hello there - I run a group that advocates for scientific literacy and improved science communication. Would you be interested in stopping by one our meetings next time you're in the Boston area?

The_Cantabrigian

Happy to if I'm ever in town. You can reach me at http://evidencesquared.com/contact-us/ to give me your contact details.

~ John Cook


I need to do a persuasive speech on a topic but i want to do it on a topic that is clearly supported by science like global warming. Any other suggested topics that people dont believe in but have a strong science backing.?

Darth_Simba

Hello there!

The age of the Earth and the origins of complex life are another area where there is overwhelming scientific agreement but prevalent worldview-based misconception by the public.

~ Peter


I need to do a persuasive speech on a topic but i want to do it on a topic that is clearly supported by science like global warming. Any other suggested topics that people dont believe in but have a strong science backing.?

Darth_Simba

Hi there!

Emily Vraga here, John & Peter invited me to co-host the show several times with them. Another great issues is genetically modified foods - people think they aren't safe but scientists overwhelmingly do.

http://www.pewinternet.org/2015/01/29/public-and-scientists-views-on-science-and-society/

~Emily


Additional Assets

License

This article and its reviews are distributed under the terms of the Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and redistribution in any medium, provided that the original author and source are credited.