Persuasion Research for Learning Professionals and Debunkers
This article has been updated on 12 April 2019. Note that it needs more updating as there has been recent research on debunking suggesting that it’s desirable to elaborately debunk erroneous information, where sometimes the earlier research suggested focusing on the correct information. The truth is that this field of inquiry is fairly new and lots more research is needed before we can settle on a definitive set of guidelines. I’ll have more later…
Our mission in The Debunker Club is to rid the learning field of myths, misinformation, and outright deceit. We have three main mechanisms to do this:
- Persuade purveyors of misinformation to stop spreading misinformation.
- Create and maintain healthy skepticism among learning professionals (to make it less likely that they’ll be taken in by misinformation).
- Share good information — that is, recommendations that are research- and/or evidenced-based and proven in real-world practice.
In each of these endeavors, we are trying to persuade.
Given this, we ought to take a look—at least a brief look—at the research on persuasion, so that we can be most persuasive. Indeed, because there is research to suggest that SOME persuasive efforts can actually make people believe misinformation more strongly, we ought to be very careful in how we attempt to persuade.
This brief post will attempt to quickly review some of the relevant research.
We, as Learning Professionals, Should Know the Persuasion Research
As learning professionals, we ought to make an effort to learn the persuasion research ourselves. Part of our job as learning professionals is to persuade our learners. I have made persuasion one of the Decisive Dozen twelve most important learning factors—so I’m a big believer—and will try to persuade you of its importance. I’ve also written a chapter in an unpublished manuscript (my infamous “big book,” Work-Learning) and will draw partly from that in this post.
Every learning intervention requires persuasion. We may need to be persuasive to nudge our learners to engage in the following:
- To enroll in a learning event.
- To engage deeply in learning.
- To believe the learning messages.
- To believe in the importance of the learning messages.
- To believe management will provide sufficient support.
- To believe that coworkers will be supportive.
- To revisit the topic at a later time.
- To apply their learning to their job.
- To persevere in the face of obstacles.
- To share their learning with others.
- To strengthen and enrich their learning periodically.
- To utilize what they learned in lateral issues.
But Dr. Thalheimer, isn’t persuasion just common sense? You be the judge! Here are some common persuasion mistakes we make in the learning field:
- Just presenting information.
- Using formal language—not connecting at a human level.
- Not acknowledging uncomfortable truths to the learners.
- Not learning about the learners’ motivations, skepticisms, and perspectives—and thus not being able to design learning messages specifically to be effective.
- Not inoculating our learners to the obstacles they may face.
- Not requiring learners to make a commitment to putting their learning into practice.
- Not utilizing testimonials from previous learners in our learning programs.
- Not asking learners’ managers to require a specific commitment to learning and application.
- Assuming that simply showing our learners their misconceptions will produce attitude and behavior change.
- Not understanding that trying to convince others can actually make their erroneous views stronger.
- Not understanding that clarity in our instructional materials can support learners’ belief.
When you hear the word “persuasion” I bet you think of a person trying to use sound arguments or great oratory to convince other people. Well, you better get over it. That’s not how persuasion really works—at least that’s not most of it.
Persuasion is partly about argumentation, logic, and evidence. There is some truth to our common understanding. But persuasion involves so much more. If you want to have maximum persuasive impact, you need to connect with people at a human level.
Researchers Susan Fiske, Amy Cuddy, and Peter Glick found that we humans—because we have to filter so many stimuli in our social decision making—tend to simplify our evaluations down to the two dimensions of warmth (trust) and competence.
When we try to persuade, we will similarly be judged. Our learners are sifting through vast swarms of stimuli in their work and in their lives. To respond successfully in this chaos, they have learned to react quickly to the stimuli they face. Any learning messages we create will be judged in the same way—quickly and often with only shallow processing. Partly how they’ll judge our messages is by judging whether we are trustworthy and whether we have competence in relation to the persuasive issue.
Our learners’ cognitive machinery utilizes two channels: (a) an effortful conscious channel that is very good at focusing intentionally on one or a few issues in depth, and (b) an automatic unconscious channel that can respond to many issues with quick heuristic decision making. The automatic channel tends to engage prior to the conscious channel and be utilized more often as well. Responding automatically with heuristic processing usually works well, but it can produce some unintended consequences.
This is why training programs that focus only on conveying information are often so inadequate—they are utilizing only one of two channels; and, they are utilizing the channel that is the least likely to inspire, energize, and entice learners to engage in the hard work that long-term learning requires.
To maximize persuasion, we need to go beyond the content. We need to utilize double-barreled persuasion, utilizing both the intentional conscious channel and the automatic unconscious channel.
Later, I’ll come back to how specifically we might do this.
Suppose you want to train a select group of citizens of the United States. You want to train some right-wing Republicans that government revenues do not go up when taxes are cut. You want to train some left-wing Democrats that higher oil prices are likely to create significant negative economic repercussions.
Fortunately for you, a vast majority of experts in both domains have views consistent with your training messages. Unfortunately, your target audiences have differing views than the experts. Republicans believe that tax cuts lead to greater revenues because the drop in taxes is thought to stimulate the economy. Democrats believe that higher oil prices are good because they incentivize people to reduce pollution—and any negative economic consequences are minimal.
Also in your good luck, you have two more arrows in your quiver. First, your trainees are captives. They are required take your training program. They will diligently sit through your full-day workshop. Second, you are a skilled trainer, able to employ great charisma, wit, and good hygiene to your advantage.
How successful do you think you would be in changing people’s minds if you crafted well-designed training relying on the experts views? I’ll give you four choices. Pick one before moving forward.
- Training would be significantly persuasive.
- Training would be mildly persuasive.
- Training would have little or no effect.
- Training would strengthen people’s erroneous views.
For the left-wing, right-wing scenario above, the answer may shock you. By simply presenting our learners with good arguments, with expert testimony, and with valid information, not only are we unlikely to change their minds, but we may actually push them to hold their views even more strongly than they held them before we intervened. Research has shown this in some specific contexts[1] however based on new and better research[2], this so called backfire effect is likely less prevalent than we once feared.
Still, here are some examples when the backfire effect did cause people to believe more strongly after they encountered counter-evidence. Campaigns to lower the rates of smoking can actually increase the rates of smoking.[3] Conservatives who were shown evidence that Saddam Hussein—leader of Iraq at the time the United States invaded Iraq in 2003—did not have weapons of mass destruction (WMD); were subsequently more likely to believe he had WMDs.[4] Those who were most wrong in their beliefs about welfare, were least likely to be swayed by evidence to the contrary.[5] Consumers who pay attention to calorie information on food labels eat more unhealthy food.[6]
These findings should send chills up your spine. We can’t just present facts and evidence and assume that we will be persuasive. We must do more.
How Should We Debunk?
When we debunk, we are trying to persuade. We’re also hoping to prevent a boomerang or backfire effect that pushes people to believe more strongly in the misinformation they are conveying. But how can we be more likely to persuade?
The following chart summarizes the recommendations in the review article by Lewandowsky, Ecker, Seifert, Schwarz, and Cook (2012).
Because this graphic probably makes more sense if you read the article, let me clarify a few things.
First, it’s important to know that if you repeat the myth or misinformation—even if you remind people that it’s a myth—you’re probably making it more likely they’ll remember and believe the myth! Repetitions have a way of increasing belief, even when we tag the repetition as a falsehood. So, in general, you don’t want to repeat the misinformation. From a practical standpoint, don’t reiterate the falsehood but focus on other relevant facts.
Second, people like to be consistent in their mental models. Even if we rebut their beliefs, if they are left with beliefs that aren’t coherent, they are likely to revert back to their original false beliefs. So, in general, it’s helpful to provide them with a new way to think about a set of beliefs of concepts—to ensure that they won’t slide back to believing in falsehoods. To put this another way, we have to be able to convey a coherent story about what is true and why it is true.
Third, people have a tendency to believe what is conveyed to them, unless they have reason to doubt or be skeptical. If they already have an erroneous belief, the counter-information we provide—because it isn’t consistent with their beliefs—is likely to raise their skepticism, making it harder for us to be persuasive. This again, is another reason we can’t simply rely on presenting people with counter evidence. On the other hand, if we can get folks in the learning field to generally have a higher level of skepticism, they are likely to bring more of their critical faculties to evaluating evidence. The Debunker Club may have its greatest impact by raising skepticism in general.
Fourth, it is very difficult to change people’s beliefs once they hold them strongly, especially when the misinformation is consistent with their world views or are encapsulated in a causal scenario or coherent story.
Fifth, simple explanations are more persuasive in overcoming misinformation than complex rationales or long lists of counter-arguments.
Sixth, multiple debunking efforts may be required to overcome long-reinforced beliefs.
Seventh, people may be more open to counter arguments when we enable them to first self-affirm their personal identity by reflecting on values that make them feel good about who they are. By eliciting self-affirmations, people seem more open to counter arguments.
Other research suggests additional techniques that enable people to be more persuasive. Specifically, people are more likely to be persuasive when:
- They remind persuadee how they are similar to each other.
- They personalize the persuasive message.
- They are likeable.
- They treat other people with respect.
- They do favors for the persuadee.
- They utilize the foot-in-the-door technique, getting agreement on a smaller request first.
- They are celebrities or can invoke celebrities to make the persuasive pitch.
I’m just touching on the persuasion research, but we should now have enough arrows in our persuasion quivers to be effective — or at least, to be more effective.
Additional Resources
[1] (for example, Nyham & Reifler, 2010; Schwartz, Parker, Hess, & Frumkin, 2011; For wonderful review of how people become misinformed, and how such misinformation can be corrected, see Lewandowsky, Ecker Seifert, Schwarz, & Cook, 2012)
[2] see my blog post on this here, which refers to several studies, including this key one: Wood, T., & Porter, E. (2018).
[3] (Byrne & Hart, 2009, as cited in Lewandowsky, Ecker, Seifert, Schwarz, & Cook, 2012)
[4] (Nyham & Reifler, 2010)