How to change someone‘s mind
Everyone knows the narrative trope where our hero manages to encapsulate all of the learning from the tale that we have witnessed. Regardless of the genre, one theme comes through: that the truth will have its day.
It’s a wonderful concept but, like all good fantasies, it’s not quite real. It’s the world as we would wish it to be. The belief that the rest of the world is in fact stupid, is one thing we all seem to have in common.
For example, a group of study participants had their mathematical literacy skills measured before performing a series of exercises. As would be predicted, mathematical literacy was a strong predictor of the ability to solve mathematical problems. However, when the problems were placed into a political context, requiring the participants to analyse evidence for the link between crime data and cities that had banned handguns, maths ability no longer predicted their success. Instead, the political affiliation of the problem solver decided the outcome. Liberals were good at solving problems that provided evidence for the link between handguns and crime, while conservatives were able to do the opposite. In other words, ability took a backseat to politics.
That’s a serious knock for any desire for a world united by facts.
We have a bias for supporting our own world view. People are fiercely tribal, and we are often closed to facts that contradict reality as we know it. What would it take to change someone’s mind? How do you get people to think differently in an information-saturated world, surrounded by opposed certainties?
At least since the time of Edward Bernays, we’ve been puzzling over mass desire. With WWI propaganda having been so successful, Bernays wished to replicate the mass media model for businesses, helping to mould the American public’s mind for his client’s products. (His reputation was likely assisted by making no secret of his famous uncle, Sigmund Freud.)
Among his notable achievements was his work for the tobacco industry. Noticing the hesitance from American society for women smoking cigarettes, as well as the demand for greater societal freedoms from them, Bernays orchestrated a tobacco campaign that saw attractive women being photographed smoking by local reporters in New York’s Easter Sunday parade. When they were queried by a local paper as to their reason for smoking so publicly, their answer was that the cigarettes were “torches of freedom“.
The reason why we are so ready to accept marketing on its own terms is that a brand is unlikely to carry an emotional association prior to it. As we know, ideas and emotions sell, but objects rarely do. But what about those things that already carry emotional weight?
As early as the 1940s, the impact of the media to shape the ideas and attitudes of the public, on everything from marketing, to the influence of violence in films, was considered contradictory. In one of my favourite examples of academics throwing shade, the following was used to describe the understanding of communication research in 1948: “...some kinds of communication on some kinds of issues, brought to the attention of some kinds of people under some kinds of conditions, have some kind of effects.” Inspiring stuff.
Prior models of media consumers were more likely to homogenise the public, seeing them as atomised and passive participants in media consumption. An increasingly long list of factors were found to have an influence on what media was consumed, and what effect it would have. A unitary, passive audience - an idea which appears peculiarly simplified to a modern audience - was abandoned, superseded by a complex list of factors that contributed to media’s apparent effects, including class, political leanings, the impact of time, the social situation that the media was consumed in, and so on.
The enormous success of wartime propaganda, was obviously put down to the conditions of the time. Perhaps fortunately to those who have read Orwell, it could not be replicated. During WWI, counternarratives among the domestic population were absent. Everyone knew who the enemy was, the majority of the public was united in opposition, and there was no desire for an alternative perspective. This meant that the only job of propaganda was to reinforce the prevailing view of the time.
Reinforcement was something that the media was very good at.
By the 1950s, the notion that communication was associated with reinforcement rather than change was considered axiomatic. For example, a 1952 study by Kelly and Volkhart found that when the political message received was counter to current norms, then the less salient a incoming message was, the more likely the person was to be converted by it.
However, sometimes non-media factors favour a ‘flip’. The consumption of Western media in the Eastern Bloc was understood to follow, rather than precede, an individual’s identification with more free market ideals. But it was not the media itself that was the catalyst for change, rather it was reinforcing the new group identification that the individuals had already selected.
If your plans for world domination have been scuppered by the ineffectiveness of the media, then let me offer some advice. If you can’t get the answer you want, just ask a different question.
What are you more in favour of: inheritance tax or ‘the death tax’? What about welfare or social security? Defence spending or military spending?
One of the exciting (or troubling) reveals of the human psyche is not how stubborn we are, but how utterly precarious our opinions appear to be. People can change their minds at the slightest provocation. The use of differing words, who is asking the questions, or simply what time they are asked, can radically alter someone’s response.
Simply re-arranging the order of questions has been found to alter support for a topic by 5 - 10 percentage points. A 1981 study by Schuman and Presser found that 45 percent of Americans would "not allow" a Communist to give a speech, whereas only 20 percent of Americans would "forbid" it. Similarly, a study of voting patterns in Arizona in 2000 found that support for increases in school funding rose substantially when the polling booth itself was inside a school.
So variable are reported opinions known to be, that in his 1992 book The Nature and Origins of Mass Opinion, John Zaller described the formation and reporting of opinion statements as being whatever happened to be ‘at the top of the head’. Zaller viewed the formation of opinions as something that could be modelled. While incoming information might be filtered for the individual, the reporting of a particular answer was essentially whatever happened to be salient to the individual at the time.
Bear in mind that, according to Zaller, the messages received are still filtered through a myriad of individualised traits. Once through, your opinion is simply a balancing act of these arguments. Such things as question wording, what you’ve just heard on the news, or what state of mind you’re in at the time, either contribute or negate from your mental ‘scale‘, tipping it one way or the other.
Framing is another concept which calls on circumstance. Think of framing as going from your living room to your kitchen, but when you arrive, you‘re can‘t really remember as to why you were there in the first place. As you cross the threshold of the doorway, you enter a different mental framework. Whereas living-room-you is looking for the remote, the moment you cross the threshold (literally) you cross into a slightly different version of you (mentally). Were you making a sandwich? You were probably making a drink, weren’t you? There exists a mental compartmentalisation between kitchen-you, and livingroom-you, so much so that the literal threshold of the doorway has triggered you into a version of you that doesn’t retain the reason for you being there.
You have slightly different mental programming for different places, different people and different social situations. When someone (or something) frames a particular idea as being an issue of fairness, you access a version of you that makes judgements regarding fairness. It’s still you, but the concept of ‘you‘ is multi-faceted anyway.
Here’s another example used by Zaller. An experiment carried out in the US in the 1970s featured two groups of people, who were asked whether they would allow Communist reporters into the United States. Of the first half, 37% said they would allow it. The second group were asked the same thing, but only after they had already answered a question asking if they believed US citizens should be allowed into Russia. The proportion which would allow Communist journalists into the US jumped to 73%.
The Communist reporter question, when asked alone, might be asking whether Communism should be quashed, balanced against the idea of press freedom. For the second group, the question is exactly the same in content but the question is now about the idea of reciprocity. The inconsistency was recognised.
When you think about it, we do this all the time. Every time we’re in an argument with someone and we make the ‘have you considered it this way?’ play, it might be thought of as you trying a version of priming - a way of stimulating the brain’s associative talents. By raising ideas that might be mentally linked with particular concepts in the brain, the salience of particular concepts are also increased, meaning they might be more likely to be recalled. Trying priming means that we’re wanting someone to ‘see a different side’ or ‘see it from my perspective’. The idea considered is not new, but a very slightly different person is being asked for.
So, if you wanna get the right answer, make sure the right version of them shows up.
However, although we might answer a question slightly differently when we think about it another way, there are things that simply won’t be altered by a re-phrasing.
We’ve all been party to conversations where habit seems to trump evidence. There appear to be entrenched ideas that we humans cling to. It’s this type of thinking that we are concerned with, when we say that the world seems increasingly divided. What we’re concerned about is essentially a rising form of tribalism.
This is because our partisan brains appear to fall apart when required to challenge or disprove our worldview. It does not just happen on a conscious level; there is evidence of this impacting our visualisations of the world around us also. Republicans report Obama as having darker skin than Democrats did. Democrats were more likely to remember, incorrectly, that George Bush was on holiday during Katrina. Our bias appears frighteningly unconscious.
Why can’t we all be a little bit more objective? The problem, unfortunately, is that truth is a currency with less value than we might hope. Group cohesion is hardwired into us because our strength comes from belonging, not necessarily from being correct. And it is for this reason that maintaining belonging often takes priority over accuracy. Regardless of political persuasion, all of us are closed-minded to certain facts.
So instead of asking what it takes to be ‘right’, let’s ask a different hypothetical question. When you challenge my deeply held belief that Donald Trump is great, what is the cost to me of you being correct? Have I lost an argument, or have I lost something more fundamental?
The first thing that happens when confronting me with a persuasive fact is you throw me into a state of psychological limbo. I’m confronted by two worlds, one in which Donald Trump is an excellent leader, and the other in which Donald Trump is incompetent. This lurching state of panic, known as cognitive dissonance, is deeply uncomfortable and we avoid it wherever possible. But, it has to be resolved, one way or the other.
I am now faced with a choice. I can accept the persuasive fact, likely putting me at odds with everyone around me whose opinion I respect, or I can dismiss the idea as being ridiculous, misleading, or perhaps even... fake news. The desire to hold a true but divisive fact - against my ideological group - often loses. This is because my desire to say the right thing and therefore attain success will hang on my believing the body of knowledge that was just thrown into disrepute.
But, there’s another problem. One of the things that a social group provides is a type of security. It provides a structure about how to perceive the world, satisfying our desire for the resolution of ambiguity. This desire, hardwired into us, is called epistemic closure.
Humans do not cope well with ambiguity. It frightens us. Our desire for epistemic closure is resolved by our belonging to a social group. It grants us a way to look at the world, make judgements about it, and resolve the chaos. It’s a salve for the immense complexity of the world“ complexity that none of us, however clever we are, can possibly measure up to. We all seek epistemic structure in some form, whether it be provided by science, religion, or short-handed political or social judgements.
By challenging my ideology, (therefore, also my social group and identity), you have thrown me into a state of cognitive dissonance which I have to resolve. Will I come over to your side, thereby undermining my certainty on a number of issues? Will I remain on mine, where I can dismiss your comment out of hand but likely win the support of my social group, returning to comforting tropes such as labelling anti-Trump messages as liberal propaganda? I’m unlikely to resolve one type of certainty by creating more. So, it’s an easy choice: long live Fox News.
The world, and the world view
This article aims to shake some of the ideas and opinions you have, and ask whether you believe these things because of you, or simply because they are approved by people around you. As mentioned earlier, wherever you are and whatever you believe, the belief that the rest of the world is stupid is one thing we all seem to share.
It is precisely this approach which makes a person less likely to change course.
People seek certainty. Wider resolutions to uncertainty reside within the social group to which they ascribe. If you create doubt in someone’s mind, while threatening the wider shorthand they use to process the world around them (a heuristic), then you leave them nowhere to retreat to. We crave the resolution of answers. Giving that up is too great a cost. It’s easier for the psyche to reject your argument than to capitulate an entire worldview and be thrown into the chaos of uncertainty.
So, what allows us to overcome these inbuilt biases? Well, we know that partisan affiliation runs deep. Studies show that a better determinant of support for a policy is whether the person speaking is part of ‘your group’ than the policy itself. Support can still be garnered even when it seems to run contrary to ideology. We defend our tribes against outsiders.
However, affirmation of a listener’s wider worldview might allow them to concede the point. By allowing for the individual to hold on to their wider interpretive world structure, one might be more likely to allow the individual to consider the particular issue in isolation, without undermining the need for epistemic closure and belonging.
Remember that the big enemy we’re trying to face is uncertainty. It’s generalised statements that limit the questions we have to ask of ourselves. When we challenge a particular view, we create cognitive dissonance and, by threatening the social group to which that interpretative structure pertains, risk more uncomfortable dissonance thereafter. If you create uncertainty, people are more likely to adopt your view if you fill in the gap for them.
So, creating a new, concrete certainty when undermining the old one provides a new host in which mental security can reside. This won’t work for everything. Creating a very definite pro-choice belief for a fundamentalist Christian is unstable, as dissonance between this view and the wider social identity is inevitable. However, with things that are less attributable to ideology, it would be best to resolve the uncertainty than ‘leave it hanging’. The issue, unfortunately, is that the definite answers to questions are pretty thin on the ground in our modern world. It is precisely this problem that leads to the seeking of interpretive structures. But, if you can find concrete answers, or there is a definite alternative structure that can be provided, you should provide it.
Last of all is the ideal. We crave belonging and a structure of interpreting the world, but there is no reason that this has to be bound by parties, gender identities, or liberal-versus-conservative ideals. Labels are comforting but they breed the types of opposition that entrench the fundamental schisms in the world. If you want to foster understanding, the best way to do so is to be kind. Build new social identities, particularly in the small communities that surround you. Make sure that these communities reward truth-seeking behaviour, so that your tribe rewards accuracy rather than just ‘saying the right thing’. Make saying the open thing, the quizzical thing, the right thing to say. Never call anyone stupid. Above all, have fun with them. Be supportive. Understand that arguments are less likely to win someone over than showing someone a nicer way to live. We are not rational creatures; we are social creatures. Never risk reinforcing the barriers to cohesion.
Do not seek the approval of your group by lambasting people, and do not reward it when you see it in others. It’s about realising that their grounds for belief, just as your own, have very little to do with ability to be rational, and far more to do with the situation you find yourself in.
George Dean is a freelance writer and holder of opinions. When not writing articles or slightly disturbing Science Fiction, he can be found reclining in coffee shops looking needlessly wistful. If spotted in the wild please send him back to his desk. Find him @georgedean27.