Most People Believe Disinformation Doesn’t Work

Julie Hotard
14 min readAug 9, 2021

--

Houston, we have a problem. Most people in the U.S. don’t believe in one of the most powerful forces in our culture — disinformation. At least they don’t believe it works on most people. And most don’t believe we can do anything to stop it from working — at least not without violating the 1st amendment.

Even the president greatly underestimates the dangers of disinformation, says Biden’s former disinformation czar.

“Unfortunately, the Biden administration hasn’t seen disinformation as the crosscutting threat that it is,” Jankowicz said. “The same mistakes have been repeated over and over.”

You can read about this all day on social media — the idea that propaganda doesn’t work that well — for example, that it only works on stupid, crazy, racist or irresponsible people — Or that it only works on those few people who have a burning need to believe what the propagandist says.

In mainstream media too, you can read about how there’s no reason to believe disinformation works very well. Here is one of the numerous examples:

However, research shows that repeating lies typically causes humans to believe them. It’s normal human thinking. It doesn’t mean that the conned person is stupid, crazy, irresponsible or that the person needs to believe lies. This is called the Illusory Truth Effect.

Most people don’t believe the ITE is real. See how easy it is to be mistaken in your beliefs? Disbelief in the ITE is itself an example that illustrates the ITE. Since people constantly say on the internet & elsewhere that people “have agency” so they can choose not to believe constantly repeated lies from a source they trust. Well, even if some people CAN choose not to believe the lies, very few people actually make that choice.

As is often the case, misunderstanding the problem leads to not being able to solve it. If you think people who believe lies are unusually stupid or crazy, maybe you think nothing needs to be done about disinformation, because you think it won’t affect many people. Or you think “you can’t fix stupid.”

But actually, you CAN fix stupid. It’s the reverse process of making people stupid.

Maybe you think people who believe lies are “The Other” — that they are different and are not good people like you and your friends are — that they are unusually irresponsible, racist or otherwise evil. Maybe you rage at them on social media, and then forget about the disinfo problem — as if raging is a solution.

A friend of mine who’s helping me spread the word about disinfo was amazed to see that a recent post about disinformation was quickly followed by literally hundreds of responses saying how “stupid” people are who believe disinformation.

Alternatively, maybe you want to save your brother or niece or friend, but you may think that believing lies is NOT the effect of hearing them repeated — and associated with intense emotions — thousands of times per year by Right Wing TV, radio, newspapers and social media. So you think that if you have a few one on one conversations with your relative or friend, that should suffice to change their mind. You can’t understand why it doesn’t.

If you are a journalist and you don’t believe repeated lies work, then maybe you “preach to the choir.” Well meaning journalists at Right Wing Watch, Media Matters for America, and those who work the “disinfo beat” in mainstream media organizations do this. These people have limited impact, because they are debunking “to the choir” — to the people who do NOT live in the Right Wing media bubble and do NOT hear the lies thousands of times per year.

These misunderstandings are why it’s so important to look at disinformation within the context of our own society’s beliefs. If we don’t believe propaganda works on very many people, then we’re not going to care about solving the problem.

Thanks to propaganda researcher Dr. Emma Briant, for stimulating my thoughts — and thus this article — through a Twitter thread she wrote this morning that relates to this subject.

She pointed out how disinformation research is not given enough attention, or used as much as it should be, by our society.

Her comments made me think about why. I realize no one is going to pay attention to research findings if they can’t admit the problem being researched is real — if they can’t bear to look at it and can’t bear to hear the terminology used. People also don’t feel like hearing about a problem if it seems impossible to solve.

I am not an academic researcher. I study disinformation from a Big Picture point of view. I have a wider lens from which to view disinformation than most people who study it, because my background and interests are different from those of most propaganda researchers. Due to my wider lens, I am noticing some different aspects of the problem than others do.

It’s important to look at the field of disinformation research within the context of one’s own culture’s false beliefs and disinformation.

Dr. Briant notes that when she first joined Twitter in 2012 there were very few accounts registered by people with a specialist interest in propaganda. “Communication Studies scholars still thought we were weirdos and the internet was democratizing enough to make propaganda impossible at that time.”

Impossible. Yes. Let’s let that sink in. Even Communication Studies scholars didn’t believe propaganda worked. They were in denial of reality about a huge problem. They simply ran away from it.

The false belief that disinformation isn’t powerful has been around a long time. For example, the 1928 book Propaganda by Edward Bernays didn’t sell very well to the public. Why? The public hated the word “propaganda” which brings to mind lies and deception. They were disgusted by the idea that they might be manipulated. They ran away and denied the reality.

People in charge of corporations, however, were very interested in using it to sell their products and services — resulting in Bernays having a thriving career.

So did most people feel willing to admit they were manipulated and try to solve the problem by stopping themselves from being manipulated? No. Did other people, for example those in corporations, want to manipulate others? Yes.

A similar situation occurred in World War II, before the necessity of counterpropaganda operations was finally accepted. At first, the information from counterpropaganda operations in World War I was ignored. At first, even the military didn’t want to look again at the fact that propaganda could influence our troops and destroy their motivation to fight the war. This situation is noted in a book by Paul Linebarger.

How severe is this problem? It was so severe that it delayed the production of the counterpropaganda that was necessary for winning World War II. We’re lucky it wasn’t delayed long enough to cause us to lose the war.

Dr. Briant states that communication studies scholars at first dismissed propaganda researchers like herself by claiming they had deterministic ideas about media effects. Yes. They had the idea that propaganda works. Which was correct. To me, this seems more important than the specific criticism claiming that propaganda researchers viewed propaganda using a “hypodermic needle” model. The big issue was likely the desire of communications researchers to disbelieve in the power of propaganda.

Denial is very important in disinformation situations. When people believe lies, they also deny the truth — real situations in the real world.

One interesting aspect of being in denial is: We’re often in denial about being in denial. People feel ashamed about shoving something important under the rug, and don’t want to admit doing so or having done so.

That’s why it doesn’t surprise me that the same Communication Studies scholars who dismissed propaganda research because they didn’t believe propaganda could work — that these researchers just suddenly admitted disinformation was powerful — but not until they had worked out a neat and tidy solution — inoculation of the public against disinformation. They denied and covered up the fact that they had previously denied reality.

People who are in denial often stay in denial. If they don’t stay in denial, they may flip to facing the problem by taking one small step beyond denial, while at the same time denying the embarrassing reality that they used to be in denial.

The neat and tidy solution — inoculation of the public against disinformation — hasn’t made much impact. A solution obtained from looking at a problem in a very limited way and taking one small step beyond denial — rather than facing the problem thoroughly in all its aspects — usually doesn’t work well.

To recap, our culture has denied the reality of powerful propaganda and also hates the word. “Disinformation” may seem not to be a good word either, if one is thinking only about clarity and specificity. However, the term seems to be less hated than propaganda. So maybe it will have to do until a more appealing term comes along.

However, it’s not only the label “propaganda” that faces obstacles from the public, and even from some academics. It’s the concept — the very idea that propaganda is possible and powerful and works on most people.

Somehow we need to help each other out of denial. We need to face the truth.

People who study propaganda — the spreading of messages that are often lies — are not necessarily having good success at getting people to face the truth. They’re not having good success at spreading truthful messages about their research and how our society can benefit from using it.

I’ve already mentioned that one reason is our cultural disbelief in the power of propaganda. Many Americans see these researchers as solving a problem that doesn’t exist, doesn’t affect many people, or can’t be fixed because “You can’t fix stupid” and because it’s hard to fix someone else’s “irresponsibility.”

Another reason disinformation researchers have trouble getting attention is that we live in a plutocracy. Plutocrats rule partly through the power of expensive disinformation. Cambridge Analytica — which conducted widespread social media campaigns leading up to both Brexit and Trump’s 2016 election — was financed by someone who was so wealthy, that he owed almost seven billion dollars in back taxes at the time he financed that operation.

He had almost limitless resources to finance a search for ways to sell disinformation to people of every personality type and life circumstance.

The Big Lie about the 2020 election having supposedly been stolen from Trump, is also well funded.

By contrast, the average disinformation researcher is not sleeping in a “bed made of money” like many people who are involved in the researching or spreading disinformation. Here’s an article about a deliverer of propaganda who admits to lying in such a bed.

By the way, if you go to the YouTube of this conversation between Malcolm Nance and Ben Shapiro, it’s followed by hundreds or thousands of responses praising Shapiro and bashing Nance, and almost none doing vice versa. This is very common on social media of all kinds. Swarms of trolls cause the Right Wing view to predominate. It seems unlikely that this happens constantly by accident. I’ve written about this situation in the past.

If the average disinformation researcher were sleeping on a bed of money, as the people creating and spreading disinformation are, we would already have found the best ways to break through our culture’s heavy denial of the power of disinformation, and our society would be well on its way to facing and solving the problem.

Here is one more major aspect of our disinformation problem related to the issue of denial of disinformation’s power. Propagandists study people’s individual and cultural strengths and weaknesses (such as our cultural denial of the reality of disinformation’s power)— to find ways to use them against us.

Maria Konnikova, in her book The Confidence Game notes how successful con artists know us better than we know ourselves. A propagandist is a type of con artist. For that reason, the advice to “Know thyself”and “Know thy culture” is an essential key to avoiding being conned or fooled by lies. It’s important for both ordinary people and for researchers.

It’s easy to see how propagandists can use people’s weaknesses against them. Propagandists just cater to people’s vices so intensely that people start proudly displaying those vices as if they were virtues — as we saw during the January 6th insurrection and at many other times recently.

Here’s an example of belief in a lie that commonly occurs on the other side of the political spectrum. Liberal social media seem to firmly believe in the habit of raging at the supposed racism, stupidity or irresponsibility of people who believe lies. The belief behind this habit of rage tweeting is that disinformation problems can’t be fixed but can only can be vented about.

This keeps the focus off of research that might solve the problem. It also focuses attention off of billionaire plutocrats who finance disinformation. It focuses blame and attention instead on the ordinary people who are being conned — as if that is the only significant part of the problem — as if there isn’t Big Money behind Big Lies.

I don’t know whether this habit of raging — and this belief that disinformation problems can’t be solved — arose organically from liberal Twitter, or whether it’s incited by Right Wing propagandists pretending to be liberals for the purpose of leading liberals astray — or whether it came from a combination of both. Whatever the origin is, that habit and that belief are extremely beneficial for plutocrats and their propaganda efforts.

It’s harder to notice how people’s virtues can be used against them, than how their vices are used against them. Below is one way that can be done. In the U.S. we value some very useful and important traits such as individuality, independence, personal responsibility and freedom.

How can a propagandist use those against us? By encouraging us to view them, pursue them and practice them in unrealistic ways. If we’re unrealistic about these traits, we end up going in the opposite direction and losing those qualities. For example, if we lie to ourselves and believe that being free means refusing to get vaccinated for COVID, because we believe COVID is a Democratic hoax, we may get sick, infect our family and friends and die of COVID.

If we believe we’re independent or free in ways we aren’t, we’ll never get to become independent or free in those ways. You can’t learn things you think you already know.

Here’s a quote about that, that has been attributed to many people.

As if the fate of the quote has conspired to illustrate the quote itself, it is attributed to many people, but no one actually knows who said it first. There’s no evidence it was Mark Twain. See how easy it is to believe something that isn’t true? Even the good folks at Goodreads.com, whom I like and appreciate in many ways, get this wrong. Here is an investigation done on the quote.

Another way good values can be used against us, is by distracting us from developing the admirable quality and focusing us instead on scorning someone else for supposedly not having the quality. An Us vs. Them dynamic is created. People seen as “The Other” are said to be evil or incompetent — compared to the good “Us.”

Personal responsibility is perhaps the trait most often used in this way. As is noted in Isabel Wilkerson’s excellent book Caste, in our plutocratic society, we reserve most of our empathy for privileged and upper class people.

However, plutocratic propaganda tells us constantly about the supposed irresponsibility of the poor, minorities, ordinary people who believe Right Wing media lies etc.

Right Wing propagandists have been studying American culture and using it against us for a very long time. Here’s a pamphlet that was sent to Republican candidates running in the 1990 elections, noting the concepts — and the words describing them — that are favored and disfavored in American culture. These words — and the concepts they describe — have been used to glorify Republicans and to bash and demonize Democrats and groups the Right Wing associates with Democrats (like minorities, journalists, coastal dwellers, city dwellers, scientists, academics) ever since then.

This is why we need to know what appeals to people in our culture and what disgusts us —in order to “Know thy culture.”

Here are some books I found that can be useful to “Know they culture.” It’s easier to see a picture clearly if you are not inside of it. That’s why many of the ideas in some of the books came from people, or are written for people, who are — or originally were — outside of American culture.

American Ways: A Cultural Guide to the United States by Gary Althen. I think there’s another book by that name, so be sure to get the one by that author.

Incredibly American: Releasing the Heart of Quality by Marilyn R. Zuckerman and Lewis H. Hatala

The Culture Code: An Ingenious Way to Understand Why People Around the World Live and Buy as They Do by Clotaire Rapaille

If we read books like these, we can understand our culture better.

Here’s one last thing we can we do about disinformation. We can help each other to break through our culture’s denial about the power of disinformation. We can all help by spreading the word to friends and relatives — whether in person and/or through social media — about the Illusory Truth Effect.

Please share the wikipedia page for it that I linked to above, earlier in this essay. People need to be aware of the powerful effects disinformation has on the average person — whether stupid or smart, selfish or kind. Then people will be more motivated to find solutions to the problem because they will know it’s real.

By the way, Americans know how to disable a propaganda machine. We’ve already done it before, in a different context than the present one. We could do it again.

--

--