Commentary

Mis- and disinformation: Threats to Nebraska’s economy, health and safety

January 31, 2022 5:45 am
Misinformation word cloud

(Getty Images Plus)

As partisan polarization deepens within the United States, Americans are increasingly failing to agree not only on policy, but upon what’s true. At the heart of this matter lies the spread of false narratives, which accelerates (and is cultivated by) partisan divides. The circulation of misleading information has sown national discord by warping once-unifying issues into conflicting personal agendas. Even further, debates can quickly escalate into accusations about who is intentionally stirring moral panics for personal gain. Our societal ill is not just partisanship. It is that Americans believe in starkly different realities.

For Nebraskans, false information has stoked confusion about whether their state faces real threats to its agricultural prosperity and public health — cornerstones of Nebraska’s Good Life. Local citizens continue to wrestle with questions along these lines: Is climate change truly a human-made problem? Is COVID-19 as severe for unvaccinated individuals as health experts make it out to be? Even as extreme weather devastates Nebraska farmlands and doctors delay surgeries due to the influx of COVID-19 patients requiring hospital beds (even UNMC, one of the nation’s leading medical research centers, has grappled with capacity), the mixing of truths and untruths has kept Nebraskans divided about the gravity and causes of these issues. 

Fortunately, no matter how heated their disputes, Nebraskans have taken a civil approach to their conflict so far. Yet, the spread of false narratives still poses a concern to the future security of the state. Widespread falsehoods can split people into distant ideological camps that dehumanize their opponents and even radicalize ingroup members toward political violence. The move from extreme ideas to violence is not merely a risk for those in local organized groups, but any otherwise peaceful citizen surrounded by like-minded others. Thus, in addition to climate and public health concerns, the risk of violent extremism makes it critical that we understand how false information spreads, what it might look like and how we can fight it.

The spread of misinformation and disinformation

Misinformation and disinformation both refer to misleading information, but they hold one key distinction: Misinformation unfolds unintentionally, whereas disinformation involves the intentional fabrication and spread of false narratives. On the internet, these two classes of false content can originate domestically or abroad and can shape human behaviors such as vote choice or, at their extremes, culminate in religious or racially/ethnically motivated violence. 

Online mis- and disinformation spreads in two main ways. People, bots and algorithms can influence first what is shared online and second who interacts with whom about that content. Once false narratives are introduced online, they can be quickly promoted, discounted or distorted by others in ways that instill a false sense of consensus (or dissensus) among pockets of users. One wrong click on TikTok, for instance, can catapult a person deep into communities of extremist content. The accelerated formation of echo chambers by bots and web algorithms can heighten emotions and sentiments against dissenters, and the illusion of extraordinary (dis)agreement can hinder productive, fact-based conversations around critical issues.

What makes false ideas ‘stick’?

False ideas persist when they are new and unusual, morally and emotionally laden, use images and extend to offline communities. Although people generally hope to share accurate information, they often also share new content online (regardless of accuracy) to signal that they are “in-the-know.” The issue is that this inclination can lead people to share new and hyperpartisan falsehoods, which often use more moral-emotional words and are quicker to spread than true stories. Additionally, mis- and disinformation collages look more compelling and credible than words alone, and as information crosses platforms (an incomplete Twitter thread screenshot to Instagram), it becomes decontextualized and tough to refute. These false narratives also last longer when spread in ongoing communities that have an online and offline presence, such as political parties or extremist groups.

Combating false information

There is hope, however, for fighting mis- and disinformation. Recent research shows that journalists and social media platforms can help counter false information by fact-checking or nudging people to think about the accuracy of what they are sharing (like Twitter’s read-before-you-retweet prompt). That said, a study by MIT researchers finds that these methods can backfire when online users try to correct and debunk inaccuracies from each other’s posts. Namely, replying to false tweets with links to fact-checking websites leads users to subsequently retweet lower-quality information with higher partisan slants and more toxic language. In other words, we should be diligent about communicating accurate content but be wary of publicly debunking others online. 

The threat of mis- and disinformation continues to grow as more people turn to online news sources and as social media algorithms become more sophisticated. When we fixate on the face-value credibility of people and groups online rather than evidence itself, we lose sight of what is real and actionable. Our ability to talk productively about truths determines our likelihood of forming collective solutions for preserving our farmlands, public health and security. To do so, we must first acknowledge in conversation that the fears and concerns of our opponents are part of their realities. 

This essay reflects the views of the author and does not necessarily represent the views of the University of Nebraska at Omaha.  

Our stories may be republished online or in print under Creative Commons license CC BY-NC-ND 4.0. We ask that you edit only for style or to shorten, provide proper attribution and link to our web site.

Tin Nguyen
Tin Nguyen

Tin Nguyen is a research faculty member in the Management Department of the University of Nebraska at Omaha’s College of Business and at the National Counterterrorism Innovation, Technology and Education (NCITE) Center. A Minneapolis native and Creighton University alum, Tin received his doctoral training from Penn State in Industrial and Organizational (I-O) Psychology before returning to Omaha.

MORE FROM AUTHOR