Buy us a coffee! Set up a $5 donation each month to keep community journalism alive!
Buy us a coffee! Set up a $5 donation each month to keep community journalism alive!
powered by bulletin

News & Views of Phillips Since 1976
Monday May 20th 2024

Disinformation: How It Impacts Elections and What We Can Do About It


Illustration: Jake Ryan

As voters gather information on candidates running for office this election season, they will be inundated with massive amounts of information from a variety of sources. Some information will be factual, while much will involve misinformation and disinformation. What is the difference between misinformation and disinformation? Misinformation is false or inaccurate information. Disinformation is false information which is deliberately intended to harm and mislead.

Minnesota Secretary of State Steve Simon calls disinformation “the conscious spread of knowingly false information” This shouldn’t be confused with political “disagreement.” When weaponized, Simon says, it can incite violence and disruption, posing a threat to the integrity of elections, the backbone of a healthy Democracy. He’s referring not just to January 6, but to the ongoing targeting of election workers.
In 1949 the Federal Communications Commission (FCC) created the Fairness Doctrine, which required radio and television broadcasters “to present fair and balanced coverage of controversial issues to their communities, including granting equal airtime to opposing candidates for public office.” This was an extension of the Radio Act of 1927, which mandated that licensees serve the public interest.

Over the years challengers of the equal airtime requirement called it an infringement on their First Amendment right to freedom of speech. By 1987 the FCC agreed and repealed the Fairness Doctrine.
In 2010, Citizens United, the landmark Supreme Court ruling, allowed corporations and unions to make unlimited, unregulated, political donations. Together, these legal changes contributed to a deluge of negative attack advertising, opening the doors for disinformation in American politics.

Today, close to 100% of the public receives news online, much of it from social media and other completely unregulated sources of information.

The American Psychological Association (APA) warns that the massive reach of unregulated “news” (including cable tv, social media, the internet, and a vast array of partisan actors from both here and abroad), has resulted not only in distrust of the news media, but by extension, other institutions, such as government and scientific institutions.

This rise in rampant disinformation has also led to a decline in interpersonal trust, which ultimately affects our local communities and personal relationships. We retreat to “bubbles” that reinforce our beliefs, whether they are true or false. Simon points out that the amplifiers in language – calling people “vermin” and claiming that elections are “rigged, fixed and stolen” – only exacerbate our sense of doubt and suspicion.

To combat disinformation before it spreads we can adopt proactive strategies. Early identification and debunking of false information, with credible fact checking, greatly reduces its potential to be widely spread and normalized.

Promoting media literacy – the ability to access, analyze, and evaluate – is crucial to understanding when messages are influenced by corporate media and outside influencers. Media literacy also helps us to recognize when we’re being emotionally manipulated.

How do we keep up with the rise of Artificial Intelligence (AI) driven disinformation, including fake audio and “deep fake” videos? One proposed solution involves using AI to spread the truth before the lies take hold, therefore using AI as a tool to provide inoculation against deceptive content, a concept known as “pre-bunking.”

American think tank Brookings suggests we all need to take responsibility for a positive, truthful, democratic path forward: “Government should promote news literacy and strong professional journalism. The news industry must provide high-quality journalism in order to build public trust and correct fake news and disinformation. Technology companies should invest in tools that identify fake news, reduce financial incentives for those who profit from disinformation, and improve online accountability. Educational institutions should make informing people about news literacy a high priority. Finally, individuals should follow a diversity of news sources, and be skeptical of what they read and watch.”
In the short term, Secretary Simon says there are three things we can take responsibility for protecting our elections: 1) Tell the truth, 2) Use empathy when dealing with people who are duped into believing the untruths, 3) Be transparent in showing how the system really works – “show don’t tell.”
For voters to find the information they need to make important decisions on who to vote for, they must have tools to identify the disinformation they encounter, and to ultimately feel confident in the choices they make.

Want to take action? Read this informative article from aarp, “11 Ways to Fight Election Misinformation” at For accurate information on voting, go to the Minnesota Secretary of State website at

This is the first Article in the League of Women Voters Minneapolis 2024 Democracy Series. The next article will go deeper into our discussion of election security. All articles will be available at

Related Images:

Leave a Reply

Copyright © 2024 Alley Communications - Contact the alley