
Summary of “this season: dis/ misinformation”
Visit these links for more resources about dis/misinformation:
- Omidyar Network
- Encode Justice
- “Combatting Information Manipulation: A Playbook for Elections and Beyond”
- Cyber Collective
- ReThink
Welcome to the third installment of “this season,” a storytelling program by March On Foundation (MOF) and Quitman Studios that brings experts together to discuss the most pressing issues of the season.
We hear about “disinformation” more and more these days. It impacts our elections, the quality of our political discourse, and the very safety of the world’s population. Disinformation also contributes to radicalization and homegrown terrorism, making it is easier than ever to sow seeds of discord through disinformation.
As panelist Emma Leiken said, “You have a real megaphone and a lack of friction in the system… the lack of speed bumps in the digital realm mean that any kind of information, especially false information, can travel really quickly at scale.”
And as always, the most marginalized communities are the ones who suffer the most from disinformation’s consequences. Young people are particularly vulnerable.
Our panelists this Spring shared our desire to clear up the different definitions in the world of disinformation and provide tips to stop people from spreading it themselves. Esther Pang, executive director of March On Foundation, served as our moderator. Our panelists this season were:
- Emma Leiken, a human rights advocate and technologist with a commitment to belonging, information integrity, safety and inclusion both online & off. Emma currently leads a portfolio focused on youth organizing and responsible technology at Omidyar Network. She holds a B.A. in religion from Oberlin College and an M.A. in international development with a focus on technology from the LSE.
- Sneha Revanur, founder/president of @encode justice, an international 501(c)(4) organization mobilizing youth for human-centered artificial intelligence. Sneha is currently in her first year at Williams College and is pursuing a degree in Political Economy and a certificate in Arabic. She is also an incoming Summer Intern at the Center for AI and Digital Policy.
Emma and Sneha outlined three types of disinformation:
- Misinformation is not mal-intentioned, but still extremely harmful. Misinformation is spread when you unknowingly share untruths with others. This can happen when you share an article without having read it first because it had a headline that was consistent with your worldview. Other examples include cherry-picking data and taking quotes out of context. As Sneha said, “Anyone can be a perpetrator of misinformation.”
- Disinformation, however, is distinctly malicious and manipulative. It consists of untruths intended to cause harm, destabilize societies and communities, and sow false narratives about socio-political situations. The Russian government under Vladimir Putin is notorious for spreading disinformation both at home and abroad, e.g. falsely proclaiming that Ukraine is run by Nazis to justify Russia’s war against Ukraine.
- Midinformation is a newer definition in the world of disinformation. Midinformation is when knowledge stands in the middle, when we’re working our way to the truth. A great example of midinformation was research in the early days of the COVID-19 pandemic, when scientists were still looking for answers and sharing their findings as they came. This led to to contradictory requirements for mask usage that contributed to confusion – and in some instances, future mis/disinformation.
Disinformation takes many forms, and the rise of AI has made it possible to make it very sophisticated, and much more dangerous.
One area of particular concern for Sneha is deep fakes, or synthetic media that has been manipulated into making one person look like another. Deep fakes are becoming harder and harder to identify, and it’s difficult to train people to spot them. Sneha stated she often sees them used maliciously in the form of image-based sexual abuse, also called nonconsensual distribution of intimate images (commonly known as “revenge porn”); or to spread socio-political messages by pretending to be people with authority.
It isn’t only individuals who are complicit in spreading dis/misinformation: corporations who are aware of their contribution to disinformation often don’t have the will to address it, and any governing bodies tasked with this challenge don’t have the teeth to enforce their recommendations.
Messaging platforms can be a megaphone for disinformation, as information can be spread at an exponentially larger rate than by word of mouth. This also goes for social media sites like Facebook, making them breeding grounds for radicalization.
The inaction from policymakers about technology protections only exacerbates the problem; as Sneha pointed out, “Self governance alone is not the solution. We need to have government intervention.”
We cannot prevent people from using modern technology; in fact, it would be unfair and unproductive to stop using it. There are many solutions to address this problem, but it will take the entire community from the top-down to make them work.
A solution for policymakers and corporations is to create friction or “speed bumps” to make it harder for people to spread misinformation. One example of a “speed bump” by the messaging platform WhatsApp: during COVID-19, they implemented a 256 person limit on groups, labeled frequently forwarded messages, and limited the number of times you can forward chat. These practices create friction to slow down the spread of misinformation. Other ways to create friction include algorithms targeted at preventing the spread of falsehoods and clear models and transparency around how algorithms create recommendations for Internet users.
But what about on an individual level? How can we as a society be accountable to one another? Emma shared her personal “digital hygiene routine” as an example:
- Engage in critical thinking. Always verify your sources of the information. According to Emma, “The question is not who do you trust, but what do you trust.”
- Always read the full article before sharing it.
- Check for confirmation bias: don’t share an inflammatory headline just because it shares your worldview.
- Recognize that there is always a risk for narratives that don’t fully capture a situation, and that things can be taken out of context.
- Set high standards of transparency, accountability, honesty, and humility for yourself in what you read and how you share it with others.
- Make a digital hygiene commitment and stick to it.
Most importantly, as Sneha said, “Don’t lower the standard of truth and objectivity.”
This conversation is the third in a series, with our next panel scheduled for the Fall. Sign up here for exciting updates regarding this season!