Digital literacy programs can help us separate fact from fiction
by Toni Denis, photos by Carlos de Gonzales
When a group of students from Prescott College and other area schools organized a peaceful Black Lives Matter protest on the Courthouse Square in September, alarmist rumors circulated on Facebook and other social media apps to “be vigilant” about supposed outsiders coming to riot in Prescott.
The Sept. 4 protest, organized in response to the shooting of Jacob Blake in the back by Wisconsin police, wasn’t the first BLM protest in Prescott. Another peaceful rally organized by students came in early June after the death of George Floyd. A counter-protest response consisted of a gathering of men armed with semi-automatic rifles in front of the Palace Saloon, stating on social media that they were “protecting Whiskey Row” from potential rioters that they'd heard rumors about. Some people praised them for being there after the fact.
The more recent protest started on a Friday at 3pm, as vendors were setting up for a weekend craft show, and drew a considerably larger, more vocal and menacing group of counter-protesters, yelling at the 100 or so marchers and harassing them as they walked along the sidewalks. Police stood close by in large numbers to prevent any altercations. A video of the event shot by a local youth and aired by TMZ shows just how hateful the crowd was to the protesters against police violence.
5enses contributor Abby Brill, who attended the protest, reported that many of the counter-protesters claimed that the protesters had been bused in from out of town, which was false. They also claimed that the BLM organizers were aligned with Antifa, a fringe anti-fascist identity movement, which was also false.
“What started as a small Black Lives Matter protest spread, largely through social media, to attract perhaps 300 heavily armed right-wing militia, itching for confrontation,” Brill said. “While there may have been some ugly language used by some in both groups, there is no verbal threat equal to an angry man with an AR-15. Many of the armed crowd were shouting ‘Go home!’”
Courtney Osterfelt was heartbroken by the adult counter-protesters. “It’s time everybody embodies the timeline everyone’s hometown,” she said, pointing to the abusive treatment of young people as unacceptable, and asking, “What does it mean to be a patriot and maintain civil rights
while having civil discourse?”
Whether “itching for a confrontation” was the catalyst for promoting a fake protest or not, word spread online that yet another BLM protest was planned for Prescott on Sept. 11, a week later. Despite several online denials that an event was planned, dozens of armed local men showed up anyway, dressed in camouflage as though they planned to battle an invading horde. With the exception of a few people who got their signals crossed and showed up with signs, no BLM protest happened. Police had taped off the front of the courthouse in the event of a spontaneous protest.
The Arizona Republic and then USAToday.com covered the non-event after the fact, citing it as an example of super-spreading misinformation. The menacing aspect of armed men awaiting a nonexistent rally cannot be understated, coming just weeks after a 17-year-old white man from Illinois traveled to Kenosha, WI and shot three protesters at a BLM rally there, killing two.
Like much of America, Prescott is not immune to the flood of online misinformation. Social media, in particular, is a source for a great deal of not only misinformed ideas, but also disinformation, intended to deceive people to advance a political or marketing agenda. People who spread this information have accepted and believe something that is usually easily disproven by a Google search or by consulting official, legitimate or government sources. Research shows that we are likely to promote stories that are repeated in our social circles or that confirm our biases.
This can lead to dangerous decision-making, from spreading wild conspiracy theories that inspire Rambo wannabes to shoot innocent people to voting for a candidate based on rumor and innuendo about an opponent, spread online by thousands of trolls, many hired by a foreign government. The latest “infodemic,” a term coined by the World Health Organization, is misinformation about Covid that can lead people to risk their own and others' lives by refusing to wear masks. Some began taking a dangerous drug, hydroxychloroquine, and even drinking bleach because the president suggested they might keep the virus at bay.
Add in artificial intelligence programs that are becoming adept at composing news stories with data they’re fed, as well as creating “deepfake” photos and videos depicting public figures doing and saying things that are not real, and the problem looms even larger. So how can we as citizens protect ourselves from the overt or subtle influences of disinformation, misinformation, “alternative facts,” conspiracy theories and fake news?
Several universities studying this phenomenon have developed programs to attack it, realizing that the “marketplace of ideas” isn't enough to remedy this bonfire of the truth, rather it will take the efforts of every American to learn how to be responsible, digitally literate citizens.
One of the newest programs to stem the tide of bad information is at Arizona State University's News Co/Lab. The research center opened a free, three-week online course in August to teach digital media literacy. Called Mediactive: How to Participate in Our Digital World, it covers how to weigh information sources and be an active questioner of facts instead of a passive consumer. The program is based on a book written by one of the lab's co-founders, Dan Gillmor, a professor of Journalism and Mass Communication at the Cronkite School of Journalism.
The course promises to teach foundational “media literacy principles,” saying students will learn how to spot misinformation, assess credible sources and claims, explain how the professional news media operate, and use media to participate in the community. The course includes website content, videos and interactive activities.
Reviewing the website alone will give the casual media consumer an array of solid guidelines to follow, like “triple-check before you share” on social media. It promotes a technique developed by Washington State University Vancouver professor Mike Caufield called SIFT — Stop; Investigate the source; Find better coverage; and Trace claims, quotes and media to the original context.
Here's another list of ideas from Mediactive for us to drill down on:
Be skeptical of absolutely everything.
Don’t be equally skeptical of everything; use judgment.
Open your mind; go outside your personal comfort zone.
Challenge your own assumptions.
Understand how media works, and how it’s used to persuade.
Keep asking questions.
Take a breath and consider a slow-news approach.
The website and course evolved from a project called the Global Security Initiative, an ASU program that addresses worldwide challenges like defense and security with tools and technology to solve problems.
Disinformation is a threat to world stability, so the Co/Lab was created to address that. Its priorities include: cybersecurity; human/artificial intelligence/robot teaming; narrative, disinformation and strategic influence; and visualization and analytics. Gillmor says that the 2016 election’s disinformation campaign was a wake-up call for social-media companies.
Facebook invested in the research program, which also got funds from the Democracy Fund, the Rita Allen Foundation, Craig Newmark Philanthropies, and the News Integrity Initiative.
Kristy Roschke, managing director of News Co/Lab, says 600 people took the live class in its “soft” launch. The second live class in September attracted 1,700 people, some by word of mouth but also by marketing through the Osher Lifelong Learning Institute in Arizona and nationwide through Arizona AARP, libraries and targeted Facebook marketing. A third live class is planned to start October 5.
“It is really designed for people to do it on their own time,” Roschke says, “however, we are over the course of the election season offering a live course with a couple of sessions each week. We’ve chunked it into a week-to-week time span so you can interact with us and interact with students. (But) we’re happy with people doing it however it works best.”
Someone who wants to learn quickly can read all of the online material in three hours, but the online course includes webinars and the ability to ask questions. That has been mutually beneficial, since Co/Lab has adapted, modified and improved the content based on feedback. Research and classes like Mediactive discuss how people can better interact with digital media so that bad actors trying to manipulate them have a harder time succeeding. Then, Roschke says, people are less likely to spread the misinformation on social media.
An Associated Press/NORC Center for Public Affairs Research poll in 2019 found that nearly half of Americans struggle with determining whether information is true. Respondents said they not only distrust the media, they also doubt information from government, scientists, academics and politicians.
The bombardment of online information paired with a lack of tools to discern truth takes a psychological toll, says Roschke. “Social scientists who do research on this say there are a couple of things that happen,” Roschke explains. “A lack of memory and an attention problem creates the illusory-truth effect. When you repeat a lie so many times all people remember is the lie — it has to make journalists think hard about the news they report. If you’re quoting something that is not factual, like treatments related to Covid-19, then even in a story you write to debunk that information, it can have an opposite effect.”
The tendency toward “doomscrolling” of negative or emotionally triggering political news and memes also has an addictive and psychologically damaging quality that can’t be discounted, she said.
News Co/Lab is not alone in its efforts. The News Literacy Project, founded more than twelve years ago to teach secondary-school students, announced in September that it made its Checkology platform publicly available. Video about the nonpartisan platform describes how it works and offers interactive games to keep viewers engaged. The website Newslit.org is a guide for adults, zeroing in on political information.
The tide is beginning to turn in the disinformation wars as social-media companies adapt and tech companies develop platforms and AI tools to identify fake news and mitigate some of the worst deceptions.
Twitter and Facebook recently stepped in to respond to viral disinformation by marking posts as false (in the case of Facebook) and removing posts completely, the usual response by Twitter. Even YouTube is monitoring videos and showing informational panels on Covid to stop misinformation. Some of these efforts are AI-driven with some human supervision. Occasionally the AI is overzealous, sparking complaints about censorship, but it is having positive effects.
These social-media companies have purged thousands of accounts operated by bots, automated posting-engines created by special-interest groups and malicious foreign actors. That’s helped, too, but it’s still a game of whack-a-mole against new accounts and fake profiles.
The recent Netflix film The Social Dilemma highlights the dangers of relying on information from social media, which uses algorithms to detect a user’s interests and reinforces them with information biased toward them, creating addictive feedback loops. In interviews former employees of Google, Facebook, Twitter, Instagram and YouTube warn that these algorithms have been wildly successful in generating dollars for the companies, but are destructive when it comes to user psychology, particularly around politics. Ten Arguments for Deleting Your Social Media Accounts Right Now, a book by Jaron Lanier, explores the exploitation of social-media users by the companies in greater depth.
In Arizona, teen trolls recruited by Turning Point USA, a conservative nonprofit, were tasked with promoting the idea that Covid is a hoax and masks are unnecessary on Twitter and Facebook. When The Washington Post exposed the effort, the social media platforms shut them down.
While it’s dismaying to learn how much is wrong with social media, it’s even more critical to make an effort to understand why, with democracy under attack through disinformation and voter suppression. With virus information it can literally be a matter of life and death.
Eric Newton, a professor of practice at ASU’s Cronkite School of Journalism, said that identifying the problem may help mobilize the public in the disinformation battle.
“Ninety-nine point nine percent of America doesn’t have a good sense of news literacy,”Newton said. “We have a crisis and we’re not at all prepared. National politics is bad enough, but it’s even worse on the local level, where we have news deserts.”
Newton said it’s “the world’s greatest paradox” that almost everyone has a handheld device giving them access to more information than they could ever digest, but they don’t have the training to find out what apps or knowledge they need to get the right information.
“One of (Gillmor’s) reviewers said (of Mediactive) … machines are being upgraded all the time, but the people haven’t been upgraded at all,” Newton said. “They take in false information, and now the country is so polarized and not hearing people out. It’s harder now for people to sit down and say we need to care about the entire nation or the entire community and how do we do that. Increasingly we have people who listen to only Fox or listen to Chris Matthews, and only that, or only read the local newspaper. It’s tribal …. There’s no place for people to get on the same page because there’s infinite page counts.”
In the case of the BLM counter-protesters, the Yavapai County Sheriff's Department and Prescott police are notified in advance of any protest, so contacting either would have brought accurate information. While someone posted on at least some social-media sites that the sheriff had not confirmed a protest, and the sheriff and Prescott police posted on Facebook that they were not aware of a protest, the armed men apparently decided they’d rather believe the rumor.
Brill said the counter-protesters were motivated by disinformation.
“It is safe to assume that the far-right protesters on the square are influenced by extreme-right, militant websites, which push the threat of the Anitifa movement and paint all Black Lives Matter protesters as looters, pushing their socialist agenda to overthrow the rule of law and ruin our way of life. This was heard repeatedly during the protest.”
She noted that nine students organized the BLM protest, one of more than 50 progressive protests over the past four years, none of which has involved violence.
Toni Denis is a frequent contributor to 5enses.
Perspective by Abby Brill