One day during the spring 2020 semester, something changed in instructor Michelle Westervelt’s Reading, Writing, and Inquiry course at Indiana University (IU) Kokomo. Looking back on the class, peer mentor Noah Weathersby, a history and political science major, describes “an obvious shift in the tone of our discussion.” What he witnessed that day was “enlightening, personal, and deep.”
What was going on here?
The catalyst was an activity that called on students to read an article (nothing new there) but also to examine it with an understanding of framing effects. That is, the students were to consider how linguistic features, such as vocabulary, control the way readers interpret information conveyed through the media.
“Students really took a lot from that day as they began to realize the subtle and overt ways that media itself can be framed,” Weathersby explains.
Information, in other words, is not merely an objective package of reality to be transmitted impartially from author to reader but something far more complex and amorphous, subject to manipulation and interpretation.
“One of the most important things students picked up on was the use of pronouns—the ‘us-versus-them’ argument—and that led to a meaningful conversation on how easy it is to be manipulated by simple pronouns,” Westervelt says. The class went on to examine other language in the article—“open borders,” “limitless amnesty,” “feeble,” “self-defeating”—and, Westervelt recalls, “students were especially surprised by how easily they could fall for the framing tactic.”
This aspect of communication is nothing new to composition instructors like Westervelt, but, as her experience with her class demonstrates, it can be enlightening to college students and other news consumers. It is especially timely these days.
The rise of media manipulators—from bots to troll armies—has prompted Americans to worry out loud about the insidious effects of incorrect or biased information on politics, technology, society, and science in a post-truth era. Until recently, the consequences loomed in the future. In our current COVID-19 crisis, though, the effects can be immediate and deadly.
Weathersby and Westervelt, along with other peer mentors and instructors in first-year writing classes at IU Kokomo, are using a curriculum called Mind Over Chatter (MOC): Skills for Navigating the Post-Truth Era. The curriculum helps students recognize not only the subtle ways that media creators shape information but also the biases naturally present in news consumers. As an English professor and administrator, I collaborated with fellow English professor Paul Cook, psychologist Christina Downey, and librarian Polly Boruff-Jones to create this curriculum, which we believe can protect students and others from the dangerous consequences of consuming misleading media in today’s environment.
Funded by the top prize at the Rita Allen Foundation’s 2018 Misinformation Solutions Forum, the MOC curriculum features online modules that students can complete outside of class, as well as a guide that instructors and peer mentors can use in class to walk students through analysis and discussion. Instructors may deploy the modules and activities in a variety of ways. Typically, engaging with the curriculum requires about three hours of class time. Peer mentors like Weathersby undergo training, facilitate MOC learning activities alongside the same instructor over multiple semesters, and are paid for their time.
In the midst of a pandemic, reliable facts and sound interpretation of those facts are more important than ever.
To illustrate how quickly information ecosystems proliferate, one need look no further than the pandemic. COVID-19 was mentioned more than 115 million times between January 1 and March 31, 2020, on social media and other online platforms, according to analysis by the nonprofit research institute RTI International. Mentions of the term spiked to 6.3 million on March 12, 2020, the day after the World Health Organization (WHO) declared COVID-19 a pandemic.
It’s easy to see how the current media landscape, coupled with official shelter-in-place guidance and self-imposed isolation, has enabled the spread of incorrect and misleading information about this worldwide event. Americans of all ages are not only sharing more information on the web and social media but also leaning more heavily on these online spaces to re-create a sense of community and belonging. What they are finding on social media, however, often lacks substance, says Jessa Lingel, an associate professor at the University of Pennsylvania who studies digital culture. “What I’ve noticed is that on three of the main platforms—YouTube, Instagram, and TikTok—there’s a lot of messaging, but not a lot of information,” Lingel told the Philadelphia Inquirer in April 2020. “There will be a meme on Instagram on how to best wash your hands, or a ten-second TikTok video on what social distancing really means, but on the negative side, someone will share a TikTok of empty shelves in stores, which creates panic buying and promotes bad behavior.”
The problem goes beyond a lack of substance, however. The dramatic spread of COVID-19 has given rise to widespread misinformation and disinformation. Even seasoned journalists have to work hard to avoid misinformation, a term for, among other things, inadvertent mistakes made in the reporting process. More troubling are the numerous examples of deliberate fabrications, or disinformation. The latter can be difficult to identify with certainty, as it’s not always clear where a claim originated. But any kind of unfounded speculation or conspiracy theory can be problematic if people start to believe or even act on it. In a February 2020 report, the WHO termed the proliferation of COVID-19 information—some valid, some not—an “infodemic.”
Consider, for example, the attempts to spread disinformation and confusion about the effectiveness of complying with the Centers for Disease Control and Prevention’s guidelines on masks, social distancing, and large gatherings. Or what about the spate of arson attacks on 5G wireless towers in the wake of a claim that the towers were facilitating the spread of the virus? Then there’s the misinformation and disinformation surrounding the safety and effectiveness of COVID-19 vaccines.
Many Americans are being “infected” by this information disease. For example, 29 percent of US adults polled by the Pew Research Center in March 2020 believed the virus was created in a lab either intentionally (23 percent) or unintentionally (6 percent). Nearly half of US adults held at least one misconception about COVID-19 in September 2020, according to a Kaiser Family Foundation Health Tracking Poll. This included one in five adults who believed wearing a mask to be harmful, and one in four adults (and half of Republicans) who believed hydroxychloroquine to be an effective treatment for COVID-19.
How can we protect our students and ourselves from the panoply of pollutants running amok in these information ecosystems?
Instead of taking aim at specific examples of misinformation and disinformation (the approach of many other interventions with similar aims), MOC teaches students to recognize the enemies within. Students learn how our minds’ innate and sometimes irrational tendencies in processing information—what psychologists call cognitive biases—can make us all susceptible to misinterpretation and deception. In engaging with MOC’s online modules, students identify and analyze different kinds of cognitive biases in video clips, excerpts from news articles, and hypothetical scenarios.
“These modules have influenced my schoolwork and even my everyday life,” explains Miranda Torres, a student who used the MOC curriculum in IU Kokomo instructor Martha Warner’s class. “I feel I am better informed now than before the modules.”
Prominent psychologist Scott Lilienfeld and colleagues, writing in the July 2009 issue of Perspectives on Psychological Science, identified widespread cognitive bias mitigation as one of the highest possible services that psychological science can offer society. This goal has driven our work on MOC as we face a media universe where reliable fact and sound interpretation vie for attention with misinformation and disinformation.
In one MOC exercise, students explore a cognitive bias known as confirmation bias—the tendency to process information in ways that confirm one’s existing beliefs. Students receive details about an imaginary crime and then play detective to consider a suspect’s possible guilt. By getting feedback on the choices they made at each stage of the exercise, students see how information they gathered early about the suspect can bias their later choices about how to investigate the case further. If students form early beliefs about the suspect’s innocence or guilt, their later choices show a tendency to confirm those beliefs rather than explore all possibilities fairly.
“After completing the confirmation bias scenario,” Westervelt says, “the majority of students commented specifically that when doing college research in the future, they will look for the opposing viewpoint early in the process to keep their own biases in check.”
MOC also teaches students to recognize another cognitive bias known as the mere-exposure effect—the well-documented tendency toward automatically accepting as true a piece of information that one has encountered many times. Usually, this tendency is highly adaptive; generally speaking, repeated encounters with a phenomenon attest to its reality. However, thanks to the free open access that everyone has to publish material on the internet, especially on social media, someone who wishes to convince us of an untruth—about the source of COVID-19, remedies for it, or anything else—can simply flood information channels. People’s natural tendency to believe pervasive information will take over, leading them to believe that untruth. One of these disinformation purveyors can multiply exposure many times with help from bots (automated social media accounts), amplifiers (coordinated efforts to share specific social media posts), and superspreaders (social media influencers with more than one hundred thousand followers). The module on mere-exposure effect explains the evolutionary principle underlying the effect and provides activities designed to show students how they may be unknowingly experiencing it. A historical example shows how the tobacco industry used mere-exposure effect (among other techniques) to mislead the public about the dangers of its product.
Cameron Lee, a student who studied mere-exposure effect with Warner, wrote in a reflection that this module “really dug into how the overexposure of something can slowly make you believe it.”
In another module, which focuses on framing effects, students watch a video of the German magazine Auto Bild’s 2014 interview with entrepreneur Elon Musk and answer questions about the way Musk frames the conversation with his language. “If you look at, say, people like Bill Gates or Larry Ellison, Steve Jobs, these guys didn’t graduate from college,” he told Auto Bild, “but if you had a chance to hire them, of course, that would be a good idea.” Students are asked to pay attention to the second-person “you” and replace it with the first-person “I,” and then consider whether the message remains as convincing. Making such language choices more visible empowers students to take better control of their role as information consumers.
Another module explores the paradox of authority, the conundrum posed by the fact that we glean most knowledge of our world indirectly from authorities (experts, thought leaders, or other trusted figures) rather than from direct experience. This means that we must gain skills in fairly assessing the credibility of authorities, while also maintaining a willingness to bestow our trust in them when appropriate. Like all the other modules, the module on the paradox of authority offers students guidance on navigating the media landscape. It provides specific strategies, quoted below, that students can use to challenge the information coming from ostensible experts:
→ Hold a clear standard for what you consider “expertise,” and hold it consistently with all parties for different kinds of issues. What convinces you that a person is a true, knowledgeable, honest authority on a topic? Their educational level? Their personal experience? Evidence for their past sound judgment and good personal conduct? Whatever standard you hold for believing an authority, apply it fairly to all authorities.
→ Be a tough customer. You should learn to see your choice to believe others as a form of capital, just like cash, that you refuse to give away recklessly. Understand that you have no obligation to believe others without testing their claims! Instead, see your belief in an authority as a precious gift that can be generously given or vigorously earned—it’s not for free. This is probably your most important tool, which can keep you from falling prey to other people’s wishes or motives for you.
A final module offers concrete strategies for mindful media consumption and a chance for students to reflect on their information diets. In this module, students consider what kinds of information they consume online and how much time they spend engaging with media of all kinds, including everything from watching Netflix, to mindlessly “doomscrolling” through Instagram, to reading a novel for class. By focusing on the emotional resonance of online content, particularly when it comes to social media, and learning basic mindfulness techniques like deep breathing and conscious reflection on information consumption, students can start thinking about how they consume media. They begin to develop a mindful daily media habit in which they reflect not only on where and how they spend their time online but also on the quality of the experiences they are having with the information they encounter. Developing a daily digital media habit challenges students to think carefully about how they are spending what may be our most precious commodity in the era of digital distraction—our focused attention. (One of our team members studies the practice of developing a daily digital media habit as it pertains to political engagement.)
MOC also provides a zoomed-out view of the structure of digital media and social media. It introduces students to the concepts that shape and animate our experience of the modern web, including author and activist Eli Pariser’s notion of filter bubbles—those invisible algorithmic structures that personalize our content on everything from YouTube to Amazon to a simple Google search—and the concepts of digital distraction and “FOMO” (fear of missing out) that influence our anxious (over)dependence on our digital devices.
MOC is designed to be used as part of a freshman-level writing curriculum, but it would be suitable for any high school or college course where the instructor wishes to increase student awareness of how natural biases are activated in our free-for-all media environment. While we still have a limited empirical understanding of the modules’ short- and long-term effects on cognitive biases among students, instructors who have fully integrated these materials into their courses have reported marked improvements in student critical thinking and argumentation.
“The MOC modules have been effective because they present the multilayered concept of misinformation in a very accessible way,” Westervelt says. “They give students specific scenarios to draw from when making connections to their own exposure, biases, and habits.”
In a reflection on her own experience teaching with the MOC modules, Warner writes that “students who used all of the resources available benefited by increasing critical thinking and self-awareness.”
Available online through IU Expand and in the Canvas Commons, MOC is free to everyone. In the midst of a pandemic, reliable facts and sound interpretation of those facts are more important than ever. Eliminating misinformation from the online universe is not only practically impossible but also difficult to achieve without running afoul of the First Amendment. We can equip our students and ourselves, however, to be more mindful consumers of information by learning to identify our cognitive biases and overcome them.
The members of the Mind Over Chatter team contributed to this article, including Polly Boruff-Jones, dean emerita at Indiana University (IU) Kokomo and dean of university libraries at Oakland University in Rochester, Michigan; Paul Cook, associate professor of English at IU Kokomo, who teaches courses on digital media and has written extensively about the misinformation crisis and epistemological structure of the digital era; and Christina Downey, associate vice chancellor for academic affairs and professor of psychology at IU Kokomo.
Illustrations by Paul Spella