How to Best Study the Cognitive,
Social, and Technological Biases
that Make Us Vulnerable
to Misinformation?
A second panel addressed research challenges in
studying cognitive, social, and technological biases
that create vulnerabilities to misinformation. Stanford University historian Joel Breakstone cited a
study showing that students have difficulty distinguishing the reliability of news stories (McGrew et al.
2017). In this study, students’ assessments of the
veracity of the content they were presented were
found to rely primarily on the appearance of the article rather than on consideration of the source. The
same study also found that students were unable to
distinguish native advertising from real news stories
80 percent of the time. Breakstone concluded that
the fight against misinformation is one in which we
all must take part, not just big tech companies.
Communications scholar R. Kelly Garrett (Ohio
State University) sought to clarify several points in
the discussion. First, he distinguished between hold-
ing a belief and being ignorant of the evidence
against it, citing statistics on the number of people
who know the scientific consensus on global warm-
ing but reject it (Funk and Kennedy 2016). However,
he noted that online partisan news leads people to
reject evidence, due to these outlets’ emotional pull
(Garrett, Weeks, and Neo 2016). While social media
increase the profile of misinformation, it remains
unclear how much this actually shifts public opin-
ions.
Physicist Kristina Lerman (University of Southern
California) emphasized that humans have limited
information-processing capabilities, which makes it
impossible to keep up with the growing volume of
information, resulting in reliance on simple cognitive heuristics. These heuristics may in turn amplify
certain cognitive biases. Lerman reported on her
studies about popularity, engagement, and position
bias (Kang and Lerman 2015; Lerman and Hogg
2014). The latter is the idea that people pay more
attention to what is at the top of a list or screen.
Experimental trials show that news stories at the top
of a list are four to five times more likely to be shared.
Lerman also discussed how social media reinforce
network biases, creating echo chambers that distort
our perceptions.
Table 1. Top Priorities for Each Topic.
After each panel, participants in the workshop proposed and chose top priorities
for each topic via a dynamic online head-to-head ranking system. This table records the top five results for each.
Survey of Participant Priorities by Panel
Panel 1: Defining and Detecting Misinformation
1. Identifying and promoting reliable information instead of focusing on disinformation
2. Tracking variants of debunked claims
3. Developing reputation scores for publishers
4. Creating an automated trustmark to promote journalistic integrity
5. Collecting reliable crowdsourcing signals
Panel 2: Cognitive, Social, and Technological Biases Creating Vulnerabilities to Misinformation
1. Investigating the use of language, images, and design in misinformation persuasiveness
2. Validating model predictions via field experiments
3. Studying the roles of algorithmic mechanisms in the spread of misinformation
4. Translating research findings into policy recommendations
5. Accessing behavioral data from social media platforms
Panel 3: Countermeasure Feasibility, Effectiveness, and Responsibility
1. Support and scaffold critical thinking
2. Increase prominence and availability of fact-checking information
3. Design trust & reputation standards for news sources and social media users
4. Build tools to track the provenance of digital content
5. Develop computational tools to support fact-checking