Trends and Issues in Higher Ed

May 29, 2018

Understanding fake news and misinformation

A spring mini lecture series offered students, faculty, staff and the community new ways of understanding and combating fake news and misinformation.

It’s hard enough to understand the problems of fake news and misinformation — the complex factors that have contributed to them becoming primary issues facing our society — let alone, how to combat them.

lecture audience photo

UW students at Jevin West’s lecture, “Cleaning Up Our Polluted Information Environments.” Photo credit: Shantelle Xiu

“As a community, UW thrives in our access to all types of information — we are constantly producing and consuming information. This lecture series reminds us as a community to protect our integrity as information producers, and as ethical information consumers.”

— Tim Tiasevanakul, class of 2020, Law, Societies and Justice major

Enter the experts: Kate Starbird, UW assistant professor in Human-Centered Design and Engineering; Jevin West, UW assistant professor in the Information School; and Berit Anderson, CEO and Editor-in-Chief of the media company Scout.ai. These three have extensive experience working in the information trenches, in a variety of ways. Starbird and her team of UW students have been researching social media platforms for years, to better understand how rumors, “alternate narratives” and conspiracy theories spread after crisis events. West is a data scientist who co-developed the wildly successful UW course, “Calling Bullshit: Data Reasoning in a Digital World.” And Anderson, a technology journalist and former editor of Crosscut Public Media, started her company with the goal to “democratize technology,” just as fake news undermines democracy.

The series, which was sponsored by the Office of the Provost and ran through April 2018 on the Seattle campus, attracted students, faculty and staff from across the UW, as well as members of the local community. It demonstrated how hungry we are, as a community and as global citizens, for ways of understanding and tackling these problems — and offered us knowledge and strategies from three engaging, expert perspectives. All three lectures are now available to watch online in full.

Kate Starbird: Understanding the “muddied waters”

Kate Starbird, UW assistant professor in Human-Centered Design and Engineering

Kate Starbird, UW assistant professor in Human-Centered Design and Engineering

As Kate Starbird told us on April 18, “We are all targets of disinformation, meant to erode our trust in democracy and divide us.” This doesn’t mean we have to be completely vulnerable — but for Starbird, it’s crucial that we understand why we’re as vulnerable as we are.

Starbird’s lecture, “Muddied Waters: Online Disinformation During Crisis Events,” opened with a discussion of what makes human beings susceptible to disinformation and political propaganda. In general, says Starbird, it’s our unconscious cognitive biases that lead us to take stories, real or faked, as “true” — when they confirm our pre-existing beliefs. And our biases can be targeted by technology companies. For example, social media algorithms create “filter bubbles” by showing us, and getting us to click on, what we already want to see. It has become increasingly possible to only encounter information of the kind we want or expect to see, on each side of the political divide — making the divide ever greater.

“If it makes you feel outraged against the other side, probably someone is manipulating you.”

– Kate Starbird

Starbird and her students have been analyzing the marked increase on social media in the spread of disinformation: information disseminated with the intent to confuse, on both the left and right of the political spectrum. They study Twitter, for example, to see how disinformation follows crisis events such as school shootings, and how those rumors “muddy the waters”: casting doubt on the credibility of those who experienced the crisis, and even on the crisis itself. “Disinformation is very effective,” she says, at muddling our thinking. And when we, as consumers of information, feel muddled, we give up trying to understand — and worse, are confused into inaction.

Jevin West: Combating information “Polluters”

Jevin West, UW assistant professor in the Information School

Jevin West, UW assistant professor in the Information School

Jevin West, in his April 24 lecture “Cleaning Up Our Polluted Information Environments,” discussed the problem as largely one of “un-trained editors.” Where we once relied on professional journalists, editors and fact-checkers to serve as information “gatekeepers,” we can all now share information widely, but without clear ways to evaluate credibility. In addition, West said, studies have shown that falsehoods travel faster than truths — and it takes far less energy to create bad information than it does to refute it after the fact.

West displayed examples of fake news in its everyday forms, from the more benign (flat earth conspiracy theories) to the more dangerous (false medical information) — and explained how new technologies will make it continuously harder to tell what’s real from what’s faked. He also showed the audience how easy it is to misrepresent information through statistics, and to create graphs that can seem to say anything. We’re especially vulnerable when it comes to numbers, he says, because “numbers carry authority” — and it can be very hard to tell when they are presented out of context.

For West, fake news and misinformation pose an “existential threat.” “I think it’s the most serious issue we’re dealing with in society right now,” he says.

Berit Anderson: Tracing the rise of “AI Propaganda”

Berit

Berit Anderson, CEO and Editor-in-Chief of the media company Scout.ai

Following the 2016 U.S. presidential election, Berit Anderson and her colleagues at Scout began investigating the causes for what seemed like surprising results key swing states. What they found, she says, is that “there’s a new global electioneering platform built on the backs of the tech industry.” In early 2017, they published an article entitled, “The Rise of the Weaponized AI Propaganda Machine” detailing how automated propaganda networks can influence elections around the world, which garnered attention and support from some of the biggest players in the industry, including Google.

Anderson’s lecture on April 30, “The New Global Politics of Weaponized AI Propaganda,” outlined the steps that have allowed Russia, in particular, to target elections in the U.S. and Europe. These include, “Turn voters into disinformation agents” via companies such as Facebook, who allowed advertisers to target voters in swing states by zip code and income. Another tactic is to “Deploy the bot armies” — that is, create vast networks of fake social profiles to spew politically divisive, often faked “news.” In some cases, the same bot accounts have been deployed in different parts of the world in support of different agendas, from the Arab Spring to Brexit to the 2016 U.S. Presidential election.

Looking ahead to solutions

The series as a whole provided powerful strategies for combatting these problems — many of which overlapped across the three talks.

“A big take-away from all three lectures was that I have a responsibility as an individual to be more self-aware and a better consumer of information. I walked away feeling empowered.”

– Katie Harper, graduate student in Library and Information Science

All three discussed how new policies and better regulation of tech companies (or “pollution facilitators,” in West’s terms) can help. For example, the new E.U. General Data Protection Regulation (GDPR) will hold Facebook and other companies accountable to clearing users’ histories when requested, making it more difficult for “polluters” to exploit personal information. Anderson emphasized how new policies can help create change internally at tech companies as well. Facebook has taken a “big step,” she says, by committing to requiring advertisers to label political ads as such, and provide “paid for by” information. Somehow, we need to “restore trust in our information systems,” says Starbird — and that means that tech companies need to be more transparent and trustworthy.

At the same time, all three emphasized the ways in which individuals can help — by becoming more conscious information consumers and producers. A good rule of thumb, says West, is to “think more, share less”: to become more thoughtful and selective about the information we share, thus becoming better “editors.” We should also pay attention to how information affects us emotionally and engages our biases, says Starbird. “If it makes you feel outraged against the other side,” she says, “probably someone is manipulating you.” “Don’t become a cog in the outrage machine,” echoes Anderson. “The most important thing you can do as an individual is not to let yourself become angry at people with different political views,” she says.

“It was so interesting to learn about the fake news and misinformation that we might encounter every day through popular media. This series as a whole is valuable to the UW community because it spreads awareness about misinformation, and increases our knowledge of ‘fake news’ and credible sources in media and otherwise.”

– Selah Lile, class of 2021, pursuing double major in Psychology and Spanish

For West, the most powerful solution lies in education — especially for younger generations as they learn information literacy. “The biggest thing that we can do is arm the consumer,” he says, with education. Requiring media literacy as part of grade school curricula, for example, could go a long way toward creating a more critical and savvy consumer public. (In 2017, Washington state passed a bill into law to do just that.) Starbird and Anderson agree. We have to understand how online media works — “how it affects our lives, our economy and our global politics,” says Anderson.

The speakers suggested other ways to address the problems as well, including West’s prompting to make better use of existing resources such as reliable fact-checking organizations, and Anderson’s suggestion to reach out to individual developers we might know at tech companies here in Seattle, to work toward change together.

Anderson says she’s optimistic: “I’ve spoken to politicians in Europe and also here in the U.S., and I’ve found that people are very motivated to find a fix for this,” she says. “It’s a time to be bold and stand up for the things we believe in.”