Which of these statements seems more trustworthy to you?
1) Americans are drowning in a tsunami of ignorance! There is a conspiracy at the highest levels to replace all knowledge with propaganda and disinformation.
2) A recent Stanford University report found that more than 80 percent of middle schoolers didn't understand that the phrase "sponsored content" meant "advertising."
For most of the NPR audience, this shouldn't be a tough question. The first sentence is a florid, mislabeled statement of opinion with an unverifiable, overgeneralized, ideological claim ("conspiracy at the highest levels").
The second is more measured in tone and limited in scope. And, it has a link that goes straight to the original source: a press release from a reputable university.
But these days, statements of all stripes are bombarding us via broadcast and social media. The trick is classifying them correctly before we swallow them ourselves, much less before we hit "Like," "Share" or "Retweet."
And that is the goal of an educational initiative that will be adopted by 10 universities across the country this spring.
Thinking like fact-checkers
This new approach seeks to get students thinking like, and doing the work of, fact-checkers.
"We have approached media literacy and news literacy in the past sort of like rhetoricians," says Mike Caulfield, director of blended and networked learning at Washington State University in Vancouver. (Can that be right? A public university based in the United States with a campus in Canada? No, it's Vancouver, Wash.)
In other words, he explains, we teach students close reading and analysis of elements, like tone. "Fact-checkers," on the other hand, "get to the truth of an issue in 60 to 90 seconds."
He says fact-checkers read laterally — moving quickly away from the original text, opening up a series of tabs in a browser to judge the credibility of its author and the sources it cites.
A new working paper, by the same Stanford researchers cited earlier, provides support for this proposition.
They pit professional fact-checkers against historians and undergraduates. When evaluating websites and searching for information online, the researchers said, "fact-checkers arrived at more warranted conclusions in a fraction of the time."
Four moves and a habit
Caulfield has distilled this approach into what he calls "Four moves and a habit," in a free online textbook that he has published. It's aimed at college students, but frankly it's relevant to everyone.
The moves are:
- Check for previous work: Look around to see whether someone else has already fact-checked the claim or provided a synthesis of research. [Some places to look: Wikipedia, Snopes, Politifact and NPR's own Fact Check website.]
- Go upstream to the source: Most Web content is not original. Get to the original source to understand the trustworthiness of the information. Is it a reputable scientific journal? Is there an original news media account from a well-known outlet? If that is not immediately apparent, then move to step 3.
- Read laterally: Once you get to the source of a claim, read what other people say about the source (publication, author, etc.). The truth is in the network.
- Circle back: If you get lost or hit dead ends or find yourself going down a rabbit hole, back up and start over.
Finally, Caulfield argues in his book that one of the most important weapons of fact-checking comes from inside the reader: "When you feel strong emotion — happiness, anger, pride, vindication — and that emotion pushes you to share a 'fact' with others, STOP."
His reasoning: Anything that appeals directly to the "lizard brain" is designed to short-circuit our critical thinking. And these kinds of appeals are very often created by active agents of deception.
"We try to convince students to use strong emotions as the mental trigger" for the fact-checking habit, he says.
Caulfield is also the director of the Digital Polarization Initiative of the American Association of State Colleges and Universities's American Democracy Project. Starting this spring, the initiative will bring at least 10 universities together to promote web literacy. They will each adopt Caulfield's ebook as a text across several courses in different disciplines: from history to science to journalism. Students will fact-check, annotate and provide context to news stories that show up in social media feeds.
Their efforts will be published, in the hope of helping others get to the truth a little faster. You can see some initial efforts at this link.
For example, last spring, students at Western Kentucky University took up the question "Are the protesters against Trump being paid to protest?" The students traced the claim back to a Tweet by an Austin, Texas, resident that was later retracted. But his retraction, they found, received much less notice than his original, baseless statement.
The speed of a lie
I asked Caulfield whether the protester example doesn't point up a problem with trying to leverage education to combat hoaxes in the first place. He's asking students to take "90 seconds to two minutes" to confirm something before passing it along.
That is certainly faster than other media literacy methods. But half-truths and misstatements, meanwhile, spread entirely unchecked.
That's the meaning behind the maxim "A lie can travel halfway around the world before the truth can get its boots on." (Who said this? Probably not Mark Twain. Jonathan Swift said something similar, according to this site.)
Caulfield argues education does have a role, alongside changes like better algorithms on social networks and tighter regulations, in improving the information landscape. Plus, for him, this kind of work answers a philosophical question: "What do we want education for citizens to look like in a networked world?" He uses an ecological metaphor to explain what he means:
"If you have a couple of people in a large group who identify as scientists, humanists, activists, historians, you build up a herd immunity" to falsehoods, he says. For example, a study by Facebook showed that when someone references Snopes in the comments of a Facebook share, the original sharer is 4.4 times more likely to delete his or her post. And that is the truth ... as far as I know.
Copyright 2017 NPR. To see more, visit http://www.npr.org/.