When Facebook CEO Mark Zuckerberg testified before a joint Senate Committee on Wednesday, he led off with a mea culpa. Just a few paragraphs into his opening statement, he took personal responsibility for the disinformation:
"[I]t's clear now that we didn't do enough to prevent these tools from being used for harm as well. That goes for fake news, foreign interference in elections, and hate speech, as well as developers and data privacy. We didn't take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I'm sorry. I started Facebook, I run it, and I'm responsible for what happens here."
After the 2016 election, many feared that fake news articles spread on Facebook swayed the results of the election. It's a broad but reasonable leap to make: many purveyors of fake news aimed to help Trump win, and lo and behold, Trump won.
But among people who study fake news, it's not at all clear how much — if at all — those articles swayed the election.
With that in mind, here's a look at several facts we do know about the role fake news played in the 2016 election. It's by no means an exhaustive review of all the studies done on fake news since the election, but it is a start at digging into the complicated factors at play here.
1) Social media heavily drove fake news
Social media plays a bigger role in bringing people to fake news sites than it plays in bringing them to real news sites. More than 40 percent of visits to 65 fake news sites come from social media, compared to around 10 percent of visits to 690 top US news sites, according to a 2017 study by researchers from NYU and Stanford.
And another study suggests Facebook was a major conduit for this news. The more people used Facebook, the more fake news they consumed, as Princeton's Andrew Guess, Dartmouth University's Brendan Nyhan, and the University of Exeter's Jason Reifler found.
That study also found that Facebook was "among the three previous sites visited by respondents in the prior 30 seconds for 22.1 percent of the articles from fake news websites we observe in our web data." But it was only in the prior sites visited for around 6 percent of real news articles.
2) Fake news had a wide reach
More than one-quarter of voting-age adults visited a fake news website supporting either Clinton or Trump in the final weeks of the 2016 campaign, according to estimates from Guess and his co-authors. That information was gleaned from a sample of more than 2,500 Americans' web traffic data collected (with consent) from October and November of 2016.
Some posts, in particular spread, especially far: In the months leading up to the election, the top 20 fake news stories had more shares, reactions, and comments on Facebook (8.7 million engagements) than the 20 top hard news stories (7.3 million engagements), according to a Buzzfeed analysis.
Importantly, this doesn't mean that fake news itself had a broader reach than hard news. Indeed, in either category, 20 stories are just a tiny slice of a gigantic universe of news stories.
"There is a long tail of stories on Facebook," a Facebook spokesman told BuzzFeed. "It may seem like the top stories get a lot of traction, but they represent a tiny fraction of the total."
It does, however, show on a basic level that millions of people interacted with these kinds of stories.
3) ...but it appears a small share of people a read large share of the fake news
Only an estimated 10 percent of Americans account for nearly 60 percent of visits to fake news sites, according to that study from Princeton's Guess and his co-authors. Not only that, but that 10 percent is the 10 percent of people with the "most conservative information diets."
That suggests that, at least as far as reading fake news articles goes, fake news may have served largely to influence already-decided voters. One could reasonably assume that those one-in-ten uber-conservative people who read the most fake news stories were unlikely to ever vote for Clinton.
Perhaps relatedly, Guess and his co-authors also found that fake news articles were heavily pro-Trump: People saw an average of 5.45 fake news articles during the month-and-a-half-long study...and that 5.00 of those articles were pro-Trump. (But once again, extremes make averages; a small share of heavy fake-news readers drove that average up.)
4) People are bad at remembering fake news (or, more precisely, they're good at misremembering it)
Sure, maybe one-quarter of Americans saw a fake news story...but did it stick? One early-2017 study cast doubt on this. In it, researchers from NYU and Stanford presented people with a series of fake news headlines, as well as a series of fake-fake news headlines (that is, headlines the researchers made up).
Fifteen percent of respondents said they recalled seeing the "real" fake news headlines, and eight percent said they believed the headlines. But then, 14 percent said they remembered the fake fake news headlines, and eight percent likewise said they believed those headlines.
That result could mean that a sizable chunk of Americans are so set in their beliefs that they are easily convinced of falsehoods, as the New York Times's Neil Irwin wrote:
"That's a strong indication about what is going on with consumers of fake news. It may be less that false information from dubious news sources is shaping their view of the world. Rather, some people (about 8 percent of the adult population, if we take the survey data at face value) are willing to believe anything that sounds plausible and fits their preconceptions about the heroes and villains in politics."
Indeed, the authors found that "Democrats and Republicans, respectively, are 17.2 and 14.7 percentage points more likely to believe ideologically aligned articles than they are to believe nonaligned articles."
That doesn't mean fake news swung the election; the authors are careful to say that their study doesn't show that. But it is further evidence of how susceptible people are to believing ideas they already want to believe.
5) Fake news studies have important caveats
OK, pretty much every study has an important caveat and fake news studies are no different. Any time there's a headline saying that a study shows that fake news did or did not sway the election, there's probably some sort of mitigating information to consider.
For example: That study from Gross and his co-authors was taken by many to have meant that fake news had "little impact." But it's more complicated than that.
The study found that a small number of people clicked on a lot of fake news stories, and that fake news stories are also a small fraction of most people's information diets.
But people encounter fake news in other ways. As Slate's Morten Bay pointed out, "The study had an important limitation: It looked only at Facebook users who actually clicked on one of the fake news links littering their news feeds during the election."
In other words, headlines whizzing past you on Facebook — not just the articles you end up clicking on — may well be affecting how you think about politics.
(And while some news coverage may have overstated the findings of the study, the authors themselves told Slate that they "did not measure how much fake news affected an individual's opinions about the election or whether fake news affected the outcome of the election.")
Likewise, in a recent study from Ohio State University, the authors say that their data "strongly suggest...that exposure to fake news did have a significant impact on voting decisions."
That study looked at the survey responses from 585 people who claimed to be Obama 2012 voters. This survey was conducted in December 2016 and January 2017.
Among other survey questions, the authors included three fake news statements that had been widely circulated during campaign 2016 — two negative statements about Clinton and one positive statement about Trump.
They found that "belief in these fake news stories is very strongly linked to defection from the Democratic ticket by 2012 Obama voters."
The authors control for as many variables as possible (ideology, education, attitudes toward Trump and Clinton, social media usage), but importantly, they do not have direct evidence that respondents were exposed to these fake news stories before they voted.
And other researchers are skeptical as well. Dartmouth University Political Science Professor Brendan Nyhan (also one of Guess' co-authors) noted, asking people about their beliefs after the election presents its own problems: "[C]orrelations with post-hoc self-reported beliefs [do not equal] evidence of causal effects for vote choice or turnout," he tweeted.
The study suggests that self-reported Obama-to-Trump voters could have been more susceptible to believing fake news stories. However, as the authors themselves write, there's no way, using this data, to prove that fake news caused some voters to swing from Obama to Trump.
6) There are many potential impacts of fake news that go well beyond determining the results of the 2016 election.
Even if (if) it's true that fake news didn't swing the 2016 election, that doesn't mean fake news isn't still worrisome.
"It can confuse people, it can turn people off from politics — it can have a lot of negative effects that we're only beginning to understand," Guess said in an interview.
For example, it's still troubling if fake news convinces people at the extreme liberal or conservative end of the spectrum of things that aren't true — even if it doesn't change their votes.
And there is evidence that fake news is effective at changing beliefs. One 2017 study from researchers at Yale University found that the more people were exposed to a given fake news statement, they more they believed it.
That's good news for fake news writers and the creators of Russian bots and hypothetical 400-lb. hackers in New Jersey. If it's true that showing people the same headline multiple times makes them believe it, all fake news purveyors need to do is be persistent — and hope that they continue to have platforms like Facebook for posting the things they make up.
Copyright 2018 NPR. To see more, visit http://www.npr.org/.