In the hours just after the massacre in Las Vegas, some fake news starting showing up on Google and Facebook. A man was falsely accused of being the shooter. His name bubbled up on Facebook emergency sites and when you searched his name on Google, links of sites connecting him with the shooting topped the first page.
It appears to be another case of automation working so fast that humans can't keep pace. Unfortunately, these powerful tech companies continue to be a main destination for news and it's not clear how they can solve the problem.
In this particular case, the man's name first appeared on a message board on a site called 4chan. It's is known as a gathering spot for underground hackers and the alt-right. Everyone who posts is anonymous. And we're not publishing the man's name because he's been through enough.
Shortly after the shooting, police announced that a woman named Marilou Danley was a person of interest. She had been living with the shooter in his Nevada home.
On a message board called /Pol/- Politically Incorrect, someone said her ex- husband was the shooter. His Facebook page indicated he was a liberal and the far-right trolls on Pol went to work to spread the word.
Even after police identified the shooter, the wrong man's name appeared for hours in tweets. On Facebook, it appeared on an official "safety check" page for the Las Vegas shooting, which displayed a post from a site called Alt-Right News. And on Google, top searches linked to sites that said he was the shooter. When you searched his name, a 4chan thread about him was promoted as a top story.
So, why did parts of these hugely powerful companies continue to point to an innocent man?
Bill Hartzer, an expert on search, says Google is constantly crawling the Web and picking up new information as it appears. The innocent man went from hardly having anything online to have a whole bunch of stuff.
"Google has not had the time to really vet the search results yet," Hartzer says. "So what they'll do is they will show what they know about this particular name or this particular keyword."
In a statement, Google said the results should not have appeared, but the company will "continue to make algorithmic improvements to prevent this from happening in the future."
One improvement that Greg Sterling thinks Google should make is putting less weight on certain websites, like 4chan. "In this particular context had they weighted sites that were deemed credible more heavily you might not have seen that," says Sterling, is a contributing editor at Search Engine Land. On the other hand, he says "if news sites ... were given some sort of preference in this context you might not have seen that."
Unfortunately, it seemed like Facebook was giving those same sites credibility. In a statement, Facebook said it was working on a way to fix the issue that caused the fake news to appear. (Disclosure: Facebook pays NPR and other leading news organizations to produce live video streams that run on the site.)
But Sterling says part of the issue with having these companies determine what's news is that they're run by engineers. "For the most part the engineers and the people who are running Google search don't think like journalists," he says. "They think like engineers running a product that's very important."
And then there is the scale of what Google and Facebook do. They are huge. And that's only possible because computers do a lot of the work. Yochai Benkler, a law professor at Harvard, says that with such massive scale even if there were humans helping out there would be mistakes.
Benkler says that even if Facebook and Google blocked sites like 4chan, it wouldn't solve the problem. "Tomorrow in another situation like this someone will find some other workaround," Benkler says. "It's not realistic to imagine perfect filtering in real time in moments of such crisis."
But, for the man who spent hours being accused of mass murder, the technical problems at Google and Facebook probably aren't much comfort. And they won't be much comfort to the next person who lands in the crosshairs of fake news.
Copyright 2017 NPR. To see more, visit http://www.npr.org/.