Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations
Watch Live

Zuckerberg Looks To Clear Up Stance On Facebook, Fake News And The Holocaust

Facebook founder and CEO Mark Zuckerberg says, "I personally find Holocaust denial deeply offensive, and I absolutely didn't intend to defend the intent of people who deny that."
David Paul Morris Bloomberg via Getty Images
Facebook founder and CEO Mark Zuckerberg says, "I personally find Holocaust denial deeply offensive, and I absolutely didn't intend to defend the intent of people who deny that."

Facebook CEO Mark Zuckerberg is clarifying remarks he made about whether his platform should remove content posted by Holocaust deniers, saying he wasn't defending them when he commented that it was hard to know their intentions. His initial comments set off intense criticism earlier this week.

"I personally find Holocaust denial deeply offensive, and I absolutely didn't intend to defend the intent of people who deny that," Zuckerberg wrote to Kara Swisher of Recode on Wednesday, the day after the site published a lengthy interview with the billionaire.

In the original interview, Swisher asked Zuckerberg about Facebook's policy of taking down fake news — something it has combated using a variety of approaches after fake news sources were found to have been used to manipulate voters in the 2016 presidential election.

Advertisement

Swisher used the Sandy Hook school massacre as an example, asking Zuckerberg why Facebook would allow an organization to post a conspiracy theory that claims the killings were staged.

Zuckerberg implied that Facebook would be quicker to take down harassment directed at a Sandy Hook victim than to remove a fake news story promoting the conspiracy theory. Then he offered up the example of the Holocaust.

"I'm Jewish, and there's a set of people who deny that the Holocaust happened," he said.

"I find that deeply offensive," Zuckerberg continued. "But at the end of the day, I don't believe that our platform should take that down because I think there are things that different people get wrong. I don't think that they're intentionally getting it wrong, but I think ..."

Swisher interrupted to say, "In the case of the Holocaust deniers, they might be, but go ahead."

Advertisement

Seeming to view the question as primarily one of free speech, the Facebook founder said, "I just don't think that it is the right thing to say, 'We're going to take someone off the platform if they get things wrong, even multiple times.' "

He mentioned the potential that some people who are speaking in a public forum can simply get things wrong. But his critics said denying the Holocaust was much more dangerous and problematic — and that Zuckerberg's suggestion that some of those denials weren't made to mislead people were astounding.

"This is bonkers!" wrote Cale Weissman of Fast Company, after using profanity ("holy s***").

"This position is so bizarre, it's hard to know where to begin," writes Yair Rosenberg in The Atlantic.

The Anti-Defamation League said Facebook has "an obligation not to publish" falsehoods about the Holocaust.

In the Recode interview, Zuckerberg said that rather than taking down a fake news or conspiracy post or barring the user, the company would seek to minimize it.

"Our goal with fake news is not to prevent anyone from saying something untrue — but to stop fake news and misinformation spreading across our services," Zuckerberg said. "If something is spreading and is rated false by fact checkers, it would lose the vast majority of its distribution in news feed."

Facebook would take a harder line, Zuckerberg said, if anyone published calls to violence or tried to organize any type of attacks.

In his clarifying email to Swisher, he concluded, "I believe that often the best way to fight offensive bad speech is with good speech."

The heated discussion over Zuckerberg's remarks is the latest in a string of debates over whether Facebook is simply a technological platform, or if it should best be seen — and see itself — as a media outlet. That question has grown more complicated as the tech giant spends more to attract and create programming.

In January, Facebook said it was changing how its influential News Feed works, giving prominence to news articles from "high quality" sources, and pushing down others, as NPR's Aarti Shahani reported.

In recent months, Facebook has reached deals with journalists such as CNN's Anderson Cooper and Fox News' Shepard Smith, who are creating shows specifically for the platform. Other news organizations are also involved in the deal, which is meant to draw viewers to Facebook's Watch video section.

Over the past week, Facebook's policies have also been called into question for how it handles news organizations, including their use of ads to boost their most high-profile projects.

People at at least two news outlets – KPBS in California and The Texas Tribune — have complained that Facebook nixed ads because they were deemed to be political. In the Tribune case, an ad asked young people what issues they care the most about, ahead of the November election. At KPBS, an attempt to promote an investigative story about migrant children appearing in court without their parents was barred.

Replying to complaints about the KPBS decision, Facebook product director Rob Leathern said the issue "applies only to running ads on the platform, and is fixable by completing the authorization process."

"If you are running ads in the U.S. about electoral or political issues, you will need to go through the authorizations process," he added. "This includes news organizations — who are designated separately in the ad archive where these ads are retained publicly."

The same standards apply to all advertisers, Leathern said in another tweet. That and other tweets were in response to a posting by Jean Guerrero, who wrote the KPBS investigative piece. The story, she said, was effectively being censored by Facebook.

In a separate thread centering on the Texas Tribune's attempt to place an ad, the magazine's chief audience officer, Amanda Zamora, questioned why the ad was scrutinized so closely when "materially false and harmful" information passes through Facebook's news feed.

"Too bad efforts to engage readers in our journalism needs to be verified, when so much garbage fills my feed," Zamora wrote.

Days after Facebook announced its push to promote reliable news back in January, the company acknowledged the possibility that social media can have negative ramifications for democracy.

Samidh Chakrabarti, Facebook's civic engagement product manager, said the service was being "used in unforeseen ways with social repercussions that were never anticipated."

Copyright 2018 NPR. To see more, visit http://www.npr.org/.