As Election Day nears, Facebook and Twitter are finding themselves in a Catch- 22, between conservatives who accuse them of stifling free speech when they block or put warning labels on questionable posts, and others who say the companies need to do more to stamp out online misinformation and conspiracy theories.
"Nineteen days before voting closes in the 2020 election, I believe we are more vulnerable to online disinformation from both foreign and domestic sources than ever before," Nina Jankowicz, a disinformation expert at the Wilson Center, told members of the House Permanent Select Committee on Intelligence during a virtual hearing on Thursday.
Pointing to the group that plotted to kidnap Michigan Gov. Gretchen Whitmer, Jankowicz cautioned lawmakers that what happens online can have real-world consequences, including violence.
"The social media platforms played a huge role in allowing that group to organize. It seeded the information that led them to organize," she said. Other false messages, she said, were often aimed at women or minorities, with the goal of keeping them from participating in democracy and even from public life.
"As a woman who has been getting harassed a lot lately with sexual and gender disinformation, I am very acutely aware of how those threats that are online can transfer to real world violence," Jankowicz added.
Republicans chose not to attend the hearing. Democrats largely agreed with the witnesses that misinformation, especially around the election and the coronavirus pandemic, was possibly dangerous. Yet, Rep. Jim Himes, D-Conn., said he was deeply uncomfortable with the idea that either the government or private companies should "fight disinformation."
"How do we get in the business of figuring out who should define what 'misinformation' or 'disinformation' is?" he asked. "That strikes me as hard."
Melanie Smith, head of analysis at Graphika, Inc., a social network mapping and analysis company, said social media platforms could tweak their algorithms to curb the spread of misinformation.
This month, Facebook, Twitter and YouTube have announced a steady stream of policies to block or limit the spread of conspiracy theories and other potentially harmful misinformation. Just this week, Facebook rolled out a ban on messages that deny the Holocaust happened, and ads that discourage vaccinations. On Wednesday, YouTube banned videos that supported the unfounded QAnon conspiracy theory and other hoaxes that target individuals and could possibly lead to violence. Facebook and Twitter already established similar policies.
With each move, however, the social media platforms have raised the ire of conservatives who have long accused the companies of treating them unfairly. The anger reached a new peak this week, with President Trump and his allies lashing out about Twitter and Facebook's decisions to block or limit the spread of an unverified story about Hunter Biden, the son of the Democratic presidential nominee.
"We've seen Twitter and Facebook actively interfering in this election in a way that has no precedent in the history of our country," Sen. Ted Cruz, R-Texas, said Thursday.
Republicans have renewed calls to overhaul Section 230, the law that protects tech companies from liability for what users post. The companies, they argue, are acting more like publishers than impartial platforms. A hearing, at which the CEOs of Twitter, Facebook and Google are set to appear, is scheduled for just days before Election Day.
At the House hearing, however, the expert witnesses cautioned against repealing Section 230 without a clear replacement waiting in the wings.
"Social media provides an enormous public good in terms of the ways in which people communicate with each other," said Joan Donavan, research director at the Harvard Kennedy School's Shorenstein Center. "It's the features that are becoming the problem--the way in which information is sorted and, by and large, the way in which people can 'pay to play'--they can pay to push their information or 'news' across these platforms."
Regulations, she said, should require social media platforms to reveal more information to the public about the messages posted, including who is behind them.
Adedayo Akala is an intern on the NPR Business Desk. NPR's Shannon Bond contributed to this report.
Copyright 2020 NPR. To see more, visit https://www.npr.org.