Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations
Watch Live

National

2022 will be a tense year for Facebook and social apps. Here are 4 reasons why

Social media companies face scrutiny from regulators, lawmakers and users over everything from misinformation to teen mental health to election security.
Social media companies face scrutiny from regulators, lawmakers and users over everything from misinformation to teen mental health to election security.

The tumult began in early January 2021 for social media companies. The attack on the U.S. Capitol led Twitter, Facebook and YouTube to kick off then-President Donald Trump. Throughout the year, they were challenged to stop the spread of baseless claims about the 2020 presidential election, as well as harmful vaccine misinformation.

Facebook had to respond to a whistleblower's revelations, just when it wanted to turn everyone's attention to the "metaverse." Twitter's eccentric CEO abruptly left, handing the company, as well as its ambitions to create a new version of social media, over to a little-known deputy. The Trump administration's attempt to ban TikTok over national security concerns fizzled, allowing the Chinese-owned app to cement its hold as the defining driver of youth culture.

It's fair to say social media apps were at the center of politics and society in 2021, and not always for the better. And yet, many thrived financially, reporting record profits.

Advertisement

So what will 2022 bring? Here are four areas to watch this year.

Lawmakers say they want to regulate Silicon Valley. Can they agree on what that means?

If members of Congress agree on one thing, it's that the tech giants are too big and too powerful. (On Monday, Apple became the first publicly traded company to be worth $3 trillion.)

But the agreement stops there. Democrats want laws that force tech companies to take down more harmful content. Republicans say the platforms censor conservative views, despite evidence showing that right-wing content and figures thrive on social media.

The best chance for bipartisanship may come from a Senate Commerce subcommittee led by Sens. Richard Blumenthal, D-Conn., and Marsha Blackburn, R-Tenn. They say they want to work together, particularly when it comes to protecting kids and teens online. (Both have decried what Instagram shows to younger users after having their staffs make fake accounts on the photo-sharing platform.)

Advertisement

Lawmakers have introduced a slew of bills targeting tech giants, from holding social media platforms responsible for health misinformation to requiring companies to open up more data to outside researchers to updating the two-decade-old children's privacy law. They also want to beef up competition law and give more firepower to the Federal Trade Commission and Justice Department, which regulate big tech.

The question is, will any of these bills become law in 2022?

While Washington stalls, Europe is moving quickly to counter Big Tech

European regulators have been more willing to confront tech giants, perhaps because they have little interest in protecting U.S. supremacy in the sector and because many Europeans are more comfortable with government intervention to protect citizens.

The European Union is writing strict new rules that would prevent big tech companies from giving preference to their own products and services such as Amazon pushing people toward items it sells over those from third-party vendors. They would also force companies to crack down more on harmful content, such as child sex abuse and terrorism, and give users more control over how their data is used to target ads.

The U.K. recently set new standards for how apps should be designed for kids, including providing parental controls, turning off location tracking and limiting what data they collect. Companies including Instagram, TikTok and YouTube are already making changes to comply. For these global companies, it's often easier to implement new rules universally than try to enforce a hodgepodge of different policies for users in different countries. Regulations passed in Europe could affect users far beyond the continent.

Companies will struggle with how their apps affect kids' mental health and safety

Reports that Instagram was building a version of its photo-sharing app for kids under 13 drew criticism from parents and regulators and bipartisan outrage in Congress. The outcry was magnified by internal research leaked by Facebook whistleblower Frances Haugen revealing that Instagram knows its platform is toxic for some teen girls.

Under pressure, Instagram paused work on the kids' app in September, but Instagram head Adam Mosseri made clear to Congress in December that the company still plans to pursue the project. He says kids are already online, so it would be better if they used a version of Instagram with parental controls.

All social media platforms, from Instagram to TikTok and Snapchat, will wrestle with this in 2022. Kids and teens are a critical demographic, one that is essential to the companies' growth.

Midterm elections are coming. What will Facebook, TikTok and the rest do about misinformation?

The companies say they've learned a lot from dealing with adversaries ranging from Russian trolls and Chinese influence operations to elected officials spreading disinformation and companies selling spying as a service.

But the challenges to their platforms keep evolving too. Using social media to sow discord, undermine authoritative information and circulate rumors and lies is now a tactic used by anti-vaccine activists, far-right extremists and climate change deniers. So in 2022, you can expect elected officials and candidates to continue to spread misinformation online.

Pressure is already on social media companies to ramp up resources before the campaigning truly gets underway. Some lawmakers are eager to pass laws pressuring the companies to do more to stop the spread of harmful or false content, but those dictates could run up against the tech platforms' First Amendment rights.

Meanwhile, executives like Meta (formerly Facebook) CEO Mark Zuckerberg have made clear they do not want to be the arbiters of what people can say online. And do people really want to give these unelected corporate leaders that power? We'll be watching for the answer to that question in 2022.

Editor's note: Amazon, Apple and Google are among NPR's financial supporters. Meta pays NPR to license NPR content.

Copyright 2022 NPR. To see more, visit https://www.npr.org.