A new feature on the social media platform X reveals some influential political accounts are actually foreign actors. The revelation is more evidence of outside efforts to fan division and spread disinformation in the United States.
KPBS’ Amita Sharma recently spoke to Imran Ahmed, the founder and CEO of the Center for Countering Digital Hate about the role of social media in America's polarization and what each of us can do about it.
Below is the full interview, edited for clarity.
Some of the world’s largest tech companies are American-owned. Many people blame these companies’ algorithms for intensifying political and cultural polarization. But policy experts tell me they don’t foresee regulation to address this problem anytime soon. What are you doing to tamp down the spread of disinformation and hate in the online space?
Ahmed: The first is that, clearly, there is free speech, and people have the right to hold opinions no matter how disgusting, and to express those opinions on social media. The problem that we have is that social media distorts the lens through which we see the world. A very good example of that is around the assassination of Charlie Kirk last year, when we found that only 1.6% — so less than 1 in 50 comments — were people either calling for retaliatory violence or celebrating his death. Which means that 98% — the vast, vast majority of comments — were people just saying, "This is horrible. I don't want this to happen to anyone," a fairly normal opinion. But on social media, it felt at the time like everyone that was being amplified was saying something horrific. So, the first thing that we do is, we want mandatory transparency of these platforms. We want to have enhanced access to data so that we can actually tell people facts like that, which help to inoculate people against the distortive lens of social media. So, transparency, first of all. And second of all, where they lead to real harm, and sometimes these things are innocuous or they are in the realms of free speech, but where they lead to perhaps violence or they lead to misinformation about, say, health care, which can mean people take actions that harm their lives, may even kill them, that we want people to be able to hold them accountable if they're negligence in spreading lies that actually cause real-world human harm.
How successful has your center been encountering digital hate?
Ahmed: We've managed to persuade lawmakers in the UK and the EU to actually put in place statutory transparency for these platforms, to create mechanisms for holding them accountable if they harm kids or society. More generally, if they lead to riots as they did in the UK, when X played a significant role in race riots in the UK by spreading an absolute lie that the person who stabbed three young girls in Southport, near my hometown of Manchester, was an asylum seeker when he was actually a homegrown man, who was very, very disturbed. So we've managed to do that. And in the U.S., we've been working with lawmakers across the political spectrum who care about the impact on kids in particular, but also want to have greater transparency of how these platforms work. Right now, for the most part, they are a black box. We don't really know how they are distorting the lens and whose agenda they are serving when they do that. One of the things that we're focusing on here is putting into place those real mechanisms for transparency, but also for reform of the laws in the U.S.
I think most people would be staggered to understand this, but social media companies have a special get-out-of-jail free card when it comes to causing harm to other people. Any business that harms a consumer, that consumer can take legal action against them. You can't sue a social media company if they harm you or your kids. And we're trying to make sure that the get-out-of-jail-free card is taken away from these companies who, frankly, without checks and balances, have become a little bit tyrannical, very arrogant, and completely dismissive of their impact on our society.
You reported last year that Instagram, in the majority of cases, took no action against people who posted toxic content against women politicians, even when those posts violated the platform's own rules. Do you have any updates on that?
Ahmed: It was a really disturbing study to show that women in politics were receiving enormous numbers of hateful comments. And this is not a partisan issue. Actually, the worst affected, the person who received the most abuse was the Republican Congresswoman Marjorie Taylor-Greene, who recently has announced that she's stepping down as a result of the abuse that she receives online in particular. And we found that most of the time, in fact over 90% of the time, when we reported these comments to the platform Instagram, they took absolutely no action. Since then, the truth is that Instagram has made no real progress in enhancing its ability to both detect and deal with real hate: threats of rape, threats of murder, threats to harm people's families, harassment and stalking. The truth is that what's happened since that time is that Mark Zuckerberg announced that he was actually getting rid of a lot of the enforcement that they had for their rules, for their community standards. And we found that actually things have degraded on that platform since our report came out.
Imran, the social media platform X now shows the location of an account, the number of times a username has changed and when it joins. What is your take on this new tool?
Ahmed: Look, these people are publishers. They decide whose comments and whose posts get the most visibility and whose don't. They get to define, therefore, reality for millions of people. And it is wrong that they should be able to make those publishing decisions, those editorial decisions, even if they're done by an (artificial intelligence) algorithm without any scrutiny whatsoever, without us even knowing how they are seeking to shape the agenda to their own commercial interests. And so what we're looking for is real transparency. You don't get to decide, first of all, the syllabus and then get to grade your own homework. You need someone independent to do that. And really, the people that should be doing that are Congress because they are the representatives of the will of the people.
Just to be clear, you're saying it is a simulation of transparency because X itself is publishing this information and not some objective third party?
Ahmed: First of all, they've decided what they're going to make transparent, and there's no way of auditing the effectiveness of that transparency. In the very short time that this transparency drive has been there, they've already come back and said, "Oops, it doesn't work very well, so don't take it too seriously." That's not real transparency. That is a simulation of transparency, a desperate attempt to make themselves part of the conversation around transparency, when in reality, what we want to understand is how their algorithms work, how their content enforcement decisions work, how the advertising reshapes what we see on those platforms. And that's the minimum that consumers would demand of them. What's less interesting is where you think someone might be based. It's a cute little trick, but it's not real transparency.
Even so, were there any significant revelations from this new feature for you?
Ahmed: I mean, is it a revelation that foreign bad actors seek to influence our politics and that social media platforms have played an integral role in giving them access to our body politics, to our political discourse, allowing them to reshape it, to pervert it, to corrupt it, for their own malignant endeavors? No, that's not real transparency. We all knew that. Now, Elon Musk, when he bought the platform X, said that he would get rid of the bots. The only thing that the feature has made transparent is that he's utterly failed in that task.
What role did foreign governments play in disseminating false information after Charlie Kirk's murder?
Ahmed: The truth is, we simply don't know, because the only people who can tell us where the IP addresses are, who are repeatedly posting, where they suspect there are multiple accounts being run by the same government propaganda agency, are the platforms themselves. And so it's crucial that they give access. They've already said that their transparency may not work that effectively. It's time to ensure that there is a possibility of independent academic research and civil society bodies like my own that are able to take the raw data and study it in-depth because we are the ones who really care about our societies. And the way these platforms have behaved has made it really clear to all of America, all of the world, that they don't really care about us, the normal people.
Given that regulation of social media's algorithms may not be in our near future, and the limitations of your work and other people's work without that raw data, what can people, regular citizens, do to constrain the divisive effect of social media?
Ahmed: The best way to make sure that social media algorithms aren't reprogramming your understanding of what people are saying and what is normal in society is to put down your phone and go and talk to your other citizens, talk to other people, because the truth is that social media is inherently, by design, distortive. It seeks to present a world that is completely different to the real world that you live in for only one purpose, which is to keep you addicted, to keep you scared, to keep you emotionally affected so you stay on that platform. They are incredibly unhealthy places to find information and to understand the world around you. And it's better to actually go out and have a conversation. The kids, the Gen Z, call it touching grass. I think it's time that we all put down our phones and touch a bit of grass.