Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations
Watch Live

Infinite scrolling on trial: Social media and kids

 April 1, 2026 at 3:07 PM PDT

S1: Hey there , San Diego , it's Andrew Bracken in for Jade Hyneman. On today's show , we take a look into the ways social media keeps us scrolling and scrolling and scrolling. This is KPBS Midday Edition. Connecting our communities through conversation. A jury last week found silicon , Silicon Valley giants Meta and Google caused harms to a woman who'd been using social media starting at the age of six. Both of those companies said they're weighing legal options to this multimillion dollar judgment. It was a very closely watched trial. Many similar suits are making their way through the courts , but the judgment highlighted how social media platforms rely on certain core features that many say are addictive and harmful for young minds. Kristin Vaccaro joins me now. She's an associate professor of computer science and engineering at UC San Diego. Kristin , welcome to Midday Edition.

S2: Thanks for having me.

S1: So good to have you here. So I want to talk more about this , this case. But , you know , really kind of dig into a little bit what's under the hood of these social media platforms. But I'm just wondering what your initial reaction was to the case of what you've learned about it.

S2: Frankly , I was a little surprised that they were able to make this case. Section 230 has historically provided a lot of protections for social media platforms in terms of content. But this frame was framed a little bit differently. Right.

S1: Right. And that's kind of what I want to dig into is like , you know , these design features we hear about. And at the core of it is this idea of , you know , using , we hear this term problematic use of social media. Also dissociative use of social media. Can you explain what those those things are because they're at the core of this , of this in many other cases as well. Sure.

S2: Sure. So you'll often hear people talk about this experience that they might have , where they sort of wake up and realize that they've been browsing their sort of like one hour into viewing TikTok , um , short form videos , and they sort of realize that they've been doing this , but they didn't feel like they were making a conscious choice about kind of making doing that , um , browsing or consumption. And so this experience that people have is often termed a dissociative use , because you're not sort of aware of the or making conscious choices about your.

S1: So you're just like on autopilot , basically.

S2: Right , exactly. Maybe it's like out of habit or , you know , they're very good at populating your newsfeeds with with things that you're likely to enjoy and watch. And so you're kind of looking for that little bit of gold. Yeah. That you would.

S1: Are there certain ? I don't know. Types of social media or maybe specific features that kind of lead to that more dissociative use of the social media.

S2: So people talk about this a lot with short form videos. So things like TikTok videos , YouTube shorts , Instagram Reels , those have a reputation that people will kind of get on and then lose track of time. Um , it's also a thing that people have , you know , kind of shown in the past that people might log on to a social media platform because they offer a lot of features. So you might log on initially intending to do something like , you know , send a direct message to your friends or family , but then you sort of , you know , it's right there. You know , the next tab over , you're suddenly in the explore tab or Instagram Reels , and you've now lost sort of like. Yeah.

S1: Yeah. And you kind of point to research that , that shows often we do use social media in ways that we may not. I kind of intend. Right ? Right.

S2: So there's a really interesting project that came recently out of the University of Washington , where they had basically gotten people to download spyware , and they were taking screenshots of their use of , you know , their mobile phones all day long. And what they found is people expressed the most regret , like they sort of were unhappy with what they ended up doing when they intended to do some kind of direct communication and then ended up in these newsfeeds. Mhm.

S1: Mhm. So that's exactly what you're talking about. Like I go to Instagram , maybe just send a message to somebody and then next thing you know I lose an hour. Exactly.

S2: Exactly. Scrolling.

S1: Scrolling. Um okay. So we're talking about scrolling. We're talking about these features. You know , at the heart a lot of this is infinite scrolling , endless scrolling. I'm just wondering if you can talk a little bit about that and the role it plays in keeping us coming back to social media and losing track of the time. Like you've just kind of been describing. Sure.

S2: Sure. I mean , the social media platforms collect a lot of information about you through kind of simple things like which videos you spend the most time watching or which ones you like or comment on. All of that information is used to make predictions about what kinds of content you're going to enjoy the most in the future. And so , um , you know , people often have the sense that , you know , TikTok in particular has a reputation for being very accurately able to predict sort of niche and unexpected interest that people end up enjoying. Sometimes that can be very constructive , but other times it can end up , you know , you lose track of time. Right.

S1: Right. And it's I don't know , it's giving us what we want even. Yeah. Even when we're not actively thinking about it. Um , we hear a lot about the algorithm discussions about algorithms. I'm just curious if you can talk a little bit about how you see people talk about the algorithm , and , I don't know , Open us up to sort of new ways of looking at it , because you've sort of an interesting way of describing it , if you like. Yeah.

S2: Yeah. So I mean , one thing that you definitely want to keep in mind is there is not a single algorithm that is controlling any one of these platforms , right ? Every platform is different , and every platform is actually using thousands or maybe even tens of thousands of different kind of small pieces of code that are going to make predictions for different aspects of your consumption. So , um , you really have to think of it as sort of like a , a suite of things that have all been kind of put together to try to construct this , which also makes it very hard to kind of test , you know , how they're working , especially if you're on the outside , like , how can you check what the kind of impacts of a system like this are ? Um , is it giving ? So for example , is it pushing people towards , um , becoming radicalized ? You know , that's a concern that people have or , um , you know , any kind of problematic content that you might not want to encourage. It becomes very difficult because these systems are so , um , complex and involve so many different pieces.

S1: A lot of these platforms have settings , you know , which theoretically should help us be able to navigate to this. I mean , another feature that we haven't talked about is just autoplay , right ? In that infinite scrolling. There's also just a lot of features where you just whether you like it or not , you see the next video , you get the next piece of content starting up before you even choose to do it. I'm just wondering , you know , if you can talk a little bit about the state of the controls we have over how these platforms , you know , operate in our lives ? Sure.

S2: There definitely are control settings that people might find very useful if they're trying to reduce their their use of social media. So you mentioned turning off autoplay. That's a great idea. You could turn off notifications if you don't want that , kind of like to get pulled back in. Um , however , the downside of control settings is that often people aren't aware of the control settings that exist , or they can't find the ones that actually do the thing that they're looking for. It turns out , um , I've done some research with some collaborators , and we've found that that actually , you know , most people , if you give them a specific task , uh , at least some of the time , you know , they won't be able to complete it. Like , they won't be able to find the control setting that does that. That thing. Right.

S1: Um , you know , we're talking about this problematic use of social media and some of the platforms , the way they hook us and keep us engaged. And I'm wondering if you could point to any examples that may have a different approach or other ideas that you'd like to see kind of embedded in these platforms , because I think , you know , the trial that we've been kind of centering this upon is at least kind of like highlighting some of these features that we , I don't know , I take for granted in a lot of these tools we use.

S2: Yeah , absolutely. I think of really interesting platform to contrast with Instagram would be a platform called B-Real. So this is a platform. It's also image based sharing. But instead of working like Instagram , it sends a notification at random sometime during the day and you have two minutes to upload a photo. And the goal , like the reason why they've designed their platform this way , is that they're trying to encourage people to be more authentic with what they post. So the idea is instead of having such a curated , polished view of someone's life , you instead see something much more realistic. And the goal is , you know , then when people are browsing , instead of comparing themselves to everyone else's highlight reels , it feels a little bit more like you're actually kind of seeing people's actual lives and , you know , you have this , you know. Less of a sense of I'm , you know , comparing myself and feeling , you know , really sad.

S1: Well , the young woman at the center of this case who , you know , had reported experiencing mental health struggles , um , said she did use , you know , beauty filters on Instagram. Right. And I know that's that's something you've also been involved with some work and to kind of looking into those impacts , right ? Yeah.

S2: Beauty filters I think can be really tricky. On the one hand , everyone likes to look good in a , you know , you're sharing content about yourself. But then on the flip side , you know , they can kind of deeply change people's expectations of what they and others should look like. Um , in particular , I've been working with a graduate student who's been running a study where she built a magic mirror. That's what we call it. So you basically can see in the mirror , you know , a reflection of yourself and then what how you look like changes as you put on these filters. And we'll run events with young people or students in the hope of kind of understanding how people think about these , you know , the effects of these AI kind of changes that get made to how you look. Um , in many cases , people get really upset , either because , you know , they don't like the changes that are being made. Often these filters will make you look more feminine. They often embed , um , a number of standards that are aligned with Western beauty ideals that might not , you know , everyone doesn't kind of have the body for , uh , or doesn't agree with fundamentally. Um , but then the other thing is , you know , these kind of , uh , image or , you know , kind of. Yeah , vision systems can be used to make predictions about you. And those are often also incorrect in ways that people , people get really hurt by. If I get predicted , uh , as being the wrong gender or the wrong age or the wrong race or ethnicity. You know , that might really conflict with my self-image in a way that's that's pretty hurtful.

S1: Yeah , a lot. A lot of factors here. Um , you know , I want to bring it back to this case , and it involved , you know , someone that started using these platforms quite young. And , you know , parents are obviously in a difficult position here , kind of trying to navigate , uh , how to keep their kids safe online , but also kind of being in this world where they play a big role in young people's lives.

S2: Yeah. I mean , number one , you know , you want I hate to give advice to parents. Parents already get yelled at a lot. Right. So I guess I just want to say that.

S1: I appreciate that.

S2: I really appreciate it. I really want to say that up front , but I mean , hopefully you're having conversations with your kids early and often. Uh , it sounds like in this case , there were a lot of other factors that were also kind of this young woman was having a really tough go of it. And so making sure that you're kind of talking to your kids about all of these aspects , like are you having , you know , tough , tough time with bullies at school , is that going to then play out on social media ? Like , absolutely , that can merge. But I mean , I think the core thing is , you know , talking to your kids in terms of the social media aspect of it , I freely admit I do not use social media myself. I am not someone , you know. I think there's a lot of risks involved with that consumption. And I think there's a lot of negative , you know , negative potential impacts on people's days.

S1: Well , with that. I mean , as someone who studies this , who , like , looks under the hood , right , and sees , I mean , what would you like to see from social media that would make you change your mind and want to use it ? I mean , I think that's a really interesting point that you yourself don't.

S2: I mean , yeah , I guess I should say , like there are some forms of social media that I think are great. And I do use. So like direct messaging , an application like WhatsApp , where you're communicating with people that you know , that you're already kind of connected with and you don't have to worry about this imagined , much larger potential audience. You know , that's great , right ? I think it's much harder for that to go awry. Um , I would love to see social media platforms make themselves more available to audits or other kinds of testing from external parties. So right now , you know , I have a number of projects where testing YouTube recommendation system , we're testing these beauty filters , but it can be very difficult to get the recommendations without sort of , um , possibly violating terms of service. You know , you have these workarounds where you're trying to , uh , to get access in a way that's not kind of supported in the official. Yeah.

S1: Yeah. I mean , and I mean , it seems like a common concern is like , how much of these technologies do we need to understand before we can set rules or set laws around them ? Um , and I imagine it can be difficult to kind of really get to really understand some of these algorithms too. Right. Mhm.

S2: Mhm. Mhm.

S1: Mhm. Um , you know , any other recommendations you might have for people on how to approach social media to have a better experience. And we just have about 20s left.

S2: Um , I guess the main thing that I would say is , you know , if you're having a good time on social media , you feel good at the end of the session. That's great. Keep doing what you're doing. If you're someone who's starting to experience regret or these other negative emotions , then I would probably recommend that either external apps or maybe your own , you know strong well that you try to reduce your usage.

S1: Yeah , it's it's really hard to navigate all that comes with it. But I appreciate you kind of yeah , talking more about it and , you know , explaining some of those things for us. I've been speaking with Kristin Vaccaro. She's an associate professor of computer computer science and engineering at UC San Diego. Kristen , thanks.

S2: Thank you.

S3: That's our show for today. I'm your host , Jade Hindman. Thanks for tuning in to Midday Edition. Be sure to have a great day on purpose , everyone.

GLAAD graded Instagram, Facebook, Twitter, YouTube and TikTok on their measures to protect LGBTQ users. All received failing grades.
Jenny Kane
/
AP
The Instagram, Messenger and Facebook apps are shown on a phone screen in this undated photo.

A jury last week found Meta and Google designed their social media platforms to hook young users without a concern for their well being.

The case was brought forth by a woman who had been using social media since the age of 6. She testified that she became addicted to social media as a child and that the addiction worsened her mental health struggles.

On Midday Edition Wednesday, we talk about the case and how social media features like infinite scrolling and autoplay can lead to problematic social media use.

Guest:

Kristen Vacarro, associate professor of computer science and engineering, UC San Diego