Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations
Watch Live

KPBS Midday Edition Segments

Measuring Media Bias

 November 3, 2020 at 12:13 PM PST

Speaker 1: 00:00 In this information age, we're all struggling to keep up with the flood of data that reaches us through the media. We use it to make important decisions every day, not just during election time, but more than ever before. We need to assess whether the information that reaches us is accurate biased or flat out lies. Our next guest has founded a company, offering a technology to help detect bias and information. She's the speaker at tomorrow's monthly meeting of the San Diego based center for ethics in science and technology. Tomorrow's who Bharti is CEO and co-founder of very crypt, which uses artificial intelligence to score bias in news tomorrow. Welcome to midday. Speaker 2: 00:39 Thank you so much. It's a pleasure to be here. Speaker 1: 00:41 So now you talk about how the information ecosystem we're living in has changed. And I find thinking of media as an ecosystem is very helpful. How would you describe the state of our information ecosystem right now? Speaker 2: 00:54 Yeah, I think that right now, a good word to describe the state of art ecosystem of information is weaponized. Meaning that certain actors have taken advantage of the freedoms and opportunities that our information ecosystem affords. So here in the States, we have freedom of speech so we can share freely on the internet, on our news. Our press can share freely about their perspectives and present that to the public. But with social media, really anybody can say anything they want and share that publicly. So this gives an opportunity for people that are illegitimate to kind of take advantage of the audience that social media offers to especially divisive content. Unfortunately, Speaker 1: 01:37 You say, take advantage of the audience, you know, how, how do companies manipulate our ecosystem and therefore us give us examples of how we're being manipulated. Speaker 2: 01:48 Yeah. I feel like ad technology is kind of the driver of why this is all happening. So for example, if you're a news provider and you have a story that's breaking, when you distribute that story, you want to make sure not only that your readers get that information, but also that, uh, that you can make some money. And so the way that these providers and other companies make money is by using ads. And so the way the ads work is you basically set a target audience, so certain demographic information about a person, maybe a gender or their age, and then that can be used to find people in that category and then present the ads to those people. Um, and then the way the ads work is if you click on the ad and then you follow the instructions or whatever, you can buy the thing, but even if you don't click on the ad, even if you just hover on the ad, that gives data to these ad companies, and then they know how better to target you in the future. Um, so in a lot of ways, it kind of all comes down to who's paying for the information that you're seeing on your screen. Speaker 1: 02:48 So now you've, co-founded a company called very crypt to detect bias in infinite information. And the analogy that you use make sense to me, and that you talk about it like a nutrition that manufacturers put on food, that, that your technology would provide a, a label that we would put on the information that we're consuming. How do you do that? Speaker 2: 03:09 Essentially, there's a, there's a lot of different algorithms that we use to provide different scores and different, uh, nutritional facts, if you will, to, to news readers. And at the, at the very basis of that is it's the bias metric. And so essentially we define bias as things that are not, not biased. And we can look at, we can look to the dictionary and to the encyclopedias, uh, for example, the encyclopedia Britannica to kind of glean what our culture today in, in English speaking kind of America and great Britain, what we deemed to be quote unquote unbiased. And so it's these kind of dictionary language encyclopedia language that we use to convey knowledge down to the next generation or within our community. And if we, what we've done is essentially modeled the way that, that textbooks. So not just like the words that they're using, but the actual sentence structure and the fragment structure and all of that. Speaker 2: 04:05 And then when we get incoming texts, we essentially compare and see, well, how close is this to our model of what is unbiased based on this culturally kind of accepted definition from the dictionary, from the encyclopedia. But beyond that, we offer some additional metrics such as how sensational the content is. So is it emotionally charged? And if so, in what direction and to what degree, and then also was this written by a human or a bot because oftentimes even stories that are diligently written by humans are reproduced on the interwebs by bots. And then that's actually how the kind of ads are proliferating. Speaker 1: 04:41 What kind of people want this technology? What kind of clients are you getting? Speaker 2: 04:45 There's a lot of people who make money off of our technology. Non-existing our clients are not those people. Our clients are right now, people who are in a data intelligence. So say you're trying to make predictions like geospatial risk analysis or other types of risk analysis. In addition, as you're an investment firm, uh, right now large, uh, finance organizations actually rely on AI to trade. So, uh, what all of our clients kind of have in common is that they take in news as data. And of course, uh, because our mission is to kind of, to help people in the news industry itself, all of, kind of the, the people trying to bring the truth to the people who are less motivated by, by ads and who don't get their primary income through ad tech, basically Speaker 1: 05:31 Finally, Tamara, do you have any words of advice for those of us that are swimming in this information ecosystem about how we can determine bias? Speaker 2: 05:39 Just remember that your brain is kind of designed to take shortcuts and to act with a bias perspective. So sometimes it prevents us from communicating with people who have different opinions than us. And so I challenge you all to kind of broaden your perspective and break down barriers, because if we can't communicate, then we'll never be able to fix this problem. Speaker 1: 06:01 We've been speaking with Tamara [inaudible], who is CEO and co-founder of very crypt. Tamara is talk autonomous bias detection in a world of sensational headlines is tomorrow night, Wednesday night at five 30. You can find it on the website of the center for ethics in science and technology tomorrow. Thank you so much. Thank you, Allison.

An event hosted by the Center for Ethics in Science and Technology will describe the new media ecosystem we are living in, and technology designed to assess bias in media news stories.
KPBS Midday Edition Segments