Play Live Radio
Next Up:
Available On Air Stations
Public Safety

Using 'Darkness' To Shed Light On Racial Profiling Claims

Maria Morales is pictured at a trolley station in Encanto on June 9, 2015.
Katie Schoolov
Maria Morales is pictured at a trolley station in Encanto on June 9, 2015.
Using 'Darkness' To Shed Light On Racial Profiling Claims
Using 'Darkness' To Shed Light On Racial Profiling Claims
San Diego State researchers are using a "Veil of Darkness" to examine claims of police racial profiling.

Speak City Heights is a media collaborative aimed at amplifying the voices of residents in one of San Diego’s most diverse neighborhoods. (Read more)

For more than a year, residents in some of San Diego's most heavily policed neighborhoods have been telling City Hall there is racial profiling by officers.

Maria Morales, 38, is one of those residents. She said she has no doubt that what she and her boyfriend experienced a while back at a trolley stop was racial profiling.

"We have tattoos and we're colored people. And there was a white couple. (The officers) totally bypassed the white couple and came up to us," Morales said. "I already knew what to expect."

Morales said the officers asked whether she was on parole.


"Just because I have tattoos and because I have long, dark hair and colored skin and a certain look, you know nothing about me. I've never been to prison in my life," Morales said. "For you to automatically assume that, it's a shaming feeling."

But this and hundreds of similar anecdotes haven't convinced elected officials or police brass that racial profiling is a reality in the San Diego Police Department.

Since 2001, department leaders have said the data they have on race and policing is inconclusive. And they're not alone.

At the public's urging, law enforcement agencies across the nation now have an unprecedented amount of information on their officers – cell phone videos, body camera footage and traffic stop cards. But most haven't drawn any wide-ranging conclusions with that data. The public dialogue on racial profiling still plays out like a debate, and often circles around the idea that the real problem is mutual misunderstandings about culture and police practices.

Joshua Chanin, a public affairs professor at San Diego State University, said the departments are right about taking their data with a grain of salt.

Earlier this year, Councilwoman Marti Emerald tapped Chanin, along with SDSU professors Stuart Henry and Dana Nurge, to take a look at the department's data to gauge whether people of color are getting a disproportionate share of traffic stops.

The quick-and-dirty method used by most departments, advocacy groups and media outlets has been to make a comparison between the rate at which certain racial groups are stopped, and Census data. For example, if the share of black drivers stopped by police is higher than the share of black residents, many allege racial profiling.

"But this isn't the most accurate assessment of those people that are driving. There's a distinct difference between the driving population and the residency," Chanin said.

Researchers call this "The Denominator Issue," and Chanin will spend the next several months trying to get around it.

The Denominator Issue

Researchers need a reliable baseline to find disparities, and Census population figures don't provide that. Not everyone in a given area is old enough to drive or has a car. And San Diego's proximity to the border and its tourism draw mean the driving population is constantly in flux.

So Chanin plans to use something called "The Veil of Darkness."

"What this technique does," Chanin said, "is it uses natural changes in light to isolate the effect of race on the likelihood that a driver will be stopped."

The Veil Of Darkness

The Veil of Darkness assumes two things. First, that it's more or less the same people driving on a given street between 5:30 and 9 p.m. They're coming home from a nine-to-five or heading out for the night shift. Second, it assumes than an officer can better observe a driver's skin color when the sun is up.

So, researchers can compare traffic stop data from 5:30 to 9 p.m. in July when it's light out to the same timeframe in January when it's dark. If more people of color are pulled over in that area during the summer, one can assume race is at play.

The method isn't perfect.

"The extent to which a driver's physical characteristics predict the likelihood of being stopped or searched is one of a host of variables that may shape or may influence a police officer's decision," Chanin said. He'll be able to control for the area's crime rate and police presence in his statistical model, but whether the driver was making furtive movements or otherwise tipping off the officer can't be accounted for.

Statistician Greg Ridgeway created the Veil of Darkness with his colleague, Jeffrey Grogger, and said it's the best option out there. They came up with it after stumbling upon an article in which an officer said it was impossible for him to target certain racial groups because he couldn't see skin color while patrolling at night.

The strategy has since been used in five jurisdictions: Oakland, Cincinnati, Minneapolis, Syracuse and the state of Connecticut.

"It really gets at simply the direct question, 'Does the ability to see the driver in advance influence which drivers the officer is going to stop?' And that really gets a lot closer to the key question of racial profiling," Ridgeway said.

To paint a fuller picture, Chanin's group will also study whether people of color are more likely to get a ticket or be searched after they're pulled over. And he'll include interviews with community members and police officers.

If the final picture is one of systemic racial profiling, Emerald said the city will act.

"It's a big if," she said. "Maybe the study will show some improvements are needed. If so, I think our police chief is ready, willing and able to do something about it."

To date, most researchers have not found systemic racial profiling using the Veil of Darkness. But Ridgeway said that doesn't mean there shouldn't be a call to action when the study comes out.

"You can get past the question of whether the department is race-neutral to get to, 'Well, let's solve these individual cases of individual officers and individual incidents and try to minimize the risk of those,'" Ridgeway said.

In Cincinnati, the result was special software that would more accurately scan the department for problem officers. Most systems only track complaints and disciplinary problems. Some officers have slipped through the cracks in San Diego.

Ridgeway's system customizes a benchmark for each officer and looks for abnormal enforcement patterns.

"It's basically comparing that officer to peers that would be exposed to the same kinds of suspicious characters, the same kinds of interactions," Ridgeway said.

Back at the trolley stop, Morales said she wishes her voice were proof enough for city leaders. But she said she welcomes the study if it means greater accountability for officers.

"I think we're just kind of desperate," Morales said. "Just give us something that will support what we're already saying."

The study results will come in three phases. The first is expected in late October.

KPBS has created a public safety coverage policy to guide decisions on what stories we prioritize, as well as whose narratives we need to include to tell complete stories that best serve our audiences. This policy was shaped through months of training with the Poynter Institute and feedback from the community. You can read the full policy here.