Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations
Watch Live

Military

Can AI help predict military suicides? The VA is funding a project to find out

 Dustin Millado, a veteran who worked in the Army Criminal Investigation Division, is now a digital forensic examiner for Stop Soldier Suicide’s Black Box Project.
Jay Price
/
American Homefront
Dustin Millado, a veteran who worked in the Army Criminal Investigation Division, is now a digital forensic examiner for Stop Soldier Suicide’s Black Box Project.

Like many of us, Army Captain Jim Gallagher, a West Point graduate who served in Iraq, could scarcely be separated from his smartphone.

“He and I were constantly texting throughout the day,” said his wife, Amanda Gallagher, who lives near Fayetteville, N.C. with their three young daughters. “And he would be on Twitter all the time.

"He was just always using his phone," she said. "To the point where we could get into arguments about like, you know, 'you need to get off phone and pay attention to what's going on here.'"

Advertisement

Then one Saturday night they were watching TV, and he got up and told her he was going to the bathroom. Instead he went into the garage to kill himself.

Gallagher left his iPhone on the kitchen counter and wasn’t wearing the Apple Watch that had become a constant presence on his wrist. She was surprised to find it later in the house.

Now his two laptops, tablet, and that phone – likely containing data about his heart function and sleep patterns uploaded from his watch – are in the hands of the Black Box Project. The initiative - which won a $3 million VA grant - will use artificial intelligence to look for patterns in data from the devices of people who die by suicide.

“In today's day and age, our digital devices, especially smartphones, know more about us than things we've shared with even those we’re closest to,” said Chris Ford, who leads the Durham, N.C.-based non-profit group Stop Soldier Suicide, which runs the project.

The group's main work is suicide-specific telehealth therapy with clients who voluntarily seek help.

Advertisement

Suicide research is notoriously hard, and a key reason is that the people scientists most need to study are, of course, no longer alive. Now, though, the devices that have become extensions of us offer a new window.

“The inner thoughts, feelings and behaviors, the things that I'm doing at two in the morning when I can't sleep,” Ford said. “Those don't get shared with our intimate partners, those don't get shared with our parents or our friends in most circumstances. So, when we came up with this approach, we felt it was the last missing piece in the advancement of understanding suicide risk.”

They also thought it could be an unusually reliable source.

“Psychological autopsies have been done for decades,” he said, “But it's not the whole picture. It's third party data around the person who died… you're asking friends, co-workers, and loved ones their perspectives about the person who died. And so it's biased by the person who's being interviewed and their thoughts and feelings and memories, and it's biased by the interviewer. because if they think suicide is driven mostly by mental health, they're going to probe deeper into mental health. If they think it's more driven by relationships or isolation, they’re going to drill into those things.

“So we really believe the data that we're getting from these devices are the most objective data available to really help us understand the thoughts, attitudes, and behaviors in that last year of life,” Ford said.

Some information the devices contain might be obviously useful, like texts, emails, and browser and location histories. But the artificial intelligence will also be looking for patterns in less obvious data that might correlate with suicide risk - things researchers may not have even thought of yet.

Amanda Gallagher holds a photo of her husband Jim, who died by suicide in 2018.
Jay Price
/
WUNC
Amanda Gallagher holds a photo of her husband Jim, who died by suicide in 2018.

Ford said the project is focusing on three questions: who's most at risk for suicide, whether there was a moment when the person was actively seeking or was receptive to help, and how they expressed their desire for help.

The project is just getting up to speed, but it may already be changing some long-held beliefs among suicide researchers, such as the rarity of suicide notes.

“Most research studies in modern times indicate suicide notes happen 15 or 20 percent of the time,” Ford said. “In just looking at our test set, we're finding drafted and deleted suicide notes in at least half the devices.

The digital forensics examiner for the project, Dustin Millado, said these are often in a phone’s notes app or sometimes take the form of an audio or video recording.

“It almost seems like they want to tell somebody what's going on, but they don't really want to tell a person," said Millado. “So that's why they document these things in these notes."

Experts say the project's results could apply to the non-military population, too, with caveats about the differences between the two groups. But they caution against expecting too much from technology.

“Suicide has so many different possible combinations of variables and factors that there are an infinite number of pathways to get there," said Craig Bryan, an Ohio State University professor and a suicide prevention researcher.

He said predictive analytic, large data approaches probably are never going to be able to predict when a specific person is going to attempt suicide. But he said the technology might allow for an approach that he compares to weather forecasting.

“When there's a tornado warning, we don't know if a tornado is going to hit anyone's house in particular at a particular time, he said, "but we know that, hey, the conditions are right, in this county or this city.”

So far, the project has received more than 100 loaned devices that were used by people who died by suicide. Millado - the forensics examiner - handles them with gloves and keeps them in meticulously tracked evidence bags. After he copies the data, he ships the devices back with their contents still intact.

There’s a reason he’s so careful: To people like Amanda Gallagher these aren’t just phones and computers.

“So all of his possessions are really important to me, and they seem like such a limited resource,” Gallagher said. “Like I have to repost the same pictures every year on his birthday, because there aren't any new pictures. If I want to see him with his daughters now, I have to lay one picture on top of the other."

She thinks he’d have been happy to be a part of the project.

“Because it would be exactly the kind of work that he would want to do, because he had a real heart for the soldiers that he worked for and a real passion for data analysis,” she said.

Even though she made copies of the data, she said it was stressful to send the devices away to the Black Box Project. But she said if it helps prevent even one suicide, it's worth it.

“I knew that if there was a chance that another woman didn't have to sit in her child's bedroom and explain to them that their dad wasn't coming home because I was willing to let the phone go, then I should let the phone go,” she said.

The National Suicide Prevention Lifeline is a hotline for individuals in crisis or for those looking to help someone else. To speak with a certified listener, call 988.

CrisisText Line is a texting service for emotional crisis support. To speak with a trained listener, text HELLO to 741741. It is free, available 24/7, and confidential.

This story was produced by the American Homefront Project, a public media collaboration that reports on American military life and veterans.

Copyright 2023 North Carolina Public Radio – WUNC. To see more, visit .