Can an Algorithm Prevent Suicide?

At a recent visit to the Veterans Affairs clinic in the Bronx, Barry, a decorated Vietnam veteran, learned that he belonged to a very exclusive club. According to a new A.I.-assisted algorithm, he was one of several hundred V.A. patients nationwide, of six million total, deemed at imminent risk of suicide.

The news did not take him entirely off guard. Barry, 69, who was badly wounded in the 1968 Tet offensive, had already made two previous attempts on his life. “I don’t like this idea of a list, to tell you the truth — a computer telling me something like this,” Barry, a retired postal worker, said in a phone interview. He asked that his surname be omitted for privacy.

“But I thought about it,” Barry said. “I decided, you know, OK — if it’s going to get me more support that I need, then I’m OK with it.”

For more than a decade, health officials have watched in vain as suicide rates climbed steadily — by 30 percent nationally since 2000 — and rates in the V.A. system have been higher than in the general population. The trends have defied easy explanation and driven investment in blind analysis: machine learning, or A.I.-assisted algorithms that search medical and other records for patterns historically associated with suicides or attempts in large clinical populations.

Doctors have traditionally gauged patients’ risks by looking at past mental health diagnoses and incidents of substance abuse, and by drawing on experience and medical instinct. But these evaluations fall well short of predictive, and the artificially intelligent programs explore many more factors, like employment and marital status, physical ailments, prescription history and hospital visits. These algorithms are black boxes: They flag a person as at high risk of suicide, without providing any rationale.

But human intelligence isn’t necessarily better at the task. “The fact is, we can’t rely on trained medical experts to identify people who are truly at high risk,” said Dr. Marianne S. Goodman, a psychiatrist at the Veterans Integrated Service Network in the Bronx, and a clinical professor of medicine at the Icahn School of Medicine at Mount Sinai. “We’re no good at it.”

Deploying A.I. in this way is not new; researchers have been gathering data on suicides through the National Health Service in Britain since 1996. The U.S. Army, Kaiser Permanente and Massachusetts General Hospital each has separately developed a algorithm intended to predict suicide risk. But the V.A.’s program, called Reach Vet, which identified Barry as at high risk, is the first of the new U.S. systems to be used in daily clinical practice, and it is being watched closely. How these systems perform — whether they save lives and at what cost, socially and financially — will help determine if digital medicine can deliver on its promise.

“It is a critical test for these big-data systems,” said Alex John London, the director of the Center for Ethics and Policy at Carnegie Mellon University in Pittsburgh. “If these things have a high rate of false positives, for instance, that marks a lot people at high risk who are not — and the stigma associated with that could be harmful indeed downstream. We need to be sure these risk flags lead to people getting better or more help, not somehow being punished.”

The V.A.’s algorithm updates continually, generating a new list of high-risk veterans each month. Some names stay on the list for months, others fall off. When a person is flagged, his or her name shows up on the computer dashboard of the local clinic’s Reach Vet coordinator, who calls to arrange an appointment. The veteran’s doctor explains what the high-risk designation means — it is a warning sign, not a prognosis — and makes sure the person has a suicide safety plan: that any guns and ammunition are stored separately; that photos of loved ones are visible; and that phone numbers of friends, social workers and suicide hotlines are on hand.

Doctors who have worked with Reach Vet say that the system produces unexpected results, both in whom it flags and whom it does not.

To some of his therapists, Chris, 36, who deployed to Iraq and Afghanistan, looked very much like someone who should be on the radar. He had been a Marine rifleman and saw combat in three of his four tours, taking and returning heavy fire in multiple skirmishes. In 2008, a roadside bomb injured several of his friends but left him unscathed. After the attack he had persistent nightmares about it and received a diagnosis of post-traumatic stress. In 2016, he had a suicidal episode; he asked that his last name be omitted to protect his privacy.

“I remember going to the shower, coming out and grabbing my gun,” he said in an interview at his home near New York City. “I had a Glock 9-millimeter. For me, I love guns, they’re like a safety blanket. Next thing I know, I’m waking up in cold water, sitting in the tub, the gun is sitting right there, out of the holster. I blacked out. I mean, I have no idea what happened. There were no bullets in the gun, it turned out.”


The strongest risk factor for suicide is a previous attempt, especially one with a gun. Yet Chris’s name has not turned up on the high-risk list compiled by A.I., and he does not think it ever will.

“At the time, in 2016, I was going to school for a master’s, working full time,” he said. “Our two kids were toddlers; I was sleeping no more than a few hours a night, if that. It was too much. I was sleep-deprived all the time. I had never been suicidal, never had suicidal thoughts; it was a totally impulsive thing.”

The A.I. behind Reach Vet seems to home in on other risk factors, Dr. Goodman said: “The things this program picks up wouldn’t necessarily be the ones I thought about. The analytics are beginning to change our understanding of who’s at greatest risk.”

The algorithm is built on an analysis of thousands of previous suicides in the V.A.’s database, dating to 2008. The computer mixes and shuffles scores of facts from the medical records — age, marital status, diagnoses, prescriptions — and settles on the factors that together are most strongly associated with suicide risk. The V.A. model integrates 61 factors in all, including some that are not obvious, like arthritis and statin use, and produces a composite score for each person. Those who score at the very top of the range — the top 0.1 percentage — are flagged as high risk.

“The risk concentration for people in the top 0.1 percent on this score was about 40 times,” said John McCarthy, the director of data and surveillance, in Suicide Prevention in the VA Office of Mental Health and Suicide Prevention. “That is, they were 40 times more likely to die of suicide” than the average person.

Bridget Matarazzo, the director of clinical services at the Rocky Mountain Mental Illness Research Education and Clinical Center for Veteran Suicide Prevention, said of Reach Vet. “My impression is that it’s identifying some folks who were previously on providers’ radar, but also others who were not.”

Late in 2018, a V.A. team led by Dr. McCarthy presented the first results of the Reach Vet system. Over a six-month period, with Reach Vet in place, high-risk veterans more than doubled their use of V.A. services. By contrast, in a comparison group tracked for six months before Reach Vet was installed, the use of V.A. services stayed roughly the same.

The Reach Vet group also had a lower mortality rate over that time — although it was an overall rate, including any cause of death. The analysis did not detect a difference in suicides, at least up to that stage. “It’s encouraging, but we’ve got much more to do to see if we’re having the impact we want,” Dr. McCarthy said.

Ronald Kessler, a professor of health care and policy at Harvard Medical School, said: “Right now, this and other models predict who’s at highest risk. What they don’t tell you is who is most likely to profit from an intervention. If you don’t know that, you don’t know where to put your resources.”

For doctors using the system, however, it has already prompted some rethinking of how to assess risk. “You end up with a lot of older men who are really struggling with medical problems,” Dr. Goodman said. “They’re quietly miserable, in pain, often alone, with financial problems, and you don’t see them because they’re not coming in.”

And for those whose names have popped up on Reach Vet’s list, the experience of being identified and contacted is not something they can easily forget.

Barry, the Vietnam veteran, said that he was in a relatively good place, for now. He is close to his two grown children, and he receives regular care at the Bronx V.A., including both individual and group therapy, and medication for recurrent psychotic episodes. But he is also aware of how quickly things can turn dark.

“Look, I know I sometimes talk to myself at night, and I hear voices,” he said. “The meds work fine, but once in a while they don’t, and that angers them, the voices. And that is not good for me.”

[Like the Science Times page on Facebook. | Sign up for the Science Times newsletter.]

Post a Comment