The JAIC’s Warfighter Health Team is Working to Develop an AI-enabled Suicide Prevention and Intervention Tool

  • By: JAIC Public Affairs
The JAIC’s Warfighter Health Team is Working to Develop an AI-enabled Suicide Prevention and Intervention Tool

Over the past decade, suicide has become the second leading cause of death in the U.S. military. Unfortunately, though, when someone takes their own life, it’s almost always a shock. Family, friends, and colleagues will often say that there were few if any signs to indicate that the servicemember was in that state of mind, and if there were signs, they weren’t picked up on by anyone in a position to help.

The Warfighter Health team at the Joint Artificial Intelligence Center aims to impact suicide prevention through AI-enabled capabilities. Team members are now working to develop the AI-enabled Suicide Intervention and Prevention tool, which will proactively identify risk factors and alert commanders, medical providers, and others that a servicemember could be at a high risk for suicide ideation.

“There is a desperate need for what we’re doing, which is to find a way to mitigate risk and intervene much earlier when a servicemember is heading down a destructive path,” explains Col Caesar Junker, USAF, who leads the Human Performance Portfolio for the Warfighter Health team.

“The goal is to bring accuracy, sensitivity, and specificity to the algorithm so you don’t have false positives,” Junker said. “Again, this is intended to be a tool that is highly precise and successful.”

—Col Caesar Junker, USAF

There are known indicators for when stress is starting to take a toll on a servicemember. These could include family challenges like debt and divorce; physical problems like sleep deprivation and chronic pain; and destructive coping mechanisms such as aggression, alcoholism, and increased drug use. However, determining when those indicators rise to a level of risk for suicidal behaviors is tricky.

For this reason, the Warfighter Health team is working with a multidisciplinary team of experts from academia and private technology and data science firms to develop an ontology framework using many data sources.

“We are looking at information that provides a comprehensive evaluation and indicators to the motivation behind a suicide,” Junker explains. “Leveraging these detailed data sources, paired with medical experts and AI, we are using human/machine teaming to better understand suicide indicators and develop methods for continuous assessment of trends.”

Junker adds, “Now we are analyzing these information sources and dissecting the data alongside medical experts and building an ontology and a program to extrapolate information and then build a model for improved understanding of suicide patterns with an evidence base that supports predictive analytics.”

That algorithm can then be used in conjunction with medical records, financial records, behavioral reports, and other information to help identify increasing risk levels among current servicemembers.

“The goal is to bring accuracy, sensitivity, and specificity to the algorithm so you don’t have false positives,” Junker said. “Again, this is intended to be a tool that is highly precise and successful.”

Bringing Together a Common Goal

Each of the military services has implemented their own programs to tackle the issue of suicides and other destructive behaviors in an effort to increase what’s known as resilience among servicemembers. The Army, for example, has the Army Study to Assess Risk and Resilience in Servicemembers (STARRS), a massive longitudinal research study into suicide prevention.

In fact, for over 20 years, military entities have implemented a variety of projects to try to prevent suicides, but “no one has been able to apply Machine Learning or AI to it, like we are doing now,” Junker states.

The JAIC’s role is to bring together experts from within the DoD and partner organizations in the private sector and academia to figure out how to leverage AI/ML to be more successful in this area. “The more information we have, the better off we’re going to be,” said Junker.

Making Progress

Junker notes that the effort to build the AI-enabled Suicide Intervention and Prevention tool is still in its early stages, but that having it within the purview of the JAIC will provide the authority, cross-functional cooperation, and expertise needed to advance the project more rapidly and reach critical mass.

In the near future, the Health Performance team expects to build AI products for the following capabilities:

  • Commanders’ Risk Mitigation Dashboard (CRMD). A Suicide Intervention and Prevention risk assessment model will be part of a data-driven composite display of behavioral risks. This effort is being led by the Navy’s 21st Century Sailor’s Office (OPNAV N17). The CRMD will improve a leader’s situational awareness of when their crew was, is, and will be at risk for a variety of destructive behaviors, including suicide and suicide ideation, and allow them to make more informed decisions about Sailor readiness and resilience. “Instead of inundating them with a lot of information that they would have to sift through, we are making the information highly useful to them and immediate so they are alerted to know when one of their servicemembers is at risk, which will give them the information they need to make decisions on the most effective ways to intervene,” Junker explains.
  • The Defense Health Agency Medical Records. In this case, an algorithm would sit on the platform of an electronic medical record and alert military medical providers when one of their patients displays evidence of being at higher risk for suicide.

Junker is quick to emphasize that his team understands that suicide and monitoring for suicide is a touchy subject, and their efforts always take that recognition into full account.

“What we’re providing is a tool to help commanders or providers to do their jobs better in recognizing what is going on with a servicemember and how to effectively respond to help our people. These are automated tools for risk mitigation—teamed with human experts - and that’s exactly what the DoD needs right now to make the most of the information they have.”

Junker said the work the Warfighter Health team is doing can not only reduce the number of suicides, but it could also enable servicemembers to get help earlier while fostering overall resilience within the military services and other DoD organizations.

“There is great utility in helping make a more resilient servicemember by providing resources to develop better coping mechanisms to stress,” said Junker. “That’s good for the servicemembers, of course, but long-term, the DoD will be stronger and more successful at achieving its mission.”