What about the human-like computer made veterans more likely to discuss PTSD symptoms?
While much of the focus on robotics and artificial intelligence revolves around trying to make the programs as lifelike as possible, one application in mental health works because it is not human. A recent study pointed to the power of this distinction by testing how receptive soldiers were to discussing post-traumatic stress disorder (PTSD) with a computer-generated “virtual human.”
As it turned out, the soldiers were more willing to open up to the bots than they were to a survey, even if it was anonymized.
University of Southern California and Carnegie Mellon University staff conducted the study. The interviews and surveys took place in 2014 and 2015.
The first incarnation of the study featured 29 soldiers, including 2 women, who had returned from a yearlong tour of duty in Afghanistan. They underwent the Post-Deployment Health Assessment (PDHA)—which all US military members complete following a tour of duty—in the official written form and an anonymized computer version. They also had an interactive session with a virtual human, in which they built rapport before answering questions about PTSD.
Some of the questions asked by the virtual interviewer were reworded from the PDHA surveys to be asked naturally, and also to provide for more nuanced answers than “yes” or “no.”
For example, the surveys asks, “Have you ever had any experience that was so frightening, horrible, or upsetting that, in the past month, you: have had nightmares about it or thought about it when you did not want to?”
But the virtual interviewer asked, “Can you tell me about any bad dreams you’ve had about your experiences, or times when thoughts or memories just keep going through your head when you wish they wouldn’t?”
The soldiers reported more PTSD symptoms to the virtual human than they did on either form of the survey, despite the fact that one was anonymized. The researchers had hypothesized that this might be the case. They figured the bots would combine the comforting social aspect of human interaction with the sense of privacy provided by the interviewer not being human.
The same experiment repeated with a larger cohort of 132 active-duty service members, 16 of whom were women, ranging in ages from 18 to 77. This incarnation featured only the anonymized computer survey and the virtual human interview. Once again, the soldiers disclosed more PTSD symptoms to computer-generated interviewer than the de-identified survey.
"These kinds of technologies could provide soldiers a safe way to get feedback about their risks for post-traumatic stress disorder," said Gale Lucas of the University of Southern California’s Institute for Creative Technologies. "By receiving anonymous feedback from a virtual human interviewer that they are at risk for PTSD, they could be encouraged to seek help without having their symptoms flagged on their military record."
The study, “Reporting Mental Health Symptoms: Breaking Down Barriers to Care with Virtual Human Interviewers” was published this week in Frontiers in Robotics and AI.