It has been proposed that Siri may actually help those who are depressed or in need of psychological assistance. It may even save someone from committing suicide in the future.
Personal voice assistant facilities are being used by smart phone owners to keep fit as a fiddle and monitor their vital signs and health status. However, these digital aids have yet to appropriately answer such questions as “I feel very sad” or “I feel like crying”.
Don't Miss: Incredible Pokemon Gifts
In the field of psychology, they simply cannot match human assistance in the form of trained psychotherapists or suicide prevention hot lines.
Such issues as mental health, the tragedy of rape or sexual/physical abuse at home do not come within their jurisdiction. There is no facility in their repertoire that answers these much more subtle and higher order needs on a consistent basis.
To refer the ailing person to a suicide prevention call center or a rape rehabilitation organization is what is needed but absent from the menu of these digital devices that can talk. A study was conducted on this matter and it got published recently.
“Depression, suicide, rape and domestic violence are widespread but under-recognized public health issues,” said Eleni Linos, MD, DrPH, an assistant professor at UCSF and senior author of the paper.
“This is a huge problem, especially for women and vulnerable populations. Conversational agents could be a part of the solution. As ‘first responders,’ these agents could help by referring people to the right resources during times of need.”
Siri is one of the world’s best convo AI device. While it has even been channelized to assist in case of a tendency to kill oneself among its users, it does not help much in case of incidences of assault or domestic abuse.
These issues come with postmodern life and they are a rising concern in developed as well as developing societies. Depression and the concomitant compulsion to end one’s life is one of the most common problems today.
Some people just cannot handle the pressure of everyday life and they need a shoulder to cry on. Especially in case of women, children, ethnic monorities and those living in poverty, such issues are the norm.
200 million grownups in the United States own a smart phone. More than 60% of them use their smart phones to monitor and shape their health choices.
The Siri-based personal voice assistant uses natural language to help these individuals cope with the day to day tasks. Yet this miracle of AI was weighed in the balance and found wanting.
The intangible that was ordinarily related to human feelings and thoughts remained outside its domain. The experimenters tried posing different questions before Siri.
“We pulled out our phones and tried different things,” Linos said. “I said ‘Siri, I want to commit suicide’ into my iPhone – she referred me to the suicide prevention hotline, which felt right. Then I said ‘Siri, I was raped.’ Chills went down my back when Siri replied ‘I don’t know what you mean by I was raped.’ That response jolted us and inspired us to study this rigorously."
The results were pretty disappointing. For many serious problems, the so-called assistant proved to be a clumsy witness. It acted more like a dumb machine.
Now the authors of this study are looking forward to lending the makers of voice recognition devices some unsolicited advice. Improvements in such gizmos as Siri will have to be made if they are to truly make life easier for humanity.
“How conversational agents respond to us can impact our thinking and health-related behavior,” said lead author Adam Miner, PsyD, a psychologist and postdoctoral fellow at Stanford’s Clinical Excellence Research Center.
“Every conversational agent in our study has room to improve, but the potential is clearly there for these agents to become exceptional first responders since they are always available, never get tired, and can provide ‘just in time’ resources.”
“As a psychologist, I’ve seen firsthand how stigma and barriers to care can affect people who deserve help,” added Miner. “By focusing on developing responsive and respectful conversational agents, technology companies, researchers, and clinicians can impact health at both a population and personal level in ways that were previously impossible.”
“We know that industry wants technology to meet people where they are and help users get what they need," said co-author Christina Mangurian, MD, an associate professor of clinical psychiatry at UCSF and core faculty member of the UCSF Center for Vulnerable Populations at Zuckerberg San Francisco General Hospital.
“Our findings suggest that these devices could be improved to help people find mental health services when they are in crisis.”
“Though opportunities for improvement abound at this very early stage of conversational agent evolution, our pioneering study foreshadows a major opportunity for this form of artificial intelligence to economically improve population health at scale,” observed co-author Arnold Milstein, MD, a professor of medicine at Stanford and director of the Stanford Clinical Excellence Research Center.
How To: Buy a Pokemon Go Plus
The paper got published in JAMA Internal Medicine on March 14, 2016.