Apple wants engineers who are psychology experts to program Siri to be more relatable. Will other AI assistants follow the same trend?
As smartphones have become nearly ubiquitous in much of the country, many of us live our daily lives with virtual assistants like Siri in our pockets.
While these assistants can be helpful with simple tasks like relaying the weather forecast, in times of crisis a disembodied robot voice might not be so reassuring.
However, this could change as Apple officials search for engineers with psychology backgrounds in the hopes of making Siri more relatable and helpful in emergencies.
In a job posting, Apple put out a call for engineers to work on Siri with a “peer counseling or psychology background,” among other requirements.
“People talk to Siri about all kinds of things, including when they’re having a stressful day or have something serious on their mind,” Apple officials wrote in the posting. “They turn to Siri in emergencies or when they want guidance on living a healthier life. Does improving Siri in these areas pique your interest? Come work as part of the Siri Domains team and make a difference.”
Bruce Arnow, PhD, a psychologist and psychotherapist at Stanford Health Care, said increasingly mental health professionals have seen an opportunity to reach people in need of help via apps on their phone.
As a result of those collaborations, he said it wasn’t surprising to see Apple pursue psychology experts to make Siri more relatable.
“A couple years ago it would have seemed odd and now it doesn’t,” Arnow said.
He pointed out that people already turn to their phones for help on a daily basis and that some likely have already turned to their virtual assistants in times of crisis.
“I saw
Arnow clarified that he doesn’t think of Siri as a mental health app, but that it could be engineered to steer people in the midst of a mental or physical health crisis in the right direction.
“I’m thinking of Siri as an assistant that could — if you’re in a crisis —speak to you in a kind manner and direct you to a suicide hotline,” or other helpline, he said.
In general, mental health applications have become increasingly popular with new apps now available that are designed to assist people with addiction issues, eating disorders, or other mental health concerns.
“We do have a major access problem with respect to mental health services,” Arnow said.
He did point out that these kinds of services can’t replace full mental health treatment for those in critical need of help.
“Most patients will need a higher level of care,” Arnow said.
But he said these apps could be a first step for those in need of help.
Ramani Durvasula, PhD, a professor of psychology at California State University, Los Angeles, said as people become more attached to their phones, it will be interesting to see how Siri and other AI assistants develop.
“To the degree they want to make Siri more appealing, it’s the most logical way to start,” Durvasula told Healthline. “What gets complicated with Siri, is Siri your assistant or is Siri your friend?”
She said engineers may be able to make Siri and similar devices more adept at mimicking human behaviors and that engineers with a psychology background will have more insight and information about how to effectively mimic that behavior.
However, Durvasula pointed out that some people will want to shout at Siri or be combative, and engineers will have to figure out if they want the device to mirror that combativeness or remain passive.
“Are there ways that over time you can have the phone learn the kind of language that the person is using, whether that’s obscenity or volume or pace of language, and using that data?” Durvasula said
She said there’s a potential that people’s interactions with Siri or other devices could someday impact how they interact with real people.
“If you’re screaming at a phone and get away with it, you may be more likely to talk like that to other people,” she said. “If you scream at a normal breathing human being, they’re going to scream back or walk away.”