A new article on The Wall Street Journal hypothesis’ that Alexa, and all voice-assistant devices, will be and are currently being used for more than just search results. The WSJ article notes that the developers of Alexa have discovered users are treating their devices more like relationships.
The WSJ article opens up with the following statement:
Amazon has sold more than 15 million Echo devices and now owns 75% of the smart-speaker market, according to estimates by Consumer Intelligence Research Partners, which puts this company on the front lines of what might be called early-stage AI therapy, in which a device is asked to respond to extremely personal questions and requests by its users.
The idea is people are seeing more than a Google search bar for information, but knowledge and a change of life. Later in the article, Toni Reid, the vice president of Alexa experience and Echo devices, and Rohit Prasad, the vice president and head scientist for Alexa talk about how people are wanting to actually have conversations with Alexa. Instead of gathering information for ourselves, we are wanting to get to know more about Alexa.
That’s all well and fine, but the technologists also discovered that in this relationship, people are wanting to share themselves to their Alexa devices. Very personal statements were being shared including “Alexa, I’m being abused.” “Alexa, I’m having a heart attack.” “Alexa, I’m thinking about suicide.”
Two Environments, Two Responses
Amazon’s response to Alexa hearing about the customer being abused is different than when a pastor or counselor hears it. Toni Read states in the article, “We wouldn’t take action on behalf of the customer.” The device though will give the phone number for the National Suicide Hotline for when someone says they are suicidal, again, not making the phone call for them.
We have to remember that Amazon’s goal is to make a profit.We have to remember that Amazon’s goal is to make a profit. Contacting the suicide prevention hotline or calling the police without user prompt is a legal, moral, and personal mess that has no financial incentive. This is not part of the goal for Amazon.
As pastors and counselors, we are all considered mandatory reporters. This means we do not get an option, you are required by law in the United States to call the police or crisis services.
So, if you are looking where to find cheap AA batteries or two-day shipping, I know the company for you. If you or someone you know is looking for companionship or to work through some very serious concerns, we suggest you go to or refer the individual to church leadership or a counseling center instead of an AI device.
What are your thoughts on people who treat AI devices in a more humanistic way?
Leave a Reply