I've been wanting to write about this very subject but couldn't put it into words, you absolutely nailed it!! I'm so alarmed by people using AI for therapy and these are the exact reasons. There was even a case not long ago where a young boy ended his life after a Khaleesi chatbot encouraged him to go through with it, and in the screenshot you can see he had been talking to many AI therapists as well. He was in crisis but the AI doesn't know how to recognize that and provide actual support. :(
I agree. AI chatbots are so easily available and that their audience tend to be more vulnerable folks with a lack of accessibility to mental health services makes it all the more scarier.
seeing people use AI as a substitute for therapy is so shocking to me since AI just does not think on its own. like you said, AI responds based on what you tell it, not a unique response that will actually challenge you in the way therapists do. i know a lot of people recognize that and instead use chatbots as a form of simple ranting or motivation but there's still just as many people who believe that AI is an actually adequate replacement for therapists. i really hope it becomes more widely known that AI chatbots are not therapy because if not, we're going to see more cases of suicide and self-harm as a result from these chatbots.
I've been wanting to write about this very subject but couldn't put it into words, you absolutely nailed it!! I'm so alarmed by people using AI for therapy and these are the exact reasons. There was even a case not long ago where a young boy ended his life after a Khaleesi chatbot encouraged him to go through with it, and in the screenshot you can see he had been talking to many AI therapists as well. He was in crisis but the AI doesn't know how to recognize that and provide actual support. :(
I agree. AI chatbots are so easily available and that their audience tend to be more vulnerable folks with a lack of accessibility to mental health services makes it all the more scarier.
Actually one of the most interesting articles I have read so far
i
You make such great points here. I never really thought about this.
seeing people use AI as a substitute for therapy is so shocking to me since AI just does not think on its own. like you said, AI responds based on what you tell it, not a unique response that will actually challenge you in the way therapists do. i know a lot of people recognize that and instead use chatbots as a form of simple ranting or motivation but there's still just as many people who believe that AI is an actually adequate replacement for therapists. i really hope it becomes more widely known that AI chatbots are not therapy because if not, we're going to see more cases of suicide and self-harm as a result from these chatbots.
I'm someone who uses AI for self-reflection.
One thing I do to avoid it leaning too much in my favor is ask it to be blunt and honest with me.
I tell it I want a non-partial answer—something it would say even if the other person asked the exact same question.
I've also saved in memory that I want tough love, not coddling—and that has seriously helped me see my blind spots.