Sometimes, we all Is this normal? Did I do the right thing? are you okay? About two years ago, Kate, who asked her only her name for her privacy, began to enter this kind of question as follows. CHATGPT.
“No one has a guide to be a human.” I think I’m looking for that prestigious source. “No one can answer questions, so I don’t actually answer questions,” she added.
Kate knew that he could not get the certainty he wanted, but sometimes he would spend this kind of question for up to 14 hours a day. “You want to reaffirm and add weight,” she said. “If you are 99% convinced, you want to make it 100, but it can’t be done because it’s not work.”
This desire to request a guarantee repeatedly Finding a compulsive reliefIt is common in people with anxiety and compulsive disorders. We all need a little affirmation, but the pursuit of obsessive tenderloin is that according to Andrea Kulberg, a psychologist who has been licensed for 25 years, it is a little suspicious to try to reach certainty that no one exists.
Kulberg said, “People do so because they give certainty fantasies, she explains that she is trying to convince herself that bad things will not happen by investigating online or asking questions on chatbots, and providing a temporary relief if you have a relief, but in time, provides trust in the need to require relief. Kulberg can increase the anxiety.
There are many means that people use to pursue relief. Books, forums, Google, friends and family. But unlike AI chatbots, these other resources do not stimulate the user to continue. This is one of the features that AI chatbots can make a perfect storm for individuals with OCD and anxiety disorders. Kate says, “This does not provide you with a complete response.“ Do you always say, ‘Do you want to do this?’ And if we didn’t end, if it was not completed. ”
Shannon, who can spend more than 10 hours a day to demand the relief of the AI chatbot, said, “This is a big bug hole for me. (Shaannon also asked me to use her name only.) She continues to activate some chat.” I know that it is not healthy. I try to avoid it, but I’m still sucked in, “she said.” I will just think about something. And I will just go and ask about AI. “
Sometimes when Kate asks Chatgpt for several hours on a single topic, the chatbot says there’s nothing to say about the problem. “I think most people don’t reach at that point at that point, but when her phone battery dies, there is little rest.