
A patient in therapy has been urged to report their therapist after he accidentally revealed he was “cherry-picking” responses from ChatGPT.
The patient and original poster (OP), user TomorrowFutureFate, recounted the incident on Reddit, explaining that in a recent session over video call with their therapist, the connection was so bad the OP suggested switching to audio-only.
Instead of turning the video off, however, the therapist accidentally shared his screen with the OP, revealing his web browser.

Photo by PeopleImages/Getty Images
The OP wrote, “He had several tabs open with Google searches related to what I had previously mentioned, which I think is fine (e.g., I was mentioning I had seen a movie over the weekend, and he had googled that movie).
“However, he also had ChatGPT open, and to varying degrees was inputting what I was saying into ChatGPT using first person and then summarizing and cherry-picking things from its response.
“This led to a very surreal session in which, out of sheer shock, I also ended up basically cribbing from ChatGPT in my responses.”
‘I don’t think this is an isolated incident’
They continued, “For example, I’d say something, he would type it into ChatGPT, it would return a result, like a summary of ‘Cognitive Flexibility’, and then because I could see his screen, I would say something like ‘I guess I could be more flexible…’ and he’d say, ‘Yes! Exactly!’
“I don’t think this is an isolated incident, either, because I could see some of his ChatGPT history at the start of the session and could see that he had been asking it questions about depersonalization, which I can only assume would have to do with the patient before me.”
In a message to Newsweek, the OP said they were “reluctant” to bring the situation up with the therapist, since they feel like they would be challenging not only an expert, but also a person with whom they have had a meaningful relationship over the past year.
“I think more than anything else, I found it surreal, like something out of an episode of Black Mirror,” they said. “But people I’ve told this story to in real life have reacted with a mixture of disbelief and horror, so I suspect I am underreacting.”
The OP added that they are “leaning” toward parting ways with the therapist, and that it might be hard to trust the advice of a therapist for a while.
“One of the things that I can’t get over is that he was using the free version of ChatGPT,” they wrote. “I pay $50 a session.
“If he’s really going to try to outsource my therapy to AI, certainly, some of that funding could go towards the $20/month ChatGPT Plus subscription so I can at least get a cutting-edge AI therapist.”
Newsweek has reached out to online therapy services to ask about the use of AI.
Newsweek‘s “What Should I Do?” offers expert advice to readers. If you have a personal dilemma, let us know via [email protected]. We can ask experts for advice on relationships, family, friends, money and work, and your story could be featured on WSID at Newsweek.
To read how Newsweek uses AI as a newsroom tool, click here.