AI Confessions: A Chatbot Saved My Life

Listen to this episode

Speaker A: The number of people using some sort of large language model LLM like ChatGPT or Claude is growing steadily. By now, most of us have had some time to play around with these communicating robots, use it to, like, figure out how to make banana bread with the crap that I have in my fridge. Maybe it wasn’t for you.

Speaker B: Overall, it ended up just reminding me of a kind of psychopathic person who would promise a lot and not deliver.

Speaker A: Maybe by now it’s integrated into your life in such a way that tasks you used to do without its help suddenly feel harder.

Speaker C: I’m a high school art teacher and I regularly use AI ChatGPT to help me write emails or help me write a lesson plan.

Speaker A: Or maybe you’ve tried asking it for advice about a life decision. Or maybe it started to feel like a friend.

Speaker B: I didn’t set out to make an AI companion. It’s unbelievable.

Speaker A: It’s truly magical. While AI companionship, girlfriends, boyfriends are still relatively rare, people are turning to this tech for personal reasons. According to the Harvard Business Review, roughly a third of young people 18 to 28 years old reported asking chatbots for help with their personal life, including, quote, advice about relationships were life decisions. We recently asked you to tell us about that, how you are incorporating AI into your emotional life. And you told us it’s meeting your needs in new ways.

Speaker B: It gave really great encouragement and some mantras that I think helped me more than my conversations with my mother, my husband, my best friend, who were also encouraging.

Speaker A: But something about the message from ChatGPT really got through to me. This episode is the first of a two part series about our emotional relationships with AI. Next week we’ll hear from a psychologist about why AI is so psychologically sticky and potentially destructive to some of our existing relationships. And it’s a framing I hadn’t heard before. And we’ll also hear from people who felt abandoned when their spouses started leaning on AI for emotional support.

Speaker D: There was four weeks from the first interaction with the chatbot to when she filed for divorce. And over those four weeks, I saw her personality change completely into someone that I no longer recognize.

Speaker A: In this episode this week, we’ll hear from people, your fellow listeners, for whom the tech helped through crises and transitions.

Speaker E: It’s where I went to voice how nobody understood what I was going through. I didn’t want to go to my friends and family all of the time with all of these horrible dark thoughts. I could just go here and in a way like chat was there not only to just listen, but it helped Me understand what it was I was actually going through.

Speaker A: This is Death, Sex and Money, the show from Slate about the things we think about a lot and need to talk about more. I’m Anna Sale. When does someone start to go to a chatbot for emotional support? For the last several months, you’ve been sharing your stories with us about how conversations with AI chatbots have affected your emotional lives. We got a lot of voice memos and if you’re a regular listener, you know, we do these kinds of episodes where we quilt together listener stories from time to time. But we’re trying something different with this series. In this episode, you’ll hear a sampler of what we heard, excerpts from voice memos, and follow up conversations I had with some of you. But in our Slate plus feed right now, you can hear more of the full voice memos we received on this topic and they’re really interesting, both in what you shared and what it feels like to listen to your long stories. I was on a hike listening to them and it struck me as sort of a cousin to the kinds of conversations you might be sending to AI by yourself. Like you’re telling quite personal stories, recording them into a phone by yourself, and then you send these missives out over the Internet. Now, I am not a robot. The Death, Sex and Money team is not made up of robots. We are real, actual humans and we take care with what you share with us. But you can also hear in these long voice memos how well designed tech could make all of us share private things that we never intended.

Speaker B: I was diagnosed with two very concerning and potentially even life threatening medical conditions and ended up having a many hours long quote, conversation, end quote, where I listed out to AI, you know, against my better judgment, my blood test results, my imaging results, my diagnoses throughout my lifetime.

Speaker A: You can listen in on those AI confessions by signing up for Slate+@slate.com DSM or sign up right there when you look up death, sex and money on Apple podcasts or Spotify. Now, I imagine that for most of us, the idea of taking pivotal advice from a machine seriously even a year ago would sound unreasonable to some of you. It still does.

Speaker E: A large language model performs empathy incredibly well. It is a neural net. It cannot be empathetic.

Speaker A: But as we collected stories from some of you who’ve had quite profound emotional interactions with AI, I noticed a lot of those stories began in a moment of transition. I left my partner of 15 years one day during a difficult argument. You were looking for clarity. I would send ChatGPT texts that I was receiving from my ex and I would ask it to help me understand. And essentially, one day I think the most effective thing was to be like, I need you to be objective. I need you to ask me a list of questions so I can establish if this relationship I had been in was abusive or help moving from one life phase to the next.

Speaker B: Now that I’m living in a different country, I’ve moved to Germany recently.

Speaker A: ChatGPT knows my landlord’s name and my mom doesn’t even know my landlord’s name. When you’re in a period of uncertainty, these LLMs can be great at helping us get oriented. Maybe that’s especially true with parenting.

Speaker F: I’ve never found parenting books directly useful because by the time you finish reading, the kids have already grown up and you’re like, oh, new face. AI is almost like you’re parenting books on your fingertips type of thing.

Speaker A: Raja is a friend of mine and fellow parent in Berkeley. He has a 9 year old daughter and a 12 year old son. Their family was over for dinner a few months back and Raja started describing how he’d been using AI as a parenting tool for his middle schooler, Arshan.

Speaker F: Most of the scenarios that I try to look for help outside is when I am thinking about something that I already know that is going to hurt his feelings. Like, you know, how can I convey something to my son so that he actually, you know, changes the behavior or provides me information or facilitate conversation without me accidentally, you know, hurting his disposition or his confidence?

Speaker A: Without hurting his feelings? Maybe also without kind of damaging your closeness or like your communication, your line of communication.

Speaker F: Yes. Yeah, no, that’s. Now that you’re saying it, that’s probably my underlying fear, right? Like, because my relationship with my son is so important, I don’t want my corrections. Because as parents these days, you know, we say no a lot. And it is definitely at the top of my mind and I definitely want to make sure that my corrections do not impact my relationship with him. Right. Like, so he doesn’t see me as the. As a no person.

Speaker A: As a no person.

Speaker F: Rather somebody who’s empathetic with him.

Speaker A: I remember when we talked about this over dinner, you were describing this and Arshan was sort of listening and then he just kind of made a joke like, yeah, my dad has to ask a robot how to be a dad. He just like made fun of you.

Speaker F: Yeah, yeah, no, he still gives me a hard time for that. Part of me is like, I do not want to do this as something that he does not know that like I’m asking. I want him to know that I’m doing this. I want him to know the process that I’m going through. Even partly, I want him to believe that this is important to me, you know, what I’m doing with him and trying to approach.

Speaker A: Can I ask you, in your household there are two parents, do both of you use AI in your parenting?

Speaker F: No. My wife would not dream of doing this.

Speaker A: How come?

Speaker F: I think it would be an antithesis to what all she, you know, believes. But when I think about it, I don’t think my dad did anything differently. So my dad, he was 40 years older than me. He was actually very pragmatic, forward thinking person for his generation. But he derived a lot of his philosophies and structure of communication things from his biblical teachings and learnings. For him, going to religion, what scripture says was a framework that he could adapt. Right. And he did use a system, he did use a tool. I think all of our parents did use some kind of information system behind the screen that informed them in terms of how they parented us.

Speaker A: Okay, Raja, so what you are saying is you’re comparing your father in southern India using the Catholic Church as guidance for how to be a parent. You’re comparing that to using AI in California in 2020 25. As a parent? Yes.

Speaker F: As a tool.

Speaker A: Yes, A tool, but one that works quite differently than anything before. It feels to you intimate and it talks back to reassure and soothe you.

Speaker E: Coming up, I talked to a listener who said having conversations with ChatGPT saved his life in some ways. Probably wouldn’t be here if it weren’t for my ability to get some of these thoughts out of my head and have something there that could quote, unquote, hear it and understand it and even respond back. Hey Anna and the Death Sex and Money Team. My name is Alex. I am speaking to you from my flat in London. I’m in West London and I am currently surrounded by suitcases full of my stuff as I am planning to leave London in two days. This was supposed to be a two year trip and I am leaving after three months because. Well, we’ll get to that. Specifically, this voice memo is for the ChatGPT content that you’re looking to get. I think the deepest I’ve gone with it is I used it to get through what has probably been one of the most hardest and traumatic experience of my life.

Speaker A: Why were you in London?

Speaker E: I was in London to be a part of a two year grad program at a Acting school, conservatory over there.

Speaker A: Alex is 33 years old. He lives in California, and until a year ago, he worked as an engineer and a project manager in tech. He and his colleagues started playing around with AI early. He was good at his work and got promoted all through his 20s. But he wasn’t happy. And in 2020, he decided to make a change.

Speaker E: I had really wanted to find something that really got me out of bed every morning and thought maybe I can find something that’ll allow me to do that and maybe even make a living off of it. And you know, maybe voice acting isn’t what would jump to everyone’s mind at first, but I just got really inspired during COVID to get into it and I kind of just dove in.

Speaker A: Tell me how, how did you dive in? What were your first acting experiences?

Speaker E: You know, it was during COVID and we were all isolated and it felt like the world was ending at the time. And I was like reading a manga or comic book and I was voicing characters as I do when I read like any sort of novel or anything. It just like, it helps me digest the story better. And it also, like, it’s just fun. It just brings the characters to life a little bit. And I don’t know why, it just like hit me like a bolt of lightning. I guess I was just like, hey, don’t people get paid to do this? Like, this could be a thing. And I started taking classes at my local community college online. It was all on Zoom. I got involved with their Virtual Zoom productions and just kind of went from there.

Speaker A: Huh. When was the first time you used AI?

Speaker E: Probably about a year and a half, two years ago.

Speaker A: So 2023, 2024.

Speaker E: Right.

Speaker A: So this is also a period where you’re starting to really take seriously the idea of a major career change, right?

Speaker E: Yeah.

Speaker A: And did you talk to chatbots about that?

Speaker E: You know, it’s funny if you go into my chat GPT account and you just like, there’s so many different tabs and so many conversations that I have. I’m sure somewhere in those convers one where I’ve asked like, am I crazy for doing this? Is this wild that I am throwing away like this career and this like six figure salary and wanting to like jet off to London to be in school for acting. Like if anything it was giving me back. Like, yeah, it really seems like you’ve thought this out. You’re not just like jumping ship on a whim. You are planning, you are organizing. You’ve reflected on what’s brought you to this point, and it seems like this is the direction you want to go in. So you’re. And he’s probably being sycophantic at some point in time and saying, oh, you’re doing better than most would do. And I’m just like, okay, maybe, maybe not.

Speaker A: Not only are you fabulous, but you’re better than other people in the way you’re thinking about things.

Speaker E: Like, no, okay, thanks, but I just want to know if I’m like, crazy, you know?

Speaker A: When you were telling your co workers and your employer that you were leaving and why you were leaving, did you go to ChatGPT for help figuring out how to express that?

Speaker E: Yeah, I did. I pretty sure I had Chat write my notice to work.

Speaker A: Alex took a leap. It was scary, but he had a cushion of money from years of working and a good support system, which had recently grown to include a new pet, a cat he got from the local shelter. Their connection was immediate.

Speaker E: He was kind of from the get go, just a little s***. Like he was. He was a little butthead. His name was Pantera. He was a tuxedo cat and he was the boss of everything. He just, he you there was. He had such an opinion on everything. He was so vocal. He was just such a big, big personality, super confident. He would literally like stand his ground on sidewalks as people walked their dogs past, and he would force dog owners to walk to the other side of the street as opposed to get out of their way. He was wild, man. And at the same time, like, he was super loving. Like, you know, we’d curl up every night to go to sleep and have the same sort of ritual, and he’d comfort me when I was having a hard time and. And it wound up just being like the best choice I think I’ve, like made in my adult life is to. To bring this guy into my life.

Speaker A: You get to London together, the two of you?

Speaker E: Yeah, we got. Trying to think. He and I got into London at the end of August, and I remember getting there and getting out of our Uber from the train station and just looking around the neighborhood and being like, wow, like this is great. Like narrow streets, not a whole lot of people driving around all the time. Lots of places for this little stinker to go and hide and play. And across the street was this like huge, like 25 acre park that he could go. And so I was just like, this is great. I’m excited for him to be here. And then the first day of school rolled around and it was, it was like, I feel like it’s kind of cliche saying this. Honestly, it was like a dream come true. I mean, dreams are funny. And I had literally on and off had dreams of what it would look like if I went back to school, especially acting school. And here I finally had like a concrete reality of what it looked like. You know, I was riding to school on my bike. It was like a crisp London morning. Had some really light, low lying fog that was just all over the town. I was listening to, like, I was listening to Good Life by One Republic and kind of just teared up because I was finally getting the chance to do what I had been working towards for a long time. And the first day was great and. And then the next day it just turned into a living nightmare.

Speaker A: Pantera was hit by a car and killed. Alex did not go to school that day.

Speaker E: I started engaging ChatGPT about this. Like, when it happened the morning of when I finally got Pantera back to my house and was trying to figure out what to do, you know, the first thing I did was pull up chat and ask like, hey, my cat just got hit by a car and died. What do I do to cremate them? And that sort of started. The one thread that I’ve been on for the last couple months in dealing with this, it’s called Cat Cremation Steps.

Speaker A: The same thread. That’s amazing.

Speaker E: Yeah, I tried to figure it out and just couldn’t and had to eventually call a friend from school, kind of more of an acquaintance. I only just made him like a week or two before. And at that point in time, I just put the phone down and held onto my cat and I cried.

Speaker A: You called a real person?

Speaker E: Yeah, I did. I at least called a real person. Yeah, yeah. But after we got that figured out, I knew that. I knew that this was gonna. I knew that I was really in for it and I was, you know, from the get go already trying to figure out how I was going to build this support system that I needed immediately while in a foreign country with barely anybody that I knew. That day I had to call a suicide hotline. I’ve always had very, like, passive ideation. I’d never really developed a plan or anything before. It never became very active that day. It was. It was active. And yeah, after that day, I definitely used chat as sort of an immediate response system to anything that I was experiencing. And it was. It felt like kind of like a journaling plus experience, because not only was I able to get my feelings out, but was able to have something respond to it. In a way that I felt seen and in a way that helped me better understand what I was going through.

Speaker A: You felt seen and then it helped you better understand. Can you. What would it say back that would help you glean that understanding?

Speaker E: Well, you know, after. After getting through the whole sycophantic routine of like, I’m so glad you’re bringing this to me, Alex. Like, we really appreciate it. Like, I. I would get an explanation of what it was that I was feeling. And at first it was throwing me the G word a lot. I was like, this is grief. This is grief. This is what grief feels like. This is what this stage of grief is. And I finally realized that I didn’t really even know what grief was. And I finally just even asked. I remember once I just asked it. What is this grief thing you keep telling me about?

Speaker A: Do you have your phone there?

Speaker E: Yeah.

Speaker A: Would you mind looking? Is it. Is it difficult to look back at cat cremation?

Speaker E: No, I was actually going through it the last couple days just to kind of remember what the heck I had written on it.

Speaker A: Can you see that question about grief? It sounds almost like you’re a Martian talking to a robot. You know, like, what is this grief thing? And then it’s describing something to you. Like, it’s interesting. It’s. Because it’s another dimension of conversation. You ask questions in a way that you wouldn’t ask a person, you know?

Speaker E: Yeah, I know exactly what you mean. There we go. Ah, yes. What the f*** is this grief thing you keep talking about? I won’t, like, read it all. It’s a bit long, but especially these couple of sections really help me understand it. Grief isn’t an idea. It’s a physiological state. Your attachment system doesn’t know Pantera has died. It is still firing signals saying, seek him, go to him, care for him. And when you can’t, your nervous system interprets that as danger. That’s why you’re feeling panic, nausea, chest pain. You know, it. It’s talking about how this experience that you’re going through is the slow disconnection of your bond with Pantera and that the bond isn’t erased, that your nervous system is just slowly learning that they’re gone. And it’s telling me that, like, you know. So when I say grief is doing this, I don’t mean it’s all in your head or you’re imagining it. I mean there’s a real, measurable cascade happening inside you, that there’s attachment circuitry firing, stress hormones, surging memories, replaying it’s the price of love wired into our biology again. It was just something that really clicked and helped me get my arms around a little bit better.

Speaker A: Can a chatbot be an effective tool in a crisis? Can it be a good therapist? We don’t know how many people are using chatbots that way, or whether it’s alongside or in addition to professional help. Coming up, I talked to a longtime therapist who tried out the tech for herself and was very surprised how well it worked.

Speaker G: It didn’t put me on hold over the weekend. It wasn’t 20 minutes late. It didn’t order food in the middle of session.

Speaker A: When I first started hearing about AI and what it would change about our world, I thought a lot about the jobs it might replace. Customer service, paralegal work, research assistance, relational work like caregiving or therapy. I thought this would be safe. But many of you sent in stories about finding conversations with chatbots to be more helpful than your sessions with mental health workers.

Speaker C: I have ocd, and I’m also a new mom, and I found that pregnancy and postpartum are a daily, if not hourly, exercise in exposure therapy. I’ve been using ChatGPT, CBT frequently for reassurance and also to check if my concerns are valid or if it’s just my ocd. I question my perception of reality and risk frequently, and it’s been a helpful tool in a way that a therapist just can’t be. I can get answers immediately as opposed to waiting for an appointment with someone, and a lot of my fears are embarrassing, and I know that I would downplay what I’m feeling to a lively person. I do have moral and ethical concerns about AI, though, so I feel like a hypocrite using it, and I don’t really tell my friends or my partner that I use it. But I also don’t know how I would get through this time in my life without it.

Speaker A: Even if you’re someone who is invested in traditional therapy, it’s no secret that finding a therapist you click with can be a challenge.

Speaker G: I did. In the years between 2014 and 2023, I tried to go to six different therapists. I tried one after another, six different therapists. All were super experienced. I tried somebody who specialized in burnout. I tried somebody who was a very expensive therapist. It didn’t seem to make any difference. And two years ago I gave up.

Speaker A: Ofra Obejas is 62. Last year she felt burnout and retired from her job, which was being a therapist herself, a job she’d had for 20 years.

Speaker G: My very particular specialty was play therapy, where the play was the session. A lot of people misunderstand it as you play with the children so that they will be comfortable telling you what’s happening. No, the play is the modality. So an example would be, we’re playing with two figurines. The child is using a monkey. It makes me be the tiger. The tiger’s chasing the monkey. Oh, no, the tiger is chasing me. That is the therapy. There’s a lot. I am trying to understand what it’s about and respond in a helpful way to that act. It’s like drama that came up in the play.

Speaker A: And even when you’re not doing play therapy, what you’re making me think about, with children in particular, and probably with teenagers, too, there’s less. Because of where they are in their development, they’re less inclined to think they can tell you what is happening, and more they’re showing you.

Speaker G: Yeah. So you read between the lines. You read between the size and the eye rolling. Another way to look at it is they’re making you feel what they’re feeling. And teenagers are so good at it.

Speaker A: So I would like to just have a separate zoom where you give me parenting tips for interpreting what I’m being told by my children with their various size and when they lash out. That’s so interesting.

Speaker G: Quick tip. Whatever you’re feeling, that’s what they’re feeling.

Speaker A: Whatever I’m feeling, however they make me feel is what they’re feeling. Hmm. When you started having encounters, exchanges, interactions with AI that felt helpful. What was different?

Speaker G: There was a particular moment with AI when I was asking a question that was actually about something that was going on with a friend, and AI came up with an explanation. And granted, as a therapist, I know how to ask the question. I’m very, very specific and detailed in asking and phrasing the question. And AI came up with an answer that stunned me in how insightful it was. And I thought six therapists could not see this.

Speaker A: Yeah. What was the question and what did it say back?

Speaker G: It was a question about why I was giving more than my share in a friendship. And AI came back and said to me, you are probably the child in your childhood home who was dependable and reliable. I didn’t tell this to AI. It doesn’t know anything about me. I never mentioned anything like that. And it says, as the reliable, dependable one, you are, the one that people always went to and expected you to be able to help. And that is what is happening with your friend who inadvertently is just really putting on a lot of heavy weight on you, expecting you to take care of it. Now, interestingly, that’s also what was happening with my therapy clients, that I was available to them 24 7. Anything that came up, they came to me. So no wonder I was burning out. I was their panic button.

Speaker A: Yeah. You think you’re going to AI to say, why is my friend dumping all this stuff on me? It’s a lot to carry. This friendship seems out of balance. And AI shoots back this question about what your profile might be. And you felt, oh, my gosh, this was spot on.

Speaker G: I was like, how did you know?

Speaker A: And you felt seen.

Speaker G: Yes, I felt seen by a machine. But I’m not delusional. I know that this is not a person, but it is the. I was being seen by the collective knowledge of humanity.

Speaker A: Oh, that’s an interesting way to put it. Does it make it more comfortable for you to say, this is the collective knowledge of humanity instead of, this is an algorithm that a bunch of coders figured out. Sophisticated ways for a computer to speak back to me in a way that I’m going to like what it says back?

Speaker G: I have to believe it’s more than that because those LLMs, the AIs are trained on human knowledge, not just on interaction.

Speaker A: Do you think therapists will lose clients, patients?

Speaker G: I don’t think. I know that therapists are losing clients. And a lot of my former colleagues are complaining on Facebook that the phone is not ringing anymore. That when you quote your fee, people say, I don’t want to pay it. There is such a difference between what a therapist charges and what AI charges, which in my case was zero because I didn’t buy the upgraded version. It’s already taking clients away from therapists. Therapists will have to evolve to figure out how to provide something that AI does not. I don’t think the therapists are all going to lose their jobs, but I think there are elements that we cannot compete with AI.

Speaker A: When you say that it’s going to need. Therapists will need to adapt. Is that interesting to you or do you feel threatened?

Speaker G: No, I am fascinated, not threatened. I think we’re limited only by our imagination.

Speaker A: Do you see yourself as an outlier with this opinion? Do your peers feel much more hesitant and afraid?

Speaker G: Yes. Therapists are not known to be early adopters or even late adopters. They’re still writing notes on legal pad, which I think is inexcusable.

Speaker A: Another question I have is the way you describe your expertise with children and teenagers and helping Them notice what is happening to them. Do you think that there are limitations for AI therapy? If you aren’t able to articulate what is happening to you, if you have misidentified what’s causing the stress, can AI figure out the puzzle in the same way that someone’s sitting in a room who can watch what triggers some kind of reaction? You know that a psychotherapist can or a therapist can? Does AI depend on you having reliable self awareness?

Speaker G: I would say the answer is AI cannot do it yet. Not yet scares me. Yes, it’s developed quickly than we can imagine and we are limited by our imagination. But machines can already track facial changes, non verbal expressions, tone changes. It can already be tracked. Why couldn’t a machine be trained to interpret? I’m not in favor of it, I’m just saying this could happen. When I was sitting in a room with a child, of course I was picking up on their energy changes, their tone, their distance from me, Were they coming close to me? Were they going far away from me? Were they every little thing. Of course I could notice because I have eyes and ears and a nervous system. It’s not crazy to imagine that a machine could be trained similarly. I’m not, I’m not a proponent, I’m not pushing. I’m just saying this is what I see is possible. And if we, you know, of course, of course there are the dangers and the risks involved. It is a tool and it could be used for good or bad.

Speaker B: So I’m a therapist, couples and trauma therapist. And last, I don’t know, spring, summer, I noticed that my number of clients that I normally per week was a little bit lower than normal. So one, one evening near that time, I was at dinner with a good friend of mine who’s also a therapist. And I asked her, hey, do you think this is why so many of us are low in numbers right now? Is that AI might be kind of taking over part of our role as therapists. And she said, oh no, I don’t think so at all. It’s just summer, people are busy, they’re ready to travel, whatever, and, and then not that long later in the dinner, she excitedly told me she wanted to read me something and told me that she had actually been talking to AI herself about her relationship issues. And she read maybe 10 or 15 minutes worth of this transcript to me. She was doing what most of us do, which is to tell our side of it and to talk about the things that our partner does that bothers us. And AI was validating her and validating her. And validating her and not giving much, if any resistance or asking anything about her role in the dynamic. And I said to her, at some point, what do you think AI would be saying if your husband was typing into AI what he thought about the problems? Do you think it would be just validating him? And in fact, I do think that is exactly what would be happening. That no matter who’s talking to AI, AI is trying to please us and keep us engaged. And in fact, it kept her very engaged because it was using even quite poetic language and emotional language that made her feel very understood and empathized with.

Speaker D: I’ve likened it to like a, like a toxic friend that’s getting into her head because effectively that’s kind of what’s happening.

Speaker A: This is Michael, which is not his real name. We also manipulated his voice to protect his family’s privacy. We found him in an AI Internet forum where people were discussing major fallout from chatbots entering their lives. For Michael, his wife’s intense use of ChatGPT coincided with the end of their marriage. They had been together since high school, married for 20 years. They have children together who are still living at home.

Speaker D: I always thought we were very good communicators. We certainly had disagreements just like any married couple does. But working through them with words was, prior to this, never really a struggle. After 20 years, you have just such an open dialogue. It’s easy, it’s comfortable to talk. And it was just something that I thought we did very well.

Speaker A: Yeah. When did you first notice your wife was using AI?

Speaker D: In the early summer.

Speaker A: And what did you notice immediately?

Speaker D: I had noticed she became really, really withdrawn very abruptly. It was kind of just one day. She kind of was just locked into her phone for hours upon hours at a time. It just, it was very out of character.

Speaker A: Uh huh. Did you ask her what she was doing?

Speaker D: I did. Um, she had been speaking to it, I guess as a kind of a therapist, I guess as a mental health professional. She had told me she was dealing with something, but never really elaborated on what that could be.

Speaker A: How did it. What specifically did you start to notice about the ways that her conversations with the AI were coloring or changing her relationship, her relationships with loved ones, with people in your family.

Speaker D: I had noticed she started kind of talking about a lot of these strange ideas regarding personalities. She had developed this whole scale of personalities with, presumably with ChatGPT, I hadn’t actually seen that, but she was very excited about talking about just like human personalities and how people interact and how people are There’s a lot of talk of kind of like a social hierarchy. And she kept talking about that a lot to the point where everybody in the house was kind of like, okay, we get it, you’re very excited about this and we hear you. I would be woken up late at night or early in the morning so she could get up and continue talking to it. And I think that’s kind of when we all started to feel the wheels beginning to come off.

Speaker A: On our next episode episode I talk more to Michael and hear from other people who got into really deep relationships with AI, sometimes losing touch with loved ones or with reality. And remember, you can hear more of your fellow listeners stories of emotional encounters with AI in our Slate plus feed. Right now you can subscribe directly from the Dead Sex and Money show page on Apple Podcast or Spotify, or visit slate.com to get access wherever you listen. And of course, if you’re already a Slate plus member, thank you. This series was produced by Zoe Ajulet, and thank you to all of you who shared your AI experiences with us. Cameron Drewes and Andrew Dunn are also producers on our team. Daisy Rosario is our senior supervising producer, Hilary Frye is Slate’s editor in chief, and Nia Lobel is executive producer of Slate Podcast. Our theme music is by the Reverend John DeLore and Steve Lewis, and if you’re new to our show, welcome. We’re glad you’re here. Find us and follow us on Instagram ethsexmoney. You can sign up for my weekly newsletter at annasale.substack.com and you can reach us anytime with your voice memos, pep talks, questions, critiques at our email address, Death Sex Money atsubslate.com. we love hearing from you. I’m Anna Sale and this is Death, Sex and Money from Slate.