Let’s face it: many people are using AI assistants like ChatGPT now. It’s become the go-to tool for making work easier, streamlining life admin and so much more, with 14 percent of Brits using AI at least six times per week, according to a recent YouGov survey. But it’s still surprising how quickly people have taken to using AI in their personal lives and particularly their romantic lives.
In fact, according to the 2025 Singles in America study, 40 percent of people are using AI to help craft their dating profiles and one in four are using AI to filter matches, write messages, or reflect on dating habits.
And the use of AI doesn’t stop once you’re in a relationship, at least not according to an influx of TikTok users who have admitted to using ChatGPT to help them settle arguments with their partner. Influencer @isabella.lux captioned a video stating that she recorded an argument with her partner into ChatGPT, “and it said I was right.” There are plenty of other TikToks with similar captions, and it’s a conversation that’s happening online in other ways, too. Model Jordan Rand made a video claiming that she had to use ChatGPT to win an argument between her and her boyfriend because “he would literally rather believe a robot than believe [me].”
AI assistants are already embedded into people’s romantic relationships, but are they genuinely a helpful way to resolve conflict? Mashable spoke with experts and people using ChatGPT to mediate conflict in their relationships to find out.
ChatGPT as couples therapy
24-year-old Samantha* has been using ChatGPT as part of her five-year relationship for the past four months. After using ChatGPT for things like recipe ideas and work tasks in her professional life, she realised it might be a helpful way to navigate a language barrier between her and her partner.
“I started using it when I was having conflicts with my partner, who isn’t a native English speaker, so sometimes our conflicts don’t come to [a] common ground,” she tells Mashable. Quickly, Samantha’s use of ChatGPT wasn’t just about communication, but emotional support too. “I was using ChatGPT to vent all my anger and frustration, just like a journal, and then I’d ask for advice on how to express myself calmly and clearly,” she says.
Jaya, who is 26 and lives in Bristol, UK, has been dating her partner for less than a year and also often uses ChatGPT to navigate conflict. “AI is like an additional friend, like an extra person in the group chat,” she says. “With AI, it’s black and white, whereas with your friends they may not be able to unpick everything.”
It’s easy to assume that platforms like ChatGPT are an objective source of information, especially compared to friends and family, who might already have biases based on their own personal experiences and their view of your partner. But Dr. Luke Brunning, who is the co-director of the Centre For Love, Sex and Relationships at the University of Leeds, says this isn’t necessarily the case: “ChatGPT might be able to draw on the wider pool of ideas about relationships and intimacy but those ideas are often bad, or rest on assumptions about gender and monogamy, which we might want to reject.”
“ChatGPT might be able to draw on the wider pool of ideas about relationships and intimacy but those ideas are often bad, or rest on assumptions about gender and monogamy, which we might want to reject.”
If the concern is that your friends or family might be biased towards you, rather than your partner, when giving relationship advice, this is definitely something to take into account with ChatGPT, which Forbes described as “AI’s biggest yes man”, because of its tendency towards agreeability.
This is something Jaya noticed when she was initially asking AI assistants for advice: “I had to set commands to tell it to counteract the argument – you have to ask to see it from the other person’s point of view. If not, it does tell you what you want to hear,” she says.
Samantha agrees that this posed an issue to her relationship, particularly because both she and her partner were asking ChatGPT for advice separately about arguments they had, but were being provided with very different feedback. To combat this, they started using it like a “couple’s therapist”, as Samantha describes it. “When we were together, I’d type the problem from my perspective to ChatGPT and tell it I was ‘partner A’, then I’d pass the phone to my partner and he’d share his perspective, telling ChatGPT it was now partner B speaking,” she explains.
“We take the advice with a pinch of salt, but it helps to calm the situation and prevents arguments from escalating,” Samantha continues, adding: “Our arguments have been constructive rather than explosive because of using ChatGPT.”
Alex Iga Golabek is a psychotherapist, and she’s noticed that some of her clients who are couples have started to use ChatGPT as part of their romantic relationships. “Plenty of people […] in my practice share that it is very difficult and scary to even approach the notion of conflict,” Iga Golabek tells Mashable. “Accessing a tool that’s so instant and easy to use in order to learn how to drive or create that argument in the first place, that doesn’t have to be a terrible thing in the short term.”
Jaya believes it can also be a useful tool for identifying toxic behaviour, to a degree. “ChatGPT sometimes says things like, ‘he was gaslighting you’ or ‘this is manipulation’,” Jaya says. “I think a lot of the reasons why people use AI assistants at the start [of relationships] is because there’s uncertainty.” For people who are worried about this type of behaviour in their relationships, speaking to friends and family is of course, key, particularly if it escalates. But could AI assistants provide a sounding board of sorts for people who aren’t ready to do this initially, and perhaps can’t afford therapy?
Using AI to referee arguments
That being said, Iga Golabek says that neither AI assistants nor therapists can act as “adjudicators” in your relationship. “I say to my clients: I’m not a judge. This isn’t a court of law. This is more about trying to connect rather than overpower,” she says. The danger of using ChatGPT in order to prove you’re right, as some TikTok users claim to, is that solving conflicts in relationships shouldn’t necessarily be about who is in the wrong. “I keep asking my clients, do you want to be right or do you want to be happy?” Iga Golabek says.
“I keep asking my clients, do you want to be right or do you want to be happy?”
In this respect, dealing with conflict can also be a crucial part of developing a healthy relationship, which the interference of ChatGPT could potentially get in the way of. “There is value in being present and showing up in relationships to try and resolve disputes on one’s own terms,” Bruning says. “Often it matters more that an idea or resolution is yours, mine, or ours than the best or most sophisticated.”
Samantha admits that ChatGPT isn’t as big a part of her relationship as it used to be, partly because she was concerned that she might become dependent on it to communicate with her partner. “Over time, we’ve come to a point where we don’t need ChatGPT anymore. It’s helped us process our emotions and communicate better, but it hasn’t replaced talking to each other,” she says.
Is ChatGPT the most reliable mediator?
It’s possible that an AI assistant’s utility in solving relationship conflicts relies more on the person who is using it and their intentions than the platforms themselves. Jaya admits she hasn’t told the man she is dating that she is using ChatGPT to help her navigate the relationship, precisely because she’s concerned he might start to use it for the same purpose. “I’d worry he wouldn’t go back and forth with ChatGPT, so he might not use it correctly and then ChatGPT would just agree with everything he said and he’d take that as face value,” she says.
“Why risk a relationship on a source of advice with nothing at stake themselves?”
It’s important to remember that, unlike asking a friend or family for advice, an AI assistant can tell you anything, with no accountability. Unlike a person you know, you can’t decide whether you respect and trust its character. “Why risk a relationship on a source of advice with nothing at stake themselves?” Brunning asks. Whatever context you use an AI assistant in, you’re usually posing questions based on a lack of confidence in your own judgement and ability, whether that’s in how an email reads, how a recipe might taste, or your ability to communicate with your partner.
Experts have cautioned against using AI chat tools as therapists, warning that it creates a “feedback loop” that can reinforce users’ harmful beliefs and lacks the professional skills that can help people with complicated mental health issues like anxiety and depression.
According to Iga Golabek, this isn’t necessarily a new concept: “When we feel unequipped or unable to believe in the value or worth of our own argument, then we turn to someone or something that we view as higher or greater than us, and humans have done that for a long time.”
What is novel is the immediate availability of what can feel like an “all-knowing” friend in your pocket. Ultimately, in a healthy relationship, you should feel able to make decisions about how you feel and how best to communicate without checking yourself, even if you get it wrong. Forgiveness and understanding are just as crucial to a long-term partnership as emotional stability and your ability to communicate.
After all, AI assistants like ChatGPT will never truly know what it means to experience hurt or betrayal, or compassion, or make-up sex, all of which are arguably more important when approaching a difficult situation in your relationship than the ability to compile the most common dating advice littered across the internet within the space of three seconds.
Leave a comment