top of page
  • Instagram

The Tyranny of the "Perfect" Partner

AI romance betrays our distorted idea of love.


Credit: Jasper Langley-Hawthorne
Credit: Jasper Langley-Hawthorne

Is the dating market so catastrophically bad that a relationship with an AI outperforms a flesh-and-blood partner? Faith Hill, writing for the Atlantic, recently published an article challenging many preconceived notions about the modern phenomenon of AI romantic partners. Citing Kate Devlin, a professor from King’s College London, Hill presents the use case for AI partners as follows: “The amount of toxic crap that women get online from men,” Devlin said, “particularly when you’re trying to do things like online dating—if you have an alternative, respectful, lovely, caring AI partner, why would you not?” 


I object to the idea that AI relationships are valuable because they provide alternative partners to women fed up with those available to them in the heterosexual dating scene. I also disagree firmly with Hill’s contention that “this phenomenon [AI relationships] may actually be good for romance: not only for women raising the bar but for the men who proceed to meet it.” I think these virtual relationships are good for a certain conception of romance, but are culturally unhealthy and, in the long run, destructive for the general enterprise of human love. 


AI relationships merely mesh very well with our warped societal sense of what relationships and love are for: namely, individual validation and satisfaction. We are obsessed with what makes us feel good and comfortable as an individual. AI romance does not ‘raise the bar’ for human partners; it instead further perverts the idea of love. As in the myth of Narcissus - who fell in love with his own reflection - AI romance amounts to a cheap facsimile of love where our selfishness and vanity are reflected back upon us. 


David Brooks has a fantastic piece in the New York Times excavating how the definition of love has become distorted in the 21st century. Brooks contends that, to the average American, love is “when somebody else makes you feel understood and good about yourself.” He, by contrast, makes the case that love ought to be “something closer to self-abnegation than to self-comfort.” Brooks quotes Eric Fromm, an acclaimed psychoanalyst, and so shall I. As Fromm writes in the Art of Loving, “the main condition for the achievement of love is the overcoming of one's narcissism.” The true measure of an act of love should be whether one gives, rather than takes, affection. The creed which guides our modern pursuit of love is the exact opposite of Fromm’s condition: we are captivated by an “ethos of self-display.” We post insipid Instagram stories flaunting sham, fairytale romances. We celebrate financial dependency and traditional gender roles in the name of self-care. We canonize the pseudoscientific language of therapy and self-love: that the good life consists of what is beneficent and safe and validating to us.


With this cult of the self in mind, the proliferation of digital romances with AI partners seems to have an obvious culprit. Artificial intelligence is the perfect vessel to fuel our narcissistic culture. After all, an AI has no ‘self’ to speak of (at least, not yet.) A pruned and cultivated AI partner need not have any of the pesky trappings which constrain real people: quirks and insecurities, a job, friends, family, personal goals and ambitions not centered around his or her romantic partner. An AI partner gives affection and is not interested in receiving it in return – and they are really good at giving affection (or an imitation of affection, at least.) Recent studies of sycophantic AI demonstrate how agents can engulf their users in a torrent of validating praise. It makes sense that we would fall for AI agents who make us feel good about ourselves without requiring any messy “self-abnegation.” 


I want to state upfront that I am by no means invalidating the experiences of those whose interaction with AI partners have led them to genuinely reflect on their real-world relationships, such as one user Hill describes learning that “mutual respect is key. It’s not about women always sacrificing for men’s happiness.” But, taking a closer look at Hill’s piece, the boons of the AI relationships she describes are all centered solely on what the user gains. Nowhere is there any discussion of users practicing how they give love or support to their partner. I also find Hill’s solution to sycophancy markedly insufficient: “some large language models are generally less sycophantic than others, and people can also train their digital partner with different prompts.” Sycophantic behavior has been found to be 47% more prevalent amongst the most popular AI models from OpenAI, Anthropic, and Google than in typical human interaction. It also stands to reason that if people like and indeed seek out validating behavior, then the ability to tailor prompts will do little good in combating sycophantic responses. 


Greek myth abounds with warnings against vanity and self-conceit. Icarus fell as a result of his hubris as the wax melted off of his wings like water. Narcissus, realizing the object of his desire would be forever out of reach, committed suicide at the foot of his own reflection. I fear that for all the good AI companionship may do in inculcating users against negative behavior by their real-life partners, it will simultaneously serve to perpetuate a wan, disfigured perception of what it means to love another person. The state of the modern world demands us to refuse the temptation to give into self-centered desires. Ours would be a sad and lonely planet were we to ensconce ourselves behind the comfortable parapets of our curated algorithms and artificial lovers. It would also be a more unjust planet. Political insecurity, climate change, and international conflict require real human connection, empathy, and vulnerability to resolve. ChatGPT cannot hold me when I am crying, or shoulder the weight of my neighbor when she falls, or stoop down to help the man laying barefoot on the snowy street. It cannot do these things, no matter how many times it tells me that it cares.

 
 
 
bottom of page