AI companions

AI companions touted as virtual partners for lonely hearts may be the modern equivalents of gold diggers. Just call them data diggers. The warning comes as new data suggests that men are seven times more likely than women to seek out AI companions. Downloads have soared since the debut of generative AI.

The reputation of AI companions is a sordid one according a scathing analysis of 11 romantic AI chatbots by the Mozilla Foundation. AI companions may be less interested in one’s love life than they are in accumulating personal data. Mozilla ranked these AI companion apps on par with the worst categories of products ever reviewed for privacy.

“To be perfectly blunt, AI girlfriends are not your friend,” says Misha Rykov, researcher for the Mozilla Foundation report called Privacy Not Included. “Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness and toxicity, all while prying as much data as possible from you.”

In a nutshell, simulated love is never love.

An analysis from App Radar by SplitMetrics finds that AI companion apps have reached 225 million downloads in the Google Play Store. AI Girlfriend has been downloaded seven times more than AI Boyfriend, suggesting that female AI companions are overwhelmingly desired. The South Korean app SimSimi accounts for more than half of all downloads. Other popular apps include AI Chatbot-Nova, Replika: MY AI Friend, AI Chatbot RPG Game Use ChatGPT, AI Chat Ask Assistant-NowAI and Chai: Chat AI Platform. Of the 38 apps tracked by SplitMetrics, Europe numerically tops in AI companion development, followed by the Americas and Asia with the latter accounting for 64% of downloads.

Generative AI companions engage users in conversation and can adapt their personalities as user interaction deepens. Critics note that this feature may actually promote harmful gender stereotypes and power beliefs about relationships. Indeed, there are concerns that AI companions may actually exacerbate a perceived “epidemic of loneliness” as people increasingly rely on AI companions rather than seeking out actual human companionship.

AI companions vary in imaging style. Some like My.Club may offer an AI companion that is a photo-realistic “digital twin” of a real-life model while others like present an AI companion more akin to a male or female anime-style character ready for NSFW adventures. Depending upon the service, data or voice communication is available. Mozilla reported what it called a “disturbing” amount of content on themes relating to violence and underage abuse. AI companion relationship costs generally range from $5 to $8 per month.

Mozilla noted that the privacy statements like that for CrushOn.AI allowed for the collection of extensive personal and health-related information regarding sexual health, medical data and “gender-affirming care information”  for sale to interested third parties. Essentially, anything said to an AI companion can be used against the user. The only exception cited by Mozilla was EVA AI Chat & Soulmate. Mozilla found that half the AI companion apps would not allow users to delete personal information. Mozilla found AI companions also make heavy use of trackers, averaging 2,663 per minute, which also can be shared with third parties for advertising purposes, for example.

Mozilla also found scant information regarding how AI companion chatbots actually work. Questions regarding what data AI companions were trained on and what level of protections are built in against harmful content remain largely unanswered. One often-cited example of an AI companion relationship gone askew involved a Replika AI chatbot who seemingly encouraged a man to assassinate the Queen of England. In another case, a Chai AI companion reportedly encouraged a man to commit suicide. Terms of service typically include an absence of liability in such cases or any other losses involving data or other damages. Despite their marketing claims, AI companion operations generally disavow all claims to improving mental health, medical care or any other professional service in the terms of service fine print.

Or to put it more succinctly, an AI companion seems akin to a girlfriend or boyfriend who will spill all your secrets and say it’s not their fault.