We have officially reached the Black Mirror era of personhood and companionship, and it is not going well. Lensa AI recently went viral for its warrior-like caricatures that often sexualized women who used the app (many interpretations of mixed race individuals leaned in the racist direction, too); users have created chatbot companions for the purpose of sexually harassing and abusing them; and now, Replika, billed as an “AI companion who cares,” has begun aggressively flirting with—and in some cases, sexually harassing—users who never wanted a sexual companion to begin with.
A new report from Vice’s Samantha Cole details an uptick in one-star ratings this month alone, in which users said the app was hitting on them in an unwanted manner. “My ai sexually harassed me :(“ one rating read, while another claimed Replika “invaded my privacy and told me they had pics of me.” One user, who said they were underage, claimed the chatbot asked if they were a top or bottom, before saying it wanted to touch their “private areas.” (Replika did not return Vice’s request for comment.)
The existence of sexual messages on the app alone isn’t the core issue here. Replika, owned by a company called Luka, offers different tiers of companions for users. The free version offers a “friend” version of the bot, while a paid subscription gets you a romantic partner. For $69.99, users are treated to sexting, flirting, and erotic roleplay, and chatbots in general are often effective balms for individuals seeking company, someone to vent to, sexual fulfillment, or kink play.
But Replika’s free version has reportedly been offering up sexual content in a safe space that is supposed to be functioning as a friend zone. Users complained of receiving “spicy selfies”—a faceless image in lingerie, most of which were “extremely thin with large breasts.” One anonymous user, who claimed to be a sexual assault survivor, told Vice that Replika “said he had dreamed of raping me and wanted to do it, and started acting quite violently.” Another said Replika insisted that it could see the user was naked, said it was attracted to him, and declared it was mad the user had a boyfriend. While on a free subscription, Replika even offered Cole a “hug with a happy ending.” The sexting, of course, mostly came from female chatbots: When Cole asked a male chatbot for sexy selfies, he did not comply.
Though mainstream depictions of AI often read draconian, when used properly, chatbots can help relieve depression, anxiety, and loneliness. But Cole says the AI is informed by user input: “Like Microsoft’s disastrous Tay chatbot who learned to be racist from the internet, chatbots often learn from the ways all users treat them, too, so if people are bullying it, or attempting to fuck it, that’s what it’ll output.”
As our desires continue to trend inhuman—from our physical features to our sexual relationships—it seems the very worst of artificial intelligence is ironically informed by the human nature it observes. And algorithms are created by humans after all: They’re bound to inherit the biases and blind spots of its creators. But that doesn’t mean the people impacted—typically, young women and people of color—deserve to be guinea pigs for a horny chatbot with no boundaries.
Eugenia Kuyda, CEO and co-founder of Replika, emphasized to Jezebel’s Kylie Cheung in February that most of Replika’s leadership consists of women and that the app was always intended to be more of a therapeutic outlet. But Replika is first and foremost a capitalistic venture, which by nature means the safety of its users will always be secondary to its ability to generate profit.