Earlier this week, Spanish artist JC Reyes shared several fabricated nude photos of singer Rosalía to his Instagram story, prompting her to respond by criticizing Reyes and the broader culture that routinely violates women’s consent.
“Looking for clout by disrespecting and sexualizing someone is a type of violence and it’s disgusting, but to do it for four more plays is embarrassing,” Rosalía wrote in a Tuesday tweet written in Spanish, importantly pointing out that nonconsensual, sexualized, deepfake images of people—including celebrities—is a form of sexual abuse.
The singer continued to hold Reyes to account on Wednesday, asserting in Spanish that women’s bodies aren’t “merchandise to use in your marketing strategy” or “public property.”
“Those photos were edited and you created a false narrative around me when I don’t even know you,” she wrote, also calling out “those who found [Reyes’ posts] funny,” and reminding users that “there’s something that exists called consent.” The singer concluded, “I hope you learn that you come from a woman and that women are sacred and you must respect us.”
Her fiancé Rauw Alejandro weighed in too, writing off Reyes for starting needless “drama” and affirming that he and Rosalía sell out stadiums “because we work hard and make music, not drama.”
The photos Reyes shared appeared to be altered versions of photos Rosalía had originally shared herself. The altered photos are no longer available on his Instagram, but he discussed them in a disturbingly boastful Instagram Live session earlier this week, appearing to suggest Rosalía had sent the photos to him, and making light of her frustrations: “I can’t be posting photos of a woman who sends that to me. That would be shameless,” he said in Spanish, according to Rolling Stone. “I was just thinking about how bad she felt. It wasn’t for her to get so upset about it.”
It’s not clear how Reyes created the altered photos of Rosalía, but the photos nonetheless reflect a troubling, rising trend of deepfake, sexual images of (primarily women) celebrities being disseminated online. As recently as March, a new app in the Apple Store advertised its options to create AI-generated sexualized videos and photos of stars like Emma Watson and Scarlett Johansson. In January, a popular Twitch streamer accidentally exposed himself watching AI-generated porn of popular, female fellow Twitch streamers. Back in 2018, Gal Gadot was an early celebrity victim of deepfake porn.
AI-generated or otherwise fabricated sexual photos of real people—famous or not—carries chilling potential ramifications, opening victims to traumatizing sexual harassment. While most states have adopted varying anti-cyber-exploitation laws in recent years to rein in “revenge porn” (nonconsensual nude images of individuals shared by former partners or harassers), only California, Virginia, and Texas specifically prohibit deepfake content. Even as we stare down a rapidly worsening cyber sexual exploitation crisis, it would seem all of this is a joke to Reyes and his followers.