The ‘Mommy Goddess’ and the Mass Reporting of Sex Workers on Instagram

Anti-sex activists are picking off sex workers on Instagram—and Instagram is letting them.

In DepthIn Depth
The ‘Mommy Goddess’ and the Mass Reporting of Sex Workers on Instagram

Near the end of the 2023 Netflix documentary Money Shot, Mike Stabile of the Free Speech Coalition explains, “What people don’t realize is porn is traditionally the canary in the coal mine of free speech.”

Calling sex workers “the canary in the coal mine” is almost a cliché at this point. I myself have used the metaphor in my own writing to discuss algorithmic surveillance of sex workers. But here is where the metaphor falls apart: The canary is only effective if the miner listens to it. Put plainly, due to societal stigma against sex workers, the general population is more likely to perceive our silence as making their world safer, not less so. This is precisely why our accounts are such an ideal testing ground for Big Tech’s encroachment into everyone’s private lives.

What might be a more accurate metaphor is that of Cassandra, the Trojan priestess in Greek mythology, blessed by the god Apollo with the gift of prophecy but, as punishment for refusing his sexual advances, cursed never to be believed. Then again, this particular case has a mythology of its own. The prophetess in our story is the “Mommy Goddess” Gwen Adora.

The psychological whiplash of suspension

Money Shot: The Pornhub Story premiered on Netflix on March 15 after weeks of optimistic anticipation from sex workers. The documentary narrates the ongoing battle between sex workers and anti-sex activists, who tend to focus their efforts on trying to get pornography and adult content eradicated from the internet. Though they claim to fight for sexual abuse victims, their go-to messaging incorrectly conflates sex work with human trafficking and argues that sex workers are handmaidens of cisheteropatriarchal violence. “It’s pushing this far-right Christian mandate under this guise of liberal, save-the-women/save-the-children language, and it’s been very effective,” porn industry historian Noelle Perdue says in Money Shot.

For many anti-sex activists, Pornhub is Enemy #1. As such, Money Shot zeroes in on MindGeek, Pornhub’s parent company, as its opponents craft a flurry of campaigns to pressure media, state officials, and banks to restrict online pornography into obscurity. This anti-sex lobby largely operates through a handful of religious-right nonprofits with names ranging from biblical (Exodus Cry), to semi-official (Morality in Media, rebranded as the National Center on Sexual Exploitation, or NCOSE), to quasi-clever (Traffickinghub). Their most recent success was last summer, when Visa and Mastercard suspended payments to MindGeek’s advertising arm after years of aggressive lobbying, which further restricted what remained of sex workers’ incomes after Visa and Mastercard prevented people from using their cards at Pornhub in 2020.

In step with the growing sex panic in the United States, mainstream media outlets like the New York Times are platforming the anti-sex lobby with increasing regularity. But sex workers are rarely included in mainstream discussions of sex work, which is one reason Money Shot director Suzanne Hillinger told me in a Zoom interview that she “set out to make a film that gave agency to sex workers” to narrate our own stories. In a sparkling line-up of porn stars and content creators, Gwen Adora is the first performer we see at work: putting on her makeup, setting up her tripod, filming.

Gwen then details some of the challenges that fat women specifically face in the sex industry: exclusion from mainstream porn and incredulity from civilians that fat women even can work in porn, with amateur adult-content creation being a kind of equalizer for performers with marginalized and under-represented bodies. Pornhub enabled Gwen to attract a mainstream audience as an amateur performer, and between Twitter and Instagram, she’d managed to build a large following on social media.

But when Gwen woke up on March 15, a dozen or so hours after Money Shot premiered in Europe, she discovered that her Instagram account had been suspended. The alert provided a less-than-helpful Q&A: “What does this mean?” “⚠️ Your account, or activity on it, doesn’t follow our Community Guidelines on adult sexual solicitation.”

“Sex worker” functions as a kind of fixed identity; even if you’re not currently “doing sex work,” society tends to perceive your presence as “a sex worker” anyway. (Consider the swaths of people, from nurses to mechanics, who lost their day jobs after coworkers found their OnlyFans pages.) While sex workers can’t use Instagram to advertise, sell, or link to their adult content, the platform is very effective in helping us generate a close-to-mainstream following alongside other artists and influencers. We are intimately familiar with various social platforms’ community guidelines, because our livelihoods depend on our adherence to them.

But to accuse someone, like Gwen was accused, of “sexual solicitation” is to accuse them of a crime—specifically of full-service sex work, more commonly understood by civilians as prostitution. “Sex work” broadly includes a range of both legalized and criminalized sexual services; the term was coined by Carol Leigh in 1978 as an act of solidarity between “legal” sex workers like Gwen and other porn performers at the top of the “whorearchy,” and heavily criminalized full-service sex workers at the bottom. (Other kinds of sex workers—strippers, sugar babies, dominatrixes—occupy the middle.) According to community guidelines for Instagram’s parent company Meta, however, its version of “sexual solicitation” includes a broad range of content that may not be related to sex work at all. Beyond prohibiting in-person services that, regardless of legality, are widely considered part of the illicit sex industry (“escort service and paid sexual fetish or domination services”), Meta’s policy extends up the whorearchy to “legal” sex work like “strip club shows,” “erotic dances,” “tantric massages,” and adult content. It also restricts a wide range of sexual expression, like “sex chat or conversation,” “audio of sexual activity,” or “commonly sexual emojis” ( , , ). Content in violation of Meta’s policy must meet both criteria of offering or asking for sexual solicitation and including “suggestive elements”; nevertheless, but Gwen’s account was suspended despite not engaging in either.

In other words, anyone who exhibits what Meta considers to be non-normative sexual behavior may be subject to suspension—and if they don’t successfully appeal the suspension in 180 days, the account is gone for good.

to accuse someone, like Gwen was accused, of “sexual solicitation” is to accuse them of a crime.

This wasn’t the first time Instagram accused Gwen, who told me in a phone interview that she’s never worked as an in-person sex worker and certainly hasn’t advertised as such on Instagram, of solicitation. Under a different name in 2018, she ran a sex blog; Instagram suspended her account for “solicitation” when she posted a (clothed) picture of herself. When Gwen started doing sex work in 2019, Instagram banned her account within a month. The first few suspensions stung but quickly became mundane: She’s since faced regular suspensions for “solicitation,” as well as “impersonation” from fake accounts that steal her content and then try to deplatform her real profile. Because of this, Gwen regularly checks on her account status, which until her most recent suspension had been good.

This is all par for the course as a fat adult-content creator: Sex workers disproportionately experience algorithmic bias on social media, the research collective Hacking//Hustling found in its 2020 report “Posting into the Void,” and Instagram’s content-moderation algorithms are also notoriously biased against fat people (as well as women of color and queer folks). This trigger-happy system is why Gwen had created a backup account, which Instagram also suspended that weekend, just days after she lost her primary account.

The first account’s suspension could have simply been the result of a spectacularly well-timed algorithmic sweep. But as uncannily close to Money Shot’s release date as it was, another possibility arose: that the documentary attracted attention to Gwen’s account from critics, who reported it en masse for allegedly violating Meta’s “sexual solicitation” rules until they could tip the scale against her. This kind of targeted mass reporting, also called report-bombing and brigading, would be similar to the ways NCOSE, Exodus Cry, and Traffickinghub founder Laila Mickelwait relentlessly swarm platforms to try to eliminate Pornhub from the internet, as Money Shot ironically describes. With the suspension of Gwen’s backup account soon after, this theory seemed all the more likely.

And then Money Shot director Hillinger saw her account vanish.

There’s little that can prepare you for the psychological whiplash that comes from being mass reported. When my Twitter account was mass reported and briefly suspended in 2020, I was stunned by how deeply the deplatforming cut me. Seeing strangers circle your profile like sharks and beseech others to report it into oblivion is unnerving, feeling your own existence so extinguished from your digital community is traumatizing, and rebuilding a following after a ban is both dehumanizing and difficult (which is a reason you’ll often see “DELETED AT 53K” or similar notices on our profiles). But even then, our personal and private accounts are usually spared.

When Meta told Hillinger her private Instagram had been suspended for “adult sexual solicitation”—the same reason it gave Gwen—Hillinger didn’t suspect algorithmic flagging was to blame. Her posts promoting Money Shot were relatively tame, and aside from being a queer woman, there was nothing in her personal life that might have gotten caught up by algorithmic bias. Hillinger didn’t consider mass reporting as the culprit, either—not until she posted about the suspension on Twitter and at least one sex worker chimed in to suggest it.

“When Mike said that sex workers were the canaries in the coal mine,” Hillinger told me, “I didn’t realize he was talking about me.”

The weaponization of mass reporting against sex workers

As with many machine-learning algorithms, Meta’s is largely black-boxed, meaning the mechanics behind each content-moderation decision can only be surmised by observing input and output. “What’s crucial to understand here is that unfortunately, because we’ve had so little information from platforms, a lot of what we know right now is from users’ stories, from their own gossip amongst each other,” says Dr. Carolina Are, a platform governance researcher at Northumbria University’s Centre for Digital Citizens who is leading a project on malicious flagging and deplatforming.

Researchers at the crossroads of digital policy and sex work, including Are, believe mass reporting is one of many metrics, alongside algorithmic flagging, that goes into content-moderation decisions, and that each of these metrics influences the others. Scholars have found that machine-learning algorithms across the tech industry tend to learn and replicate users’ existing biases. So if users disproportionately report a certain kind of content they find objectionable, then those reports would train the algorithms to moderate similar content more severely. In a perfect world, this could be an effective content moderation strategy, but according to Are, “the issue is that a variety of people use it as a form of feed curation because of something that they don’t like that doesn’t necessarily go against community guidelines,” like sex education or abortion-related information. Even then, the issue isn’t user reporting per se: “It’s more that this could be gamed and misused in connection with community guidelines, which are already broad and unequal.”

people are harnessing a tool…to actually do harm to others.

Are, who is also a pole-dance instructor with over 27,000 followers on Instagram, found herself in a similar predicament as Gwen in 2021, when TikTok abruptly suspended her account. (Her TikTok following fluctuates as a result; she’s been deplatformed four times.) Are suspects that, as with my Twitter account, her TikTok accounts have been victim to mass reporting from other users. She told me that user comments claiming her content is “inappropriate for kids” are often a warning shot before a suspension, as are comments from users threatening to get their friends to report her account.

And because of conservative community guidelines and biased algorithms, sex workers’ accounts are already vulnerable when a coordinated swarm homes in on us. Sex workers and researchers believe it is a tactic often deployed by the anti-sex lobby to deplatform sex workers, activists, and allies. In fact, Mickelwait celebrated the successful reporting of Pornhub’s Instagram account last year and, when the account was later restored, asked her followers to report it until it was suspended again. So in practice, Are told me, “people are harnessing a tool…to actually do harm to others.” At time of publication, neither Exodus Cry, NCOSE, nor Traffickinghub has responded to requests for comment.

Our theories have been largely ignored by Meta, which claims it does “allow for the discussion of sex worker rights advocacy and sex work regulation.” But a former Meta employee, who recently left the company and only agreed to be interviewed on the condition of anonymity, confirmed that mass reporting can be weaponized by well-coordinated users to target and deplatform specific accounts, especially when algorithms are already biased against their content—which made Gwen an excellent target. They told me, “I wouldn’t put it past the [anti-sex activists] who noticed her for the first time because of the documentary to organize.” And even if mass reports made against sex workers are baseless, they explained, the sheer quantity of them further crystallizes anti-sex biases into the algorithms.

In other words, even if the intent is only to deplatform a single account, the impact of mass reporting ripples across the platform. According to the former employee, Meta knows that its algorithms—including those influenced by mass reporting—are silencing sex workers, and it has no interest in changing that. Meta has not responded to requests for comment.

Anti-sex biases spread

Sex workers aren’t at the complete mercy of algorithmic content moderation; large social media platforms also utilize humans to review reported violations. Paradoxically, social media users tend to perceive human content moderators—many of whom are reportedly tasked with reviewing violations ranging from child sexual exploitation material to graphic violence for laughably low wages under terrible conditions—as more logical than algorithms, but they have biases, too. And while they might understand the nuances between acceptable content and violations, they’re typically afforded no more than a few seconds to make a moderation decision. When they see the word “pornhub”? A violation probably seems more likely than not.

Still, general wisdom dictates that if bad actors mass report an account, an algorithm misfires, or a moderator makes a mistake, suspensions have to be based on something to stick, and any errors should be easy enough to appeal. In the case of appealing a suspension, users do tend to fare better with human moderators—if they manage to contact one. As Are told me, “A lot of people who would get deplatformed on the back of either mass reporting or algorithmic content removals would not be able to get their appeals reviewed unless they had friends in high places in platforms or unless journalists wrote articles about them,” which is how she had her TikTok account restored. I’ve also experienced first-hand how human connections can expedite the appeals process, which is how I got Twitter to restore my account in only a few hours.

A similar infrastructure for correcting content-moderation mistakes exists at Meta. “One of the little hacks that I exploited is that they have a workflow called ‘Oops,’” the former employee said, “where if you, a friend, or a family member is having problems with your account, and it’s causing you stress as an employee, you can report an ‘oops!’ where basically what you’re saying is, ‘I think that the algorithm got it wrong, so can you please have a human take a look at this enforcement action?’” Of course, the existence of Oops, which was first reported by CNBC in 2019, hasn’t helped all those who’ve had to wait weeks or sometimes months for a response about their account suspensions. And its exclusivity indicates that Meta doesn’t think of social media as a necessity alongside other forms of communication, like telephones or snail mail.

The former Meta employee referred to it as “a benefit that the general public ought to have: to be treated like a human. When you trap people into a system without any human intervention, it makes people really angry and upset at the powerlessness.”

Meta certainly has the power to protect sex workers: Celebrities regularly post nudity without consequence, but Instagram is quick to flag a sex worker in a bikini, to which Pornhub alluded in a letter to Meta after losing its Instagram account last year. And while Meta does not consult sex workers on adult-content policies, NCOSE has boasted of “several” meetings with Instagram in fundraising materials, XBIZ reported.

There are allies among individual employees, the former Meta employee told me, but the company culture reflects “a whole different, much harsher viewpoint” on sexual expression. This lines up with Instagram’s guidelines, which were updated in 2018 to ban “sexual solicitation”; posting sexual content directly to Instagram was already prohibited, but this policy cracked down on speech related to sex work, too. Many sex workers interpreted it as a targeted move, which Instagram said “isn’t true.” In addition to pressure from the anti-sex lobby to censor sex, Meta is also concerned with being accused of becoming implicated in the sex trade, a threat exponentially amplified by FOSTA/SESTA, a pair of 2018 bills that make any platform found to “promote or facilitate prostitution” criminally liable for human trafficking.

But by tightening its community guidelines and allowing its algorithms to continue steamrolling sex workers’ accounts while criticism of the industry remains on its platform, Meta’s content-moderation practices effectively manufacture a political leaning against sex workers. (Some sex workers’ Instagram accounts function undisturbed, but this is anecdotally an anomaly.) Meta’s decision not to step in should not be underestimated as an oversight. “Harm reduction [for sex workers] is just not a priority to them,” the former Meta employee said.

Ultimately, Meta’s increasingly puritanical policies can’t be understood in a vacuum. After a decade of relative progress—the repeal of Don’t Ask, Don’t Tell; the birth control mandate in the Affordable Care Act; the Obergefell v. Hodges decision—the backlash against sex and sexuality has been swift and vicious. At the same time, social media has played a major role in molding public conversations on political topics, including sex. Some effects of this pendulum swing are more obvious than others: Trump’s election, covid conspiracies, election misinformation. Some, like Florida’s “Don’t Say Gay” bill, several states’ attempts to gut trans rights, or the use of the slur “groomer” to attack people whose sexuality might be deemed deviant, are more insidious.

And while you may sympathize with users who don’t want to see adult content, or with platforms that don’t want to be liable for it, censorship will never end with sex workers, or sex educators and pole dancers. As Are noted, users vulnerable to deplatforming are disproportionately sex workers, but also LGBTQ folks, women of color, activists, people with marginalized bodies, and anyone who shares nudity or sexual expression. Just this month, a wave of bans swept up dozens of kink and sex-positive accounts on Instagram, and suspensions were only reversed by what Are called a “joint effort” between herself and journalists. A Meta spokesperson told Wired the suspended accounts, many of which highlighted LGBTQ+ rights, fell victim to “rules in place around nudity and sexual solicitation to ensure content is appropriate for everyone, particularly young people.”

“It’s a bit soul-destroying,” a creator whose sex-postive account @slut.social was temporarily suspended, told Dazed.

Sex workers lose spaces to speak

In the grand scheme of things, losing a social media account seems relatively harmless. Although platforms often prohibit “ban evasion,” in most cases, you can simply create another one. But rebuilding your platform is not so simple, and a suspension often means the sudden loss of years’ worth of memories. When users rely on their social media accounts for branding, a suspension can also mean an abrupt loss of revenue and the kind of disruption that may push an adult-content creator to pursue more dangerous income streams, like street-based sex work. Deplatforming is not simply an inconvenience: It’s violence.

Even after restoration, suspension leaves an indelible mark on Instagram accounts, making them more likely to face suspension again, the former Meta employee said. And this scarlet letter doesn’t only affect new content; trolls can comb through older content that was in compliance with community guidelines when it was initially published and report it now. The former employee suggested sex workers regularly review their old posts and captions, if not remove them entirely. Deletion may seem excessive, but it’s no worse than suspension, and you get to keep your following.

Deplatforming is not simply an inconvenience: It’s violence.

Many of us, myself included, rarely if ever use Instagram anymore due to its policies. It simply isn’t worth it. Considering the ease with which bad actors manage to exploit content-moderation systems, is there any real reason for us to remain? According to the former employee, the answer is no: “If they can at any time go back and look at anything you’ve ever posted and decide that thing you posted seven years ago is now not okay, you shouldn’t have a history. And if Instagram is no longer a repository for my memories, then why am I on there at all?”

Hillinger remained in Instagram limbo for weeks, despite her having a contact at Meta to escalate her appeal. First suspended on March 23, her account was restored the next day—before getting suspended again the day after that. It was reinstated for the second time on March 27, but was suspended for the third time on the 30th. After being restored on April 3, she was suspended in under two weeks on April 4. Her account remains active after its fourth restoration on April 7—for now, at least.

After more than three weeks, our prophetess Gwen’s backup account, @gwenisadorable, was restored. At the time of publication, her main account, @gwenadoraxo, remains suspended. And just last week, another performer featured in Money Shot, Siri Dahl, was suspended from Instagram for “adult sexual solicitation.” She was able to restore her account but is clear-eyed about the likelihood that it will soon be suspended again. Even if you know to listen to sex workers, how can you hear us when we no longer have anywhere to speak?

Olivia Snow is a writer, professor, and dominatrix. She’s currently a research fellow at UCLA’s Center for Critical Internet Inquiry, where she studies sex work, technology, and policy.

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin