Vote 2020 graphic
Everything you need to know about and expect during
the most important election of our lifetimes

The Guardian Will Analyze Abuse on Its Own Articles

Illustration for article titled iThe Guardian/i Will Analyze Abuse on Its Own Articles

On Monday, the Guardian launched a new series focused on internet abuse and harassment called “The Web We Want,” by turning inward.

“For the great bulk of our readers, and—yes—to respect the wellbeing of our staff too, we need to take a more proactive stance on what kind of material appears on the Guardian,” an editorial reads.

On Friday, the publication’s executive editor for audience Mary Hamilton outlined exactly how it currently monitors its comment section, which receives over 50,000 posts every day, noting that its editors were committed to encouraging readers to participate in discussions. She clarified, however, that various articles demand various kinds of moderation—for example, a crossword puzzle doesn’t need the same type of vigilance that an article about rape does.

Advertisement

Hamilton continued:

We are going to be implementing policies and procedures to protect our staff from the impact of abuse and harassment online, as well as from the impact of repeatedly being exposed to traumatic images. We are also changing the process for new commenters, so that they see our community guidelines and are welcomed to the Guardian’s commenting community. On that point, we are reviewing those community standards to make them much clearer and simpler to understand, and to place a greater emphasis on respect.

Advertisement

The Guardian is also examining its current moderation protocol and are testing various ways for writers to be involved in conversations with readers. It will reportedly begin publishing results from an analysis of its own comment section some time this week, but for now has put out a call for readers to chime in about what they look for in comments sections and what’s off-putting.

“We are not like the 4chan message boards, where anyone can say almost anything without consequences,” Hamilton continued. “Just as Facebook, Twitter, Metafilter and many others provide spaces for different kinds of communities to gather, we want to creat spaces on the Guardian for particular conversations and particular groups to speak—with each other and with us.”

Advertisement

Image via The Guardian.

Senior Editor, Jezebel

Share This Story

Get our newsletter

DISCUSSION

sqarr
Meyer Lansky Sqarrs

Jezebel has a troll problem and users have mostly been left up to their own devices in dealing with them.

It can take days or weeks for the Kinja support team to respond. Jez authors may or may not ever respond to flagged posts or emails asking for intervention.

Dismissing is obviously inadequate.

The ease with which assholes can make unlimited burners to get around dismissals and bans shelters trolls from consequences for their horrific behaviour and puts enormous strain on regular posters.

In the absence of a dedicated moderation team, more than just dismissing needs to be put in the hands of individual users.

This has been discussed. It’s quite obvious the Kinja devs either don’t give a fuck or aren’t given the resources to develop new tools.

What is the “Mod Squad”? Why is it not something anyone knows anything about? How can it possibly be useful?