YouTube's Recommendation Algorithm Reportedly Leads Pedophiles to Home Videos of Kids

Latest
YouTube's Recommendation Algorithm Reportedly Leads Pedophiles to Home Videos of Kids
Image:Getty

YouTube has already faced problems for allowing disturbing, violent bootleg content geared towards children to proliferate on the platform— alongside with popular creators and vloggers who potentially endanger their children in videos. But an investigation from researchers at Harvard’s Berkman Klein Center for Internet and Society finds that YouTube also endangers children is subtle ways, through its algorithms, which can, with a few clicks, lead viewers seeking out sexual content directly to videos of young children.

YouTube’s recommendation system, which the company says is powered by artificial intelligence, has long been criticized for creating a “rabbit hole effect,” leading viewers to increasingly extreme content as they watch. For example, reporters have found that far-right conspiracy videos, which proliferate on the platform, are often rewarded for their hyperbolic outrage and extremism via the algorithm. So that a search that begins with something relatively benign (“Donald Trump rallies”) eventually leads to fringe, and extreme theories.

The researchers, the New York Times reports, find that the same tenets that reward extremism also happen with sexual content on YouTube:

A user who watches erotic videos might be recommended videos of women who become conspicuously younger, and then women who pose provocatively in children’s clothes. Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing bathing suits, or getting dressed or doing a split.
On its own, each video might be perfectly innocent, a home movie, say, made by a child. Any revealing frames are fleeting and appear accidental. But, grouped together, their shared features become unmistakable.

And thus, a harmless video of children playing in a pool—one that is perfectly in keeping with YouTube’s moderation guidelines—suddenly gets 400,000 views by becoming ensnared in this recommendation pattern.

After the Times alerted YouTube to the fact that its recommendations system was launching viewers from adult, erotic videos to home movies of children, YouTube immediately changed their algorithm, chalking the error up to “routine tweaks to its algorithms.” The director of the company’s trust and safety initiative told the Times that there is no rabbit hole effect, saying “it’s not clear to us that necessarily our recommendation engine takes you in one direction or another.”

But the only thing that will keep this from happening is for YouTube to turn off recommendations for videos of children, the researchers say, which the company believes would “hurt creators” who rely on those recommendation clicks—and, of course, it would be bad for YouTube, which relies on its kid content as a popular (and growing) revenue stream. YouTube did say it would curb recommendations for videos it thinks are “putting children at risk,” but given the platform’s track record with damaging content, it’s unclear what this method will look like in practice, and which videos would make the cut.

At the end of the day YouTube’s algorithm is designed to give the company what it wants, which is for users stay on the site as much as possible. If the algorithm keeps replicating what keeps users engaged, without intense intervention these dangerous patterns will persist.

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin