Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

YouTube tries to brush away criticisms of pedophilic content promotion

A recent piece from The New York Times puts YouTube into a corner.
By

Published onJune 4, 2019

The YouTube logo as of 2019.
TL;DR
  • YouTube’s suggestion algorithms are called out by The New York Times in a new article.
  • According to research conducted by The Times, YouTube’s algorithms will suggest borderline content to those seeking pedophilic videos.
  • What’s more, the algorithms can also push borderline content to those who aren’t necessarily seeking it out.

Yesterday, The New York Times published an alarming article centering on how YouTube’s algorithms for suggesting new content may unintentionally assist pedophiles. In fact, some information gleaned from research for the piece suggests that the algorithms could even help push pedophilic content to those who aren’t necessarily looking for it.

Shortly after the article landed, YouTube posted a new entry to its blog which indirectly addresses the article. In the YouTube post, the company details the actions it’s taken over the past year to combat pedophilia on the platform as well as the various efforts it’s made to keep children safe.

However, the YouTube post doesn’t offer any new strategies; rather, it merely summarizes its various current efforts as a reassurance.

Here are all 5 YouTube apps and what they do
Software lists
featured image for the youtube apps article

In the article that appears in The New York Times, the writers tell the story of a mother who happily agreed to allow her 10-year-old daughter to post a video to YouTube of herself and a friend playing in the backyard pool. The mother was shocked to later find this video — which simply depicts two young girls swimming and lounging around the pool in their two-piece swimsuits — had racked up over 400,000 views.

The video became so popular in part due to YouTube’s suggestion algorithms. These algorithms are responsible for suggesting content to viewers based on other content they’ve watched. In a sense, YouTube funneled pedophiles to this video of the children playing in the pool.

Pedophiles don't have to search for borderline content on YouTube because YouTube will simply suggest it to them.

What’s more, a Brazilian research team found that YouTube’s algorithms tend to push ever-more-extreme content with its suggestions. For example, a person could start out watching a video on how to repair a bicycle, then watch a video of people actually cycling. Then, YouTube could suggest they watch a video of cyclists crashing, and then other vehicles crashing. Before the viewer knows it, they are getting suggestions for violent and disturbing content — all stemming from watching a video on how to repair a bicycle.

See Also: YouTube Originals will be free for all to watch, just with ads

Similarly, the swimming pool video can be linked to suggestive adult content. A person could set out to watch erotic-but-legal content on YouTube — for example, a bikini model’s reel. That content could lead to suggestions for other erotic bathing suit content, which gradually leads to videos featuring younger-and-younger models. Eventually, it leads to nearly half-a-million views of two ten-year-olds in a pool.

YouTube has a serious problem on its hands which will require the company to act on ethics at the expense of profits.

When The Times informed YouTube of this issue regarding the swimming pool video, the company removed several related videos but left up many others, including some that appear to be from fake accounts. It also tweaked its suggestions so that other borderline content didn’t appear. However, YouTube denied doing this purposely, instead claiming the algorithm is continuing to learn.

Parents pay thousands for kids to learn YouTuber skills at summer camps
News
The YouTube logo as of 2019.

Now, in its response blog post, YouTube is highlighting its recent efforts to keep kids safe on the platform, including removing comments from nearly all videos featuring minors, restricting minors from live streaming without an adult present in the video, and limiting recommendations like the ones described by The Times.

However, YouTube has not made the sweeping change that would prevent this whole problem: turning off suggestions for videos featuring minors. Although YouTube has the ability to do this quite easily, it hasn’t done so, likely because video suggestions drive around 70 percent of views. That’s a lot of revenue at stake.

After the unexpected popularity of her daughter’s video, the mother who approved it said she will no longer allow her daughter to post anything on the platform.

NEXT: Here’s why you couldn’t access YouTube on Sunday

You might like