Links on Android Authority may earn us a commission. Learn more.
YouTube pulls ads on almost 2 million inappropriate videos aimed at kids
YouTube is caught up in another scandal. Earlier this year, scores of advertisers pulled their ads from the platform after it was discovered that some ads were running on videos with hate speech or other offensive material. Now, after accusations over inappropriate videos and comments aimed at children, more advertisers are suspending their activity on the video streaming service. These advertisers include big names like Adidas, Mars, and Hewlett-Packard.
In a statement to Vice, YouTube says it terminated over 270 accounts and removed 150,00 videos that it has deemed to violate its terms of service. It has also disabled the comments section on over 625,000 videos targeted by child predators.
Finally, over the past week we removed ads from nearly 2 million videos and over 50,000 channels masquerading as family-friendly content. Content that endangers children is abhorrent and unacceptable to us.
YouTube is also battling autofill search results that suggest pedophiliac themes. When a user searched “how to” the site would then suggest something like “have s*x kids” and “have s*x with your kids”, according to BuzzFeed. It appears YouTube is making progress on this front. The search terms are no longer returning the inappropriate autocomplete terms.
Earlier this year, YouTube announced that it would restrict content creators that use family-friendly characters inappropriately. These videos have well-known characters like Elsa from Frozen and Spider-Man in ridiculous situations that can stray into violent or sexual themes. They’re slapped together by content mills with titles intended to game YouTube’s auto-play algorithm to keep kids watching.
In addition to those changes, YouTube says it is also in the process of implementing a new content filtering policy. When a video is flagged as inappropriate, it will be age restricted in the main YouTube app. Age-restricted videos are not allowed in YouTube Kids, so this should cut down on how many kids can view them.
YouTube is stepping up enforcement, but unfortunately, it’s not enough. New channels are created every day, and according to the BBC, the tools to screen predatory comments haven’t been working correctly for over a year. This has allowed between 50,000 and 100,00 predatory accounts to remain on YouTube. It’s also debatable whether the new filtering system will make a meaningful impact since it’s mostly kids watching the content and they are less likely to report inappropriate videos.
Do you think YouTube is doing enough to protect kids? Will its new filtering program make a difference? Let us know down in the comments.