There are plenty of awful, false, and patently crazy YouTube conspiracy theory videos on the platform. This has been a big problem for YouTube and society at large, as the proliferation of fake news continues to do us all harm.
However, the biggest issue with YouTube conspiracy videos is not that they exist, but rather that they are suggested to users who might unknowingly think they are legitimate. YouTube is now making efforts to curb that problem by changing the way it recommends videos to users (via The New York Times).
Going forward, the platform will no longer suggest videos with “borderline content” or those that “misinform users in a harmful way,” even if the content of the videos doesn’t violate any of its community guidelines.
This change should affect less than one percent of the videos currently on YouTube. However, less than one percent of the billions of YouTube videos is still quite a lot of videos.
YouTube provided these three examples of content it would cease to recommend:
- Videos promoting a phony miracle cure for a serious illness.
- Flat-Earth conspiracy videos.
- Blatantly false videos about historic events, such as the attacks on September 11, 2001.
While videos like these are surely a problem, they are not the only “borderline content” the platform will avoid recommending to users. However, YouTube declined to elaborate further on any other examples.
To be clear, these videos will not be removed from the platform — they simply will not be recommended or featured in any way.
YouTube said the way it will discover these videos will be through machine algorithms combined with human flaggers.
The updated system is rolling out now for a small set of videos in the U.S., and will eventually be a global change as the system is refined.