YouTube is updating the way it promotes videos in an effort to curtail the spread of misinformation.
The change targets how YouTube curates recommended videos for each user, including the more than 200 million videos promoted on the website's homepage each day and YouTube's autoplay feature, which queues a new video as soon as the one a user is watching finishes.
YouTube said it's working with human evaluators from across the U.S. to train its recommendation algorithms to avoid what it calls "borderline content," or content that "comes close to — but doesn't quite cross the line of — violating our community guidelines."
"When recommendations are at their best, they help users find a new song to fall in love with, discover their next favorite creator or learn that great paella recipe," YouTube said in a blog post Jan. 25. "To that end, we'll begin reducing recommendations of borderline content and content that could misinform users in harmful ways — such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat or making blatantly false claims about historic events like 9/11."
This update will affect less than 1 percent of the videos on YouTube, according to the blog post. All videos that comply with the website's community guidelines will continue to be available on the platform, however, ones that are designated "borderline" will be filtered from its recommendations.
YouTube's announcement follows recent criticism that the video-sharing website, which is owned by Google, has unwittingly played a major role in distributing misinformation online, The Verge reports.
Last year, Zeynep Tufekci, PhD — an associate professor at the University of North Carolina's school of information and library science in Chapel Hill — penned a widely read The New York Times op-ed on the topic, where she called YouTube "the great radicalizer."
The changes to YouTube's recommendation algorithms will initially take effect in the U.S. Over time, as the system becomes more accurate, the company will debut the update in other countries.
"We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users," the blog post reads. "We believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community."