YouTube promises to stop recommending flat Earth and 9/11 truther videos

Mashable

Even without Alex Jones, harmful conspiracy theory videos were running rampant on YouTube. Now, the company says it's going to take action. In a blog post published on Friday, YouTube said it would be making changes to its recommendations algorithm to explicitly deal with conspiracy theory videos. The company says the update will reduce the suggestion of "borderline content and content that could misinform users in harmful ways." YouTube clarified what kind of videos fit that description by providing three examples: "videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11."


Searching for news on RBG? YouTube offered conspiracy theories about the Supreme Court justice instead.

Washington Post - Technology News

Conspiracy theories about the health of Supreme Court Justice Ruth Bader Ginsburg have dominated YouTube this week, illustrating how the world's most popular video site is failing to prevent its algorithm from helping popularize viral hoaxes and misinformation. More than half of the top 20 search results for her initials, "RBG," on Wednesday pointed to false far-right videos, some claiming doctors are using mysterious illegal drugs to keep her alive, according to a review by The Washington Post. Ginsburg has been absent from oral arguments at the Supreme Court this week as she recuperates from recent surgery to remove cancer from her lungs. Tests revealed Friday that she will need no further treatment and that her recovery is on track. The falsehoods, most of which originated with the fringe movement QAnon, dramatically outnumbered results from credible news sources.


Searching for news on RBG? YouTube offered conspiracy theories about the Supreme Court justice instead.

Washington Post - Technology News

Conspiracy theories about the health of Supreme Court Justice Ruth Bader Ginsburg have dominated YouTube this week, illustrating how the world's most popular video site is failing to prevent its algorithm from helping popularize viral hoaxes and misinformation. More than half of the top 20 search results for her initials, "RBG," on Wednesday pointed to false far-right videos, some claiming doctors are using mysterious illegal drugs to keep her alive, according to a review by The Washington Post. Ginsburg has been absent from oral arguments at the Supreme Court this week as she recuperates from recent surgery to remove cancer from her lungs. Tests revealed Friday that she will need no further treatment and that her recovery is on track. The falsehoods, most of which originated with the fringe movement QAnon, dramatically outnumbered results from credible news sources.


YouTube Will Crack Down on Toxic Videos, But It Won't Be Easy

WIRED

YouTube is trying to reduce the spread of toxic videos on the platform by limiting how often they appear in users' recommendations. The company announced the shift in a blog post on Friday, writing that it would begin cracking down on so-called "borderline content" that comes close to violating its community standards without quite crossing the line. "We'll begin reducing recommendations of borderline content and content that could misinform users in harmful ways--such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11," the company wrote. These are just a few examples of the broad array of videos that might be targeted by the new policy. According to the post, the shift should affect less than one percent of all videos on the platform.


YouTube tweaks algorithms to stop recommending conspiracy videos

FOX News

YouTube logo is seen on an android mobile phone. Google-owned YouTube is reworking its recommendation algorithm that will suggest what videos users should view next in a new bid to stem the flow of conspiracy theories and false information on the massive video platform. The company, which has been criticized by lawmakers and called out in studies for pushing viewers toward fraudulent content and conspiracy theories, said in a Friday blog post that it is taking a "closer look" at how to reduce the spread of content that does not quite violate YouTube's Community Guidelines but comes close to doing so. "To that end, we'll begin reducing recommendations of borderline content and content that could misinform users in harmful ways--such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11," the company said in its blog post. YouTube's recommendation algorithm, a secretive formula that determines which clips are promoted in the "Up Next" column beside the video player, drives a large percentage of traffic on the video platform, where over a billion hours of footage are watched each day.