Goto

Collaborating Authors

YouTube's recommender AI still a horrorshow, finds major crowdsourced study โ€“ TechCrunch

#artificialintelligence

Most likely it's a clumsy attempt to throw disinformation shade at rivals.) Returning to the regulation point, an EU proposal -- the Digital Services Act -- is set to introduce some transparency requirements on large digital platforms, as part of a wider package of accountability measures. And asked about this Geurkink described the DSA as "a promising avenue for greater transparency". But she suggested the legislation needs to go further to tackle recommender systems like the YouTube AI. "I think that transparency around recommender systems specifically and also people having control over the input of their own data and then the output of recommendations is really important -- and is a place where the DSA is currently a bit sparse, so I think that's where we really need to dig in," she told us. One idea she voiced support for is having a "data access framework" baked into the law -- to enable vetted researchers to get more of the information they need to study powerful AI technologies -- i.e. rather than the law trying to come up with "a laundry list of all of the different pieces of transparency and information that should be applicable", as she put it.


YouTube's algorithm is still recommending videos that you wish you hadn't seen, say researchers

ZDNet

At times, found the report, the algorithm even encourages users to watch videos that violate the website's content policies. YouTube's algorithm is recommending videos that viewers wish afterwards that they hadn't seen, according to research carried out by Mozilla. And at times, found the report, the algorithm even encourages users to watch videos that are later found to have violated the website's content policies. Last year, Mozilla launched RegretsReporter, an open-source browser extension that lets users report videos that they were recommended and which they wish they hadn't ended up watching. When filing a report, users are asked to provide the video's title, description, view count and entry point (whether by direct search or through recommended content); and they can also provide Mozilla with a "trail" of how they arrived at the reported video.


Mozilla wants to understand your weird YouTube recommendations

ZDNet

From cute cat videos to sourdough bread recipes: sometimes, it feels like the algorithm behind YouTube's "Up Next" section knows the user better than the user knows themselves. Often, that same algorithm leads the viewer down a rabbit hole. How many times have you spent countless hours clicking through the next suggested video, each time promising yourself that this one would be the last one? The scenario gets thorny when the system somehow steers the user towards conspiracy theory videos and other forms of extreme content, as some have complained. To get an idea of how often this happens and how, the non-profit Mozilla Foundation has launched a new browser extension that lets users take action when they are recommended videos on YouTube that they then wish they hadn't ended up watching.


YouTube's algorithm recommends videos that violate its own policies

New Scientist

The algorithm YouTube uses recommends videos that don't follow the company's guidelines YouTube's algorithm recommends videos that violate the company's own policies on inappropriate content, according to a crowdsourced study. Not-for-profit company Mozilla asked users of its Firefox web browser to install a browser extension called RegretsReporter, which tracked the YouTube videos they watched, and asked them whether they regretted watching each video. Between July 2020 and May 2021, 37,380 users flagged 3362 videos they viewed as regrettable โ€“ a fraction of 1 per cent of all those they watched. Reports of these videos were highest in Brazil, with about 22 videos out of every 10,000 viewed being logged as regrettable. Researchers then watched the reported videos and checked them against YouTube's content guidelines; they found that 12.2 per cent of the reported videos either shouldn't be on YouTube, or shouldn't be recommended through its algorithm, say the Mozilla researchers.


Mozilla project exposes YouTube's recommendation 'bubbles'

Engadget

We've all seen social media posts from our climate change-denying cousin or ultra-liberal college friend, and have wondered how they came to certain conclusions. Mozilla's new project, "TheirTube," is offering a glance at theoretical YouTube homepages for users in six different categories -- fruitarian, doomsday prepper, liberal, conservative, conspiracist and climate denier. Through these different personas, Mozilla hopes to demonstrate how YouTube's recommendation algorithm could confirm certain biases. The six personas were created after Mozilla conducted interviews with real YouTube users who experienced similar recommendation bubbles. An account was created for each persona which was then subscribed to channels that the interviewees followed.