YouTube under fire for recommending videos of kids with inappropriate comments

#artificialintelligence 

More than a year on from a child safety content moderation scandal on YouTube and it takes just a few clicks for the platform's recommendation algorithms to redirect a search for "bikini haul" videos of adult women towards clips of scantily clad minors engaged in body contorting gymnastics or taking an ice bath or ice lolly sucking "challenge." A YouTube creator called Matt Watson flagged the issue in a critical Reddit post, saying he found scores of videos of kids where YouTube users are trading inappropriate comments and timestamps below the fold, denouncing the company for failing to prevent what he describes as a "soft-core pedophilia ring" from operating in plain sight on its platform. He has also posted a YouTube video demonstrating how the platform's recommendation algorithm pushes users into what he dubs a pedophilia "wormhole," accusing the company of facilitating and monetizing the sexual exploitation of children. We were easily able to replicate the YouTube algorithm's behavior that Watson describes in a history-cleared private browser session which, after clicking on two videos of adult women in bikinis, suggested we watch a video called "sweet sixteen pool party." Clicking on that led YouTube's side-bar to serve up multiple videos of prepubescent girls in its "up next" section where the algorithm tees-up related content to encourage users to keep clicking.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found