Collaborating Authors

YouTube is more likely to serve problematic videos than useful ones, study (and common sense) finds


Here's a study supported by the objective reality that many of us experience already on YouTube. The streaming video company's recommendation algorithm can sometimes send you on an hours-long video binge so captivating that you never notice the time passing. But according to a study from software nonprofit Mozilla Foundation, trusting the algorithm means you're actually more likely to see videos featuring sexualized content and false claims than personalized interests. In a study with more than 37,000 volunteers, Mozilla found that 71 percent of YouTube's recommended videos were flagged as objectionable by participants. The volunteers used a browser extension to track their YouTube usage over 10 months, and when they flagged a video as problematic, the extension recorded if they came across the video via YouTube's recommendation or on their own.

Strike or No Strike, Pensions Problematic for LA Schools

U.S. News

Strike or no strike, after a deal is ultimately reached on a contract for Los Angeles teachers, the school district will still be on a collision course with deficit spending because of pensions and other financial obligations.

Audits attempt to straighten out the "wild, wild west" of algorithms


AI algorithms employed in everything from hiring to lending to criminal justice have a persistent and often invisible problem with bias. The big picture: One solution could be audits that aim to determine whether an algorithm is working as intended, whether it's disproportionately affecting different groups of people and, if there are problems, how they can be fixed. How it works: Algorithmic audits -- usually conducted by outside companies -- involve examining an algorithm's code and the data used to train it, and assessing its potential impact on populations through interviews with stakeholders and those who might be affected by it. Between the lines: Financial audits exist in part to open up the black box of a company's internal operations to outside investors, and ensure that a company remains in compliance with financial laws and regulations. Details: Algorithmic audits can help companies screen their AI products for flaws that may not be apparent at first glance.

Auditors are testing hiring algorithms for bias, but there's no easy fix

MIT Technology Review

For all the attention that AI audits have received, though, their ability to actually detect and protect against bias remains unproven. The term "AI audit" can mean many different things, which makes it hard to trust the results of audits in general. The most rigorous audits can still be limited in scope. And even with unfettered access to the innards of an algorithm, it can be surprisingly tough to say with certainty whether it treats applicants fairly. At best, audits give an incomplete picture, and at worst, they could help companies hide problematic or controversial practices behind an auditor's stamp of approval.

County to Keep Funding Problematic Shelter Despite Protest

U.S. News

The Washington Post reports Prince George's County will continue to fund the nonprofit Family Crisis Center's shelter despite calls to close its doors. Maryland law requires marriage license fees collected by the county go toward the nonprofit. Those fees totaled $385,000 last year.