Goto

Collaborating Authors

 classmate


Helen Oyeyemi's Novel of Cognitive Dissonance

The New Yorker

Few fantasies are harder to wipe away than the romance of a clean slate. Every January, when we're twitchy with regret and self-loathing, advertisers blare, "New Year, new you," urging us to jettison our failures and start fresh. In fiction, self-reinvention is a perennial theme, often shadowed by the suspicion that it can't be done. Lately, novelists have put a political spin on the idea, counterposing hopeful acts of individual self-fashioning to the immovable weight of circumstance. Halle Butler's "The New Me" (2019), a millennial office satire, finds its temp heroine, Millie, trying to life-hack her way out of loneliness and professional drift--buy a plant, whiten her teeth, make friends, think positive.


Inside the First Major U.S. Bill Tackling AI Harms--and Deepfake Abuse

TIME - Tech

Here's what the bill aims to achieve, and how it crossed many hurdles en route to becoming law. The Take It Down Act was borne out of the suffering--and then activism--of a handful of teenagers. In October 2023, 14-year-old Elliston Berry of Texas and 15-year-old Francesca Mani of New Jersey each learned that classmates had used AI software to fabricate nude images of them and female classmates. The tools that had been used to humiliate them were relatively new: products of the generative AI boom in which virtually any image could be created with the click of a button. Pornographic and sometimes violent deepfake images of Taylor Swift and others soon spread across the internet.


Teens are now using AI chatbots to create and spread nude images of classmates, alarming education experts

FOX News

A troubling trend has emerged in schools across the United States, with young students falling victim to the increasing use of artificial intelligence (AI)-powered "nudify" apps that have the power to create fake pornography of classmates. "Nudify" is an umbrella term referring to a plethora of widely available apps and websites that allow users to alter photos of full-dressed individuals and virtually undress them. Some apps can create nude images with just a headshot of the victim. Don Austin, the superintendent of the Palo Alto Unified School District, told Fox News Digital that this type of online harassment can be more relentless compared to traditional in-person bullying. "It used to be that a bully had to come over and push you. Palo Alto is not a community where people are going to come push anybody into a locker. But it's not immune from online bullying," Austin said.


"You Didn't Hear This from Me: (Mostly) True Notes on Gossip," Reviewed

The New Yorker

In August, 1918, Virginia Woolf spent a quiet stretch at Asheham, the country house that she and her husband, Leonard, rented in rural Sussex. "We've been practically alone, which has a very spiritual effect upon the mind," Woolf wrote to a friend, the socialite Lady Ottoline Morrell. After six months spent in such isolation, Woolf quipped, "I should be a kind of Saint, and Leonard an undoubted prophet. We should shed virtue on people as we walked along the roads." Alas, any pretensions to holiness had been dispelled by the arrival of house guests the previous evening: "I had such a bath of the flesh that I am far from unspotted this morning.


Teen deepfake pornography victim warns future generation is 'at risk' if AI crime bill fails

FOX News

High school student Elliston Berry discusses the Take It Down Act, a measure that would force social media companies to remove graphic deepfakes, prevent them from being posted and criminalize the act. Senate lawmakers unanimously passed the bipartisan-led Take It Down Act that would force social media companies to speedily remove sexually explicit deepfakes, prevent them from being posted and criminalize the act. For deepfake pornography victims like 15-year-old Elliston Berry, the measure would be long overdue. The Texas high school student is working with lawmakers to get the bill passed to protect victims like herself. She's inspired by her own story from last year, when she discovered deepfake nude images of herself circulating across social media in a sinister cyber scheme that turned her life upside down.


More than 1 in 10 students say they know of peers who created deepfake nudes, report says

Los Angeles Times

When news broke that AI-generated nude pictures of students were popping up at a Beverly Hills Middle School in February, many district officials and parents were horrified. But others said no one should have been blindsided by the spread of AI-powered "undressing" programs. "The only thing shocking about this story," one Carlsbad parent said his 14-year-old told him, "is that people are shocked." Now, a newly released report by Thorn, a tech company that works to stop the spread of child sexual abuse material, shows how common deepfake abuse has become. The proliferation coincides with the wide availability of cheap "undressing" apps and other easy-to-use, AI-powered programs to create deepfake nudes.


A Generation of AI Guinea Pigs

The Atlantic - Technology

This spring, the Los Angeles Unified School District--the second-largest public school district in the United States--introduced students and parents to a new "educational friend" named Ed. A learning platform that includes a chatbot represented by a small illustration of a smiling sun, Ed is being tested in 100 schools within the district and is accessible at all hours through a website. It can answer questions about a child's courses, grades, and attendance, and point users to optional activities. As Superintendent Alberto M. Carvalho put it to me, "AI is here to stay. If you don't master it, it will master you." Carvalho says he wants to empower teachers and students to learn to use AI safely.


L.A. school district probes inappropriate images shared at Fairfax High. More AI abuse?

Los Angeles Times

Los Angeles school officials are investigating allegations that inappropriate photos were "created and disseminated within the Fairfax High School community," in what appears to be the latest alleged misuse of technology by students, a district statement said. Last week, Laguna Beach High School administrators announced that they had launched an investigation after a student allegedly created and circulated "inappropriate images" of classmates through the use of artificial intelligence. In January, five Beverly Hills eighth-graders were expelled for their involvement in the creation and sharing of fake nude pictures of classmates. The students superimposed pictures of classmates' faces onto nude bodies generated by artificial intelligence. In total, 16 eighth-grade students were targeted by the pictures, which were shared through messaging apps, according to the district.


ChatGPT Role-play Dataset: Analysis of User Motives and Model Naturalness

Tao, Yufei, Agrawal, Ameeta, Dombi, Judit, Sydorenko, Tetyana, Lee, Jung In

arXiv.org Artificial Intelligence

Recent advances in interactive large language models like ChatGPT have revolutionized various domains; however, their behavior in natural and role-play conversation settings remains underexplored. In our study, we address this gap by deeply investigating how ChatGPT behaves during conversations in different settings by analyzing its interactions in both a normal way and a role-play setting. We introduce a novel dataset of broad range of human-AI conversations annotated with user motives and model naturalness to examine (i) how humans engage with the conversational AI model, and (ii) how natural are AI model responses. Our study highlights the diversity of user motives when interacting with ChatGPT and variable AI naturalness, showing not only the nuanced dynamics of natural conversations between humans and AI, but also providing new avenues for improving the effectiveness of human-AI communication.


A Deepfake Nude Generator Reveals a Chilling Look at Its Victims

WIRED

As AI-powered image generators have become more accessible, so have websites that digitally remove the clothes of people in photos. One of these sites has an unsettling feature that provides a glimpse of how these apps are used: two feeds of what appear to be photos uploaded by users who want to "nudify" the subjects. The feeds of images are a shocking display of intended victims. WIRED saw some images of girls who were clearly children. Other photos showed adults and had captions indicating that they were female friends or female strangers.