Goto

Collaborating Authors

 caltrider


'AI Girlfriends' Are a Privacy Nightmare

WIRED

You shouldn't trust any answers a chatbot sends you. An analysis into 11 so-called romance and companion chatbots, published on Wednesday by the Mozilla Foundation, has found a litany of security and privacy concerns with the bots. Collectively, the apps, which have been downloaded more than 100 million times on Android devices, gather huge amounts of people's data; use trackers that send information to Google, Facebook, and companies in Russia and China; allow users to use weak passwords; and lack transparency about their ownership and the AI models that power them. Since OpenAI unleashed ChatGPT on the world in November 2022, developers have raced to deploy large language models and create chatbots that people can interact with and pay to subscribe to. The Mozilla research provides a glimpse into how this gold rush may have neglected people's privacy, and into tensions between emerging technologies and how they gather and use data. It also indicates how people's chat messages could be abused by hackers.


Is Your Secret Santa App on the Privacy Naughty List?

Slate

Sign up to receive the Future Tense newsletter every other Saturday. It's a busy time to run a Secret Santa site. For Elfster, one such site, the normal staff of around 20 elves (yes, they call themselves elves) balloons to a team of about 55. "We need a million servers, we need tons of support people--it's just off the hook for the holidays," Peter Imburg, the company's CEO, told me recently on Zoom. More than 21 million people have used Elfster for exchanges, and Imburg said the biggest group he's seen participate in an exchange on the platform was about 5,000 people. Imburg started Elfster in 2004 (the time of Friendster and Napster, hence the name) as a way to solve a personal problem: How to coordinate a family gift exchange when not everyone was in the same place, and with certain conditions, like not selecting yourself or your spouse?