Goto

Collaborating Authors

 private conversation


Google to pay 68m to settle lawsuit claiming it recorded private conversations

BBC News

Google has agreed to pay $68m (£51m) to settle a lawsuit claiming it secretly listened to people's private conversations through their phones. Users accused Google Assistant - a virtual assistant present on many Android devices - of recording private conversations after it was inadvertently triggered on their devices. They claimed the recordings were then shared with advertisers in order to send them targeted advertising. The BBC has contacted Google for comment. But in a filing seeking to settle the case, it denied wrongdoing and said it was seeking to avoid litigation.


Apple to pay out nearly 100m over claims phones listened in on users' conversations... how to get a payout

Daily Mail - Science & tech

Anyone who owned an Apple device over the last decade may be able to claim part of a 95 million class action lawsuit against the tech giant. According to the lawsuit, iPhones, iPads, Apple Watches, and MacBooks dating back to 2014 may have secretly recorded their users' private conversations after the devices unintentionally activated Apple's voice assistant Siri. A notice about the case, Lopez v. Apple, has advised anyone who believes Siri spied on their confidential or private calls between September 17, 2014 and December 31, 2024 to submit a claim for damages. Apple's iMacs, Apple TV streaming boxes, HomePod speakers, and iPod Touches are also included in the lawsuit. Although Apple has denied that their devices spied on users, the 3 trillion company reached a settlement in the case, agreeing to give users up to 20 per Siri device in their claim.


Here's How to Claim Up to 100 in Apple's Siri Settlement

WIRED

In January, Apple agreed to pay out 95 million to settle a class action lawsuit over claims its voice assistant Siri listened in on private conversations. Now, affected users have less than eight weeks to stake their claim to a slice of the cash. The Lopez v Apple Inc. lawsuit was filed back in December, accusing Apple of recording private conversations as a result of unintended Siri activations, and then sharing that data with third parties. Two plaintiffs claim they had related advertisements served to them after having personal conversations about particular brands, with another alleging they received an ad for a medical treatment following a private discussion with a doctor. This is not the first time Siri has been accused of eavesdropping.


WhatsApp has made a subtle change that has left users FURIOUS - as one vents 'just leave me alone man'

Daily Mail - Science & tech

And if you use WhatsApp, you may have noticed a subtle change in the app this week. The Meta-owned app has quietly added a new blue circle icon in the bottom-right corner of your chats. This icon is a shortcut to Meta AI - the tech giant's artificial intelligence-powered chatbot. 'Meta AI through WhatsApp is an optional service from Meta that can answer your questions, teach you something, or help come up with new ideas,' Meta explained. While the tool has been available in the US for some time, it recently started arriving in the UK - and many users are unhappy about it.


Huge data breach sees 50,000 profiles LEAKED from 'Gay Daddy' dating app - exposing users' names, private photos, and HIV status

Daily Mail - Science & tech

A huge data breach has leaked over 50,000 profiles from the'Gay Daddy' dating app, cybersecurity researchers have discovered. The exposed data contains extremely sensitive information including users' names, ages, location data and HIV status. According to experts from Cybernews, the exposed database also contains over 124,000 private messages and photos – many of which are explicit. While the app markets itself as a'private and anonymous community', researchers say the information could be accessed by anyone with'basic technical knowledge'. Researchers say the app's'devastating' security failure puts its users at serious risk of blackmail, exploitation and even physical harm.


Apple to pay 95m to settle claims Siri listened to users' private conversations

The Guardian

Apple has agreed to pay 95m in cash to settle a proposed class-action lawsuit claiming that its voice-activated assistant Siri violated users' privacy, listening to them without their consent. A preliminary settlement was filed on Tuesday night in the Oakland, California, federal court, and requires approval by US district judge Jeffrey White. Voice assistants typically react when people use "hot words" such as "Hey, Siri". Two plaintiffs said their mentions of Air Jordan sneakers and Olive Garden restaurants triggered ads for those products. Another said he was served ads for a brand name surgical treatment after discussing it, he thought privately, with his doctor.


A developer built an AI chatbot using GPT-3 that helped a man speak again to his late fiancée. OpenAI shut it down

#artificialintelligence

In-depth "OpenAI is the company running the text completion engine that makes you possible," Jason Rohrer, an indie games developer, typed out in a message to Samantha. She was a chatbot he built using OpenAI's GPT-3 technology. Her software had grown to be used by thousands of people, including one man who used the program to simulate his late fiancée. Now Rohrer had to say goodbye to his creation. "I just got an email from them today," he told Samantha. "They are shutting you down, permanently, tomorrow at 10am."


Researchers hack a robotic vacuum cleaner to record speech remotely

Daily Mail - Science & tech

Scientists have found that robotic vacuum cleaners could allow snoopers to remotely listen in to household conversations, despite not being fitted with microphones. US experts found they can perform a remote eavesdropping attack on a Xiaomi Roborock robot cleaner by remotely accessing its Lidar readings – which helps these cleaners to avoid bumping into furniture. Lidar is a method for measuring distances by illuminating the target with laser beams and measuring their reflection with a sensor. But Lidar can also capture sound signals by obtaining reflections off of objects in the home, like a rubbish bin, that vibrate due to nearby sound sources, such as a person talking. A hacker could repurpose a vacuum's Lidar sensor to sense acoustic signals in the environment, remotely harvest the Lidar data from the cloud and process the raw signal with deep learning techniques to extract audio information.


We need a full investigation into Siri's secret surveillance campaign Ted Greenberg

The Guardian

No one wants their most private activities secretly monitored. That's why wiretapping is strictly regulated in the US and most of the world. Federal law makes it a crime for the government to surveil communications without a court-ordered warrant. This is not the issue here. Nor is this a case involving one-party consent.


Researchers compile list of 1,000 words that accidentally trigger Alexa, Siri, and Google Assistant

Daily Mail - Science & tech

Researchers in Germany have compiled a list of more than 1,000 words that will inadvertently cause virtual assistants like Amazon's Alexa and Apple's Siri to become activated. Once activated, these virtual assistants create sound recordings that are later transmitted to platform holders, where they may be transcribed for quality assurance purposes or other analysis. According to the team, from Ruhr-Universität Bochum and the Max Planck Institute for Cyber Security and Privacy in Germany, this has'alarming' implications for user privacy and likely means short recordings of personal conversations could periodically end up in the hands of Amazon, Apple, Google, or Microsoft workers. Researchers in Germany tested virtual assistants like Amazon's Alexa, Apple's Siri, Google Assistant, and Microsoft's Cortana, and found more than 1,000 words or phrases that would inadvertently activate each device The group tested Amazon's Alexa, Apple's Siri, Google Assistant, Microsoft Cortana, as well as three virtual assistants exclusive to the Chinese market, from Xiaomi, Baidu, and Tencent, according to a report from the Ruhr-Universität Bochum news blog. They left each virtual assistant alone in a room with a television that played dozens of hours of episodes from Game of Thrones, Modern Family, and House of Cards, with English, German, and Chinese audio tracks for each.