In a plain factory building in the San Marcos hills, north of San Diego in California, a technological revolution is under way. There, a team of AI experts are developing a new brand of woman that can smile, flutter her eyelids, make small-talk and remember the names of your siblings. Harmony – for that is her name – is a cut above your average sex doll. More than merely a masturbatory aid, she is a friend, lover and potential life partner. In Sex Robots & Vegan Meat, Jenny Kleeman examines the innovations that promise to change the way we love, eat, reproduce and die in the future. "What you are about to read is not science fiction," she warns in her preface.
Automated facial recognition technology that searches for people in public places breaches privacy rights and will "radically" alter the way Britain is policed, the court of appeal has been told. At the opening of a legal challenge against the use by South Wales police of the mass surveillance system, lawyers for the civil rights organisation Liberty argued that it is also racially discriminatory and contrary to data protection laws. In written submissions to the court, Dan Squires QC, who is acting for Liberty and Ed Bridges, a Cardiff resident, said that the South Wales force had already captured the biometrics of 500,000 faces, the overwhelming majority of whom are not suspected of any wrongdoing. Bridges, 37, whose face was scanned while he was Christmas shopping in Cardiff in 2017 and at a peaceful anti-arms protest outside the city's Motorpoint Arena in 2018, says the use of automatic facial recognition (AFR) by South Wales police caused him "distress". The case has been brought after South Wales police and the Home Office won a high court case last year that effectively gave the green light for national deployment of the technology.
"It's much easier to build an AI system that can detect a nipple than it is to determine what is linguistically hate speech." The Facebook founder Mark Zuckerberg made that comment in 2018 when he was discussing how the company tackles content that is deemed inappropriate or, in Facebook terms, judged to be violating community standards. Facebook's artificial intelligence technology for identifying nudity gets it right more often than not. Between January and March this year, Facebook removed 39.5m pieces of content for adult nudity or sexual activity, and 99.2% of it was removed automatically, without a user reporting it. There were 2.5m appeals against removal and 613,000 pieces of content were restored. But it doesn't work every time, and the AI has problems with historical photos and paintings.
Services such as Amazon's Alexa could be regulated to allow rival digital assistants to operate on smart speakers and stop the tech giants building a monopoly "in people's kitchens and living rooms", the head of the BBC's radio operation has said. James Purnell, the director of radio and education at the BBC, made the comments weeks after the BBC launched its own voice-activated digital assistant, named Beeb, which offers information such as news, weather and programmes. The BBC is already struggling to keep youth audiences tuning into its TV programming in the Netflix era, and Purnell raised the spectre of the Silicon Valley giants extending that to control of audio access as smart speakers become commonplace. "We now have smart speakers in so many homes, and they are going to be in far more homes," he said, speaking to MPs on the digital, media, culture and sport select committee. "There is a question about whether we are happy about the biggest organisations in the world, big tech companies with their executives essentially [based] in the [United] States, combining a monopoly in people's kitchens and in living rooms. "I do think it is worth thinking about whether there should be some regulation of those smart speakers so there is a choice of assistance for people.
OpenAI, the machine learning nonprofit co-founded by Elon Musk, has released its first commercial product: a rentable version of a text generation tool the organisation once deemed too dangerous to release. Dubbed simply "the API", the new service lets businesses directly access the most powerful version of GPT-3, OpenAI's general purpose text generation AI. The tool is already a more than capable writer. Feeding an earlier version of the opening line of George Orwell's Nineteen Eighty-Four – "It was a bright cold day in April, and the clocks were striking thirteen" – the system recognises the vaguely futuristic tone and the novelistic style, and continues with: "I was in my car on my way to a new job in Seattle. I put the gas in, put the key in, and then I let it run. I just imagined what the day would be like. In 2045, I was a teacher in some school in a poor part of rural China. I started with Chinese history and history of science."
Amazon is implementing a one-year moratorium on police use of its artificial intelligence software Rekognition amid a growing backlash over the tech company's ties to law enforcement. The company has recently stated its support for the Black Lives Matter movement, which advocates for police reform – using Twitter to call for an end to "the inequitable and brutal treatment of black people" in the US and has putting a "Black lives matter" banner at the top of its home page. But the company has been criticized as hypocritical because it sells its facial recognition software to police forces. Amazon has not said how many police forces use the technology, or how it is used, but marketing materials have promoted Rekognition being used in conjunction with police body cameras in real time. When it was first released, Amazon's Rekognition software was criticized by human rights groups as "a powerful surveillance system" that is available to "violate rights and target communities of color".
Microsoft's decision to replace human journalists with robots has backfired, after the tech company's artificial intelligence software illustrated a news story about racism with a photo of the wrong mixed-race member of the band Little Mix. A week after the Guardian revealed plans to fire the human editors who run MSN.com and replace them with Microsoft's artificial intelligence code, an early rollout of the software resulted in a story about the singer Jade Thirlwall's personal reflections on racism being illustrated with a picture of her fellow band member Leigh-Anne Pinnock. Thirlwall, who attended a recent Black Lives Matter protest in London, criticised MSN on Friday, saying she was sick of "ignorant" media making such mistakes. She posted on Instagram: "@MSN If you're going to copy and paste articles from other accurate media outlets, you might want to make sure you're using an image of the correct mixed race member of the group." "This shit happens to @leighannepinnock and I ALL THE TIME that it's become a running joke," she said.
IBM is pulling out of the facial recognition market and is calling for "a national dialogue" on the technology's use in law enforcement. The abrupt about-face comes as technology companies are facing increased scrutiny over their contracts with police amid violent crackdowns on peaceful protest across America. In a public letter to Congress, IBM chief executive, Arvind Krishna, explained the company's decision to back out of the business, and declared an intention "to work with Congress in pursuit of justice and racial equity, focused initially in three key policy areas: police reform, responsible use of technology, and broadening skills and educational opportunities." The company, Krishna said, "no longer offers general purpose IBM facial recognition or analysis software. "IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and ...
Grindr is removing an "ethnicity filter" from its dating app as part of its support for the Black Lives Matter movement, the company announced on Monday. The controversial feature, limited to those who stump up £12.99 a month for the premium version of the app, allows users to sort search results based on reported ethnicity, height, weight and other characteristics. In a statement posted to Instagram, the company said "We stand in solidarity with the #BlackLivesMatter movement and the hundreds of thousands of queer people of color who log in to our app every day. "We will continue to fight racism on Grindr, both through dialogue with our community and a zero-tolerance policy for racism and hate speech on our platform. As part of this commitment, and based on your feedback, we have decided to remove the ethnicity filter from our next release.