Tech companies should stop behaving as though everything that is not illegal is acceptable, says Microsoft's second-in-command. Instead, they should focus on defining – and living by – the standards that they would like to see in regulation, before it gets forced on them anyway. For some of the most potentially dangerous new technologies, such as facial recognition, that could mean voluntarily refusing to sell them to certain countries, for certain uses, or even agreeing to a moratorium altogether, said Brad Smith, the president and chief legal officer of the world's most valuable publicly-traded company. Speaking to the Guardian before the launch of his new book, Tools and Weapons, Smith said that if technology firms wanted to be proud of how they changed the world for the better, they must take more responsibility for the ways they have made it worse. "When you think about all of the issues that people worry about in the world today and what they spend their time arguing about, it's often issues like trade, immigration, nationalism, globalisation," Smith said.
Karaoke complexes might be relatively common now, but back in 2004 singing into a PlayStation was the closest most of us could get. SingStar's discs of party classics formed the caterwauling soundtrack to millions of student gatherings, hen parties and five-pint Fridays all over Europe for more than a decade. Like Just Dance, it harnesses the infectious joy of pop music in a way that anyone can play. A gleeful absurdist masterpiece in which you start by rolling up pencils and apple peel and end up absorbing buildings, trees and, eventually, most of the planet in your big sticky ball, because why not? Journey is a short and moving shared experience whose music, evocative colour palette and simple play come together as they only can in games, for a powerful emotional effect. It's often picked as an ur-example of games as art – including by curators at the V&A, where it was front and centre at a recent exhibition. Resident Evil meets Alien seems like such an obvious game pitch that it is incredible it wasn't realised until 2008. In Dead Space, the player becomes lowly engineer Isaac Clarke, who finds himself investigating the "planet-cracking" ship Ishimura after radio contact with the vessel is lost.
Laura Nolan is a modern hero. A former Google software engineer, Nolan resigned from her job last year after being asked to dramatically enhance the artificial intelligence used in US military drones. She is now calling for a ban on all forms of autonomous weapons on the basis that they might accidentally initiate a catastrophic global war. She said this as part of her role as a member of the Campaign to Stop Killer Robots. Now, listen, sometimes I'm able to kid myself about the goodness of people.
Just 10 months after launching its first voice-controlled, video-calling smart displays in the US, Facebook is trying again with new Portal, Portal Mini and Portal TV – and now they are heading for the UK and Europe. The basic premise is very similar to the smart displays sold by Amazon, Google and others. Portal and Portal Mini look like digital photo frames complete with an actual black or white frame around the outside. They display your photos and calendar events, play videos and generally entertain, including streaming Spotify and Amazon Prime Video. They listen out for two hotwords depending on your settings.
How are you supposed to react when a robot calls you a "gook"? At first glance, ImageNet Roulette seems like just another viral selfie app – those irresistible 21st-century magic mirrors that offer a simulacrum of insight in exchange for a photograph of your face. Want to know what you will look like in 30 years? If you were a dog what breed would you be? That one went viral in 2016.
Our built environment is becoming one big computer. "Smartness" is coming to saturate our stores, workplaces, homes, cities. As we go about our daily lives, data is made, stored, analyzed and used to make algorithmic inferences about us that in turn structure our experience of the world. Computation encircles us as a layer, dense and interconnected. If our parents and our grandparents lived with computers, we live inside of them.
Artificial intelligence could be used to help catch paedophiles operating on the dark web, the Home Office has announced. The government has pledged to spend more money on the child abuse image database, which since 2014 has allowed police and other law enforcement agencies to search seized computers and other devices for indecent images of children quickly, against a record of 14m images, to help identify victims. The investment will be used to trial aspects of AI including voice analysis and age estimation to see whether they would help track down child abusers. Earlier this month, the chancellor, Sajid Javid, announced £30m would be set aside to tackle online child sexual exploitation, with the Home Office releasing more information on how this would be spent on Tuesday. There has been debate over the use of machine learning algorithms, part of the broad field of AI, with the government's Centre for Data Ethics and Innovation developing a code of practice for the trialling of the predictive analytical technology in policing.
A new generation of autonomous weapons or "killer robots" could accidentally start a war or cause mass atrocities, a former top Google software engineer has warned. Laura Nolan, who resigned from Google last year in protest at being sent to work on a project to dramatically enhance US military drone technology, has called for all AI killing machines not operated by humans to be banned. Nolan said killer robots not guided by human remote control should be outlawed by the same type of international treaty that bans chemical weapons. Unlike drones, which are controlled by military teams often thousands of miles away from where the flying weapon is being deployed, Nolan said killer robots have the potential to do "calamitous things that they were not originally programmed for". Nolan, who has joined the Campaign to Stop Killer Robots and has briefed UN diplomats in New York and Geneva over the dangers posed by autonomous weapons, said: "The likelihood of a disaster is in proportion to how many of these machines will be in a particular area at once. What you are looking at are possible atrocities and unlawful killings even under laws of warfare, especially if hundreds or thousands of these machines are deployed. "There could be large-scale accidents because these things will start to behave in unexpected ways.
Robots might do repetitive housework, such as cleaning, scrubbing, and washing-up, but it was unlikely they would do creative work such as cooking, Professor Meredith W. Thring, professor of mechanical engineering, Queen Mary College, London, said yesterday. He told the International Congress of Industrial Design in London: "I do not believe that it any computer or robot can ever be built which has emotions in it and, therefore, which can do anything original or anything which is more sophisticated than it has been programmed to do by a human being. I do not believe it will ever be able to do creative work." It was for that reason robots could be used as slaves, to do the things human beings did not want to do. In the garden, robots might be used as slaves to mow the lawn, but the human would probably do the work on the flower beds.
Apple has launched its latest iPhone 11, iPhone 11 Pro and iPhone 11 Pro Max featuring new cameras, improved screens, faster processors and longer battery life. Announced at Apple's headquarters in Cupertino, California on Thursday, the iPhone 11 series of smartphones carries on where the iPhone XS, XS Max and XR left off last year. The new iPhones feature similar designs and screen sizes with slim bezels and the Face ID notch at the top, which was first introduced in 2017 with the iPhone X. A variety of new colours will also be available, with a new matt finish on the back of the iPhone 11 Pro. The iPhone Pro will be available in two sizes, with either a 5.8-inch (147mm) or 6.5-inch (165mm) screen.