New Scientist


Artificial intelligence can now predict El Niño 18 months in advance

New Scientist

Artificial intelligence is learning how to predict El Niño climate cycles. The hope is that the technology could be used to improve climate predictions and give policy-makers more time to prepare. El Niño can cause severe weather and devastating damage. A phase of the El Niño-Southern Oscillation, it occurs when water warms over the tropical Pacific Ocean, shifting east and increasing rainfall and cyclones over the Americas while pulling rain away from Indonesia and Australia. Strong El Niño events are associated with intense storms and flooding in some areas, and drought and fires in others.


Fast swimming fish robot could perform underwater surveillance

New Scientist

A tuna-inspired robot can wriggle just as fast as real fish and swim faster than most other robots of its type. This "Tunabot" could help us learn how fish use their fins and may someday be used for underwater surveillance. Hilary Bart-Smith at the University of Virginia and her colleagues built Tunabot from 3D-printed steel and resin, covered in stretchy plastic skin. It is designed to mimic an adolescent tuna, but without any fins other than the tail, and is about 25 centimetres long. The team chose to model the robot after a tuna because the fish can swim extremely fast with high energy efficiency.


AI learns to defy the laws of physics to win at hide-and-seek

New Scientist

Never play games with a bot – it will find a way to cheat if they can. A team from OpenAI, an artificial intelligence lab in San Francisco co-founded by Elon Musk, has developed artificially intelligent bots that taught themselves to cooperate by playing hide-and-seek. The bots also learned how to use basic tools and that defying the laws of physics can help you win. In April, a team of bots known as the OpenAI Five beat the human world champions at the team-based video game DOTA 2. The hide-and-seek bots use similar principles to learn but the simpler game allows for more inventive play. Bowen Baker at OpenAI and his colleagues wanted to see if the team-based dynamics of the OpenAI Five could be used to generate skills that could one day be useful to humans.


Robot can launch out of the water and glide like a flying fish

New Scientist

Like a flying fish gliding above the water's surface, a robot can now propel itself out of water into flight. Mirko Kovac and his colleagues at Imperial College London have developed a robot that can lift itself out of water and travel through the air for up to 26 metres. The robot weighs 160 grams and could be used for monitoring the ocean sampling. It could take water samples by jumping in and out of the water in cluttered environments, avoiding obstacles such as ice in cold regions or floating objects after a flood. "In these situations, it's important to fly there quickly, take a sample and come back," says Kovac.


Police robot can be flung through windows and distract suspects

New Scientist

Police robots, thrown through a broken window, could be used to distract suspects before police enter a room. The idea is to add a distracting device that produces a loud bang and a brilliant flash to small robots already used by many US police departments. Weighing about half a kilo, Throwbots can be tossed through windows or over walls and driven around to explore building interiors with video, audio and infra-red sensors.


UK court backs police use of face recognition, but fight isn't over

New Scientist

A man from Cardiff, UK, says the police breached his human rights when they used facial recognition technology, but today a court ruled that the police's actions were lawful. That is, however, hardly the end of the matter. South Wales Police has been trialling automated facial recognition (AFR) technology since April 2017. Other forces around the country are trialling similar systems, including London's Metropolitan Police. Bridges may have been snapped during a pilot called AFR Locate.


AI facial recognition software now works for wild chimpanzees too

New Scientist

An artificial intelligence that detects, tracks and recognises chimpanzees could make studying animals in the wild more efficient. Arsha Nagrani at the University of Oxford in the UK and her colleagues have developed a facial recognition AI that can detect and identify the individual chimpanzees captured in video footage recorded in the wild. Using the AI, they can cut down the time and resources needed to track animals in their natural habitat. The algorithm could help researchers and wildlife conservationists study the complex behaviours of chimpanzees and other primates more efficiently. The team trained the AI on 50 hours of archival footage – spanning 14 years – of chimpanzees in Bossou, Guinea in West Africa.


AIs that deblur faces could make people on CCTV easier to identify

New Scientist

Artificial intelligence is turning the "Enhance!" Vishal Patel at John Hopkins University in the US and his colleagues have developed an AI that can automatically deblur photographs of people's faces. The technology could eventually be used to improve facial recognition on long-distance surveillance images, such as those taken from a drone, says Patel.


Robot pilot that can grab the flight controls gets its plane licence

New Scientist

A robot pilot is learning to fly. It has passed its pilot's test and flown its first plane, but it has also had its first mishap too. Unlike a traditional autopilot, the ROBOpilot Unmanned Aircraft Conversion System literally takes the controls, pressing on foot pedals and handling the yoke using robotic arms. It reads the dials and meters with a computer vision system. The robot can take off, follow a flight plan and land without human intervention.


Mini-brains grown in a lab show neural activity like preterm babies

New Scientist

Miniature brains grown in a lab exhibit remarkably similar activity to preterm babies' brains. This dispels the idea that human brains need to develop in a womb or be connected to other organs to function. Scientists have long been trying to grow realistic models of human brains to better understand how our brains work and make it easier to test new treatments for neurological disorders. However, until now, it was assumed that these models wouldn't be able to recreate the sophisticated connections found in real brains. "We previously assumed that the human brain needs some input from other organs and from the mother's uterus to thrive," says Alysson Muotri at the University of California, San Diego.