Information Technology: AI-Alerts

Ex-Google worker fears 'killer robots' could cause mass atrocities

The Guardian

A new generation of autonomous weapons or "killer robots" could accidentally start a war or cause mass atrocities, a former top Google software engineer has warned. Laura Nolan, who resigned from Google last year in protest at being sent to work on a project to dramatically enhance US military drone technology, has called for all AI killing machines not operated by humans to be banned. Nolan said killer robots not guided by human remote control should be outlawed by the same type of international treaty that bans chemical weapons. Unlike drones, which are controlled by military teams often thousands of miles away from where the flying weapon is being deployed, Nolan said killer robots have the potential to do "calamitous things that they were not originally programmed for". Nolan, who has joined the Campaign to Stop Killer Robots and has briefed UN diplomats in New York and Geneva over the dangers posed by autonomous weapons, said: "The likelihood of a disaster is in proportion to how many of these machines will be in a particular area at once. What you are looking at are possible atrocities and unlawful killings even under laws of warfare, especially if hundreds or thousands of these machines are deployed. "There could be large-scale accidents because these things will start to behave in unexpected ways.

Two Major Saudi Oil Installations Hit by Drone Strike, and U.S. Blames Iran

NYT > Middle East

Drone attacks claimed by Yemen's Houthi rebels struck two key oil installations inside Saudi Arabia on Saturday, damaging facilities that process the vast majority of the country's crude output and raising the risk of a disruption in world oil supplies. The attacks immediately escalated tensions in the Persian Gulf amid a standoff between the United States and Iran, even as key questions remained unanswered -- where the drones were launched from, and how the Houthis managed to hit facilities deep in Saudi territory, some 500 miles from Yemeni soil. Secretary of State Mike Pompeo accused Iran of being behind what he called "an unprecedented attack on the world's energy supply" and asserted that there was "no evidence the attacks came from Yemen." He did not, however, specify an alternative launch site, and the Saudis themselves refrained from pointing the finger directly at Iran. President Trump condemned the attack in a phone call with Saudi Crown Prince Mohammed bin Salman and offered support for "Saudi Arabia's self defense," the White House said in a statement, adding that the United States "remains committed to ensuring global oil markets are stable and well supplied."

Nvidia Open Source It's Deep Learning Inference Compiler "NVDLA"


The most part of the computing effort for deep learning inference is based on mathematical operations which can be mostly grouped into the four-part that are convolutions; activations; pooling; and normalization. These all four share a few characteristics that make them well suited for special-purpose hardware implementation: their memory access patterns are extremely predictable & they are readily parallelized. For designing a new custom hardware accelerators for deep learning is clearly popular, but achieving the state-of-the-art performance, and efficiency with a new design is a complex and challenging problem. In order to help developers to advance the adoption of efficient AI inferencing in custom hardware designs, in 2017 Nvidia opened the source for the hardware design of the NVIDIA Deep Learning Accelerator. NVIDIA Deep Learning Accelerator is both scalable and highly configurable; it consists of many great features like the modular design that maintains flexibility & simplifies integration and it also promotes standardized, open architecture to address the computational demands of inference.

This prosthetic arm combines manual control with machine learning – TechCrunch


Prosthetic limbs are getting better every year, but the strength and precision they gain doesn't always translate to easier or more effective use, as amputees have only a basic level of control over them. One promising avenue being investigated by Swiss researchers is having an AI take over where manual control leaves off. To visualize the problem, imagine a person with their arm amputated above the elbow controlling a smart prosthetic limb. With sensors placed on their remaining muscles and other signals, they may fairly easily be able to lift their arm and direct it to a position where they can grab an object on a table. The many muscles and tendons that would have controlled the fingers are gone, and with them the ability to sense exactly how the user wants to flex or extend their artificial digits.

Why facial recognition's racial bias problem is so hard to crack


Jimmy Gomez is a California Democrat, a Harvard graduate and one of the few Hispanic lawmakers serving in the US House of Representatives. But to Amazon's facial recognition system, he looks like a potential criminal. Gomez was one of 28 US Congress members falsely matched with mugshots of people who've been arrested, as part of a test the American Civil Liberties Union ran last year of the Amazon Rekognition program. Nearly 40 percent of the false matches by Amazon's tool, which is being used by police, involved people of color. This is part of a CNET special report exploring the benefits and pitfalls of facial recognition.

Russia scraps robot Fedor after unsuccessful space odyssey

The Japan Times

MOSCOW – It's mission over for a robot called Fedor that Russia blasted to the International Space Station, the developers said Wednesday, admitting he could not replace astronauts on spacewalks. "He won't fly there any more. There's nothing more for him to do there, he's completed his mission," Yevgeny Dudorov, executive director of robot developers Androidnaya Tekhnika, told RIA Novosti news agency. The silvery anthropomorphic robot cannot fulfill its assigned task to replace human astronauts on long and risky space walks, Dudorov said. Fedor -- short for Final Experimental Demonstration Object Research -- was built to assist space station astronauts.

Robot can launch out of the water and glide like a flying fish

New Scientist

Like a flying fish gliding above the water's surface, a robot can now propel itself out of water into flight. Mirko Kovac and his colleagues at Imperial College London have developed a robot that can lift itself out of water and travel through the air for up to 26 metres. The robot weighs 160 grams and could be used for monitoring the ocean sampling. It could take water samples by jumping in and out of the water in cluttered environments, avoiding obstacles such as ice in cold regions or floating objects after a flood. "In these situations, it's important to fly there quickly, take a sample and come back," says Kovac.

Look out for potential bias in chemical data sets


There might be disadvantages to using tried and trusted methods.Credit: Science Photo Library Like most research fields, materials science has embraced'big data', including machine-learning models and techniques. These are being used to predict new materials and properties, and devise routes to existing drugs and chemicals. But machine learning requires training data, such as those on reagents, conditions and starting materials. These are usually gleaned from the literature, and are human-generated. The choice of reagents that researchers use could come, for example, from experience or from previously published work.

Cities Are Trying--Again--to Plan for Autonomous Vehicles


On the one hand, autonomous vehicles offer an excellent opportunity to rethink how American cities operate, down to each lane line, crosswalk, and curb. Two years ago, the National Association of City Transportation Officials, representing 81 North American cities, published its first planning guide to self-driving vehicles, highlighting the possibilities. If everyone moves around on electric-powered transit and robotaxis, no one needs to own a car. No one needs to park a car. So that first version outlined an elegant--albeit fanciful--vision of the cities of the future.

When the AI Professor Leaves, Students Suffer, Study Says


A study by researchers from the University of Rochester found an exodus of artificial intelligence (AI) professors from North American universities to the private sector has reduced the prospect that graduate students will found new AI companies. Those graduates who did start a company usually attracted less venture capital, with the field of deep learning especially affected, according to "Artificial Intelligence, Human Capital, and Innovation," by Michael Gofman and Zhao Jin. This academic attrition could hinder innovation and economic expansion over time, the researchers suggest. The technology industry mostly ignored deep learning's potential until 2010, but interest grew as the Internet produced more data and new computer chips reduced the analytical burden. Large tech companies have hired many academic specialists, including two recent recipients of the ACM A.M. Turing Award honored for their work on neural networks.