Robotics & Automation


Trashy Robots

The Atlantic

There's nothing, it would seem, that Peter Kokis can't turn into a robot. The Brooklyn performance artist makes cyborgs out of 100 percent recycled materials--oftentimes salvaged from the trash. He builds the 170-pound costumes on his kitchen table. When he's done, Kokis parades through the streets, a veritable Transformer among mortals. "I look for complexity in everyday objects," Kokis says in Aaron Craig's short documentary One Man's Trash.


A robot dog has learned to run faster with machine learning

#artificialintelligence

Reinforcement learning has helped a four-legged bot move a bit like a real animal, without having to be taught how to make each step. The news: Roboticists want their creations to mimic animals because animals invariably move in the most energy-efficient way. But the eerily lifelike movement of robots like Boston Dynamics' Spotmini is usually coded by hand. Now researchers have combined simulation with a technique called reinforcement learning to teach a dog-like robot called "ANYmal" to run faster and recover from falls. Crucially, it did so without any manual intervention.


4 Promising IoT Use Cases in Agriculture

#artificialintelligence

Agriculture is undergoing a renaissance. IoT and artificial intelligence are enabling farmers to manage crops and livestock more reliably and efficiently. Autonomous farming equipment, livestock monitoring systems, and precision farming solutions are empowering farmers to feed our increasingly hungry and environmentally unstable world. As we begin 2019, it's exciting to reflect on all the Internet of Things--IoT--industry changes that occurred in 2018 and the trends that lay ahead in 2019. Many industries have been and will continue to be affected by the growth and maturation of IoT--school campuses will be safer, cars will be smarter, and homes will be sleeker and more intuitive, and businesses will deliver more value more efficiently.


Making Autonomous Vehicles Safer

#artificialintelligence

While self-driving vehicles are beta-tested on some public roads in real traffic situations, the semiconductor and automotive industries are still getting a grip on how to test and verify that vehicle electronics systems work as expected. Testing can be high stakes, especially when done in public. Some of the predictions about how humans will interact with autonomous vehicles (AVs) on public roads are already coming true, but human creativity is endless. There have been attacks on Waymo test vehicles in Arizona, a DUI arrest of a Tesla driver sleeping at 70mph on a freeway, and other Tesla hacks using oranges and aftermarket gadgets to trick Tesla's Autopilot into thinking the driver's hands are on the wheel. But are those unsafe human behaviors any more dangerous than the drum beat of technology hype, unrealistic marketing, and a lack of teeth in regulating testing of AVs on public roads, the factory and the design lab?


The Achilles' Heel of AI Computer Vision

#artificialintelligence

Would you ride in an autonomous vehicle if you knew that it was subject to visual problems? How about undergo cancer treatment based on a computer interpretation of radiological images such as an x-ray, ultrasound, CT, PET, or MRI scan knowing that computer vision could easily be fooled? Computer vision has a problem–it only takes slight changes in data input to fool machine learning algorithms into "seeing" things wrong. Recent advances in computer vision are largely due to the improved pattern-recognition capabilities through deep learning, a type of machine-based learning. Machine learning is a subset of artificial intelligence where a computer is able to learn concepts from processing input data either through supervised learning where the training data is labeled, or not as in unsupervised learning or a combination without explicit programming.


A Choice of Grippers Helps Dual-Arm Robot Pick Up Objects Faster Than Ever

IEEE Spectrum Robotics Channel

We've been following Dex-Net's progress towards universal grasping for several years now, and today in a paper in Science Robotics, UC Berkeley is presenting Dex-Net 4.0. The new and exciting bit about this latest version of Dex-Net is that it's able to successfully grasp 95 percent of unseen objects at a rate of 300 per hour, thanks to some added ambidexterity that lets the robot dynamically choose between two different kinds of grippers. For some context, humans are able to pick objects like these nearly twice as fast, between 400 and 600 picks per hour. And my guess would be that human success rates are as close to 100 percent as you can reasonably expect, perhaps achieving 100 percent if you allow for multiple tries to pick the same object. We set a very, very high bar for the machines.


This robot dog can recover from a vicious kick using AI

#artificialintelligence

Researchers at ETH Zurich in Switzerland taught a four-legged robot dog a valuable life skill: how to get up again after it gets knocked down. And yes, it involved evil scientists kicking and shoving an innocent robot. The researchers used an AI model to teach ANYmal, a doglike robot made by ANYbotics, how to right itself after being knocked onto its side or back in a variety of physical environments -- as opposed to giving the robot a detailed set of instructions for only one specific environment. The results were published in a paper today on Science Robotics. In simple terms, the robot tried again and again to right itself in simulation, and learned from instances when a movement didn't end up righting it.


Meet Nimbo: Turing Video's Autonomous Security Robot

#artificialintelligence

Want to watch this again later? Sign in to add this video to a playlist. Report Need to report the video? Sign in to report inappropriate content. Report Need to report the video?


Robot Recreates the Walk of a 290-Million-Year-Old Creature

U.S. News

Evolutionary biologist John Nyakatura at Humboldt University in Berlin has spent years studying a 290-million-year-old fossil dug up in central Germany's Bromacker quarry in 2000. The four-legged plant-eater lived before the dinosaurs and fascinates scientists "because of its position on the tree of life," said Nyakatura. Researchers believe the creature is a "stem amniote" -- an early land-dwelling animal that later evolved into modern mammals, birds and reptiles.


Japan robot hotel fires most of its 'annoying' robotic staff

The Independent

A hotel in Japan has laid off more than half of its robotic staff following complaints from some guests about the practical limitations of the machines. Among the 243 robots employed by the Henn-na Hotel, which roughly translates as "Weird Hotel" were a velociraptor receptionist, an automated gardener and a one-armed claw that handles left luggage. The facility which made headlines in 2015 when it opened in Nagasaki Prefecture, also made use of more experimental machines, such as bedside table-sized butler capable of arranging a wake up call or announcing the weather forecast. Glitches with this robot saw it wake up guests who were snoring loudly after mistaking the noise for a voice command, The Wall Street Journal reported. The Hen-na hotel describes the concept as "excitement meets comfort" thanks to "state-of-the-art" technologies.