In episode ten of season three we talk about the rate of change (prompted by Tim Harford), take a listener question about the power of kernels, and talk with Peter Donnelly in his capacity with the Royal Society's Machine Learning Working Group about the work they've done on the public's views on AI and ML. See all the latest robotics news on Robohub, or sign up for our weekly newsletter.
The new Blade Runner sequel will return us to a world where sophisticated androids made with organic body parts can match the strength and emotions of their human creators. As someone who builds biologically inspired robots, I'm interested in whether our own technology will ever come close to matching the "replicants" of Blade Runner 2049. The reality is that we're a very long way from building robots with human-like abilities. But advances in so-called soft robotics show a promising way forward for technology that could be a new basis for the androids of the future. From a scientific point of view, the real challenge is replicating the complexity of the human body.
Nature has a way of making complex shapes from a set of simple growth rules. The curve of a petal, the swoop of a branch, even the contours of our face are shaped by these processes. What if we could unlock those rules and reverse engineer nature's ability to grow an infinitely diverse array of shapes? Scientists from Harvard's Wyss Institute for Biologically Inspired Engineering and the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have done just that. In a paper published in the Proceedings of the National Academy of Sciences, the team demonstrates a technique to grow any target shape from any starting shape.
I am happy to announce that Robots Podcast will be renamed to "Robohub Podcast". This name change is to avoid confusion about how the podcast and Robohub relate, a question we frequently get. The answer is that they are part of the same effort to connect the global robotics community to the world -- and they were founded by many of the same people. The podcast began in 2006 as "Talking Robots" and was launched by Dr. Dario Floreano at EPFL in Switzerland and his PhD students. Several of those PhD students then went on to launch the "Robots Podcast", which will celebrate its 250th episode at the end of this year (make sure to check the whole playlist)!
Abstract: "This presentation will highlight the history of autonomous vehicle development at Ford Motor Company and elsewhere within the industry, with an emphasis on discussing some of the difficult remaining challenges to be solved. Additionally, examples illustrating the broader range of potential applications for AI and Robotics within the transportation industry will be touched upon."
In this interview, Gerdes discusses developing a model for high-performance control of a vehicle; their autonomous race car, an Audi TTS named'Shelley,' and how its autonomous performance compares to ameteur and professional race car drivers; and an autonomous, drifting Delorean named'MARTY.' Chris Gerdes is a Professor of Mechanical Engineering at Stanford University, Director of the Center for Automotive Research at Stanford (CARS) and Director of the Revs Program at Stanford. His laboratory studies how cars move, how humans drive cars and how to design future cars that work cooperatively with the driver or drive themselves. When not teaching on campus, he can often be found at the racetrack with students, instrumenting historic race cars or trying out their latest prototypes for the future.
Think of self-driving cars that have capabilities to communicate with traffic lights, smart city sensor systems, savvy home appliances, industrial automation systems, connected health innovations, personal drones, robots and more. In the SoftBank and Huawei robot demonstration, a robotic arm played an air hockey game against a human. That data was streamed to the cloud and the calculated result was then forwarded to the robotic arm control server to control the robotic arm. Other demonstrations by SoftBank and Huawei included real-time ultra-high definition camera data compressed, streamed and the then displayed on a UHD monitor; an immersive video scenery capture from 180-degree 4-lense cameras uploaded and the downloaded to smartphones and tablets; remote rendering by a cloud GPU server; and the robot demo.
Mike Salem from Udacity's Robotics Nanodegree is hosting a series of interviews with professional roboticists as part of their free online material. This week we're featuring Mike's interview with Felipe Chavez, Co-Founder and CEO of Kiwi. Kiwi is a mobile robot company delivering food to hungry college students across University of California, Berkeley's campus. Listen to Felipe explain some of the challenges Kiwi faces when deploying their robots.
Danish Nilfisk Holding A/S began being listed on the NASDAQ Stock Exchange under symbol NLFSK after being spun off from NKT A/S, a Danish conglomerate. Nilfisk is one of the world's leading suppliers of professional cleaning equipment with a strong brand and a vision for growth in robotics. In that pursuit, Blue Ocean Robotics and Nilfisk recently announced a strategic partnership to develop a portfolio of intelligent cleaning machines and robots to add to the Nilfisk line of industrial cleaners. We already have good experiences with this, and we are looking forward to starting this partnership with Blue Ocean Robotics, which complements our other partnerships very well."
For somebody (like myself) who does not work with industrial robot safety standards every day, when people start rattling off safety standard numbers it can get confusing very fast. Robot integrator is a certification (and way to make money) from Robotic Industries Association (RIA) that helps provide people who come trained to fill the safety role while integrating and designing new robot systems. In particular R15.06 which is for Industrial Robot Safety standards, the proposed R15.08 standards for industrial mobile robot safety standards, and the R15.606 collaborative robot safety standards. R15.08 which is expected to be ratified as a standard in 2019 looks at things like mobile robots, manipulators on mobile robots, and manipulators working while the mobile base is also working.