Sony Partners With CMU to Develop Food Prep and Delivery Robots

IEEE Spectrum Robotics Channel

Last week, Sony and Carnegie Mellon University announced a collaboration "on artificial intelligence (AI) and robotics research." Usually, these announcements pretty much just end there, with the implication being that giant corporation X will support academic research institution Y by funding ongoing research or a string of new initiatives. This Sony/CMU announcement is a bit more exciting because of how specific it is: The project will be about food. Researchers will focus on defining the domain of food ordering, preparation, and delivery. Initially, they will build upon existing manipulation robots and mobile robots, and will plan on developing new domain-specific robots for predefined food preparation items and for mobility in a limited confined space.


Reconfigurable Path Planning for an Autonomous Unmanned Aerial Vehicle

AAAI Conferences

In this paper, we present a motion planning framework for a fully deployed autonomous unmanned aerial vehicle which integrates two sample-based motion planning techniques, Probabilistic Roadmaps and Rapidly Exploring Random Trees. Additionally, we incorporate dynamic reconfigurability into the framework by integrating the motion planners with the control kernel of the UAV in a novel manner with little modification to the original algorithms. The framework has been verified through simulation and in actual flight. Empirical results show that these techniques used with such a framework offer a surprisingly efficient method for dynamically reconfiguring a motion plan based on unforeseen contingencies which may arise during the execution of a plan. The framework is generic and can be used for additional platforms.


Autonomous system uses quadcopters to help wheeled robots climb steep cliffs

#artificialintelligence

Sheer cliff faces present a traversal challenge for most wheeled robots on the market, but researchers at the University of Tokyo say they've developed a two-robot framework that works pretty reliably in their testing. In a newly published paper on the preprint server Arxiv.org "[We] propose a novel cooperative system for an Unmanned Aerial Vehicle (UAV) and an Unmanned Ground Vehicle (UGV) which utilizes the UAV not only as a flying sensor but also as a tether attachment device," the authors of the paper explain. "[It enhances] the poor traversability of the UGV by not only providing a wider range of scanning and mapping from the air, but also by allowing the UGV to climb steep terrains with the winding of the tether." The UGV is permanently attached via mechanized winch and cable to the UAV, a custom-made quadcopter with an Nvidia Jetson TX2 chipset, a flight controller, and a raft of sensors including a modular fisheye camera, time-of-flight sensor, inertial measurement unit (IMU), and laser sensor.


Intel's tiny Euclid computer can be the brains of a robot

PCWorld

A compact computer called Euclid from Intel should make the development of robots much easier. Euclid looks much like the Kinect camera for Xbox consoles, but it's a self-contained PC that can be the guts of a robot. It's possible to install the Euclid computer where the "eyes" of a human-like robot would be typically placed. Intel demonstrated the Euclid computer in a robot moving on stage during CEO Brian Krzanich's keynote at the Intel Developer Forum on Tuesday. Euclid has a 3D RealSense camera that can serve as the eyes in a robot, capturing images in real-time.