Results


Microsoft drone simulator helps you prevent real-world crashes

Engadget

It just published an open source simulator, the Aerial Informatics and Robotics Platform, that helps designers test and train autonomous machines in realistic conditions without wrecking expensive prototypes. You can see what the vehicle would see (including simulated sensor data), and the software ties into both existing robotic hardware platforms and machine learning systems to speed up development. Moreover, the simulator isn't necessarily confined to testing hardware. Microsoft sees its tech helping with all kinds of computer vision and machine learning code.


Autonomous driving - do it yourself!

#artificialintelligence

ALV (Autonomous Land Vehicle) project used lidar sensors, computer vision and robotic control in order to drive a car with slow speed. On the other hand, there are approaches similar to the mentioned ALVINN and Nvidia concepts, that maps the road image directly into steering commands. OpenAI Universe makes experiments with computer games particularly easy, as it provides a complete environment for testing AI agents. Computer games are becoming complex enough to emulate the real world, therefore there are some active researches on data collection in virtual environments and evaluating models trained on this data in real traffic.


Intelligent vision systems and AI for the development of autonomous driving

#artificialintelligence

Like much of the technology needed to support and enable autonomous vehicles, intelligent vision systems already exist and are used in other industries, for example, in industrial robots. This will require processing power that is only now becoming available, through advances made in System-on-Chip platforms, advanced software, deep learning algorithms and open source projects. It is enabled by the development of Heterogeneous System Architectures (HSA); platforms that combine powerful general purpose Microprocessing Units (MPUs) with very powerful and highly parallel Graphical Processing Units (GPUs). The software infrastructure needed to develop intelligent vision systems, such as OpenCV (Open Source Computer Vision) and OpenCL (Open Computer Language) require high performance processing platforms to execute their advanced algorithms.


Video Friday: iCub Does Yoga, Wooden Walking Robot, and Wind Tunnel for Drones

IEEE Spectrum Robotics Channel

We're told that Markus bet that this thing could only work in theory, and lost: This video introduces the monospinner, the mechanically simplest controllable flying machine in existence. With the company's ultra-low power, high performance Myriad 2 processor inside, the Fathom Neural Compute Stick can run fully-trained neural networks at under 1 Watt of power. As a tinkerer and builder of various robots and flying contraptions, I've been dreaming of getting my hands on something like the Fathom Neural Compute Stick for a long time. Last year in Seoul, KAIST's Unmanned Systems Research Group participated in an autonomous car demo in downtown Seoul.


Drive.ai to test 'deep learning' autonomous cars on California roads

#artificialintelligence

Another self-driving car startup is about to hit the roads. The Wall Street Journal reports that Drive.ai, a Silicon Valley startup that received 12 million in funding last year, has been granted a license to test autonomous vehicles on California roads. It's the 13th company to receive permission. But given that it's been granted a license to hit the road, it seems likely enough that it has stable self-driving tech ready to put on a vehicle.