Goto

Collaborating Authors

Results


Protecting computer vision from adversarial attacks

#artificialintelligence

Advances in computer vision and machine learning have made it possible for a wide range of technologies to perform sophisticated tasks with little or no human supervision. From autonomous drones and self-driving cars to medical imaging and product manufacturing, many computer applications and robots use visual information to make critical decisions. Cities increasingly rely on these automated technologies for public safety and infrastructure maintenance. However, compared to humans, computers see with a kind of tunnel vision that leaves them vulnerable to attacks with potentially catastrophic results. For example, a human driver, seeing graffiti covering a stop sign, will still recognize it and stop the car at an intersection.


Swaayatt Robots: Pioneering Reinforcement Learning in Autonomous Driving

#artificialintelligence

The startup focuses on developing self-driving technology for unstructured environment conditions and India's road network is full of such environments. In the thick of it is founder and CEO Sanjeev Sharma, whose interest in the field of robotics was born way back in 2009, when he watched the videos of Team MIT at the 2007 DARPA Urban Challenge. With time, he knew that he wanted to hone in on research to enable autonomous driving in the most difficult traffic environmental scenarios, but it wasn't until 2014, when Sharma deferred his PhD at the University of Massachusetts for a year, that he established Swaayatt Robots. Fast forward eight years and, despite knowing much more about autonomous mobility than in 2014, safety continues to be a huge challenge. Even before we think of the purchasing and operational cost, we're quite some time away from solving for driver safety in an uncontrolled and unstructured environment -- but Swaayatt Robots is trying to fix that.


Study Finds That Majority Of Drivers Distrust Hands-Free Driving Systems

#artificialintelligence

Automakers are jumping into the field of advanced driver assistance systems (ADAS) with both feet, trying to stuff as many features into their new cars as they can. The Insurance Institute for Highway Safety, though, wanted to find out what consumers actually want. The survey shows that the majority of consumers are pretty conservative when it comes to ADAS systems. After surveying 1,000 drivers on three partially automated driving systems (lane centering, automated lane changing, and driver monitoring), the IIHS found that consumers prefer systems where they are more in control that have more safeguards. Although consumer interest in ADAS technologies is strong, they are suspicious the more hands-free the technologies become.


La veille de la cybersécurité

#artificialintelligence

A team of researchers at Cornell University has developed a new method enabling autonomous vehicles to create "memories" of previous experiences, which can then be used in future navigation. This will be especially useful when these self-driving cars can't rely on sensors in bad weather environments. Current self-driving cars that use artificial neural networks have no memory of the past, meaning they are constantly "seeing" things for the first time. And this is true regardless of how many times they've driven the exact same road. Killian Weinberger is senior author of the research and a professor of computer science.


New Method Helps Self-Driving Cars Create 'Memories'

#artificialintelligence

A team of researchers at Cornell University has developed a new method enabling autonomous vehicles to create "memories" of previous experiences, which can then be used in future navigation. This will be especially useful when these self-driving cars can't rely on sensors in bad weather environments. Current self-driving cars that use artificial neural networks have no memory of the past, meaning they are constantly "seeing" things for the first time. And this is true regardless of how many times they've driven the exact same road. Killian Weinberger is senior author of the research and a professor of computer science.


Mysterious Drones Strike Russian Oil Refinery, Sending Ball Of Flame Into Sky

International Business Times

Mysterious drones hit a major Russian oil refinery in Novoshakhtinsk in the Rostov region, which is near the border with Ukraine, the plant's management said Wednesday. The incident came as the fight between Ukraine and Russia is about to enter its fifth month. The strike sent a ball of flame and black smoke straight into the sky, prompting the suspension of operations, the authorities told local media. A video was shared on the Telegram messaging service, showing a twin-boom tail configured drone crashing into the Russian oil refinery. There are speculations it was a "kamikaze" drone strike conducted by the Ukrainian Armed Forces, according to the Drive.


Amazon builds its first fully autonomous mobile robot for warehouses

ZDNet

The robot, called Proteus, will soon be deployed in fulfillment centers and sort centers, 10 years after Amazon established its robotics business with the acquisition of the robotics firm Kiva Systems. The e-commerce giant has long said its ultimate aim is to build warehouse robots that work "alongside" humans rather than replacing them. Unlike other warehouse robots, Proteus can actually safely work "alongside" humans. "Historically, it's been difficult to safely incorporate robotics in the same physical space as people," Amazon explained in a blog post. "We believe Proteus will change that while remaining smart, safe, and collaborative."


Researchers release open-source photorealistic simulator for autonomous driving

Robohub

VISTA 2.0 is an open-source simulation engine that can make realistic environments for training and testing self-driving cars. Hyper-realistic virtual worlds have been heralded as the best driving schools for autonomous vehicles (AVs), since they've proven fruitful test beds for safely trying out dangerous driving scenarios. Tesla, Waymo, and other self-driving companies all rely heavily on data to enable expensive and proprietary photorealistic simulators, since testing and gathering nuanced I-almost-crashed data usually isn't the most easy or desirable to recreate. To that end, scientists from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) created "VISTA 2.0," a data-driven simulation engine where vehicles can learn to drive in the real world and recover from near-crash scenarios. What's more, all of the code is being open-sourced to the public.


Proteus is Amazon's first fully autonomous warehouse robot

Engadget

In a post looking back over the past 10 years since it purchased robotics company Kiva, Amazon has revealed its new machines, including its first fully autonomous warehouse robot. It's called Proteus, and it was designed to be able to move around Amazon's facilities on its own while carrying carts fulls of packages. The company said the robot uses an "advanced safety, perception and navigation technology" it developed to be able to do its work without hindering human employees. In the video Amazon posted, you can see Proteus moving under the carts and transporting them to other locations. It emits a green beam ahead of it while it moves, and it stops if a human worker steps in front of the beam.


Researchers release open-source photorealistic simulator for autonomous driving

#artificialintelligence

VISTA 2.0 builds off of the team's previous model, VISTA, and it's fundamentally different from existing AV simulators since it's data-driven -- meaning it was built and photorealistically rendered from real-world data -- thereby enabling direct transfer to reality. While the initial iteration supported only single car lane-following with one camera sensor, achieving high-fidelity data-driven simulation required rethinking the foundations of how different sensors and behavioral interactions can be synthesized. Enter VISTA 2.0: a data-driven system that can simulate complex sensor types and massively interactive scenarios and intersections at scale. With much less data than previous models, the team was able to train autonomous vehicles that could be substantially more robust than those trained on large amounts of real-world data. "This is a massive jump in capabilities of data-driven simulation for autonomous vehicles, as well as the increase of scale and ability to handle greater driving complexity," says Alexander Amini, CSAIL PhD student and co-lead author on two new papers, together with fellow PhD student Tsun-Hsuan Wang.