Together, these sensors detect and visualise everything around the truck, including cars, pedestrians and lamp posts. The system works with Caesium, a cloud-based platform (also developed by Oxbotica) that can manage and coordinate fleets of autonomous vehicles. The company sells a "smart platform" which gives other companies access to its delivery infrastructure -- the technology behind its apps, its warehouses and delivery vehicles. So it's very important for us to keep innovating and to keep doing exciting technology projects, because that will give us a competitive advantage going forward."
Creating a self-driving car should not be difficult, but it's taking a while. Autonomous vehicles have been making headlines for years now, yet few of us have ever been in one or even seen one. We know that flying planes is more difficult than driving cars, yet pilots have enjoyed autopilot for decades. The answer is clear, or more precisely, clear vision. Pilots have used autopilot for decades in clear, open skies.
Intelligent machines powered by artificial intelligence (AI) computers that can learn, reason and interact with people and the surrounding world are no longer science fiction. Thanks to a new computing model called deep learning using powerful graphics processing units (GPUs), AI is transforming industries from consumer cloud services to healthcare to factories and cities. Many of these are in place already, providing new services to millions around the world. However, no industry is poised for such a significant change as the $10 trillion transportation industry. The automotive market is next, and the opportunity to develop advanced self-driving vehicle holds the promise to the world of dramatically safer driving and new mobility services.
The automaker's new Portal concept is a battery-powered, semi-autonomous, connected vehicle that it says was designed by millennials for themselves. It boasts a 250 mile range, and can fill up with 150 miles worth of electricity in 20 minutes at a fast charge station. Its 100 kWh battery pack is integrated into the floor, which helps maximize interior space. Front and rear sliding doors create large entry portals that inspire the minivan's name, while its six captain's chairs have fold-up seat bottoms and are mounted on rails that allow the cabin to be easily reconfigured to accommodate cargo or passengers, as needed. The driver can be a passenger some of the time thanks to a suite of cameras, radar, Lidar, and ultrasonic sensors, plus high-definition maps augmented by GPS and car-to-car and car-to-infrastructure communications that enable Level 3 autonomy, which allows the Portal to drive itself on some highways with human supervision.
That luggage rack and antlers hold state-of-the-art camera and sensor technology that Ford hopes will keep it ahead of the increasingly crowded pack. A Ford employee works with the LiDAR sensor attached to an antler-like arm extending from each side of the company's autonomous test vehicle The so-called luggage racks on the roof, meanwhile, hold three cameras (a fourth nestles underneath the windshield). According to a Medium blog post by Chris Brewer, chief program engineer for Ford's Autonomous Vehicle Development, the new autonomous research vehicle can orient itself by comparing what its LiDAR, radar, and other sensors pick up against detailed 3D maps, in what Brewer called "mediated perception." Ford will show its new autonomous research vehicles at the CES technology trade show in Las Vegas and the NAIAS automotive trade show in Detroit.
There is no question that the portability and omnipresence of cameras in today's society has improved driver safety -- video of a vehicle crash helps people find out specifically what went wrong. But what if you could impart artificial intelligence into those camera systems in vehicles, and predict problems on the road and prevent disaster? Netradyne's Driver-I technology uses machine learning to predict and prevent accidents in the commercial transportation industry San Diego, California-based Netradyne has developed technology designed to do just that, integrating cameras and deep learning with their Driver-i, a "vision based" system, mounted in or on commercial vehicles. According to Pandya, the age of machines controlling humans is far off.
Mapmaking used to be the domain of a select group of cartographers that would gather, review, and plot out data onto sheets of paper. The chances that you actually knew a cartographer in the past were probably pretty slim--but not anymore. Today and in the future, virtually everyone is or will be a contributor to the increasingly detailed maps that represent the world we live in. As our vehicles become increasingly automated, they need ever more detailed maps and not just the maps we get from Google or Apple on our smartphones. The self-driving car will need much more information.
Our self-driving future will initially be extremely expensive. That's why GM and Ford are working on autonomous systems for ride-hailing ahead of selling cars to individuals. Meanwhile, Korean automaker Hyundai is researching another approach: a system that uses less computing power and therefore is cheaper. Of course this vehicle, like all autonomous cars, won't be available for a very long time, but what Hyundai showed off in Las Vegas looks promising. The two test Ioniqs (one hybrid and one pure electric) were fitted with cameras in the windshield, radar behind the automaker's logo and lidar sensors in the front and sides of the bumper.
Carmakers and tech firms competing to develop automated vehicles seek a combination of sensors and cameras that provide maximum perception and visibility of surroundings at a cost that's manageable for mass production. Velodyne, a leading maker of laser-based LiDAR, or Light, Detection and Ranging, sensors, says it has designed a new solid-state version of its technology that provides 3D imaging for automated vehicle systems that will cost less than $50 per unit when manufactured at high volume. Velodyne will continue to sell spinning "puck"-style LiDAR units for automated vehicles developed for on-demand ride services, though the new non-spinning solid-state chip, developed with Los Angeles-based Efficient Power Corp., or EPC, is designed to be integrated into the exterior of vehicles for retail car buyers, Jellen said. Google and many automakers feel sensor fusion, a combination of visual information coming from radar, LiDAR and cameras is the best option to ensure visibility under all driving conditions.
Tesla has provided a glimpse into its self-driving future. The company recently revealed that all of its cars'from here on out' will be made with the technology necessary to become self-driving. Now, Tesla founder Elon Musk has released a video showing exactly what the self-driving cars will'see' as they navigate the streets. All Tesla cars being produced, including the Model 3, are now being built with full autonomous capabilities. Model S and Model X vehicles with the new hardware are'already in production.' Musk claimed the technology will soon enable'full autonomy all the way from LA to New York,' saying this would be achieved'without the need for a single touch.'