Robocar Startup Gets Backing From Uber Rival Grab, Plans Singapore Move


After announcing plans this month to supply self-driving vehicles for Lyft's ride-hailing network, the autonomous tech developer has scored financial backing from Southeast Asian rideshare powerhouse Grab and plans to expand into Singapore. Singapore office will study that market as a potential place to deploy vehicles equipped with its software and self-driving hardware kits in government and business fleets, Tandon said. Amid the rush by auto and tech firms to perfect robotic vehicles, Tandon and his co-founders, who were all researchers from Stanford University's Artificial Intelligence Lab, founded to specialize in deep learning-based driving software for business, government and shared vehicle fleets. Small relative to well-funded programs at Waymo, General Motors' Cruise, Uber's Advanced Technology Vehicle Group and Ford's Argo AI, Mountain View, California-based has made quick progress.

Deep learning weekly piece: testing autonomous driving (virtually)


Let me cut to the chase: below's a video of my fully-autonomous car driving around in a virtual testing environment. To train that software, SDCs must drive for thousands of hours and millions of miles on the road to accumulate enough information to learn how to handle both usual road situations, as well as unusual ones (such as when a woman in an electric wheelchair chases a duck with a broom in the middle of the road). To save on the incredibly expensive training (that requires thousands of hours of safety drivers plus the safety risks of having a training vehicle on public roads), SDC developers turn to virtual environments to train their cars. To train the deep learning algorithm, I'll drive a car with sensors drives around a track in simulator a few times (think: any car racing video game), and record the images that the sensors (in this case, cameras) "see" inside the simulator.

Driverless cars: Tim Cook says Apple AI is applicable to more than just cars


The firms have established a startup support programme at Volkswagen's Data Lab to provide technical and financial support for international startups developing machine learning and deep learning applications for the automotive industry. Volvo Cars, Autoliv and Zenuity will use Nvidia's AI car computing platform as the foundation for their own advanced software development. Nvidia has partnered with automotive supplier ZF and camera perception software supplier Hella to deploy AI technology on the New Car Assessment Program (NCAP) safety certification for the mass deployment of self-driving vehicles. The firms will use Nvidia's Drive AI platform to develop software for scalable modern driver assistance systems that connect their advanced imaging and radar sensor technologies to autonomous driving functionality.

Artificial Intelligence vs. Machine Learning: What's the Difference


So when a machine takes decisions like an experienced human being in similarly tough situations are taken by a machine it is called artificial intelligence. You can say that machine learning is a part of artificial intelligence because it works on similar patterns of artificial intelligence. Finally in the 21st century after successful application of machine learning artificial intelligence came back in the boom. As machine learning is giving results by analyzing large data, we can assure that it is correct and useful and time required is very less.

Artificial Intelligence Tech Will Arrive in Three Waves


Modria, which specializes in the creation of smart justice systems, took the job and devised an automated system that relies on the knowledge of lawyers and divorce experts. First wave AI systems are usually based on clear and logical rules. Well, it turns out that even'primitive' software like Modria's justice system and Google Maps are fine examples for AI. One year later, when DARPA opened Grand Challenge 2005, five groups successfully made it to the end of the track.

Intel Banks on Artificial Intelligence EE Times


Last year, Intel Corp. acquired neural-network hardware maker Nervana and built Nervana's chip, integrating it with Intel's own on-processor deep-learning and artificial-intelligence (AI) capabilities. This month, Intel Capital invested in AI startups CognitiveScale, Aeye Inc., and Element AI. Intel is investing in AI startups, acquiring others, and blending the mix with its own AI expertise to ensure a leadership position in machine learning, deep learning, and brainlike neural networks based on its AI hardware and software. Element AI's new world Perhaps the most enigmatic of the AI startups in which Intel Capital has invested is Element AI, which claims to be conjuring an "AI-First World" that "elevates collective wisdom."

News in AI and machine learning


I'm Nathan Benaich -- welcome to issue #18 of my AI newsletter! I will synthesise a narrative that analyses and links important happenings, data, research and startup activity from the AI world. Grab your hot beverage of choice and enjoy the read! If you're looking to invest, research, build, or buy AI-driven companies, do hit reply and drop me a line. In a massive deal this quarter, Intel CEO agreed to purchase Mobileye for $15.3bn.

Deep learning boosted AI. Now the next big thing in machine intelligence is coming


Inside a simple computer simulation, a group of self-driving cars are performing a crazy-looking maneuver on a four-lane virtual highway. Half are trying to move from the right-hand lanes just as the other half try to merge from the left. It seems like just the sort of tricky thing that might flummox a robot vehicle, but they manage it with precision. I'm watching the driving simulation at the biggest artificial-intelligence conference of the year, held in Barcelona this past December. What's most amazing is that the software governing the cars' behavior wasn't programmed in the conventional sense at all.

Vehicle Artificial Perception-Building Experimental Systems


The Sensor Acquisition Module, for example, is responsible for communicating and receiving data from sensors, while the Main Module will make predictions and make decisions to send to the engine's control module. This experimental system includes 4 main modules: The Sensor Acquisition Module, the Vision Module, the Occupant-System Communication Module and the Artificial Perception Operation Module. Data from the Sensor Acquisition Module and Vision Module is then transmitted through a communication network to an Artificial Perception Operation Module, which is a powerful computer with an intelligent software capable of predicting the behavior of surrounding objects. Also using object detection based on reflection technique; however, radar sensors use electromagnetic waves to scan objects.

How is predictive data shaping the auto industry


How is predictive data changing the automotive industry and what changes can we expect to see in the future? Connected and autonomous cars are going to benefit most from the inclusion of predictive data because their design centers on data collection and processing. As more and more connected cars hit the roads, data management is going to become an essential tool. Predictive data has already shown potential for preventative maintenance, but this same application could be used to predict software problems and security flaws as well.