Results


Is Your Tesla Eligible For HW 2.5 Update For Full Autopilot?

International Business Times

However, the company did not implement the update and in less than a year, it has already started equipping all its vehicles in the production stage, including the Model 3, with the new HW 2.5 hardware, Electrek reported Wednesday. According to the Electrek report, the company has opted for an upgrade as the HW 2.0 was not capable of enabling Level 5 autonomy -- fully autonomous driving with no need of human interference. Tesla's vehicles are based on Nvidia's Drive PX2 platform for autonomous driving. The company is also getting its cars ready for the day it can actually issue an over-the-air software update and enable full autonomy on its vehicles.


Tesla quietly upgrades Autopilot hardware in new cars

Engadget

Electrek has learned that Tesla is quietly equipping new Model 3, S and X production units with upgraded Autopilot hardware (HW 2.5). Every HW 2.0 or later car should still have the foundations for self-driving functionality, in other words. And while it's "highly unlikely" that these vehicles will need an upgrade when fully autonomy is an option, Tesla will upgrade them to 2.5 for free. Tesla likely has more headroom for vehicle upgrades than this, but it can't do anything that would limit driverless tech to post-2.0 vehicles.


Cheap lidar sensors are going to keep self-driving cars in the slow lane

#artificialintelligence

The race to build mass-market autonomous cars is creating big demand for laser sensors that help vehicles map their surroundings. Most driverless cars make use of lidar sensors, which bounce laser beams off nearby objects to create 3-D maps of their surroundings. Each beam is separated by an angle of 0.4 (smaller angles between beams equal higher resolution), with a range of 120 meters. Austin Russell, the CEO of lidar startup Luminar, says his company actively chose not to use solid-state hardware in its sensors, because it believes that while mechanically steering a beam is more expensive, it currently provides more finely detailed images that are critical for safe driving.


Low-Quality Lidar Will Keep Self-Driving Cars in the Slow Lane

MIT Technology Review

The race to build mass-market autonomous cars is creating big demand for laser sensors that help vehicles map their surroundings. Most driverless cars make use of lidar sensors, which bounce laser beams off nearby objects to create 3-D maps of their surroundings. Each beam is separated by an angle of 0.4 (smaller angles between beams equal higher resolution), with a range of 120 meters. Austin Russell, the CEO of lidar startup Luminar, says his company actively chose not to use solid-state hardware in its sensors, because it believes that while mechanically steering a beam is more expensive, it currently provides more finely detailed images that are critical for safe driving.


lyft-self-driving-game

WIRED

Conventional wisdom on self-driving used to go like this: A smart tech company, like Google's Waymo, writes the self-driving software. Today, Lyft announced it's getting into the self-driving business, launching its own unit to build autonomous vehicle software and hardware. Until today, Lyft's strategy seemed to hinge on hopping between carmakers like General Motors and tech companies like Waymo, striking deals that would put autonomous vehicles on the Lyft platform. Now lots of hardware companies use Android as their operating systems, and Google phones are still around.


Vayyar launches 3D sensors that give self-driving cars interior awareness

#artificialintelligence

A lot of the sensor hardware being developed by component suppliers in the autonomous driving industry focus on getting a clear picture of what's happening outside the car, but Vayyar's new 3D sensors provide a detailed look inside the car, including information about passengers. These embedded sensors are also small and low-cost compared to other sending solutions, and can provide real-time info about what's going on in a car, including monitoring passenger vital signs, and even keeping track of whether a driver is nodding off behind the wheel. Future autonomous car in-vehicle services could also make use of the tech, tailoring display and delivery of info on in-car displays and modifying environment controls, for instance. Vayyar also notes that the sensors can be used to automatically detect and send information about survivors within a vehicle in case of accidents, potentially giving emergency responders an early leg up.


Intel Banks on Artificial Intelligence EE Times

#artificialintelligence

Last year, Intel Corp. acquired neural-network hardware maker Nervana and built Nervana's chip, integrating it with Intel's own on-processor deep-learning and artificial-intelligence (AI) capabilities. This month, Intel Capital invested in AI startups CognitiveScale, Aeye Inc., and Element AI. Intel is investing in AI startups, acquiring others, and blending the mix with its own AI expertise to ensure a leadership position in machine learning, deep learning, and brainlike neural networks based on its AI hardware and software. Element AI's new world Perhaps the most enigmatic of the AI startups in which Intel Capital has invested is Element AI, which claims to be conjuring an "AI-First World" that "elevates collective wisdom."


Lyft partners with NuTonomy to develop and test driverless fleet in Boston

Los Angeles Times

As ride-hailing leader Uber struggles to develop robot car technology on its own, arch-foe Lyft is busy teaming up with others. On Tuesday, Lyft said it will work with driverless software company NuTonomy to develop fleets of driverless vehicles. Last month, the San Francisco company announced a partnership with Waymo, the driverless car project of Google parent Alphabet. "We're seeing the formation of partnerships between automotive companies, tech companies and ride hailing companies," said Karl Iagnemmo, NuTonomy's chief executive.


Yandex's on-demand taxi service debuts its self-driving car project

#artificialintelligence

Yandex notes that it has navigation, geolocation, computer vision, and machine learning expertise from other ongoing products and services, including Yandex.Navigator and Yandex.Maps. "We have been using computer vision technologies in a number of our services for quite a while. As a result of its combined software development efforts, as well as recent work on applying said tech to the automotive space, Yandex's self-driving software for its prototype vehicle is developed completely in house, the company tells me. The vehicle in the video isn't yet navigating real city streets, but Yandex says that testing is coming on public roads within a year, if all goes as planned.


Robot buses will hit the streets of Dubai by 2019

Daily Mail

The automated transport system will feature 25 driverless group rapid transit vehicles capable of carrying 24 passengers each. They will connect stations on the island and Nakheel Harbour and Tower Metro Station approximately 1.5 miles (2.5 km) apart. The automated transport system will feature 25 driverless GRT vehicles capable of carrying 24 passengers each, larger versions of the pods being used in Masdar City. Bluewaters Island will be connected to the Nakheel Harbour and Tower metro stations, approximately 1.5 miles (2.5 km) apart Tesla announced an updated version of their autopilot hardware in December last year, named HW2.