Results


Machine Learning at HPC User Forum: Drilling into Specific Use Cases

@machinelearnbot

Dr. Weng-Keen Wong from the NSF echoed much the same distinction between the specific and general case algorithm during his talk "Research in Deep Learning: A Perspective From NSF" and was also mentioned by Nvidia's Dale Southard during the disruptive technology panel. Tim Barr's (Cray) "Perspectives on HPC-Enabled AI" showed how Cray's HPC technologies can be leveraged for Machine and Deep Learning for vision, speech and language. Fresh off their integration of SGI technology into their technology stack, the talk not only highlighted the newer software platforms which the learning systems leverage, but demonstrated that HPE's portfolio of systems and experience in both HPC and hyper scale environments is impressive indeed. Stand-alone image recognition is really cool, but as expounded upon above, the true benefit from deep learning is having an integrated workflow where data sources are ingested by a general purpose deep learning platform with outcomes that benefit business, industry and academia.


One autonomous car will use 4,000 GB of data per day

@machinelearnbot

And it's going to be significantly more than the amount of data that the average person generates today. "Each car driving on the road will generate about as much data as about 3,000 people," Krzanich says. And just a million autonomous cars will generate 3 billion people's worth of data, he says. The car will have to learn about such things as cones in the road and other hazards, which Krzanich calls technical data.


Google's Waymo Using Intel Chips For Its Self-Driving Minivans

@machinelearnbot

Waymo--the Google self-driving project that spun out to become a business under Alphabet--said Monday it's using Intel chips as part of a compute platform that allows its self-driving Chrysler Pacifica hybrid minivans to process huge amounts of data so it can make decisions in real time while navigating city streets. "As the most advanced vehicles on the road today, our self-driving cars require the highest-performance computers to make safe driving decisions in real time," Waymo CEO John Krafcik said in an emailed statement. However, it wasn't until Waymo started the Chrysler Pacifica minivan project that it began working more closely with the chipmaker. "By working closely with Waymo, Intel can offer Waymo's fleet of vehicles the advanced processing power required for level 4 and 5 autonomy."


Regulating AI – The Road Ahead

@machinelearnbot

Summary: With only slight tongue in cheek about the road ahead we report on the just passed House of Representative's new "Federal Automated Vehicle Policy" as well as similar policy just emerging in Germany. Just today (9/6/17) the US House of Representatives released its 116 page "Federal Automated Vehicles Policy". Equally as interesting is that just two weeks ago the German federal government published its guidelines for Highly Automated Vehicles (HAV being the new name of choice for these vehicles). On the 6 point automation scale in which 0 is no automation and 5 is where the automated system can perform all driving tasks, under all conditions, the new policy applies to level 3 or higher (though the broad standards also apply to the partial automation in levels 1 and 2).


GM unit says it has 'mass producible' autonomous cars

@machinelearnbot

The General Motors unit developing autonomous vehicles said Monday it has begun rolling out the first "mass producible" self-driving cars that could be available once regulations allow. GM is now in position to begin delivering and deploying autonomous cars on a large scale when regulations are in place to permit their operation. And US Transportation Secretary Elaine Chao was expected to make an announcement on autonomous technology this week. "We will achieve success by integrating the best software and hardware to deploy truly driverless vehicles at scale."


The Future of Self-Driven Buses - Amyx Internet of Things (IoT)

@machinelearnbot

This company is taking its buses to a new level by developing self-driven, mass transit vehicles. The purpose of operating the self-driven electric buses in various traffic situations is to figure out the changes needed in the city's infrastructure to prepare for autonomous public transportation. Proterra's self-driving bus study is said to be a yearlong phase that tests the functionality of the sensors through different road conditions such as weather changes and traffic on the road. The technology is leading to a radically different future, but it is imminent that there will be a significant number of manned and unmanned public transport vehicles that will hit the roads in the future.


Leading AI country will be 'ruler of the world,' says Putin

@machinelearnbot

U.N. urged to address lethal autonomous weapons AI experts worldwide are also concerned. Armed with a heavy machine gun, this "mobile robotic complex … can detect and destroy targets, without human involvement." "In autonomous mode, the vehicle can automatically identify, detect, track and defend [against] enemy targets based on the pre-programmed path set by the operator," the company said. "* Designed initially for the DMZ, Super aEgis II, a robot-sentry machine gun designed by Dodaam Systems, can identify, track, and automatically destroy a human target 3 kilometers away, assuming that capability is turned on.


Deep learning weekly piece: testing autonomous driving (virtually)

@machinelearnbot

Let me cut to the chase: below's a video of my fully-autonomous car driving around in a virtual testing environment. To train that software, SDCs must drive for thousands of hours and millions of miles on the road to accumulate enough information to learn how to handle both usual road situations, as well as unusual ones (such as when a woman in an electric wheelchair chases a duck with a broom in the middle of the road). To save on the incredibly expensive training (that requires thousands of hours of safety drivers plus the safety risks of having a training vehicle on public roads), SDC developers turn to virtual environments to train their cars. To train the deep learning algorithm, I'll drive a car with sensors drives around a track in simulator a few times (think: any car racing video game), and record the images that the sensors (in this case, cameras) "see" inside the simulator.


Transforming from Autonomous to Smart: Reinforcement Learning Basics

@machinelearnbot

With the rapid increases in computing power, it's easy to get seduced into thinking that raw computing power can solve problems like smart edge devices (e.g., cars, trains, airplanes, wind turbines, jet engines, medical devices). In chess, the complexity of the chess piece only increases slightly (rooks can move forward and sideways a variable number of spaces, bishops can move diagonally a variable number of spaces, etc. Now think about the number and breadth of "moves" or variables that need to be considered when driving a car in a nondeterministic (random) environment: weather (precipitation, snow, ice, black ice, wind), time of day (day time, twilight, night time, sun rise, sun set), road conditions (pot holes, bumpy, slick), traffic conditions (number of vehicles, types of vehicles, different speeds, different destinations). It's nearly impossible for an autonomous car manufacturer to operate enough vehicles in enough different situations to generate the amount of data that can be virtually gathered by playing against Grand Theft Auto.


Artificial intelligence helps to keep tired drivers awake

@machinelearnbot

They have come up with an artificial intelligence platform designed to keep the driver comfortably awake at all times. Assessing the offering from Panasonic, PC Magazine notes there are five levels of drowsiness that the device can note: Not drowsy, Slightly drowsy, Drowsy, Very drowsy, Seriously drowsy. The Japanese developed device takes the form of an in-car system which monitors and detects driver drowsiness as it comes on and then reacts. The traffic in Iwo road at this time of the year is heavy.