Goto

Collaborating Authors

 2018-08


Driverless taxi debuts in Tokyo in 'world first' trial ahead of Olympics

The Guardian

A self-driving taxi has successfully taken paying passengers through the busy streets of Tokyo, raising the prospect that the service will be ready in time to ferry athletes and tourists between sports venues and the city centre during the 2020 Summer Olympics. ZMP, a developer of autonomous driving technology, and the taxi company Hinomaru Kotsu, claim that the road tests, which began this week, are the first in the world to involve driverless taxis and fare-paying passengers. The trial took place as Toyota and the transport giant Uber said they were intensifying efforts to develop a self-driving vehicle, pitting themselves against rival initiatives in Japan, the US and Europe. Toyota will invest $500m in the venture, which will develop vehicles based on the carmakers' Sienna minivans, with a view to start testing in 2021, the firms said this week. Uber and Waymo, owned by Google spinoff Alphabet, have started testing their vehicles on public roads in the US, but the venture suffered a serious setback in March when a Waymo self-driving van struck and killed a pedestrian during a trial in Arizona.


Toyota to invest $500m in Uber in driverless car deal

BBC News

Japanese carmaker Toyota is to invest $500m (£387m) in Uber and expand a partnership to jointly develop self-driving cars. The firm said this would involve the "mass-production" of autonomous vehicles that would be deployed on Uber's ride sharing network. It is being viewed as a way for both firms to catch up with rivals in the competitive driverless car market. The deal also values Uber at some $72bn, despite its mounting losses. That is up 15% since its last investment in May but matches a previous valuation in February.


Google Needs To Make Machine Learning Their Growth Fuel

#artificialintelligence

Google's fuel for future growth is AI and machine learning, and how effectively they strengthen their subsidiary businesses with these technologies will determine how profitably they grow (Photo by VCG/VCG via Getty Images) These and many other fascinating insights are from CB Insight's report, Google Strategy Teardown (PDF, 49 pp., opt-in). The report explores how Alphabet, Google's parent company is relying on Artificial Intelligence (AI) and machine learning to capture new streams of revenue in enterprise cloud computing and services. Also, the report looks at how Alphabet can combine search, AI, and machine learning to revolutionize logistics, healthcare, and transportation. It's a thorough teardown of Google's potential acquisitions, strategic investments, and partnerships needed to maintain search dominance while driving revenue from new markets.


Robotic Implants

Communications of the ACM

MIT CSAIL's origami robot is packaged in an ingestible ice pill. In 2013, University of Sheffield roboticist Dana Damian was doing postdoctoral research at Harvard Medical School affiliate Boston Children's Hospital when she learned of a procedure called the Foker technique. The surgery, performed on children with a rare congenital lung defect, calls for doctors to attach sutures to part of an infant's esophagus, then tie them off on the baby's back. Over time, the sutures lengthen the esophagus by pulling on it, stimulating tissue growth. Although the technique can be effective, the risk of infection and complication is high, and the baby must remain under sedation for weeks.


Alexa Is Shielding Children From the Truth

Slate

Being a child must be terribly confusing--hence all of the "why" and "how" questions. With no guidebook, no references, no context--no understanding of history and how society came to be, or of reproduction and how they came to be--the world is mystifying for its newest members, and growing up is a gradual process of demystification. It's no wonder kids have so many questions. It's also no wonder that they are enthralled by Alexa, the disembodied know-it-all on hand to answer their stream of queries. Smart speakers are the perfect players for their game of Twenty Million Questions.


Fad Or The Future? Robot-Made Burgers Wow The Crowds In San Francisco

NPR Technology

Alex Vardakostas had a dream about creating a robot burger-maker in college. He's now poised to open a restaurant with his Creator Burger robot in September. Alex Vardakostas had a dream about creating a robot burger-maker in college. He's now poised to open a restaurant with his Creator Burger robot in September. An audience gathers around the transparent 14-foot-long "culinary instrument" in a restaurant called Creator in San Francisco's SoMa neighborhood.


Sony's Aibo Robot Dog Is Coming to America

IEEE Spectrum Robotics

This past November, Sony announced that it was reviving its robot dog Aibo. The iconic robotic pet, introduced in 1999, won a lot of fans all over the world but had been discontinued for over a decade. The company said the new Aibo, with more advanced mechatronics and AI, would be available for purchase early this year, but only in Japan. Now Sony is announcing that, after selling 20,000 new Aibos to Japanese consumers, it is making the robot canine available also in the United States. At an event at its U.S. headquarters in New York City, the company said it will be offering a "limited first litter edition" bundle starting next month, with expected delivery before the holidays.


This drone lets you zoom in while you fly

USATODAY - Tech Top Stories

Chinese drone leader DJI unveiled two new drone models at a press conference in New York City that bring higher quality cameras and the ability to zoom in while flying. They are updates to the Mavic line; the Mavic 2 Pro has a camera made by legendary imaging company Hasselblad, and the Mavic 2 Zoom lets users get closer to the action with the zoom. This is important because drones use wide-angle lenses to show the expanse of the area you're flying over. And now, with the zoom, users will be able to zero in on other things on the ground. Flight time from a battery, which has lasted around 20 minutes with the original Mavic Pro, is increased to 31 minutes on the new 2 Pro.


A Monitor's Ultrasonic Sounds Can Reveal What's on the Screen

WIRED

You probably assume that someone can only see what's on your computer screen by looking at it. But a team of researchers has found that they can glean a surprising amount of information about what a monitor displays by listening to and analyzing the unintended, ultrasonic sounds it emits. The technique, presented at the Crypto 2018 conference in Santa Barbara on Tuesday, could allow an attacker to initiate all sorts of stealthy surveillance by analyzing livestreams or recordings taken near a screen--say from a VoIP call or video chat. From there, the attacker could extract information about what content was on the monitor based on acoustic leakage. And though distance degrades the signal, especially when using low quality microphones, the researchers could still extract monitor emanations from recordings taken as far as 30 feet away in some cases .


Artificial General Intelligence Is Here, and Impala Is Its Name - ExtremeTech

#artificialintelligence

However even these reinforcement learning algorithms couldn't transfer what they'd learned about one task to acquiring a new task. In order to realize this achievement, DeepMind supercharged a reinforcement learning algorithm called A3C. In so-called actor-critic reinforcement learning, of which A3C is one variety, acting and learning are decoupled so that one neural network, the critic, evaluates the other, the actor. Together, they drive the learning process. This was already the state of the art, but DeepMind added a new off-policy correction algorithm called V-trace to the mix, which made the learning more efficient, and crucially, better able to achieve positive transfer between tasks.