Engineers get the feeling for robotic fingers


A robotic gripper that can screw in lightbulbs or use a screwdriver without needing to'see' them or be pre-programmed to recognize them has been developed at the University of California San Diego (UCSD).

Philip Hammond to say UK will have self-driving cars by 2021 in budget 'fit for the future'

The Guardian

Driverless cars will be on Britain's roads by 2021 as a result of sweeping regulatory reforms that will put the UK in the forefront of a post-Brexit technological revolution, chancellor Philip Hammond will say this week. In his budget on Wednesday Hammond will allow driverless cars to be tested without any human operator inside or outside the car, and without the legal constraints and rules that apply in many other EU nations, and much of the US. The move – welcomed by the UK motor industry – is part of an attempt by Hammond and the Treasury to project a more upbeat message about the prospects for the UK economy after Brexit, and focus on opportunities as well as the risks. Carmakers have warned that they may have to move at least some production abroad if there is no deal to keep Britain inside the EU single market and customs union, at least for a two-year transition period. But Mike Hawes, chief executive of the Society of Motor Manufacturers and Traders, said it was good news that the government was taking a lead by making the UK attractive to those seeking to develop, test and build an entirely new generation of cars.

Back-Flipping Robot Is A Giant Leap For Robot Kind


MIT's Atlas robot, nicknamed Helios, completes the driving task at the June 2015 DARPA Robotics Challenge Finals. Helios is a second-generation Atlas, developed for DARPA by Boston Dynamics. MIT's Atlas robot, nicknamed Helios, completes the driving task at the June 2015 DARPA Robotics Challenge Finals. Helios is a second-generation Atlas, developed for DARPA by Boston Dynamics. Thursday, Nov. 16, 2017 was a day filled with news.

Stanford Algorithm Can Diagnose Pneumonia Better Than Radiologists

IEEE Spectrum Robotics Channel

Stanford researchers have developed a machine-learning algorithm that can diagnose pneumonia from a chest x-ray better than a human radiologist can. And it learned how to do so in just about a month. The Machine Learning Group, led by Stanford adjunct professor Andrew Ng, was inspired by a data set released by the National Institutes of Health on 26 September. The data set contains 112,120 chest X-ray images labeled with 14 different possible diagnoses, along with some preliminary algorithms. The researchers asked four Stanford radiologists to annotate 420 of the images for possible indications of pneumonia.

3 Questions: Lisa Parks on drones, warfare, and the media

MIT News

Drones have become a common part of warfare -- but their use remains a subject of public contention. Lisa Parks, a professor in MIT's program in Comparative Media Studies/Writing and director of its Global Media Technologies and Cultures Lab, has spent extensive time analyzing this public debate. Now, she has co-edited a new volume examining the subject, while contributing a piece to it herself. The book, "Life in the Age of Drone Warfare," has just been published by Duke University Press. MIT News talked with Parks this week about the impact and public perception of drones.

How to set up voice dictation on your computer and save your aching fingers

Popular Science

If you're using Microsoft's word processor on a Windows computer, you have several voice-recognition options. This section will address three of them, mostly focusing on the Windows Speech Recognition program built into this operating system. The integrated voice-recognition service will work on any Windows application, including Microsoft Word. To launch it, type "windows speech recognition" into the search box on the taskbar, then click the app when it appears. The first time you run this software, you'll need to teach the utility to recognize your voice.

It's About to Get Way, Way Easier to Put AI Everywhere


Google has a vision for a world full of cheap and tiny smart devices--and it hopes its software will power them all. A couple of years back, Google launched an open-source machine-learning software library called TensorFlow. It has since exploded in popularity, to the point where it's now used by the likes of Airbnb, eBay, Uber, Snapchat, and Dropbox to power their AI development. Its appeal is obvious: it allows relative beginners to build and train neural networks without needing a PhD in artificial intelligence. As a result, the library now forms a major component of Google's business plan.

Artificial Intelligence Is Now Your Coworker


Last fall, Google Translate rolled out a new-and-improved artificial intelligence translation engine that it claimed was, at times, "nearly indistinguishable" from human translation. Jost Zetzsche could only roll his eyes. The German native had been working as a professional translator for 20 years, and he'd heard time and time again that his industry would be threatened by advances in automation. Every time, he'd found, the hype was overblown--and Google Translate's makeover was no exception. It certainly wasn't the key to translation, he thought.