The UK government is investing millions in the development of micro-robots designed to work in underground pipe networks and dangerous sites such as decommissioned nuclear facilities. Airborne and underwater versions could also inspect and maintain difficult-to-reach locations such as offshore windfarms or oil and gas pressure vessels. Led by Prof Kirill Horoshenkov at the University of Sheffield and backed by a £7.2m government grant, the collaborative research programme will also involve scientists from Birmingham, Bristol and Leeds universities. It is hoped that the 1cm-long devices will use sensors and navigation systems to find and mend cracks in pipes, avoiding disruption from roadworks estimated to cost the economy £5bn a year. The remaining £19.4m will fund research into the use of robotics in hazardous environments, including drones for oil pipeline monitoring or artificial intelligence able to establish the need for repairs on satellites in orbit.
What jobs will AI probably not destroy? The jobs that are most susceptible to automation in the near term are those that are fundamentally routine or predictable in nature. If you have a boring job--where you come to work and do the same kinds of things again and again, you should probably worry. The tasks within jobs like this are likely to be encapsulated in the data that is collected by organizations. So it may only be a matter of time before a powerful machine learning algorithm comes along that can automate much of this work.
In early December, researchers at DeepMind, the artificial-intelligence company owned by Google's parent corporation, Alphabet Inc., filed a dispatch from the frontiers of chess. A year earlier, on Dec. 5, 2017, the team had stunned the chess world with its announcement of AlphaZero, a machine-learning algorithm that had mastered not only chess but shogi, or Japanese chess, and Go. The algorithm started with no knowledge of the games beyond their basic rules. It then played against itself millions of times and learned from its mistakes. In a matter of hours, the algorithm became the best player, human or computer, the world has ever seen.
Researchers at Stanford University engineers used a deep learning computer model to identify every solar panel in the continuous U.S. from satellite images. Stanford University engineers have developed a method for locating every solar panel in the contiguous U.S. from a massive satellite image database via a deep learning computer model. The researchers used a pre-trained model called Inception as the basis for the DeepSolar neural network's clustering and classifying of pixels in images. DeepSolar scanned more than 1 billion image "tiles," comprising areas bigger than a neighborhood but smaller than a zip code; each tile had 102,400 pixels, and DeepSolar classified each pixel in each tile, determining whether it was likely part of a solar panel or not. The network completed this task in less than a month, ascertaining that regions with more sun exposure had greater solar panel adoption than areas with less average sunlight.
In September of this year, Amazon hosted a press event in the steamy Spheres at its Seattle headquarters, announcing a dizzying array of new hardware products designed to work with the voice assistant Alexa. But at the event, Amazon also debuted some new capabilities for Alexa that showcased the ways in which the company has been trying to give its voice assistant what is essentially a better memory. At one point during the presentation, Amazon executive Dave Limp whispered a command to Alexa to play a lullaby. But this year, the companies making voice-controlled products tried to turn them into sentient gadgets. Alexa can have the computer version of a "hunch" and predict human behavior; Google Assistant can carry on a conversation without requiring you to repeatedly say the wake word.
The day after Christmas is always a good day to see which products and apps people were most excited about this holiday season. According to Wednesday's top free apps charts on Android's Google Play and iOS' iPhone, plenty of people received Google Home, Alexa-enabled speakers and Fitbits this holiday season. Amazon's Alexa app took the top spot on both app stores' lists of the top free apps midday Wednesday while Fitbit was in the No. 5 slot. Google Home took the No. 3 spot on the Android Play Store and came in seventh on iPhone. Since all three apps are needed to set up their respective devices, it's likely that many people received Amazon Echo or Google Home speakers, Chromecast streaming sticks and Fitbit trackers.
Although deep learning holds enormous promise for advancing new discoveries in genomics, it also should be implemented mindfully and with appropriate caution. Deep learning should be applied to biological datasets of sufficient size, usually on the order of thousands of samples. The'black box' nature of deep neural networks is an intrinsic property and does not necessarily lend itself well to complete understanding or transparency. Subtle variations in the input data can have outsized effects and must be controlled for as well as possible. Importantly, deep learning methods should be compared with simpler machine learning models with fewer parameters to ensure that the additional model complexity afforded by deep learning has not led to overfitting of the data.