The U.S. Army's Autonomous Remote Engagement System is mounted on the Picatinny Lightweight Remote Weapon System and coupled with an M240B machine gun. It's part of a program that reduces the time to identify targets using automatic target detection and user-specified target selection. The U.S. Army's Autonomous Remote Engagement System is mounted on the Picatinny Lightweight Remote Weapon System and coupled with an M240B machine gun. It's part of a program that reduces the time to identify targets using automatic target detection and user-specified target selection. Killer robots have been a staple of TV and movies for decades, from Westworld to The Terminator series.
More than a month after a self-driving Uber struck and killed a pedestrian crossing the street in Arizona, it's still not clear what sort of failure might explain the crash--or how to prevent it happening again. While the National Transportation Safety Board investigates, Uber's engineers are sitting on their hands, their cars are parked. The crash and its inconclusive aftermath reflect poorly on a newborn industry predicated on the idea that letting computers take the wheel can save lives, ease congestion, and make travel more pleasant. An industry dashing toward adulthood--Google sister company Waymo plans to launch a robo-taxi service this year, General Motors is aiming for 2019--and now, suddenly, on the verge of being rejected by a public that hasn't even experienced it yet. In other words, AV makers are clearing the technological hurdles and tripping over the psychological ones.
When Amazon first introduced developer tools that let people build stuff for Alexa, the company made a conscious decision to call these functions "skills" rather than apps. It was a subtle way of making Alexa seem capable, and also, suggesting to developers that building these skills would be a low lift. With just a "few lines of code," Amazon promised, "you can build entirely new experiences designed around voice." Amazon says most Echo users in the US have tried these third-party skills at least once, but getting them to work can be tricky. Alexa's voice skills often require super specific queries, and until Amazon started paying attention to the discovery process, taking the time to find new skills felt like a non-essential burden. Now, Amazon has decided to make Alexa's skills all about you: your dad jokes, your homework, your birthday.
The MIT Statistics and Data Science Center (SDSC), a part of the Institute for Data, Systems, and Society (IDSS), announced two new academic programs today: the MicroMasters program in Statistics and Data Science, and the Interdisciplinary Doctoral Program in Statistics, both beginning in the fall. The MicroMasters program, currently under development by MIT faculty, will be offered online through edX. "Digital technologies are enabling us to bring MIT's data science curriculum to learners around the world regardless of their location or socioeconomic status," says Vice President for Open Learning Sanjay Sarma. The curriculum includes foundational knowledge of data science methods and tools, a deep dive into probability and statistics, and opportunities to learn, implement, and experiment with data analysis techniques and machine learning algorithms. "The demand for data scientists is growing rapidly," says Dean for Digital Learning Krishna Rajagopal.
Apple has created a new robot – not for building products, but for ripping iPhones apart. The robot, named Daisy, can take nine different iPhones models apart and extract the important parts of them, in ways traditional recyclers cannot. They can then be used all over again, helping to cut wastage out of the process of making phones. The new announcement is part of Apple's broad plans for Earth Day, the event held on 22 April each year to mark green efforts. It also said that it would encourage people to recycle more of their phones, so that they can be broken up by Daisy: for every iPhone handed in until 30 April through its GiveBack recycling scheme, it will make a donation to Conservation International.
Neither example cited by Gualtieri is science fiction-worthy, but techniques like natural language processing (NLP) and deep learning have the potential to help companies save time and improve operations on a massive scale. That's not conjecture: A DataScience.com client once leveraged NLP to parse thousands of customer support inquiries and online reviews for information about the most critical issues facing its product, eliminating hundreds of hours spent on manual searches. The improvements the company made based on that information resulted in a 500-point increase in net promoter score.
When businesses identify a problem that can be solved through machine learning, they brief the data scientists and analysts to create a predictive analytics solution. In many cases, the turnaround time for delivering a solution is pretty long. Even for experienced data scientists, evolving machine learning models that can accurately predict the results is always challenging and time-consuming. The complex workflow involved in machine learning models have multiple stages. Some of the significant steps include data acquisition, data exploration, feature engineering, model selection, experimentation and prediction.
It takes a lot of practice to fly a drone with confidence. Whether it's a multirotor or a fixed-wing drone, there are a lot of complicated things going on all at once, and most of the control systems are not even a little bit intuitive. The first-person viewpoint afforded by drone-mounted cameras and VR headsets helps, but you're still stuck with trying to use a couple of movable sticks to manage a flying robot, which takes both experience and concentration. EPFL has developed a much better system for drone control, taking away the sticks and replacing them with intuitive and comfortable movements of your entire body. It's an upper-body soft exoskeleton called FlyJacket, and with it on, you can pilot a fixed-wing drone by embodying the drone--put your arms out like wings, and pitching or rolling your body will cause the drone to pitch or roll, all while you experience it directly in immersive virtual reality.