software


Google Helps NASA Find 2 New Exoplanets Using Machine Learning

@machinelearnbot

Alphabet's Google and NASA said on Thursday that advanced computer analysis identified two new planets around distant stars, including one that is part of the first star system with as many planets as Earth's solar system. The research by Google and the University of Texas at Austin that used data from NASA raised the prospects of new insights into the universe by feeding data into computer programs that can churn through information faster and more in-depth than humanly possibly, a technique known as machine learning. In this case, software learned differences between planets and other objects by analysing thousands of data points, achieving 96 percent accuracy, NASA said at a news conference. The data came from the Kepler telescope which NASA launched into space in 2009 as part of a planet-finding mission that is expected to end next year as the spacecraft runs out of fuel. The software's artificial "neural network" combed through data about 670 stars, which led to the discovery of planets Kepler 80g and Kepler 90i.


Night of the Test(Automation)Busters

#artificialintelligence

Then we internals took over and tried to add new features, while not breaking the existing ones… …today the monolith is still in place, but most of its functionality has been replaced by micro services communicating via asynchronous messaging and deliver their own frontends. In this session we will talk about challenges we faced over the past three years, about the "best practices" that failed while scaling up from 0 to 40 teams and the new challenges we are facing today. In BDD the formalized examples use a natural language-based DSL driven by the Given/When/Then keywords. At the same time, property-based testing (PBT) uses abstract (mathematical) formulas to declare expectations for the output values given some constraints on the input. The PBT tools try to disproof that the application fulfills these requirements by taking samples from the valid input value space.


Video Friday: Giant Robotic Chair, Underwater AI, and Robot Holiday Mischief

IEEE Spectrum Robotics Channel

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next two months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. This is the FX-2 Giant Human Riding Robot from KAIST HuboLab and Rainbow Robotics, and that's all I know about it. Yuichiro Katsumoto, a "gadget creator" based in Singapore, wrote in to share this video of the robotic kinetic typography that he's been working on for the last year.


The skeptic's guide to smart home gadgets

Washington Post

Before you buy any "smart" gadgets, make sure they're not dumb. This holiday season, a third of Americans plan to buy a smart home device, according to the Consumer Technology Association. But just hooking up the Internet to a door lock, kettle or dog bowl (yes, that's a thing) doesn't make it smart. The trick is figuring out which ones are worth the cost, trouble and inevitable security risks. I've been in those weeds.


Congratulations to Semio, Apellix and Mothership Aeronautics

Robohub

The Robot Launch global startup competition is over for 2017. We've seen startups from all over the world and all sorts of application areas – and we'd like to congratulate the overall winner Semio, and runners up Apellix and Mothership Aeronautics. All three startups met the judges criteria; to be an early stage platform technology in robotics or AI with great impact, large market potential and near term customer pipeline. Semio from Southern California is a software platform for developing and deploying social robot skills. Ross Mead, founder and CEO of Semio said that "he was greatly looking forward to spending more time with The Robotics Hub, and is excited about the potential for Semio moving forward."


Musk Says Tesla Is Building Its Own Chip for Autopilot

#artificialintelligence

Rockets, electric cars, solar panels, batteries--whirlwind industrialist Elon Musk has set about reinventing one after another. Thursday, he added another ambitious project to the list: Future Tesla vehicles will run their self-driving AI software on a chip designed by the automaker itself. "We are developing customized AI hardware chips," Musk told a room of AI experts from companies such as Alphabet and Uber on the sidelines of the world's leading AI conference. Musk claimed that the chips' processing power would help Tesla's Autopilot automated-driving function save more lives, more quickly, by hastening the day it can drive at least 10 times more safely than a human. "We get there faster if we have dedicated AI hardware," he said.


five-ways-machines-protect-your-business-from-cyberthreats

#artificialintelligence

Today, it's nearly impossible to ignore the avalanche of cybersecurity noise competing for your attention. For many of us (even those of us in the industry), just getting a grasp on the ever-expanding terminology can be frustrating. You can't help but notice the deluge of terms such as "artificial intelligence," "machine learning" and "expert systems." Simply put, these phrases refer to technologies and approaches at the core of the new cyberworld battleground. When I'm engaging with our customers or audiences during speaking sessions at industry events, they frequently ask about these confusing terms.


Andrew Ng Says Enough Papers, Let's Build AI Now! – Synced – Medium

#artificialintelligence

While the scientific community continues looking for new breakthroughs in artificial intelligence, Andrew Ng believes the tech we need is already here. Stop publishing, and start transforming people's lives with technology!" The three-day conference drew over 1,400 attendees from 17 different countries to the Santa Clara Convention Center. Ng's keynote speech was titled "AI is the new electricity". The number of papers submitted across arxiv-sanity categories such as machine learning, computer vision, and speech recognition has dramatically risen since 2012, says OpenAI's Senior Engineer Andrej Karpathy.


AI Defining Transportation's Future at GTC Japan NVIDIA Blog

#artificialintelligence

Whether they drive themselves or improve the safety of their driver, tomorrow's vehicles will be defined by software. However, it won't be written by developers but by processing data. To prepare for that future, the transportation industry is integrating AI car computers into cars, trucks and shuttles and training them using deep learning in the data center. A benefit of such a software-defined system is that it's capable of handling a wide range of automated driving -- from Level 2 to Level 5. Speaking in Tokyo at the last stop on NVIDIA's seven-city GPU Technology Conference world tour, NVIDIA founder and CEO Jensen Huang demonstrated how the NVIDIA DRIVE platform provides this scalable architecture for autonomous driving. "The future is surely a software defined car," said Huang.


Unlocking marine mysteries with artificial intelligence

MIT News

Each year the melting of the Charles River serves as a harbinger for warmer weather. Shortly thereafter is the return of budding trees, longer days, and flip-flops. For students of class 2.680 (Unmanned Marine Vehicle Autonomy, Sensing and Communications), the newly thawed river means it's time to put months of hard work into practice. "In underwater marine robotics, there is a unique need for artificial intelligence -- it's crucial," says MIT Professor Henrik Schmidt, the course's co-instructor. "And that is what we focus on in this class."