When data flows faster than it can be processed

@machinelearnbot

No matter what processing, what algorithm is used, astronomical amounts of data keep piling up so fast that you need to delete some of it, a bigger proportion every day, before you can even look at it, not to mention analyze it even with the most rudimentary tools. An example of this is astronomical data used to detect new planets, new asteroids etc. It keeps coming faster, in larger amounts, than it can be processed on the cloud using massive parallelization. Maybe good sampling is a solution: carefully select which data to analyze, and which data to ignore, before even looking at the data. Or develop better compression algorithms so that one day, when we have more computing power, we can analyze all the data previously collected but not analyzed, and maybe in 2030, look at the hourly evolution of a far away supernova that took place in 2010 over a period of 20 years, but was undetected because the data was parked on a sleeping server.


Sr. Embedded Algorithm Software Engineer

#artificialintelligence

Nest is looking for a software engineer to join our ever growing embedded software team. At Nest you'll have to opportunity to shape the Consumer Electronic business, join a world class engineering team, and help revolutionize the next unloved device in our homes. You'll be responsible to architect, design, and implement the heart that makes our future products tick.


The Algorithm Breaker – Narendra Nath Joshi – Medium

@machinelearnbot

It is no doubt that an appreciable amount of emphasis is placed on Algorithm Design, which is often succeeded or preceded by Algorithm Analysis. Algorithm analysis is an estimate of resources, such as time and storage, necessary. Most algorithms are designed to work with flexible inputs. Usually, the efficiency (a.k.a running time) of an algorithm is stated as a function of time complexity to space complexity. This approach, tried and tested against time and scale, manages fairly well to provide comprehensive analysis and scope for improvement, if any.


Automatic Programming

AITopics Original Links

Our approach to automatic programming is based on reuse of generic algorithms through views. A generic algorithm performs some task, such as sorting a linked list of records, based on abstract descriptions of the data on which the program operates. A view describes how actual application data corresponds to the abstract data as used in the generic algorithm. Given a view, a generic algorithm can be specialized by a compilation process to produce a version of the algorithm that performs the algorithm directly on the application data.


MIT's new algorithm predicts how much pain a person is in by looking at photos

#artificialintelligence

Pain-predicting AI could help doctors discover if any of their patients are faking it. Isaac Asimov's First Law of Robotics states that a robot may not injure a human being or, through inaction, allow a human being to come to harm. But that does not mean a computer can't tell us whether a person is in pain -- and then neatly rank that pain level into some objective measure, like a computer science textbook written by the author of Fifty Shades of Grey. The work in question was carried out by researchers at the Massachusetts Institute of Technology (MIT). They developed an artificial intelligence that is able to predict how much pain a person is in by looking at an image.