When data flows faster than it can be processed


No matter what processing, what algorithm is used, astronomical amounts of data keep piling up so fast that you need to delete some of it, a bigger proportion every day, before you can even look at it, not to mention analyze it even with the most rudimentary tools. An example of this is astronomical data used to detect new planets, new asteroids etc. It keeps coming faster, in larger amounts, than it can be processed on the cloud using massive parallelization. Maybe good sampling is a solution: carefully select which data to analyze, and which data to ignore, before even looking at the data. Or develop better compression algorithms so that one day, when we have more computing power, we can analyze all the data previously collected but not analyzed, and maybe in 2030, look at the hourly evolution of a far away supernova that took place in 2010 over a period of 20 years, but was undetected because the data was parked on a sleeping server.

Sr. Embedded Algorithm Software Engineer


Nest is looking for a software engineer to join our ever growing embedded software team. At Nest you'll have to opportunity to shape the Consumer Electronic business, join a world class engineering team, and help revolutionize the next unloved device in our homes. You'll be responsible to architect, design, and implement the heart that makes our future products tick.

The Algorithm Breaker – Narendra Nath Joshi – Medium


It is no doubt that an appreciable amount of emphasis is placed on Algorithm Design, which is often succeeded or preceded by Algorithm Analysis. Algorithm analysis is an estimate of resources, such as time and storage, necessary. Most algorithms are designed to work with flexible inputs. Usually, the efficiency (a.k.a running time) of an algorithm is stated as a function of time complexity to space complexity. This approach, tried and tested against time and scale, manages fairly well to provide comprehensive analysis and scope for improvement, if any.

Do you know what is bigger than Big Data?


I think I do - and it is the'appification' of analytics. What I mean by this is the reduction of a complex analytic activity such as market segmentation, down to a single button on your computer interface. That's what it looks like but the impacts are more profound. That's because it makes it possible for analytics to be successfully done by people who may not understand how it works, but do understand the'why' and'when' they need to do it. For example, a marketer in a company can access more sophisticated views of their campaigns without the need of a specialist analyst.

Automatic Programming

AITopics Original Links

Our approach to automatic programming is based on reuse of generic algorithms through views. A generic algorithm performs some task, such as sorting a linked list of records, based on abstract descriptions of the data on which the program operates. A view describes how actual application data corresponds to the abstract data as used in the generic algorithm. Given a view, a generic algorithm can be specialized by a compilation process to produce a version of the algorithm that performs the algorithm directly on the application data.