Let's start with understanding "Predictive Analytics". This term originated as an evolution from "Descriptive Analytics", or just plain "Analytics". Descriptive analytics refers to the process of distilling large amounts of data into summary information that is more easily consumed by humans. Example techniques used in Descriptive Analytics include counts and averages to answer a question such as "What were my average sales by region last quarter?" By its nature, descriptive analytics is a backward looking view at "what happened."
'Michael – we are bigger than US Steel". Over the holiday season, I said this to my friend Jeremy Geelan when I was comparing the Mobile industry to the IoT. The term Internet of Things was coined by the British technologist Kevin Ashton in 1999, to describe a system where the Internet is connected to the physical world via ubiquitous sensors. Languishing depths of academia(at least here in Europe …) – IoT has it's netscape moment early in 2014 when Google acquired Nest Mobile is huge and has dominated the Tech landscape for the last decade. So, 50 billion by 2020 is a massive number by a factor, and no one doubts that number any more.
The Art of Service's predictive model results enable businesses to discover and apply the most profitable technologies and applications, attracting the most profitable customers, and therefore helping maximize value from their investments. The Predictive Analytics algorithm evaluates and scores technologies and applications. The platform monitors over ten thousand technologies and applications for months, looking for interest swings in a topic, concept, technology or application, not just a count of mentions. It then makes forecasts about the velocity of the interest over time, with peaks representing it breaking into the mainstream. Data sources include trend data, employment data, employee skills data, and signals like advertising spent, advertisers, search-counts, instruction and courseware availability, patents issued, and books published.
The above question seems to haunt most people who have been doing statistical predictive modeling before the term machine learning came into play. Nowadays it seems whoever have run a classification problem with any of the advanced algorithms like Neural Network, Support Vector Machine, etc. calls themselves a machine learning expert. But is this machine learning? We have this statistical/mathematical models from the early 60's . And the only reason not all of them was popular back then is because it was too advanced for the computing power available at those times.