Microsoft, Machine Learning, And "Data Wrangling": ML Leverages Business Intelligence For B2B

#artificialintelligence

"Data wrangling" was an interesting phrase to hear in the machine learning (ML) presentations at Microsoft Ignite. Interesting because data wrangling is from business intelligence (BI), not from artificial intelligence (AI). Microsoft understands ML incorporates concepts from both disciplines. Further discussions point to another key point: Microsoft understands that business-to-business (B2B) is just as fertile for ML as business-to-consumer (B2C). ML applications with the most press are voice, augmented reality and autonomous vehicles.


Azure Data Factory v2: Hands-on overview

ZDNet

The second major version of Azure Data Factory, Microsoft's cloud service for ETL (Extract, Transform and Load), data prep and data movement, was released to general availability (GA) about two months ago. Cloud GAs come so fast and furious these days that it's easy to be jaded. But data integration is too important to overlook, and I wanted to examine the product more closely. Roughly thirteen years after its initial release, SQL Server Integration Services (SSIS) is still Microsoft's on-premises state of the art in ETL. It's old, and it's got tranches of incremental improvements in it that sometimes feel like layers of paint in a rental apartment.


Global Big Data Conference

#artificialintelligence

Qualified data providers include category-leading brands such as Reuters, who curate data from over 2.2 million unique news stories per year in multiple languages; Change Healthcare, who process and anonymize more than 14 billion healthcare transactions and $1 trillion in claims annually; Dun & Bradstreet, who maintain a database of more than 330 million global business records; and Foursquare, whose location data is derived from 220 million unique consumers and includes more than 60 million global commercial venues. For qualified data providers, AWS Data Exchange makes it easy to reach the millions of AWS customers migrating to the cloud by removing the need to build and maintain infrastructure for data storage, delivery, billing, and entitling. Enterprises, scientific researchers, and academic institutions have been using third-party data for decades to conduct research, power applications and analytics, train machine-learning models, and make data-driven decisions. But, as these customers subscribe to more third-party data, they often have to wait weeks to receive shipped physical media, manage sensitive credentials for multiple File Transfer Protocol (FTP) hosts and periodically check for updates, or code to several disparate application programming interfaces (APIs). These methods are inconsistent with the modern architectures customers are developing in the cloud.


Global Big Data Conference

#artificialintelligence

Qualified data providers include category-leading brands such as Reuters, who curate data from over 2.2 million unique news stories per year in multiple languages; Change Healthcare, who process and anonymize more than 14 billion healthcare transactions and $1 trillion in claims annually; Dun & Bradstreet, who maintain a database of more than 330 million global business records; and Foursquare, whose location data is derived from 220 million unique consumers and includes more than 60 million global commercial venues. For qualified data providers, AWS Data Exchange makes it easy to reach the millions of AWS customers migrating to the cloud by removing the need to build and maintain infrastructure for data storage, delivery, billing, and entitling. Enterprises, scientific researchers, and academic institutions have been using third-party data for decades to conduct research, power applications and analytics, train machine-learning models, and make data-driven decisions. But, as these customers subscribe to more third-party data, they often have to wait weeks to receive shipped physical media, manage sensitive credentials for multiple File Transfer Protocol (FTP) hosts and periodically check for updates, or code to several disparate application programming interfaces (APIs). These methods are inconsistent with the modern architectures customers are developing in the cloud.


Internet of Things and Bayesian Networks

@machinelearnbot

As big data becomes more of cliche with every passing day, do you feel Internet of Things is the next marketing buzzword to grapple our lives. So what exactly is Internet of Thing (IoT) and why are we going to hear more about it in the coming days. Internet of thing (IoT) today denotes advanced connectivity of devices,systems and services that goes beyond machine to machine communications and covers a wide variety of domains and applications specifically in the manufacturing and power, oil and gas utilities. An application in IoT can be an automobile that has built in sensors to alert the driver when the tyre pressure is low. Built-in sensors on equipment's present in the power plant which transmit real time data and thereby enable to better transmission planning,load balancing.