Goto

Collaborating Authors

 minnick


A "Glass Box" Approach to Responsible Machine Learning - insideBIGDATA

#artificialintelligence

Machine learning doesn't always have to be an abstruse technology. The multi-parameter and hyper-parameter methodology of complex deep neural networks, for example, is only one type of this cognitive computing manifestation. There are other machine learning varieties (and even some involving deep neural networks) in which the results of models, how they were determined, and which intricacies influenced them, are much more transparent. It all depends on how well organizations understand their data provenance. Comprehending just about everything that happened to training data for models, as well as that for the production data models encounter, is integral to explaining, refining, and improving their results.


Databricks launches data sharing initiative, machine learning offering

#artificialintelligence

Databricks has launched a project to create an open-source data sharing protocol for securely sharing data across organisations in real time, independent of the platform on which the data resides. The Delta Sharing initiative, part of Databrick's open-source Delta Lake project, has already attracted support from a number of data providers, including NASDAQ, S&P and Factset, and leading IT vendors including Amazon Web Services, Microsoft and Google Cloud, according to Databricks. Databricks is also expanding its technology portfolio with a new machine learning system and the addition of new data pipeline and data governance capabilities to its flagship Databricks Lakehouse Platform, which combines aspects of data warehouse and data lake systems. Delta Sharing is the latest open-source initiative from Databricks, one of the most closely watched big data startups. Founded by the developers of the Apache Spark analytics engine, Databricks markets the Databricks Lakehouse Platform, its flagship unified data analytics platform.


Databricks Unveil New Machine Learning Solution

#artificialintelligence

Databricks today unveiled a new cloud-based machine learning offering that's designed to give engineer everything they need to build, train, deploy, and manage ML models. The new offering is designed to bridge the gap in existing machine learning products that arises by focusing too much on data engineering, ML model creation, or the deployment aspects of the machine learning cycle, Databricks says. "Many ML platforms fall short because they ignore a key challenge in machine learning: they assume that data are available at high quality and ready for training," Databricks says in its announcement. "That requires data teams to stitch together solutions that are good at data but not AI, with others that are good at AI but not data." To address this gap, Databricks lets users switch between user "experiences" that it exposes, including data science/engineering, SQL analytics, and machine learning experiences, to access tools and features relevant to their everyday workflow.


AWS AI tools focus on developers

#artificialintelligence

Over the last few years, AWS has invested heavily in making it easier for developers and engineers to create and deploy AI models, Minnick said, speaking with TechTarget at the AWS re:Invent 2019 user conference in Las Vegas in December 2019. AWS' efforts to simplify the machine leaning lifecycle were on full display at re:Invent. During the opening keynote, led by AWS CEO Andy Jassy, AWS revealed new products and updates for Amazon SageMaker, AWS' full-service suite of machine learning development, deployment and governance products. Those products and updates included new and enhanced tools for creating and managing notebooks, automatically making machine learning models, debugging models and monitoring models. SageMaker Autopilot, a new AutoML product, in particular, presents an accessible way for users who are new to machine learning to create and deploy models, according to Minnick.