Goto

Collaborating Authors

 data and code


we will publish both the data and code if the paper is accepted---this was an oversight by us for not making clear we

Neural Information Processing Systems

We thank the reviewers for their thoughtful reviews and below we address their major concerns. This variability would be expected even from different recording sessions for the same subject. This allows researchers to add multiple covariates (e.g., different experimental Also related to Reviewer 1's comments, it is certainly possible to have different numbers Another major point/question raised by the reviewers was the sensitivity of our results to our intialization procedure. It is not necessary but it simplifies the inference derivation.


we will publish both the data and code if the paper is accepted---this was an oversight by us for not making clear we

Neural Information Processing Systems

We thank the reviewers for their thoughtful reviews and below we address their major concerns. This variability would be expected even from different recording sessions for the same subject. This allows researchers to add multiple covariates (e.g., different experimental Also related to Reviewer 1's comments, it is certainly possible to have different numbers Another major point/question raised by the reviewers was the sensitivity of our results to our intialization procedure. It is not necessary but it simplifies the inference derivation.


Data Analysis for Business, Economics, and Policy

#artificialintelligence

This textbook provides future data analysts with the tools, methods, and skills needed to answer data-focused, real life questions, to choose and apply appropriate methods to answer those questions, and to visualize and interpret results to support better decisions in business, economics, and public policy. Data wrangling and exploration, regression analysis, prediction with machine learning, and causal analysis are comprehensively covered, as well as when, why, and how the methods work, and how they relate to each other. As the most effective way to communicate data analysis, running case studies play a central role in this textbook. Each case starts with an industry relevant question and answers it by using real-world data and applying the tools and methods covered in the textbook. Learning is then consolidated by over 360 practice questions and 120 data exercises.


Introducing Databricks Machine Learning: a Data-native, Collaborative, Full ML Lifecycle Solution

#artificialintelligence

Today, we announced the launch of Databricks Machine Learning, the first enterprise ML solution that is data-native, collaborative, and supports the full ML lifecycle. This launch introduces a new purpose-built product surface in Databricks specifically for Machine Learning (ML) that brings together existing capabilities, such as managed MLflow, and introduces new components, such as AutoML and the Feature Store. Databricks ML provides a solution for the full ML lifecycle by supporting any data type at any scale, enabling users to train ML models with the ML framework of their choice and managing the model deployment lifecycle – from large-scale batch scoring to low latency online serving. Many ML platforms fall short because they ignore a key challenge in ML: they assume that high-quality data is ready and available for training. That requires data teams to stitch together solutions that are good at data but not AI, with others that are good at AI but not data.


Turing's Enduring Importance

AITopics Original Links

When Alan Turing was born 100 years ago, on June 23, 1912, a computer was not a thing--it was a person. Computers, most of whom were women, were hired to perform repetitive calculations for hours on end. The practice dated back to the 1750s, when Alexis-Claude Clairaut recruited two fellow astronomers to help him plot the orbit of Halley's comet. Clairaut's approach was to slice time into segments and, using Newton's laws, calculate the changes to the comet's position as it passed Jupiter and Saturn. The team worked for five months, repeating the process again and again as they slowly plotted the course of the celestial bodies.