@machinelearnbot


Computer Age Statistical Inference: Algorithms, Evidence and Data Science

@machinelearnbot

The twenty-first century has seen a breathtaking expansion of statistical methodology, both in scope and in influence. "Big data," "data science," and "machine learning" have become familiar terms in the news, as statistical methods are brought to bear upon the enormous data sets of modern science and commerce. This book takes us on a journey through the revolution in data analysis following the introduction of electronic computation in the 1950s. The book integrates methodology and algorithms with statistical inference, and ends with speculation on the future direction of statistics and data science.


E-Spirit's New Intelligent Content Engine Drives AI-Based Personalization

@machinelearnbot

E-Spirit has added an artificial intelligence-powered personalization content engine into its FirstSpirit Digital Experience Hub. The Dortmund, Germany-based web content management provider is calling it the FirstSpirit Intelligent Content Engine. Michael Gerard, chief marketing officer for e-Spirit, told CMSWire in an interview this week the Intelligent Content Engine helps complete the company's digital experience offering with personalization. "The FirstSpirit Intelligent Content Engine will help marketers to finally tap into the power of data to create and deliver highly individualized content to their users in real-time, synchronized across any channel to outperform their competitors," Udo Straesser, chief revenue officer for e-Spirit, said in a statement.


Interesting Application of the Greedy Algorithm for Egyptian Fractions

@machinelearnbot

Here we discuss a new system to represent numbers, for instance constants such as Pi, e, or log 2, using rational fractions. Each iteration doubles the precision (the number of correct decimals computed) making it converging much faster than current systems such as continued fractions, to represent any positive real number. Note that E could be any set of strictly positive numbers (for instance, prime numbers, or numbers that are not even integers), as long of the sum of the inverse of the elements of E, is infinite. With other number representations (continued fractions or base 10 representation), some classes of numbers result in periodicity.


Domino habits for data science

@machinelearnbot

Inculcating discipline [Understanding business justification] – Explore and document'why' your data is there? What are the technical systems / business processes that generated this data? So go on bring out the learning, machine learning, deep learning packages and enjoy.. Inculcating discipline [Understanding business justification] – Explore and document'why' your data is there? So go on bring out the learning, machine learning, deep learning packages and enjoy..


corenlp

@machinelearnbot

This library connects to Stanford CoreNLP either via HTTP or by spawning processes. CoreNLP connects by default via StanfordCoreNLPServer, using port 9000. CoreNLP expects by default the StanfordCoreNLP package to be placed (unzipped) inside the path ${YOUR_NPM_PROJECT_ROOT}/corenlp/. NOTE1: The examples below assumes that StanfordCoreNLP is running on port 9000.


How Facebook Is Using Artificial Intelligence

@machinelearnbot

With a vision that "artificial intelligence can play a big role in helping bring the world closer together," Facebook has opened a new AI research lab in Montreal as part of Facebook AI Research (FAIR). In addition, Facebook announced a $7 million in AI support for Canadian Institute for Advanced Research (CIFAR), the Montreal Institute for Learning Algorithms (MILA), McGill University, and Université de Montréal (over 5 years). The software giant even allocated $6 million to the Université de Montréal and $1 million to McGill University for AI research (over 5 years). Last year, Facebook introduced DeepText--a deep learning based text understanding engine.


Practical Deep Learning with PyTorch - Udemy

@machinelearnbot

Although many courses are very mathematical or too practical in nature, this course strikes a careful balance between the two to provide a solid foundation in deep learning for you to explore further if you are interested in research in the field of deep learning and/or applied deep learning. It is purposefully made for anyone without a strong background in mathematics. And for those with a strong background, it would accelerate your learning in understanding the different models in deep learning. This is not a course that emphasizes heavily on the mathematics behind deep learning.


A Solution to Missing Data: Imputation Using R

@machinelearnbot

If the missing values are not MAR or MCAR then they fall into the third category of missing values known as Not Missing At Random, otherwise abbreviated as NMAR. The package provides four different methods to impute values with the default model being linear regression for continuous variables and logistic regression for categorical variables. In R, I will use the NHANES dataset (National Health and Nutrition Examination Survey data by the US National Center for Health Statistics). The NHANES data is a small dataset of 25 observations, each having 4 features - age, bmi, hypertension status and cholesterol level.


Tensorflow Tutorial: Part 1 – Introduction

@machinelearnbot

This series is excerpts from a Webinar tutorial series I have conducted as part of the United Network of Professionals. Many applications as of today have tensorflow embedded as part of their machine learning applications. Let's explore the tensorflow environment and how the flexible architecture makes implementation so easy. This means you can execute code locally in your laptop with a CPU of a GPU if you have one.


Science, data science and causality - unresolved contradictions

@machinelearnbot

On the other hand, statistics, the science of all sciences, lives in peace with the idea that "correlation is not causation" (but no more than that flat statement) for about a century and generates an uncountable number of models where causation never slept for a minute. But the gap between three grand approaches - classical statistical inference (based on the idea of significance, or non-randomness); statistical (machine) learning (based on the idea of the error minimization on testing data), and causality theory per se - does not seem to narrow. Within the linear model certain criteria for distinction were proposed in I. Mandel (2017) Troublesome Dependency Modeling: Causality, Inference, Statistical Learning https://papers.ssrn.com/sol3/papers.cfm?abstract_id 2984045 (section 4.2.2), For binary variables the problem of estimation of causal coefficients is analytically solved in S. Lipovetsky and I. Mandel Modeling Probability of Causal and Random Impacts. And these or similar questions should be answered, if data science claims to be this new paradigm.