Mathematical & Statistical Methods


From the NFL to MIT: The Double Life of John Urschel

MIT Technology Review

As a result, Urschel has been profiled in Sports Illustrated and the Washington Post and featured on HBO's Real Sports; written columns (featuring math puzzles) for The Players' Tribune; and even appeared in a nationally broadcast television commercial last season for Bose headphones, along with J.J. Watt, the superstar defensive end of the Houston Texans. But why is a highly paid pro football player grinding through problems in the math study room? Those papers cover areas such as spectral graph theory and feature titles like "Spectral Bisection of Graphs and Connectedness"--which appeared in Linear Algebra and Its Applications in the spring of 2014, around the time the Ravens drafted him. Despite the extraordinary degree of commitment required to play football at his level, Urschel remains fully committed to pursuing math at the highest level, too, Zikatanov says.


The Double Life of John Urschel

MIT Technology Review

As a result, Urschel has been profiled in Sports Illustrated and the Washington Post and featured on HBO's Real Sports; written columns (featuring math puzzles) for The Players' Tribune; and even appeared in a nationally broadcast television commercial last season for Bose headphones, along with J.J. Watt, the superstar defensive end of the Houston Texans. But why is a highly paid pro football player grinding through problems in the math study room? Those papers cover areas such as spectral graph theory and feature titles like "Spectral Bisection of Graphs and Connectedness"--which appeared in Linear Algebra and Its Applications in the spring of 2014, around the time the Ravens drafted him. Despite the extraordinary degree of commitment required to play football at his level, Urschel remains fully committed to pursuing math at the highest level, too, Zikatanov says.


Quantum Machine Learning Computer Hybrids at the Center of New Start-Ups

#artificialintelligence

Creative Destruction Lab, a technology program affiliated with the University of Toronto's Rotman School of Management in Toronto, Canada hopes to nurture numerous quantum learning machine start-ups in only a few years. Currently, researchers are mostly focused using the emergent technology of quantum computers to help machine learning programs to solve problems quicker or to use typical machine learning builds to add stability and potency to quantum computers. In the same way that quantum cryptography and quantum random number generation have been developed without large sized quantum computers, he says, so too could the field of quantum machine learning. On the other hand, when it comes to generating random numbers, typical machine learning falls short, according to Wittek.


SciPy Cheat Sheet: Linear Algebra in Python

#artificialintelligence

One of those packages is SciPy, another one of the core packages for scientific computing in Python that provides mathematical algorithms and convenience functions built on the NumPy extension of Python. You'll see that this SciPy cheat sheet covers the basics of linear algebra that you need to get started: it provides a brief explanation of what the library has to offer and how you can use it to interact with NumPy, and goes on to summarize topics in linear algebra, such as matrix creation, matrix functions, basic routines that you can perform with matrices, and matrix decompositions from scipy.linalg. Sparse matrices are also included, with their own routines, functions, and decompositions from the scipy.sparse The SciPy library is one of the core packages for scientific computing that provides mathematical algorithms and convenience functions built on the NumPy extension of Python. Don't miss our other Python cheat sheets for data science that cover Numpy, Scikit-Learn, Bokeh, Pandas and the Python basics.


Four Weird Mathematical Objects

@machinelearnbot

Here I discuss four interesting mathematical problems (mostly involving famous unsolved conjectures) of considerable interest, and that even high school kids can understand. The field itself has been a source of constant innovation -- especially to develop distributed architectures, as well as HPC (high performance computing) and quantum computing to try to solve (to non avail so far) these very difficult yet basic problems. And the data sets involved in these problems are incredibly massive and entirely free: it consists of all the integers, and real numbers! The first two problems have been addressed on Data Science Central (DSC) before, the two other ones are presented here on DSC for the first time.


Clojure Linear Algebra Refresher (2) - Eigenvalues and Eigenvectors

#artificialintelligence

If there are scalar \(\lambda\) and a non-zero vector \(\mathbf{x}\) such that \(A\mathbf{x} \lambda\mathbf{x}\), we call such scalar eigenvalue, and such vector eigenvector. There can be more than one eigenvalue for a given matrix, and there is an infinite number of eigenvectors corresponding to one eigenvalue. An infinite number of vectors correspond to this λ value, but they are all linearly dependent: find one base vector, and you can construct any other by scaling that one. As an exercise, you might check whether any linear combination of these two eigenvectors (column 0 and column 2 from the result matrix) is indeed an eigenvector (it should be!).


Stay ahead of cyberattacks and fraud with predictive analytics

@machinelearnbot

In the United States, for example, insurance fraud--excluding health insurance fraud--incurs an estimated $40 billion in costs every year, boosting premiums across the board. As companies struggle to cut costs by mitigating the effects of fraud, predictive analytics algorithms scrutinize claims in a multistage process designed to help insurance companies efficiently detect and eliminate fraudulent activity by revealing insights into fraudulent patterns and claims data. By implementing IBM SPSS predictive analytics solutions, the Infinity Property and Casualty Corporation of Birmingham, Alabama, gained the ability to closely scrutinize claims histories, flagging suspicious claims for further investigation while fast-tracking legitimate claims. To learn more, discover the full scope of IBM SPSS predictive analytics capabilities.



ridge-regression-and-the-lasso

#artificialintelligence

This post will be about two methods that slightly modify ordinary least squares (OLS) regression – ridge regression and the lasso. Like OLS, ridge attempts to minimize residual sum of squares of predictors in a given model. However, ridge regression includes an additional'shrinkage' term – the square of the coefficient estimate – which shrinks the estimate of the coefficients towards zero. Two interesting implications of this design are the facts that when λ 0 the OLS coefficients are returned and when λ, coefficients will approach zero.


Day88: Ridge Regression

#artificialintelligence

I ended up thinking about ridge regression today, and how foggy my recollection is. Therefore, in today's post I'll look into ridge regression. The data used is the Digit Recognizer MNIST data set available on Kaggle. From what I understood, ridge regression uses L2 which does not cause feature weights to be 0 (whereas using L1 does).