Goto

Collaborating Authors

determinant


A machine learning classifier approach for identifying the determinants of under-five child …

#artificialintelligence

This paper aimed to explore the efficacy of machine learning (ML) approaches in predicting under-five undernutrition in Ethiopian administrative …


Linear Algebra Part-2

#artificialintelligence

Unfortunately, no one can be told what the matrix is. You have to see yourself. In this blog, lets continue the topic from previous blog where we left off. Blog covers 3-D Linear transformations, determinants, Inverse Matrices, Column Space, Null Space, Non Square matrices as transformations between dimensions. Example: Consider a linear transformation with 3-Dimentional vectors as inputs and 3-Dimentional vectors as output.


Understanding human-robot interaction critical in design of rehabilitation systems

#artificialintelligence

Robotic body-weight support (BWS) devices can play a key role in helping people with neurological disorders to improve their walking. The team that developed the advanced body-weight support device RYSEN in 2018 has since gained more fundamental insight in BWS but also concludes that improvement in this field is necessary. They find that recommendations for the optimal therapy settings have to be customized to each device and that developers should be more aware of the interaction between patient and the device. The researchers have published the results of their evaluation in Science Robotics on Wednesday September 22. Stroke, spinal cord injury or other neurological disorders can lead to impairments that severely impact the quality of life. Intensive gait neurorehabilitation training can help these individuals regain mobility and lower the workload of revalidation therapists.


Mathematics For Machine Learning Course (FREE)

#artificialintelligence

Fabio Mardero is a data scientist from Italy. He graduated in physics and statistical and actuarial sciences. He is currently working at a well-known Italian insurance company as a data scientist and Non-Life technical provisions evaluator. Linear Algebra and Mathematical Foundation: This course covers machine learning key elements, vector space, matrices, linear independence and basis and linear maps. Analytic Geometry: This course covers Lengths and Distances, Angles and Orthogonality, Orthogonal Projections and Rotations.


Machine learning and algorithmic fairness in public and population health - Nature Machine Intelligence

#artificialintelligence

Until now, much of the work on machine learning and health has focused on processes inside the hospital or clinic. However, this represents only a narrow set of tasks and challenges related to health; there is greater potential for impact by leveraging machine learning in health tasks more broadly. In this Perspective we aim to highlight potential opportunities and challenges for machine learning within a holistic view of health and its influences. To do so, we build on research in population and public health that focuses on the mechanisms between different cultural, social and environmental factors and their effect on the health of individuals and communities. We present a brief introduction to research in these fields, data sources and types of tasks, and use these to identify settings where machine learning is relevant and can contribute to new knowledge. Given the key foci of health equity and disparities within public and population health, we juxtapose these topics with the machine learning subfield of algorithmic fairness to highlight specific opportunities where machine learning, public and population health may synergize to achieve health equity. Algorithmic solutions to improve treatment are starting to transform health care. Mhasawade and colleagues discuss in this Perspective how machine learning applications in population and public health can extend beyond clinical practice. While working with general health data comes with its own challenges, most notably ensuring algorithmic fairness in the face of existing health disparities, the area provides new kinds of data and questions for the machine learning community.


Identifiability of AMP chain graph models

arXiv.org Machine Learning

We study identifiability of Andersson-Madigan-Perlman (AMP) chain graph models, which are a common generalization of linear structural equation models and Gaussian graphical models. AMP models are described by DAGs on chain components which themselves are undirected graphs. For a known chain component decomposition, we show that the DAG on the chain components is identifiable if the determinants of the residual covariance matrices of the chain components are monotone non-decreasing in topological order. This condition extends the equal variance identifiability criterion for Bayes nets, and it can be generalized from determinants to any super-additive function on positive semidefinite matrices. When the component decomposition is unknown, we describe conditions that allow recovery of the full structure using a polynomial time algorithm based on submodular function minimization. We also conduct experiments comparing our algorithm's performance against existing baselines.


How to Decompose a Tensor with Group Structure

arXiv.org Machine Learning

In this work we study the orbit recovery problem, which is a natural abstraction for the problem of recovering a planted signal from noisy measurements under unknown group actions. Many important inverse problems in statistics, engineering and the sciences fit into this framework. Prior work has studied cases when the group is discrete and/or abelian. However fundamentally new techniques are needed in order to handle more complex group actions. Our main result is a quasi-polynomial time algorithm to solve orbit recovery over $SO(3)$ - i.e. the cryo-electron tomography problem which asks to recover the three-dimensional structure of a molecule from noisy measurements of randomly rotated copies of it. We analyze a variant of the frequency marching heuristic in the framework of smoothed analysis. Our approach exploits the layered structure of the invariant polynomials, and simultaneously yields a new class of tensor decomposition algorithms that work in settings when the tensor is not low-rank but rather where the factors are algebraically related to each other by a group action.


Rectangular Flows for Manifold Learning

arXiv.org Machine Learning

Normalizing flows are invertible neural networks with tractable change-of-volume terms, which allows optimization of their parameters to be efficiently performed via maximum likelihood. However, data of interest is typically assumed to live in some (often unknown) low-dimensional manifold embedded in high-dimensional ambient space. The result is a modelling mismatch since -- by construction -- the invertibility requirement implies high-dimensional support of the learned distribution. Injective flows, mapping from low- to high-dimensional space, aim to fix this discrepancy by learning distributions on manifolds, but the resulting volume-change term becomes more challenging to evaluate. Current approaches either avoid computing this term entirely using various heuristics, or assume the manifold is known beforehand and therefore are not widely applicable. Instead, we propose two methods to tractably calculate the gradient of this term with respect to the parameters of the model, relying on careful use of automatic differentiation and techniques from numerical linear algebra. Both approaches perform end-to-end nonlinear manifold learning and density estimation for data projected onto this manifold. We study the trade-offs between our proposed methods, empirically verify that we outperform approaches ignoring the volume-change term by more accurately learning manifolds and the corresponding distributions on them, and show promising results on out-of-distribution detection.


Density estimation on low-dimensional manifolds: an inflation-deflation approach

arXiv.org Machine Learning

Normalizing Flows (NFs) are universal density estimators based on Neuronal Networks. However, this universality is limited: the density's support needs to be diffeomorphic to a Euclidean space. In this paper, we propose a novel method to overcome this limitation without sacrificing universality. The proposed method inflates the data manifold by adding noise in the normal space, trains an NF on this inflated manifold, and, finally, deflates the learned density. Our main result provides sufficient conditions on the manifold and the specific choice of noise under which the corresponding estimator is exact. Our method has the same computational complexity as NFs and does not require computing an inverse flow. We also show that, if the embedding dimension is much larger than the manifold dimension, noise in the normal space can be well approximated by Gaussian noise. This allows to use our method for approximating arbitrary densities on non-flat manifolds provided that the manifold dimension is known.


Training Humans to Train Robots Dynamic Motor Skills

arXiv.org Artificial Intelligence

Learning from demonstration (LfD) is commonly considered to be a natural and intuitive way to allow novice users to teach motor skills to robots. However, it is important to acknowledge that the effectiveness of LfD is heavily dependent on the quality of teaching, something that may not be assured with novices. It remains an open question as to the most effective way of guiding demonstrators to produce informative demonstrations beyond ad hoc advice for specific teaching tasks. To this end, this paper investigates the use of machine teaching to derive an index for determining the quality of demonstrations and evaluates its use in guiding and training novices to become better teachers. Experiments with a simple learner robot suggest that guidance and training of teachers through the proposed approach can lead to up to 66.5% decrease in error in the learnt skill.