Collaborating Authors

known unknowns

Council Post: What's Holding Back NLP In The Enterprise?


Dan Turchin is the Chief Executive Officer of PeopleReign, the system of intelligence for IT and HR employee service. A toddler learns language by seeing and doing. At peak learning age, 3-year-olds learn about 400 words a month. Most toddlers immediately understand properties about it that are phenomenally complex. Cups have weight, volume and hold fluid.

Bayesian Nonparametrics for Non-exhaustive Learning Machine Learning

Non-exhaustive learning (NEL) is an emerging machine-learning paradigm designed to confront the challenge of non-stationary environments characterized by anon-exhaustive training sets lacking full information about the available classes.Unlike traditional supervised learning that relies on fixed models, NEL utilizes self-adjusting machine learning to better accommodate the non-stationary nature of the real-world problem, which is at the root of many recently discovered limitations of deep learning. Some of these hurdles led to a surge of interest in several research areas relevant to NEL such as open set classification or zero-shot learning. The presented study which has been motivated by two important applications proposes a NEL algorithm built on a highly flexible, doubly non-parametric Bayesian Gaussian mixture model that can grow arbitrarily large in terms of the number of classes and their components. We report several experiments that demonstrate the promising performance of the introduced model for NEL.

Predicting Known Unknowns with TensorFlow Probability -- Industrial AI, Part 2


While the overall trend in crack size with respect to cycles is increasing, it's not strictly monotonic. At certain instances, the observed crack is less than what was measured at the previous instant. This may very well be due to measurement error, a common feature in field data. In other words, the data is noisy.