flo
- South America > Paraguay > Asunción > Asunción (0.04)
- North America > United States > Virginia (0.04)
- South America > Paraguay > Asunción > Asunción (0.04)
- North America > United States > Virginia (0.04)
- North America > United States > Virginia (0.04)
- Asia > Middle East > Jordan (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Information Technology > Data Science (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.67)
On Integrated Clustering and Outlier Detection
Lionel Ott, Linsey Pang, Fabio T. Ramos, Sanjay Chawla
The advantages of combining clustering and outlier selection include: (i) the resulting clusters tend to be compact and semantically coherent (ii) the clusters are more robust against data perturbations and (iii) the outliers are contextualised by the clusters and more interpretable. We provide a practical subgradient-based algorithm for the problem and also study the theoretical properties of algorithm in terms of approximation and convergence. Extensive evaluation on synthetic and real data sets attest to both the quality and scalability of our proposed method.
- Asia > Afghanistan > Parwan Province > Charikar (0.04)
- North America > United States > New York > New York County > New York City (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
On Integrated Clustering and Outlier Detection Linsey Pang University of Sydney
The advantages of combining clustering and outlier selection include: (i) the resulting clusters tend to be compact and semantically coherent (ii) the clusters are more robust against data perturbations and (iii) the outliers are contextualised by the clusters and more interpretable. We provide a practical subgradient-based algorithm for the problem and also study the theoretical properties of algorithm in terms of approximation and convergence. Extensive evaluation on synthetic and real data sets attest to both the quality and scalability of our proposed method.
- Asia > Afghanistan > Parwan Province > Charikar (0.04)
- North America > United States > New York > New York County > New York City (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
Tight Mutual Information Estimation With Contrastive Fenchel-Legendre Optimization
Guo, Qing, Chen, Junya, Wang, Dong, Yang, Yuewei, Deng, Xinwei, Carin, Lawrence, Li, Fan, Tao, Chenyang
Successful applications of InfoNCE and its variants have popularized the use of contrastive variational mutual information (MI) estimators in machine learning. While featuring superior stability, these estimators crucially depend on costly large-batch training, and they sacrifice bound tightness for variance reduction. To overcome these limitations, we revisit the mathematics of popular variational MI bounds from the lens of unnormalized statistical modeling and convex optimization. Our investigation not only yields a new unified theoretical framework encompassing popular variational MI bounds but also leads to a novel, simple, and powerful contrastive MI estimator named as FLO. Theoretically, we show that the FLO estimator is tight, and it provably converges under stochastic gradient descent. Empirically, our FLO estimator overcomes the limitations of its predecessors and learns more efficiently. The utility of FLO is verified using an extensive set of benchmarks, which also reveals the trade-offs in practical MI estimation.
- Asia > Middle East > Jordan (0.04)
- South America > Paraguay > Asunción > Asunción (0.04)
- North America > United States > Virginia (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
FLO: Fast and Lightweight Hyperparameter Optimization for AutoML
Integrating ML models in software is of growing interest. Building accurate models requires right choice of hyperparameters for training procedures (learners), when the training dataset is given. AutoML tools provide APIs to automate the choice, which usually involve many trials of different hyperparameters for a given training dataset. Since training and evaluation of complex models can be time and resource consuming, existing AutoML solutions require long time or large resource to produce accurate models for large scale training data. That prevents AutoML to be embedded in a software which needs to repeatedly tune hyperparameters and produce models to be consumed by other components, such as large-scale data systems. We present a fast and lightweight hyperparameter optimization method FLO and use it to build an efficient AutoML solution. Our method optimizes for minimal evaluation cost instead of number of iterations to find accurate models. Our main idea is to leverage a holistic consideration of the relations among model complexity, evaluation cost and accuracy. FLO has a strong anytime performance and significantly outperforms Bayesian Optimization and random search for hyperparameter tuning on a large open source AutoML Benchmark. Our AutoML solution also outperforms top-ranked AutoML libraries in a majority of the tasks on this benchmark.
- North America > United States > New York > New York County > New York City (0.14)
- North America > United States > Virginia > Albemarle County > Charlottesville (0.14)
- North America > United States > Washington > King County > Redmond (0.04)
- Asia > Middle East > Jordan (0.04)
The 200 billion dollar chatbot disruption (part two)
In the last post, we highlighted the disruption that chatbot technologies are poised to make in call centers. To recap, we are seeing the trend that Generation X and Y have now shown a preference for text-based communication over voice. This results in consumers increasingly wanting to talk with brands via messaging platforms like Whatsapp and Facebook Messenger. Simultaneously, there has been an explosion of conversational A.I. technology tools and frameworks in which natural language processing can be used to automate customer support inquiries. As the last installment discussed, this trend provides a compelling opportunity for companies to drastically reduce the costs of running their call centers. But the disruption goes further.
- Information Technology > Services (0.36)
- Marketing (0.32)
- Information Technology > Communications > Social Media (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (0.64)
Do-It-Yourself Cat Door Recognizes Your Feline
Forster's cat, Timothy, is pretty cute, but Forster is tired of him bringing in dead or dying birds and mice and dropping them on the carpet. Eight years ago, an image-recognition software company solved the same problem with its company cat, Flo. Quantum Picture developed a cat door that let Flo in, but locked her out if it saw she was carrying something in her mouth. At the time, the door connected to a desktop computer that ran the program that snapped pictures of Flo and analyzed them as she approached the door. Now, Forster is determined to build Flo's door for Timothy on weekends, in between his usual consulting work in Simi Valley, Calif.
- North America > United States > California > Ventura County > Simi Valley (0.25)
- North America > United States > Massachusetts > Hampshire County > Amherst (0.05)