Goto

Collaborating Authors

Results


Track Xplorer: A System for Visual Analysis of Sensor-based Motor Activity Predictions

arXiv.org Artificial Intelligence

With the rapid commoditization of wearable sensors, detecting human movements from sensor datasets has become increasingly common over a wide range of applications. To detect activities, data scientists iteratively experiment with different classifiers before deciding which model to deploy. Effective reasoning about and comparison of alternative classifiers are crucial in successful model development. This is, however, inherently difficult in developing classifiers for sensor data, where the intricacy of long temporal sequences, high prediction frequency, and imprecise labeling make standard evaluation methods relatively ineffective and even misleading. We introduce Track Xplorer, an interactive visualization system to query, analyze, and compare the predictions of sensor-data classifiers. Track Xplorer enables users to interactively explore and compare the results of different classifiers, and assess their accuracy with respect to the ground-truth labels and video. Through integration with a version control system, Track Xplorer supports tracking of models and their parameters without additional workload on model developers. Track Xplorer also contributes an extensible algebra over track representations to filter, compose, and compare classification outputs, enabling users to reason effectively about classifier performance. We apply Track Xplorer in a collaborative project to develop classifiers to detect movements from multisensor data gathered from Parkinson's disease patients. We demonstrate how Track Xplorer helps identify early on possible systemic data errors, effectively track and compare the results of different classifiers, and reason about and pinpoint the causes of misclassifications.


Future of Artificial Intelligence - You May Question the Capabilities but Cannot Ignore

#artificialintelligence

In the current era, emerging technologies are capturing our interest. Upcoming enterprises are engaged in a race on business analytics, data science, machine intelligence, robotics, cryptocurrency, blockchain, Internet of Things, cybersecurity, augmented reality and much more. Autonomous vehicles, artificial intelligence, drones and virtual reality are trending topics of discussion. While GDPR gets enforced effectively on May 25 and we have concerns with cybersecurity, we can neither leave our addiction of smartphone nor stay away from social media. We can fathom the impact of technology when we see the changes in consumer behavior.


RetainVis: Visual Analytics with Interpretable and Interactive Recurrent Neural Networks on Electronic Medical Records

arXiv.org Machine Learning

In the past decade, we have seen many successful applications of recurrent neural networks (RNNs) on electronic medical records (EMRs), which contain histories of patients' diagnoses, medications, and other various events, in order to predict the current and future states of patients. Despite the strong performance of RNNs, it is often very challenging for users to understand why the model makes a particular prediction. Such black box nature of RNNs can impede its wide adoption in clinical practice. Furthermore, we have no established method to interactively leverage users' domain expertise and prior knowledge as inputs for steering the model. Therefore, our design study aims to provide a visual analytics solution to increase interpretability and interactivity of RNNs via a joint effort of medical experts, artificial intelligence scientists, and visual analytics researchers. Following the iterative design process between the experts, we design, implement, and evaluate a visual analytics tool called RetainVis, which couples a recently proposed, interpretable RNN-based model called RETAIN and visualizations for users' exploration of EMR data in the context of prediction tasks. Our study shows the effective use of RetainVis for gaining insights into how RNN models EMR data, using real medical records of patients with heart failure, cataract, or dermatological symptoms. Our study also demonstrates how we made substantial changes to the state-of-the-art RNN model called RETAIN in order to make use of temporal information and increase interactivity. This study will provide a useful guideline for researchers who aim to design more interpretable and interactive visual analytics tool for RNNs.


Innovations in Artificial intelligence, Machine Learning, Cloud, and Blockchain

#artificialintelligence

This edition of ITCC TOE provides a snapshot of the emerging ICT led innovations in machine learning, blockchain, cloud computing, and artificial intelligence. This issue focuses on the application of information and communication technologies in alleviating the challenges faced across industry sectors in areas such as brick & mortar retail, e-commerce, data labelling, 5G, photo and video editing, manufacturing, talent and business intelligence, amongst others. ITCC TechVision Opportunity Engine (TOE)'s mission is to investigate emerging wireless communication and computing technology areas including 3G, 4G, Wi-Fi, Bluetooth, Big Data, cloud computing, augmented reality, virtual reality, artificial intelligence, virtualization and the Internet of Things and their new applications; unearth new products and service offerings; highlight trends in the wireless networking, data management and computing spaces; provide updates on technology funding; evaluate intellectual property; follow technology transfer and solution deployment/integration; track development of standards and software; and report on legislative and policy issues and many more. The Information & Communication Technology cluster provides global industry analysis, technology competitive analysis, and insights into game-changing technologies in the wireless communication and computing space. Innovations in ICT have deeply permeated various applications and markets.


Why You Must Treat Artificial Intelligence (AI) As A Very Special Technology

#artificialintelligence

There are lots of technologies that attract our attention – and money – these days. We're obsessed with blockchain, cryptocurrency, IOT, big data analytics, cybersecurity3-D printing and drones. We're excited about virtual reality, augmented reality and mixed reality. We love talking about driverless cars, ships and planes. We can't wait for 5G and Wi-Fi domes that solve all of our network access problems; and while we're getting a little worried about social media and privacy, we're still addicted to our ever-more-powerful smartphones.


Why You Must Treat Artificial Intelligence (AI) As A Very Special Technology

#artificialintelligence

There are lots of technologies that attract our attention – and money – these days. We're obsessed with blockchain, cryptocurrency, IOT, big data analytics, cybersecurity3-D printing and drones. We're excited about virtual reality, augmented reality and mixed reality. We love talking about driverless cars, ships and planes. We can't wait for 5G and Wi-Fi domes that solve all of our network access problems; and while we're getting a little worried about social media and privacy, we're still addicted to our ever-more-powerful smartphones.


Why You Must Treat Artificial Intelligence (AI) As A Very Special Technology

#artificialintelligence

There are lots of technologies that attract our attention – and money – these days. We're obsessed with blockchain, cryptocurrency, IOT, big data analytics, cybersecurity3-D printing and drones. We're excited about virtual reality, augmented reality and mixed reality. We love talking about driverless cars, ships and planes. We can't wait for 5G and Wi-Fi domes that solve all of our network access problems; and while we're getting a little worried about social media and privacy, we're still addicted to our ever-more-powerful smartphones.


Why You Must Treat Artificial Intelligence (AI) As A Very Special Technology

#artificialintelligence

There are lots of technologies that attract our attention – and money – these days. We're obsessed with blockchain, cryptocurrency, IOT, big data analytics, cybersecurity3-D printing and drones. We're excited about virtual reality, augmented reality and mixed reality. We love talking about driverless cars, ships and planes. We can't wait for 5G and Wi-Fi domes that solve all of our network access problems; and while we're getting a little worried about social media and privacy, we're still addicted to our ever-more-powerful smartphones.


Most companies still don't understand big data and AI -- and their potential?

#artificialintelligence

For nearly every industry, from communications to energy, architecture to real estate, the power of big data to provide intelligent insight can't be overstated. The fine-grained detail and big picture are both visible at this level, both captured by careful analysis of huge datasets. But make no mistake: huge isn't an exaggeration. When a dataset reaches into billions of points of information, it simply exceeds human capacities. That's where cutting-edge artificial intelligence (AI) and machine learning come in.