"We are just starting to discover the countless ways we can apply cognitive computing to healthcare," said Ryan Pellet, senior vice president of consulting and services for Welltok. "We are excited to have addressed a costly and cumbersome issue with our proprietary technology and IBM Watson, and will continue to explore opportunities to simplify the consumer's experience and drive new, more effective ways to engage with and satisfy them."
Scientists can now monitor and record the activity of hundreds of neurons concurrently in the brain, and ongoing technology developments promise to increase this number manyfold. However, simply recording the neural activity does not automatically lead to a clearer understanding of how the brain works. In a new review paper published in Nature Neuroscience, Carnegie Mellon University's Byron M. Yu and Columbia University's John P. Cunningham describe the scientific motivations for studying the activity of many neurons together, along with a class of machine learning algorithms--dimensionality reduction--for interpreting the activity. In recent years, dimensionality reduction has provided insight into how the brain distinguishes between different odors, makes decisions in the face of uncertainty and is able to think about moving a limb without actually moving. Yu and Cunningham contend that using dimensionality reduction as a standard analytical method will make it easier to compare activity patterns in healthy and abnormal brains, ultimately leading to improved treatments and interventions for brain injuries and disorders.
Most viewed July stories Bayesian Machine Learning, Explained Why Big Data is in Trouble: They Forgot About Applied Statistics How to Start Learning Deep Learning Top Machine Learning MOOCs and Online Lectures: A Comprehensive Survey What Has Pokemon Got To Do With Big Data? 5 Big Data Projects You Can No Longer Overlook SAS vs R vs Python: Which Tool Do Analytics Pros Prefer? Data Mining History: The Invention of Support Vector Machines Text Mining 101: Topic Modeling 5 Deep Learning Projects You Can No Longer Overlook Most shared Why Big Data is in Trouble: They Forgot About Applied Statistics Bayesian Machine Learning, Explained What Has Pokemon Got To Do With Big Data? Data Mining/Data Science "Nobel Prize": 2016 SIGKDD Innovation Award to Philip S. Yu SAS vs R vs Python: Which Tool Do Analytics Pros Prefer? How to Start Learning Deep Learning Data Mining History: The Invention of Support Vector Machines 5 Big Data Projects You Can No Longer Overlook What is Softmax Regression and How is it Related to Logistic Regression? 7 Steps to Understanding NoSQL Databases
IBM (NYSE: IBM) today revealed a series of new servers designed to help propel cognitive workloads and to drive greater data center efficiency. Featuring a new chip, the Linux-based lineup incorporates innovations from the OpenPOWER community that deliver higher levels of performance and greater computing efficiency than available on any x86-based server. Collaboratively developed with some of the world's leading technology companies, the new Power Systems are uniquely designed to propel artificial intelligence, deep learning, high performance data analytics and other compute-heavy workloads, which can help businesses and cloud service providers save money on data center costs. The three new systems are an expansion of IBM's Linux server portfolio comprised of IBM's specialized line of servers co-developed with fellow members of the OpenPOWER Foundation. The new servers join the Power Systems LC lineup that is designed to outperform x86-based servers on a variety of data-intensive workloads.