IBM


IBM, JDRF partnership using machine learning methods to tackle Type 1 diabetes

#artificialintelligence

What the research collaboration will attempt to do is create an entry point in the field of precision medicine -- combining JDRF's connections to research teams around the globe, and its subject matter expertise in T1D research, with the technical capability and computing power of IBM. IBM scientists will look across at least three different data sets and apply machine learning algorithms to help find patterns and factors that may be at play, with the goal of identifying ways that could delay or prevent T1D in children. As a result, JDRF will be in a better position to identify the top predictive risk factors for T1D, cluster patients based on top risk factors, and explore a number of data-driven models for predicting onset. The deep expertise our team has in artificial intelligence applied to healthcare data makes us uniquely positioned to help JDRF unlock the insights hidden in this massive data set and advance the field of precision medicine towards the prevention and management of diabetes."


Watson Lab welcomes high school interns with access to AI and cognitive APIs - Watson

#artificialintelligence

Key Points: – We're kicking off Watson Lab's high school internship program for the spring semester – The curriculum prepares the students to work as Software Developers at IBM during their Senior year. At Watson Lab we are piloting a high school internship program for the spring semester. One of the teachers, David Conover IBM Champion for Cloud 2016 & 2017, introduced me to the Superintendent of Pflugerville ISD. The school curriculum prepares the students to work as Software Developers at IBM during their Senior year.


What's in a bad user review? Maybe your next breakthrough - Watson

#artificialintelligence

In fact, some reviews, even negative feedback, can reveal insights and better customer experiences. Some negative reviews can reveal insights that boost your app's popularity or at least deliver a great experience for your customers. The trick is distilling meaningful feedback from reviews including both positive and negative feedback. IBM is making it easier to analyze relevant customer reviews and uncover actionable intelligence buried in the data.


IBM aims machine learning at type 1 diabetes with JDRF partnership

#artificialintelligence

The partnership is meant to give type 1 diabetes a foothold in emerging precision medicine efforts, officials say, combining JDRF's global research with the computing power of IBM. The models that emerge should quantify the risk for juvenile diabetes from the combined dataset using this foundational set of features, officials say. That will enable JDRF to better identify top predictive risk factors, cluster patients based on them and explore a number of data-driven models for predicting onset. A bit further on, the partners have eyes toward putting big data to work helping understand root causes of type 1 diabetes and hope to apply analytics to more complex datasets, such as microbiome and genomics or transcriptomics data.


Make your bets – Chatbot's Life

#artificialintelligence

With the evolution of the gaming industry, programming techniques and processing power, games typically reserved for human beings -- such as checkers, chess, and backgammon -- have become increasingly challenging as single-player games. In an article published in 1950, American mathematician Claude Shannon -- one of the most important names in the History of Computer Science -- estimated that in a typical game of chess there are at least 10¹²⁰ (ten to the power of one hundred and twenty) possibilities for distinct configurations (this became known as the Shannon number). The number of simulations depends on the complexity of the program as well as the computing and storage power of the computer. Between 1996 and 1997, IBM's Deep Blue computer defeated then-world chess champion Garry Kasparov.


IBM SPSS: Statistical Data Analysis Made Easy - Udemy

@machinelearnbot

From simple statistical analyses like descriptive statistics, graphs, cross tabulation, correlation, regression analysis to hypothesis testing techniques like t-test, chi-square, ANOVA, and multivariate analysis like factor analysis, cluster analysis, conjoint analysis, Multiple ANOVA, Multiple Regression, Hierarchical Linear Models can be calculated with few clicks. At the same time tests of normality like K-S test, Shapiro-Wilk test, Levene's Test of Homogeneity of Variances, Fishers Least Significant Difference (LSD) test, Cronbach's scale reliability and many other complex statistical techniques can be calculated with ease. The course also covers normality tests, test of homogeneity, and multiple comparison tests. After attending this course you would be able to create SPSS file, define variables, enter data, run descriptive statistics, create graphs, find out relationship between variables, test the linearity and normality of the data, run hypothesis testing and interpret the results.


How to put the IBM built-in data scientist to work for you - Cloud computing news

#artificialintelligence

To help operations teams take action, the technology delivers insights that include forecasts, discovered relationships, correlations and anomaly history. When IT environments change, the IBM technology will simply adapt and learn the "new normal," avoiding the need to manually adapt data models and thresholds. One IBM banking client is using IBM Operations Analytics technology to manage their online banking application. In the third post, Kristian Stewart, senior technical staff member for IBM Analytics and Event Management will explain how our approach delivers effectiveness and efficiency gains, at massive scale, through actionable insights from event data.


IBM's Watson to Listen in on 911 Calls – MeriTalk

#artificialintelligence

APCO recently announced that APCO International's new guide card software called APCO IntelliCommä will use IBM Watson Speech-to-Text and Watson Analytics to improve the scripts used by 911 operators. "Its extensive capabilities and unique analytic features will enable public safety communications professionals to improve response times and the quality of care on the scene while enhancing post-action data that's key to continuous improvement back at the PSAP. The APCO IntelliCommä software will use Watson Speech-to-Text and other IBM Watson and machine learning capabilities to understand the context of the emergency calls. That way call center directors can quickly modify training and response communications, as well as provide on-the-spot coaching.


IBM's Watson is Becoming a Crime Fighter - Learn How It is Helping the Financial Industry

#artificialintelligence

IBM's newest cognitive computing offering is Financial Crimes Insight with Watson, which is designed to help banks spot financial crimes such as money laundering. The mission of this latest incarnation of Watson, the brainchild of the company's newly formed Watson Financial Services division, is to "[help] organizations efficiently manage financial investigation efforts through streamlined research and analysis of unstructured and structured data." This new suite of Watson products is aimed at helping financial institutions manage their regulatory and fiduciary obligations. For example, in addition to the Financial Crimes Insight with Watson product, IBM is also offering Watson Regulatory Compliance, which focuses on assisting financial institutions in understanding and addressing constantly changing regulatory requirements.


IBM: Wait Is Over for Deep Learning Light Reading

#artificialintelligence

Last week IBM Corp. (NYSE: IBM) announced that its software was able to take the speed of training deep neural networks down from weeks to hours, or hours to minutes depending on the use case, while also improving the accuracy. Hunter walked Light Reading through IBM's recent breakthrough, why it matters, timelines for deployment and what it means for the telecom industry, specifically, in a recent interview. Hillery Hunter: It is very interesting because deep learning works by feeding the neural networks many pieces of data where that data has been labeled. Because of that, people have tolerated really long learning and model training times.