IT software

Machine Learning using Advanced Algorithms and Visualization


Machine learning is the subfield of computer science that gives computers the ability to learn without being explicitly programmed. Then, we'll walk you through the next example on letter recognition, where you will train a program to recognize letters using a support Vector machine, examine the results, and plot a confusion matrix. Tim Hoolihan currently works at DialogTech, a marketing analytics company focused on conversations. He is the Senior Director of Data Science there.

Redefining banking through AI and big data -


The most useful in the financial sector will be natural language processing for answering customers' questions, machine learning for processing back-office operations, replacing humans especially in tedious, repetitive tasks and expert systems with predictive power, able to trade stocks automatically. The natural language processing system is handling over 30,000 conversations per month, satisfying over 75% of the bank's clients, who prefer to deal with transactions in the app or online. The innovation consists of replacing statistical models with cognitive, predictive models, to fight crime in the early stages or even before it happens, by tracking account activity. There is still room for improvement regarding predictive modeling, fraud detection and prevention, as well as automated financial advice.

Learning from Experience: FDA's Treatment of Machine Learning


While the 21st Century Cures Act that passed last December exempted certain CDS from regulation and indeed FDA intends to exempt even more, FDA will continue to regulate high risk CDS. Initially such software was placed in class III – the highest regulatory oversight for products with the greatest risk – but more recently FDA has regulated that software in class II for products of only moderate risk. In the 2012 guidance documents, FDA lists information such as algorithm design, features, models, classifiers, the data sets used to train and test the algorithm, and the test data hygiene used. FDA has also begun to receive submissions to clear software that employs machine learning in what the agency refers to as "adaptive systems" – systems that evolve over time based on the new evidence collected in the field after the device goes to market.

Turing Laureates Celebrate Award's 50th Anniversary

Communications of the ACM

Among the 22 Turing Laureates in attendance at the conference were: Front row, from left: Whitfield Diffie (2015), Martin Hellman (2015), Robert Tarjan (1986), Barbara Liskov (2008). Among the 22 Turing Laureates in attendance at the conference were: Front row, from left: Whitfield Diffie (2015), Martin Hellman (2015), Robert Tarjan (1986), Barbara Liskov (2008). Butler Lampson, the 1992 Turing Laureate ("for contributions to the development of distributed, personal computing environments and the technology for their implementation: workstations, networks, operating systems, programming systems, displays, security, and document publishing"), said, "There's plenty of room at the top; there's room in software, algorithms, and hardware." A panel on Moore's Law was moderated by John Hennessy (left) and included Doug Burger, Norman Jouppi, Butler Lampson (1992), and Margaret Martonosi.

Charles W. Bachman

Communications of the ACM

Charles William "Charlie" Bachman, the "father of databases" who received the ACM A.M. Turing Award for 1973 for creating the first database management system, died June 13 at the age of 92. Born in Manhattan, KS, in 1924, Bachman earned his B.S. in mechanical engineering in 1948, as well as an M.S. in mechanical engineering from the University of Pennsylvania. He went to work for Dow Chemical in 1950, using mechanical punched-card computing devices to solve networks of simultaneous equations representing data from Dow plants. In 1957, Bachman became head of Dow's Data Processing Department, through which he became a member of Share Inc., and a founding member of the Share Data Processing Committee. In 1960, Bachman joined the General Electric (GE) Production Control Services Group in New York City, using a factory in Philadelphia to test designs for a system to automate factory planning, scheduling, operational control, and inventory control. The resulting MIACS was based on the ...

Artificial intelligence and digital communication are disrupting the contact center space


The customer service (contact center) space is accelerating faster than the market has ever seen up to this point. They're turning the conservative call center space into cutting-edge contact centers by helping them give up the expense and complexity of their hosted gear for easy-to-use, budget-friendly software as a service products in the cloud. But even as those channels have steadily reduced the inbound call volume over the last five years, it hasn't been until the recent resurgence of artificial intelligence that contact center operators have seriously questioned their entire operations model. To put that in perspective, that's somewhere between a third and a half of the total projected cloud contact center software market value in the same time frame.

Flipboard on Flipboard


The researchers analyzed how computers interpreted words from Google News and a 840 billion-word dataset used by computer scientists, and they found that machines linked "male" and "man" with STEM fields and "woman" and "female" with chores. Another study published last summer found that when software based on Google News was asked "Man is to computer programmer as woman is to X," it responded, "homemaker." The way artificial intelligence identifies words and images is based on the way people use them, so in order to promote a more egalitarian world, engineers would have to intervene in the creation of the software. Eric Horvitz, director of Microsoft Research, told Wired that Microsoft has a committee for this.

Microsoft launches Project Brainwave for real-time artificial intelligence


With the help of ultra-low latency, the system processes requests as fast as it receives them. He added that the system architecture reduces latency, since the CPU does not need to process incoming requests, and allows very high throughput, with the FPGA processing requests as fast as the network can stream them. Microsoft is also planning to bring the real-time AI system to users in Azure. "With the'Project Brainwave' system incorporated at scale and available to our customers, Microsoft Azure will have industry-leading capabilities for real-time AI," Burger noted.

Robot Process Automation: Reality of execution vs perception of future


With Robotic Process Automation (RPA) finding a purposeful and powerful friend in analytics, it can only turn into a bigger deal. Because it is clear that process automation is the next logical step in the future of customer experience. Involve the IT team and SMEs: Ensure that your IT team understands why process automation tools are different from other tools in terms of security and deployment measures. Assess availability of in-house skills: Several skills are required, including the selection of suitable processes, best-suited tools, how to set it up, building and testing, writing necessary scripts, monitoring run times and more.

Rob High: The future of AI-powered chatbots


Watson powers the company's chatbot GWYN (Gifts When You Need) and helps it detect user tone. GWYN interacts with online customers using natural language and is designed to understand human intention behind each purchase by interpreting and asking several questions. Last year, an AI teaching assistant powered by IBM Watson helped moderate an online forum for a computer science class at Georgia Tech University, and most students didn't find out they were interacting with AI. IBM Watson did a pilot with the Australian government around Nadia, a virtual assistant platform that helps disabled people get information about government services.