Goto

Collaborating Authors

How Cognitive Bias in AI Impacts Business Outcomes

#artificialintelligence

With billions of dollars at stake, decision-makers need to set boundaries and parameters for AI to avoid any downsides to technology usage. It is critical to know how to avoid common mistakes with neural networks to feel confident about your solution stack. AI processes information differently, and it's essential to understand how each works before applying it in business. For instance, specific data that a neural network might not be able to process, such as the reasoning behind the results of an insurance claim -- might not have a straightforward representation in machine learning because of possible interpretations. In this situation, the output of a neural network might not have quality results.


ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING

#artificialintelligence

Artificial Intelligence (A.I.) will soon be at the heart of every major technological system in the world including: cyber and homeland security, payments, financial markets, biotech, healthcare, marketing, natural language processing, computer vision, electrical grids, nuclear power plants, air traffic control, and Internet of Things (IoT). While A.I. seems to have only recently captured the attention of humanity, the reality is that A.I. has been around for over 60 years as a technological discipline. In the late 1950's, Arthur Samuel wrote a checkers playing program that could learn from its mistakes and thus, over time, became better at playing the game. MYCIN, the first rule-based expert system, was developed in the early 1970's and was capable of diagnosing blood infections based on the results of various medical tests. The MYCIN system was able to perform better than non-specialist doctors. While Artificial Intelligence is becoming a major staple of technology, few people understand the benefits and shortcomings of A.I. and Machine Learning technologies. Machine learning is the science of getting computers to act without being explicitly programmed. Machine learning is applied in various fields such as computer vision, speech recognition, NLP, web search, biotech, risk management, cyber security, and many others.


ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING

#artificialintelligence

Artificial Intelligence (A.I.) will soon be at the heart of every major technological system in the world including: cyber and homeland security, payments, financial markets, biotech, healthcare, marketing, natural language processing, computer vision, electrical grids, nuclear power plants, air traffic control, and Internet of Things (IoT). While A.I. seems to have only recently captured the attention of humanity, the reality is that A.I. has been around for over 60 years as a technological discipline. In the late 1950's, Arthur Samuel wrote a checkers playing program that could learn from its mistakes and thus, over time, became better at playing the game. MYCIN, the first rule-based expert system, was developed in the early 1970's and was capable of diagnosing blood infections based on the results of various medical tests. The MYCIN system was able to perform better than non-specialist doctors. While Artificial Intelligence is becoming a major staple of technology, few people understand the benefits and shortcomings of A.I. and Machine Learning technologies. Machine learning is the science of getting computers to act without being explicitly programmed.


ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING

#artificialintelligence

Artificial Intelligence (A.I.) will soon be at the heart of every major technological system in the world including: cyber and homeland security, payments, financial markets, biotech, healthcare, marketing, natural language processing, computer vision, electrical grids, nuclear power plants, air traffic control, and Internet of Things (IoT). While A.I. seems to have only recently captured the attention of humanity, the reality is that A.I. has been around for over 60 years as a technological discipline. In the late 1950's, Arthur Samuel wrote a checkers playing program that could learn from its mistakes and thus, over time, became better at playing the game. MYCIN, the first rule-based expert system, was developed in the early 1970's and was capable of diagnosing blood infections based on the results of various medical tests. The MYCIN system was able to perform better than non-specialist doctors. While Artificial Intelligence is becoming a major staple of technology, few people understand the benefits and shortcomings of A.I. and Machine Learning technologies. Machine learning is the science of getting computers to act without being explicitly programmed.


Exploring the Artificially Intelligent Future of Finance

#artificialintelligence

Exploring the Artificially Intelligent Future of Finance With technological enhancements increasing computing power and decreasing its cost, easing access to big data and innovating algorithms, there has been a huge surge in interest of artificial intelligence, machine learning and its subset, deep learning, in recent years. The popularity of smartphones, wearables and social media platforms has led to an explosion in the amount of data being recorded and AI is the only way to make use of it. With the surge of digital disruption in the financial services, the industry has led to hundreds of emerging startups bringing new ways for people to bank, which is causing traditional methods to undergo an innovation overhaul to integrate new technological advancements in order to compete. To celebrate London's 3rd Annual (15-22 July), we spoke to experts in the field to find out how and why, and, most importantly, what we can expect in the future. What have been the leading factors enabling recent advancements and uptake of deep learning? Jan: Astonishing increases in computing power and data availability in recent years have been the main benefactors of deep learning technology.