Computers have become adept at extracting patterns from very large collections of data. For example, shopping transactions can reveal consumers' preferences and message traffic on social networks can reveal political trends.
When you think of the words "data" and "mine", no doubt the idea of data mining comes first. However, just as much as we find value in mining the rich resources of data, so too can we apply the advanced techniques for dealing with data to real-world mining -- that is, extracting natural resources from the earth. The world is just as dependent on natural resources as it is data resources, so it makes sense to see how the evolving areas of artificial intelligence and machine learning have an impact on the world of mining and natural resource extraction. Mining has always been a dangerous profession, since extracting minerals, natural gas, petroleum, and other resources requires working in conditions that can be dangerous for human life. Increasingly, we are needing to go to harsher climates such as deep under the ocean or deep inside the earth to extract the resources we still need.
AI has become the need of the hour and all the industries are now integrating analytics and AI to drive the decision-making process. Bhagirath Kumar Lader, who is the Chief Manager (Business Information System) at GAIL led us through a session briefing Artificial Intelligence essentials for business leaders in today's age. Lader is one of the key members of the digital transformation team at GAIL and carries huge knowledge about how AI, ML and DL are crucial to businesses. He gave us a quick overview of the motivation for AI, AI essentials, AI hype vs reality while taking us through use cases. While AI is a crucial part of businesses, one of the key drivers of its implementation is its ability to make the decision which is usually considered the forte of humans.
Singapore has kicked off efforts to develop a framework to ensure the "responsible" adoption of artificial intelligence (AI) and data analytics in credit risk scoring and customer marketing. Two teams comprising banks and industry players have been tasked to establish metrics that can assist financial institutions in ensuring the "fairness" of their AI and data analytics tools in these instances. The Monetary Authority of Singapore (MAS) said a whitepaper detailing the metrics would be published by year-end along with an open source code to enable financial institutions to adopt the metrics. These organisations then would be able to integrate the open source code into their own IT systems to assess the fairness of their AI applications, the industry regulator said in a statement Friday. It added that the open source code would be deployed on the online global marketplace and sandbox, API Exchange (APIX), which enabled fintech and FSI companies to integrate and test applications via a cloud-based platform.
Automation has been used for decades in a wide range of industries to boost efficiency and productivity, reduce waste and ensure quality and safety. Emerging technologies such as Artificial Intelligence (AI), Natural Language Processing (NLP) and big data analytics are now being combined with automation, to deal with more complex problems and bring further improvements to business processes. This convergence of automation and intelligence is known as hyper automation. Also known as cognitive or smart automation, hyper automation is at the forefront of the 4th Industrial Revolution and is gradually making its way into every aspect of business, delivering unprecedented results. There are a number of factors driving the adoption of hyper automation among enterprises, including the ability to improve operational and service performance.
Talend has released the latest update to its Talend Data Fabric platform is adding several new features, including AI/ML, to more quickly reveal latent intelligence held inside dispersed enterprise data. The Talend Winter '20 release delivers trusted data quickly, reliably and at first sight for faster business outcomes, according to Talend execs. "The innovations introduced in Talend Data Fabric will provide our customers with dramatically improved efficiency, optimized productivity and scale, and accelerated path to revealing value from data," said Talend's Ciaran Dynes senior vice president products in a statement. Here's a list of notable features in Talend's Winter '20 release, and how they deliver value. Data Inventory: This new cloud-based app automatically inventories and quality checks data to reveal trusted data quickly and easily.
The insurance industry is way past its time when timely response and a balanced price-quality relationship were enough to define customer experience. The advent of Artificial Intelligence, Machine Learning, and Advanced Analytics have disrupted the insurance industry and have reshaped the way it operates. Insurtech firms these days are using their AI and ML capabilities to drive high-quality customer experiences, increased loyalty, generate new revenue while simultaneously reducing the costs. The vision of the insurance firms today and for the future is where customers and customer experience comes first. The combination of AI and ML models built on top of the Customer Data Platform leads to improved customer experience through hyper-personalization.
This course focuses on how to use KNIME Analytics Platform for in-database processing and writing/loading data into a database. Get an introduction to the Apache Hadoop ecosystem and learn how to write/load data into your big data cluster running on premise or in the cloud on Amazon EMR, Azure HDInsight, Databricks Runtime or Google Dataproc.. Learn about the KNIME Spark Executor, preprocessing with Spark, machine learning with Spark, and how to export data back into KNIME/your big data cluster. This course lets you put everything you've learnt into practice in a hands-on session based on the use case: Eliminating missing values by predicting their values based on other attributes. This course consists of four, 75-minutes online sessions run by one of our KNIME data scientists. Each session has an exercise for you to complete at home and together, we will go through the solution at the start of the following session.
A major marketing firm has turned to IBM Watson Studio, and its data, to create an interactive platform that predicts the risk, readiness and recovery periods for counties hit by the coronavirus. Global digital marketing firm Wunderman Thompson launched its Risk, Readiness and Recovery map, an interactive platform that helps enterprises and governments make market-level decisions, amid the coronavirus pandemic. The platform, released May 21, uses Wunderman Thompson's data, as well as machine learning technology from IBM Watson, to predict state and local government COVID-19 preparedness and estimated economic recovery timetables for businesses and governments. The idea for the Risk, Readiness and Recovery map, a free version of which is available on Wunderman Thompson's website, originated two months ago as the global pandemic accelerated, said Adam Woods, CTO at Wunderman Thompson Data. "We were looking at some of the visualizations that were coming in around COVID-19, and we were inspired to really say, let's look at the insight that we have and see if that can make a difference," Woods said.
Capstone (3 Credits): A semester-long group project in which teams of students propose and select project ideas, conduct and communicate their work, receive and provide feedback (in informal group discussions and formal class presentations), and deliver compelling presentations along with a web-based final deliverable. Includes relevant readings, case discussions, and real-world examples and perspectives from panel discussions with leading data science experts and industry practitioners.