Goto

Collaborating Authors

Data Science


Artificial Intelligence - Atos

#artificialintelligence

Worldwide spending on artificial intelligence is expected to reach €40 billion in 2020. Human-centric industries, such as financial services, retail and healthcare are expected to be the biggest spenders, closely followed by asset-intensive industries manufacturing, energy & utility, transport etc.


[L4-BD] Introduction to Big Data with KNIME Analytics Platform - Online

#artificialintelligence

This course focuses on how to use KNIME Analytics Platform for in-database processing and writing/loading data into a database. Get an introduction to the Apache Hadoop ecosystem and learn how to write/load data into your big data cluster running on premise or in the cloud on Amazon EMR, Azure HDInsight, Databricks Runtime or Google Dataproc.. Learn about the KNIME Spark Executor, preprocessing with Spark, machine learning with Spark, and how to export data back into KNIME/your big data cluster. This course lets you put everything you've learnt into practice in a hands-on session based on the use case: Eliminating missing values by predicting their values based on other attributes. This course consists of four, 75-minutes online sessions run by one of our KNIME data scientists. Each session has an exercise for you to complete at home and together, we will go through the solution at the start of the following session.


Global Big Data Conference

#artificialintelligence

A major marketing firm has turned to IBM Watson Studio, and its data, to create an interactive platform that predicts the risk, readiness and recovery periods for counties hit by the coronavirus. Global digital marketing firm Wunderman Thompson launched its Risk, Readiness and Recovery map, an interactive platform that helps enterprises and governments make market-level decisions, amid the coronavirus pandemic. The platform, released May 21, uses Wunderman Thompson's data, as well as machine learning technology from IBM Watson, to predict state and local government COVID-19 preparedness and estimated economic recovery timetables for businesses and governments. The idea for the Risk, Readiness and Recovery map, a free version of which is available on Wunderman Thompson's website, originated two months ago as the global pandemic accelerated, said Adam Woods, CTO at Wunderman Thompson Data. "We were looking at some of the visualizations that were coming in around COVID-19, and we were inspired to really say, let's look at the insight that we have and see if that can make a difference," Woods said.


fbprophet

#artificialintelligence

Implements a procedure for forecasting time series data based on an additive model where non-linear trends are fit with yearly, weekly, and daily seasonality, plus holiday effects. It works best with time series that have strong seasonal effects and several seasons of historical data. Prophet is robust to missing data and shifts in the trend, and typically handles outliers well.


Business Analytics or a Data Science Degree?

#artificialintelligence

Capstone (3 Credits): A semester-long group project in which teams of students propose and select project ideas, conduct and communicate their work, receive and provide feedback (in informal group discussions and formal class presentations), and deliver compelling presentations along with a web-based final deliverable. Includes relevant readings, case discussions, and real-world examples and perspectives from panel discussions with leading data science experts and industry practitioners.


Global Big Data Conference

#artificialintelligence

B2B software sales and marketing teams love hearing the term "artificial intelligence" (AI). AI has a smoke and mirrors effect. But, when we say "AI is doing this," our buyers often know so little about AI that they don't ask the hard questions. In industries like the DevTools space, it is crucial that buyers understand both what products do and what their limitations are to ensure that these products meet their needs. If the purpose of AI is to make good decisions for humans, to accept that "AI is doing this" is to accept that we don't really know how the product works or if it is making good decisions for us.


13 Top Python Libraries You Should Know in 2020

#artificialintelligence

Python provides a lot of libraries to help developers with their work. Which of them will be the most popular in 2020? And which are worth your time? Here are our picks for the 13 top Python libraries. Python is one of the most popular programming languages.


DSC Data Science Search Engine

#artificialintelligence

Embracing Responsible AI from Pilot to Production - May 27 On average, 80% of AI projects fail to make it to production. But it IS possible to successfully launch AI, at scale, that is built responsibly and works for everyone. How you scale from pilot to production is critical to ensuring AI success, while continuing to be a good corporate citizen through responsible productization.


Global Big Data Conference

#artificialintelligence

Last Tuesday, Google shared a blog post highlighting the perspectives of three women of color employees on fairness and machine learning. I suppose the comms team saw trouble coming: The next day NBC News broke the news that diversity initiatives at Google are being scrapped over concern about conservative backlash, according to eight current and former employees speaking on condition of anonymity. The news led members of the House Tech Accountability Caucus to send a letter to CEO Sundar Pichai on Monday. Citing Google's role as a leader in the U.S. tech community, the group of 10 Democrats questioned why, despite corporate commitments over years, Google diversity still lags behind the diversity of the population of the United States. The 10-member caucus specifically questioned whether Google employees working with AI receive additional bias training.


Coles shuffles data management into the cloud

ZDNet

Machine learning might be high on the agenda for the data science team at Coles, but according to Richard Glew, Coles head of engineering and operations, they are currently limited by the existing on-premise environment. "Even if we can do something, being able to do something quickly is another matter. We've got a lot of issues [like] where is our data, do we have the right hardware, how long does it take to get it … all the usual stuff with an on-prem environment," he said, speaking as part of the Databricks Data and AI APAC virtual conference. In a move to expand the possibility of enabling machine learning, advanced analytics, and data exchange, the company is currently developing an electronic data processing platform (EDP) to change the way it manages and stores data. "Our EDP platform is designed to be a universal data repository for all the data we want to share internally or externally as an organisation, and we fully catalogue that," Glew said.