If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Artificial intelligence (AI), the use of human-like intelligence through software and mechanisms, enables the disruption of the most diverse segments. After all, this is an industry that has grown an average of 20% per year for the past 5 years, according to a survey by BBC Research. Many organizations have already joined the "future" and gained space by efficiently applying AI in everyday activities. For example, some banks started to perform financial services without the help of a human; farms use drones capable of identifying points in a crop that need more irrigation and automatically trigger sprinklers. AI is not set to replace the recruiter's work, the importance of the interview, the empathy, and the sparkle in the eye that we sometimes feel when interviewing a candidate.
Data science is one of the most appealing industries with a lot of features and opportunities. With humans using 2.5 quintillion data per day, the data landscape is at a dynamic space, almost mimicking the real global connectivity. New technologies to tackle data overwhelming are introduced year after year and the transformation is likely to continue into the coming decade. The rise for data-related practitioners in the fast-moving world is very real. According to a report, data related jobs post 2020 is anticipated to add around 1.5 lakh new openings.
You use high-level software libraries (e.g., scikit-learn, Keras, TensorFlow) to train or deploy machine learning algorithms, and would now like to understand the fundamentals underlying the abstractions, enabling you to expand your capabilities You're a software developer who would like to develop a firm foundation for the deployment of machine learning algorithms into production systems You're a data scientist who would like to reinforce your understanding of the subjects at the core of your professional discipline You're a data analyst or A.I. enthusiast who would like to become a data scientist or data/ML engineer, and so you're keen to deeply understand the field you're entering from the ground up (very wise of you!) You use high-level software libraries (e.g., scikit-learn, Keras, TensorFlow) to train or deploy machine learning algorithms, and would now like to understand the fundamentals underlying the abstractions, enabling you to expand your capabilities You're a software developer who would like to develop a firm foundation for the deployment of machine learning algorithms into production systems You're a data scientist who would like to reinforce your understanding of the subjects at the core of your professional discipline You're a data analyst or A.I. enthusiast who would like to become a data scientist or data/ML engineer, and so you're keen to deeply understand the field you're entering from the ground up (very wise of you!)
At the start of the 2010s, the hype around Big Data really took off. As the expectations around advanced analytics and analyzing unstructured data grew, the role of "Data Scientist" appeared on the upward slope of the Gartner hype-cycle (see figure below). At the same time, challenges with implementing various important new data platforms referenced in the Gartner graphic were starting to become apparent (i.e., Map Reduce and other distributed systems and Database Platform as a Service), and these start to appear on the downward slope of the same Gartner hype-cycle. These platforms didn't magically provide the Data Scientists with the data they required, and it became clear that a lot of design and engineering was required to align these data-platforms with what the Data Scientists needed. Also, a huge amount of hype and expectation was developing around NoSQL databases, but this mainly focused on the needs of web-scale applications and agile development rather than the needs of data analysis.
Can we integrate the power of Python calculation with a Tableau? That question was encourage me to start exploring the possibility of using Python calculation in Tableau, and I ended up with a TabPy. How can we use TabPy to integrating Python and Tableau? In this article, I will introduce TabPy and go through an example of how we can use it. TabPy is an Analytics Extension from Tableau which enables us as a user to execute Python scripts and saved functions using Tableau.
In the past five years, one trend that has made AI more accessible and acted as the driving force behind several companies is automated machine learning (AutoML). Many companies such as H2O.ai, DataRobot, Google, and SparkCognition have created tools that automate the process of training machine learning models. All the user has to do is upload the data, select a few configuration options, and then the AutoML tool automatically tries and tests different machine learning models and hyperparameter combinations and comes up with the best models. Does this mean that we no longer need to hire data scientists? In fact, AutoML makes the jobs of data scientists just a little easier by automating a small part of the data science workflow.
With a focus on building scalable capabilities for generating, operationalising, and measuring data-driven insights for their clients, ZS has a strong advanced data science group within their Business Consulting function. The team focuses on integrating transformative AI-enabled solutions and data products across multiple industries such as healthcare, life sciences, telecommunication, high tech, and retail. As the company leverages deep industry expertise and leading-edge analytics to create solutions that work in real life, the data science team plays a vital role in driving these functions. From foundational research in deep learning, natural language processing (NLP), optimisation, and operational research -- the advanced data science team at ZS works across various solutions. They are involved in developing solutions to full-scale productisation.
It's been almost one year since the Covid-19 pandemic started. Data scientists worldwide have been analyzing data gathered during the pandemic to inform policies. As we have seen, policymaking has not been straight forward. During this time of social isolation, it's been a great opportunity for policymakers to figure out the right approach to making sense of the data to gain flexibility in community-based policy decisions. On Nov 17th, 2020, XPrize and Cognizant announced their Pandemic Response Challenge.
There has been a lot of talk in the recent years about the use of AI capabilities or machine-learned solutions to improve digital products or services and user experience. If we can cut through the hype and have the necessary building blocks in place as covered in Four Hurdles To Creating Value From Data, there is legitimate value to be had from the use of data and AI to make better products in many verticals. For tech companies that are founded on strong engineering practices and find themselves knee deep in data such as Uber, Twitter and Tesla, the use of AI techniques to extract value is likely to be already part of their culture. As for others facing a different set of circumstances and yet, still see the need to adopt data and AI to remain competitive, these companies may not know where to start or underestimate the changes that are involved. The previous CEO of GE, Jeff Immelt shared from his experience during the digitization of the industrial world, that the transformation involved more than just software and technology people.