Goto

Collaborating Authors

Education


Feature Stores need an HTAP Database

#artificialintelligence

A Feature Store is a collection of organized and curated features used for training and serving Machine Learning models. Keeping them up to date, serving feature vectors, and creating training data sets requires a combination of transactional (OLTP) and analytical (OLAP) database processing. This kind of mixed workload database is called HTAP for hybrid transactional analytical processing. The most useful Feature Stores incorporate data pipelines that continuously keep their features up to date through either batch or real-time processing that matches the cadence of the source data. Since these features are always up to date, they provide an ideal source of feature vectors used for inferencing.


Literature Should Be Taught Like Science - Issue 97: Wonder

Nautilus

In the past quarter century, enrollment in college English departments has sunk like the Pequod in Moby Dick. Meanwhile enrollment in science programs has skyrocketed. Elon Musk, not Herman Melville, is the role model of the digital economy. But it doesn't have to be that way, says Angus Fletcher, 44, an English professor at Ohio State University. Fletcher is part of "group of renegades," he says, who are on a mission to plug literature back into the electric heart of contemporary life and culture. Fletcher has a plan--"apply science and engineering to literature"--and a syllabus, Wonderworks: The 25 Most Powerful Inventions in the History of Literature, his new book. Before the England-born Fletcher got his Ph.D. in literature at Yale, he earned an undergraduate degree in neuroscience, followed by a four-year stint in a neurophysiology lab at the University of Michigan. He switched careers when he realized the biology of the brain wouldn't take him far enough toward understanding our need for stories. "What's special about the human brain is its power of storytelling," Fletcher says.


ML for Business Managers: Build Regression model in R Studio

#artificialintelligence

In this section we will learn - What does Machine Learning mean. What are the meanings or different terms associated with machine learning? You will see some examples so that you understand what machine learning actually is. It also contains steps involved in building a machine learning model, not just linear models, any machine learning model.


A technique to estimate emotional valence and arousal by analyzing images of human faces

#artificialintelligence

In recent years, countless computer scientists worldwide have been developing deep neural network-based models that can predict people's emotions based on their facial expressions. Most of the models developed so far, however, merely detect primary emotional states such as anger, happiness and sadness, rather than more subtle aspects of human emotion. Past psychology research, on the other hand, has delineated numerous dimensions of emotion, for instance, introducing measures such as valence (i.e., how positive an emotional display is) and arousal (i.e., how calm or excited someone is while expressing an emotion). While estimating valence and arousal simply by looking at people's faces is easy for most humans, it can be challenging for machines. Researchers at Samsung AI and Imperial College London have recently developed a deep-neural-network-based system that can estimate emotional valence and arousal with high levels of accuracy simply by analyzing images of human faces taken in everyday settings.


Advancing More Ethical Artificial Intelligence

#artificialintelligence

A business school takes a multidisciplinary approach to teaching students about the critical role of ethics in the deployment of artificial intelligence. San Francisco has a long history of discovery--from the Gold Rush to the tech revolution. The city also has a history of embracing people-centered social justice. It makes sense, then, that faculty at San Francisco State University (SFSU) would want to combine the two as we explore the implications of one of the next frontiers of discovery: artificial intelligence. I have found that business schools largely discuss AI within other topic areas such as product development or marketing.


AI could have profound effect on way GCHQ works, says director

The Guardian

GCHQ's director has said artificial intelligence software could have a profound impact on the way it operates, from spotting otherwise missed clues to thwart terror plots to better identifying the sources of fake news and computer viruses. Jeremy Fleming's remarks came as the spy agency prepared to publish a rare paper on Thursday defending its use of machine-learning technology to placate critics concerned about its bulk surveillance activities. "AI, like so many technologies, offers great promise for society, prosperity and security. Its impact on GCHQ is equally profound," he said. "While this unprecedented technological evolution comes with great opportunity, it also poses significant ethical challenges for all of society, including GCHQ." AI is considered controversial because it relies on computer algorithms to make decisions based on patterns found in data.


Machine Learning Engineers Are In High Demand. So, What Do They Do?

#artificialintelligence

With every organization digitizing its operations and taking advantage of data science tools, artificial intelligence, machine learning, the demand for professionals in their domain is always high. With machine learning being an important aspect of all automation tools, machine learning engineers are in the highest demand. According to Brandon Purell, Senior Analyst at Forrester Research, "one hundred percent of any company's future success depends on adopting machine learning. For companies to be successful in the age of the customer, they need to anticipate what customers want, and machine learning is absolutely essential for that." Let's understand why the demand for a machine learning engineer is more than ever.


Global Big Data Conference

#artificialintelligence

Aquarium, a startup from two former Cruise employees, wants to help companies refine their machine learning model data more easily and move the models into production faster. Today the company announced a $2.6 million seed led by Sequoia with participation from Y Combinator and a bunch of angel investors including Cruise co-founders Kyle Vogt and Dan Kan. When the two co-founders CEO Peter Gao and head of engineering Quinn Johnson, were at Cruise they learned that finding areas of weakness in the model data was often the problem that prevented it from getting into production. Aquarium aims to solve this issue. "Aquarium is a machine learning data management system that helps people improve model performance by improving the data that it's trained on, which is usually the most important part of making the model work in production," Gao told me.


The Ultimate Guide to Machine Learning Frameworks - The New Stack

#artificialintelligence

We have seen an explosion in developer tools and platforms related to machine learning and artificial intelligence during the last few years. From cloud-based cognitive APIs to libraries to frameworks to pre-trained models, developers make many choices to infuse AI into their applications. AI engineers and researchers choose a framework to train machine learning models. These frameworks abstract the underlying hardware and software stack to expose a simple API in languages such as Python and R. For example, an ML developer can leverage the parallelism offered by GPUs to accelerate a training job without changing much of the code written for the CPU. These frameworks expose simpler APIs that translate to complex mathematical computations and numerical analysis often needed for training the machine learning models. Apart from training, the machine learning frameworks simplify inference -- the process of utilizing a trained model for performing prediction or classification of live data.


3 ways to get into reinforcement learning

#artificialintelligence

When I was in graduate school in the 1990s, one of my favorite classes was neural networks. Back then, we didn't have access to TensorFlow, PyTorch, or Keras; we programmed neurons, neural networks, and learning algorithms by hand with the formulas from textbooks. We didn't have access to cloud computing, and we coded sequential experiments that often ran overnight. There weren't platforms like Alteryx, Dataiku, SageMaker, or SAS to enable a machine learning proof of concept or manage the end-to-end MLops lifecycles. I was most interested in reinforcement learning algorithms, and I recall writing hundreds of reward functions to stabilize an inverted pendulum.