Deep learning is a powerful statistical technique for classifying patterns using large training data sets and multi-layer AI neural networks. It's essentially a method for machines to learn from all kinds of data, whether structured or unstructured, that's loosely modeled on the way a biological brain learns new capabilities. Machine learning can be applied to just about any domain of knowledge given our ability to gather valuable data in almost any area of interest. But machine learning methods are narrower and more specialized than humans. There are many tasks for which they're not effective given the current state-of-the-art.
As the field of "big data" has emerged as a tool for solving all sorts of scientific and societal questions, one of the main challenges that remains is whether, and how, multiple sets of data from various sources could be combined to determine cause-and-effect relationships in new and untested situations. Now, computer scientists from UCLA and Purdue University have devised a theoretical solution to that problem. Their research, which was published this month in the Proceedings of the National Academy of Sciences, could help improve scientists' ability to understand health care, economics, the environment and other areas of study, and to glean much more pertinent insight from data. The study's authors are Judea Pearl, a distinguished professor of computer science at the UCLA Henry Samueli School of Engineering and Applied Science, and Elias Bareinboim, an assistant professor of computer science at Purdue University who earned his doctorate at UCLA. Big data involves using mountains and mountains of information to uncover trends and patterns.
Gartner, Inc. today highlighted the top strategic technology trends that organizations need to explore in 2019. Analysts presented their findings during Gartner Symposium/ITxpo, which is taking place here through Thursday. Gartner defines a strategic technology trend as one with substantial disruptive potential that is beginning to break out of an emerging state into broader impact and use, or which are rapidly growing trends with a high degree of volatility reaching tipping points over the next five years. "The Intelligent Digital Mesh has been a consistent theme for the past two years and continues as a major driver through 2019. Trends under each of these three themes are a key ingredient in driving a continuous innovation process as part of a ContinuousNEXT strategy," said David Cearley, vice president and Gartner Fellow.
What's the first thing that comes to mind when you hear the following phrases? These phrases probably evoke thoughts such as "fake," "not real," or even "shabby." Artificial is such a harsh adjective. The word "artificial" is defined as "imitation; simulated; sham" with synonyms such as fake, false, mock, counterfeit, bogus, phony and factitious. The word "artificial" may not be the right term to use to describe "Artificial Intelligence," because "artificial intelligence" is anything but fake, false, phony, or a sham.
Deep learning techniques do a good job at building models by correlating data points. But many AI researchers believe that more work needs to be done to understand causation and not just correlation. The field of causal deep learning -- useful in determining why something happened -- is still in its infancy, and it is much more difficult to automate than neural networks. Much of AI is about finding hidden patterns in large amounts of data. Soumendra Mohanty, executive vice president and chief data analytics officer at L&T Infotech, a global IT service company, said, "Obviously, this aspect drives us to the'what,' but rarely do we go down the path of understanding the'why.'"