If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
In many respects, we are reinventing modern programming tools for the A.I. age. Models and expensive resources like talent, data and computing power are currently centralized within large tech corporations. TensorFlow, Tensorflow Hub, AutoML, Algorithmia, and cloud computing are all examples of increasing decentralization of artificial intelligence. Accelerate development (1000 brains are better than 100). Make A.I. safer (more people involved to check and balance development).
Businesses have entered the most rapid period of technological change in history, and artificial intelligence (AI) is on the cusp of revolutionizing the entire workforce, Ginni Rometty, chairman, president, and CEO of IBM, said in a keynote address at the 2018 Gartner Symposium/IT Expo in Orlando on Tuesday. "The pace is unabated," Rometty said. "You have to change the way you work, because this isn't going to stop." AI has become one of the great, meaningless buzzwords of our time. In this video, the Chief Data Scientist of Dun and Bradstreet explains AI in clear business terms.
New technical artifacts connected to the Internet constantly share, process, and storage a huge amount of data. This practice is what unifies the concept of Internet of Things ("IoT") to the concept of Big Data. With the growing dissemination of Big Data and computing techniques, technological evolution and economic pressure spread rapidly, and algorithms have become a great resource for innovation and business models. This rapid diffusion of algorithms and their increasing influence, however, have consequences for the market and for society, consequences which include questions of ethics and governance. Automated systems that turn on the lights and warm the dinner by realizing that you're returning home from work, smart bracelets and insoles that share with your friends how much you've walked or cycled during the day in the city or sensors that automatically warn farmers when an animal is sick or pregnant.
Having worked in the cryptography space for over two decades, and having been an active participant in the cryptocurrency evolution since its inception, I take a deep interest in the subject. In particular, I believe that the intersection of artificial intelligence (AI) and blockchain is an exciting but challenging new development. Matt Turck recently discussed why the topic matters and highlighted interesting projects in the space, referring to AI (big data, data science, machine learning) and blockchain (decentralized infrastructure) as the defining technologies of the next decade. Evidently, the time is already ripe for these new concepts, despite them being novel and still underdeveloped. Currently, AI startups are being overwhelmingly acquired by companies such as IBM, Apple, Facebook, Amazon, Google, Intel and Alibaba, among others.
The development of smart cities and their fast-paced deployment is resulting in the generation of large quantities of data at unprecedented rates. Unfortunately, most of the generated data is wasted without extracting potentially useful information and knowledge because of the lack of established mechanisms and standards that benefit from the availability of such data. Moreover, the high dynamical nature of smart cities calls for new generation of machine learning approaches that are flexible and adaptable to cope with the dynamicity of data to perform analytics and learn from real-time data. In this article, we shed the light on the challenge of under utilizing the big data generated by smart cities from a machine learning perspective. Especially, we present the phenomenon of wasting unlabeled data. We argue that semi-supervision is a must for smart city to address this challenge. We also propose a three-level learning framework for smart cities that matches the hierarchical nature of big data generated by smart cities with a goal of providing different levels of knowledge abstractions. The proposed framework is scalable to meet the needs of smart city services. Fundamentally, the framework benefits from semi-supervised deep reinforcement learning where a small amount of data that has users' feedback serves as labeled data while a larger amount is without such users' feedback serves as unlabeled data. This paper also explores how deep reinforcement learning and its shift toward semi-supervision can handle the cognitive side of smart city services and improve their performance by providing several use cases spanning the different domains of smart cities. We also highlight several challenges as well as promising future research directions for incorporating machine learning and high-level intelligence into smart city services.
Abu Sebastian, an author on the paper, explained that executing certain computational tasks in the computer's memory would increase the system's efficiency and save energy. "If you look at human beings, we compute with 20 to 30 watts of power, whereas AI today is based on supercomputers which run on kilowatts or megawatts of power," Sebastian said. "In the brain, synapses are both computing and storing information. In a new architecture, going beyond von Neumann, memory has to play a more active role in computing." The IBM team drew on three different levels of inspiration from the brain.
It's a time-tested science fiction trope, guaranteed to strike fear into the heart of anyone who ever watched The Terminator and wondered, "Is that really possible?" But, in reality, machine learning and cloud technologies are already being used to develop robots that are better, smarter, faster, and more useful than ever before. And it's happening faster than many people think. In fact, cloud technology is proving to be the tipping point from the basic, single-purpose robotics of years past--think assembly line robots, or the machines that vacuum our floors and wash our dishes--to devices that can think, act, and work alongside humans seamlessly. Big Data, artificial intelligence, machine learning, and more are now being used to develop robots' own neural networks, making use of today's large data sets to train machines on behavior.
Over at Argonne, Madeleine O'Keefe writes that the Lab is supporting CERN researchers working to interpret Big Data from the Large Hadron Collider (LHC), the world's largest particle accelerator. The LHC is expected to output 50 petabytes of data this year alone, the equivalent to nearly 15 million high-definition movies--an amount so enormous that analyzing it all poses a serious challenge to researchers. Centered around the ATLAS experiment, these efforts are especially important given what is coming up for the accelerator. In 2026, the LHC will undergo an ambitious upgrade to become the High-Luminosity LHC (HL-LHC). The aim of this upgrade is to increase the LHC's luminosity--the number of events detected per second--by a factor of 10. "This means that the HL-LHC will be producing about 20 times more data per year than what ATLAS will have on disk at the end of 2018," says Taylor Childers, a member of the ATLAS collaboration and computer scientist at the ALCF who is leading the effort at the facility.
Every second, approximately 6,000 tweets are posted on Twitter. That's a significant amount of data -- and it represents only one social media platform out of hundreds. Social media offers an enormous volume of unstructured data that can generate knowledge and help make better decisions on a larger scale. While humans are clearly efficient data generators, computers are having a difficult time processing and analyzing the sheer volume of data. Arizona State University Associate Professor Ming Zhao leads the development of GEARS, a big data computing infrastructure designed for today's demanding big data challenges.
Today marks the start of the fall Strata Data Conference in New York City, which has traditionally been the big data community's biggest show of the year. It's been a wild ride for the big data crowd in 2018, one that's brought its share of highs and lows. Now it's worth taking some time to consider where big data has come, and where it's possibly headed in the future. Here are five things to keep in mind as the Strata Data Conference kicks off. We've said this before, but it bears repeating: Hadoop is just one of many technologies angling for relevance in today's increasingly heterogeneous at-scale computing environment.