If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Big data, the Internet of Things, and artificial intelligence hold such disruptive power that they have inverted the dynamics of technology leadership. When science and technology meet social and economic systems, you tend to see something akin to what the late Stephen Jay Gould called "punctuated equilibrium" in his description of evolutionary biology. Something that has been stable for a long period is suddenly disrupted radically--and then settles into a new equilibrium.1 1.See Stephen Jay Gould, Punctuated Equilibrium, Cambridge, MA: Harvard University Press, 2007. Gould pointed out that fossil records show that species change does not advance gradually but often massively and disruptively. After the mass extinctions that have occurred several times across evolutionary eras, a minority of species survived and the voids in the ecosystem rapidly filled with massive speciation.
You're sitting at home minding your own business when you get a call from your credit card's fraud detection unit asking if you've just made a purchase at a department store in your city. It wasn't you who bought expensive electronics using your credit card – in fact, it's been in your pocket all afternoon. So how did the bank know to flag this single purchase as most likely fraudulent? Credit card companies have a vested interest in identifying financial transactions that are illegitimate and criminal in nature. According to the Federal Reserve Payments Study, Americans used credit cards to pay for 26.2 billion purchases in 2012.
There's a big connection between my love for water sports and hardware design -- both involve observing waves and planning several moves ahead. Four years ago, when we started sketching the POWER9 chip from scratch, we saw an upsurge of modern workloads driven by artificial intelligence and massive data sets. We are now ready to ride this new tide of computing with POWER9. It is a transformational architecture and an evolutionary shift from the archaic ways of computing promoted by x86. POWER9 is loaded with industry-leading new technologies designed for AI to thrive.
With the boom in digital technologies, the world is producing over 2.5 exabytes of data every day. To put that into perspective, it is equivalent to the memory of 5 million laptops or 150 million phones. The deluge of data is forecast to increase with the passing day and with it has increased the need for powerful hardware that can support it. This hardware advancement refers to faster computing or processing speed and larger storage systems. Companies worldwide are investing in powerful computing with the R&Ds constantly in the race for making improved processors.
It was just a few weeks ago that Sophia, an artificial intelligence (AI)-powered humanoid robot developed by Hong Kong-based Hanson Robotics, was given honorary citizenship by Saudi Arabia. While AI is making huge inroads into our day-to-day life, was this something expected to happen so soon? Meanwhile, even though the popular perception is that AI is only going to be a job killer, reports also say that 2020 onwards, AI will start adding more jobs than it would take away. According a report by research firm Gartner, Inc., AI will create 2.3 million new jobs while eliminating only 1.8 million jobs in 2020. At the recently-held Wall Street Journal CEO Council meeting in Washington DC, I was fortunate enough to listen to two eminent futurists and authors--Martin Ford and Jerry Kaplan--who are known for their pioneering work in the field of AI.
The Data Science Trends for 2018 are largely a continuation of some of the biggest trends of 2017 including Big Data, Artificial Intelligence (AI), Machine Learning (ML), along with some newer technologies like Blockchain, Edge Computing, Serverless Computing, Digital Twins, and others that employ various practices and techniques within the Data Science industry. The Dataconomy article titled The Future of Big Data Is Open Source aptly captures the industry buzz that dominated the last part of 2016 and the entire 2017. Then, Big Data and Data Science were the biggest industry buzzwords, but a lot has changed since then. In 2016, and 2017, Big Data was a market differentiator for businesses, and continues to be. Now, as we enter 2018, there are numerous new technologies that will coordinate within the greater foundations of Data Science and Big Data, and expand such industries into new spaces that are only beginning to be understood and appreciated.
The 22nd International Cloud Expo 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. CloudExpo DXWorldEXPO have announced the conference tracks for Cloud Expo 2018, introducing DXWorldEXPO. DXWordEXPO, colocated with Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DXWorld Expo within the program.
When subatomic particles smash together at the Large Hadron Collider in Switzerland, they create showers of new particles whose signatures are recorded by four detectors. The LHC captures 5 trillion bits of data -- more information than all of the world's libraries combined -- every second. After the judicious application of filtering algorithms, more than 99 percent of those data are discarded, but the four experiments still produce a whopping 25 petabytes (25 1015 bytes) of data per year that must be stored and analyzed. That is a scale far beyond the computing resources of any single facility, so the LHC scientists rely on a vast computing grid of 160 data centers around the world, a distributed network that is capable of transferring as much as 10 gigabytes per second at peak performance. The LHC's approach to its big data problem reflects just how dramatically the nature of computing has changed over the last decade.
The digital transformation underway at Under Armour is erasing any stale stereotypes that athletes and techies don't mix. While hardcore runners sporting the company's latest microthread singlet can't see Hadoop, Apache Hive, Apache Spark, or Presto, these technologies are teaming up to track some serious mileage. Under Armour is working on a "connected fitness" vision that connects body, apparel, activity level, and health. By combining the data from all these sources into an app, consumers will gain a better understanding of their health and fitness, and Under Armour will be able to identify and respond to customer needs more quickly with personalized services and products. The company stores and analyzes data about food and nutrition, recipes, workout activities, music, sleep patterns, purchase histories, and more.