Collaborating Authors


The Pivotal Management Challenge of the AI Era


History indicates that major technological changes can take about half a century to go from the first lab drawings to society. Alan Turing first proposed the Turing machine laying the foundations of computation in 1936; the first general-purpose "Turing-complete" system was built in 1945; and "The Computer" was only named "Machine of the Year" by Time in 1982, about half a century later. The foundations of the internet were laid out in the 1960s, but consumers did not get to broadly use and benefit from it until the mid-to-late 1990s. For most people, artificial intelligence was strictly a sci-fi concept until recent years. Yet, if you go by the above timeline, the AI revolution may actually be running at least two decades late.

Are Computers That Win at Chess Smarter Than Geniuses?


But then there was the Chinese game of go (pictured), estimated to be 4000 years old, which offers more "degrees of freedom" (possible moves, strategy, and rules) than chess (2 10170). As futurist George Gilder tells us, in Gaming AI, it was a rite of passage for aspiring intellects in Asia: "Go began as a rigorous rite of passage for Chinese gentlemen and diplomats, testing their intellectual skills and strategic prowess. Later, crossing the Sea of Japan, Go enthralled the Shogunate, which brought it into the Japanese Imperial Court and made it a national cult." Then AlphaGo, from Google's DeepMind, appeared on the scene in 2016: As the Chinese American titan Kai-Fu Lee explains in his bestseller AI Super-powers,8 the riveting encounter between man and machine across the Go board had a powerful effect on Asian youth. Though mostly unnoticed in the United States, AlphaGo's 2016 defeat of Lee Sedol was avidly watched by 280 million Chinese, and Sedol's loss was a shattering experience. The Chinese saw DeepMind as an alien system defeating an Asian man in the epitome of an Asian game.

'Christmas slots went in five hours': how online supermarket Ocado became a lockdown winner

The Guardian

Ocado's warehouse in Erith, 15 miles east of London on the Thames estuary, is staffed by 1,050 "personal shoppers". Outnumbering them are 1,800 robots the size of small washing machines. You see them by climbing to the top level of the vast warehouse – at 564,000 sq ft, it is more than three times the size of St Peter's in Rome – where a sign tells you that photography is strictly prohibited. The online supermarket is paranoid that rivals will glimpse the technology it believes to be revolutionary. From the viewing platform you can watch these metal cubes endlessly whiz around, moving thousands of plastic crates as if they were playing an enormous game of chess. You occasionally sight bottles of bleach or rosé, packets of noodles and dog biscuits, before they are sent down to a lower level. "I find it quite mesmerising, like robotic ballet," says Mel Smith, CEO of Ocado Retail, the UK arm of the business. "The day I decided I wanted this job was when I went to [the warehouse] and thought, this is absolutely the future."

What is Artificial Intelligence?


The term artificial intelligence (AI) refers to computing systems that perform tasks normally considered within the realm of human decision making. These software-driven systems and intelligent agents incorporate advanced data analytics and Big Data applications. AI systems leverage this knowledge repository to make decisions and take actions that approximate cognitive functions, including learning and problem solving. AI, which was introduced as an area of science in the mid 1950s, has evolved rapidly in recent years. It has become a valuable and essential tool for orchestrating digital technologies and managing business operations.

Artificial Intelligence Usage on the Rise


Machine learning and artificial intelligence (AI) have captured our imaginations for decades, but until more recently, had limited practical application. Steven Kemler, an entrepreneurial business leader and Managing Director of the Stone Arch Group, says that with recent increases in available data and computing power, AI already impacts our lives on many levels and that going forward, self-teaching algorithms will play an increasingly important role in both in society and in business. In 1997, Deep Blue, developed by IBM, became the first computer / artificial intelligence system to beat a current world chess champion (Gary Kasparov), significantly elevating interest in the practical applications of AI. These practical uses still took years to develop, with the worldwide market for AI technology not reaching $10 billion until 2016. Since then, AI market growth has accelerated significantly, reaching $50 billion in 2020 and expected to exceed $100 billion by 2024, according to the Wall Street Journal.

Pitting Computers Against Each Other . . . in Chess

Communications of the ACM

For those of us involved in programming computers to play chess, it has been a great adventure. ACM annual tournaments began in 1970 (50 years ago!) and were hosted year after year for a quarter-century by the organization. They were terrific catalysts for progress in the field, and deserve major credit for the eventual 1997 defeat of then-World Champion Garry Kasparov. I feel human intelligence has been vastly overrated. We humans haven't learned how not to fight wars over various explanations of how the universe or man came into being.

AlphaZero, a novel Reinforcement Learning Algorithm, deployed in JavaScript


In this blog post, you will learn about and implement AlphaZero, an exciting and novel Reinforcement Learning Algorithm, used to beat world-champions in games like Go and Chess. You will use it to master a pen-and-pencil game (Dots and Boxes) and deploy it into a web app, entirely in JavaScript. AlphaZero's key and most exciting aspect is its ability to gain superhuman behavior in board games without relying on external knowledge. AlphaZero learns to master the game by playing against itself (self-play) and learning from those experiences. We will leverage a "simplified, highly flexible, commented, and easy to understand implementation" Python version of AlphaZero from Surag Nair available in Github. You can go ahead and play the game here. The WebApp and JavaScript implementation are available here. This code was ported from this Python implementation.

Machine Learning vs. AI: What's the Difference?


Every time Netflix recommends a new binge-worthy show for you, or Amazon suggests a related product, or Google helps you find the name of that one actor that was on the tip of your tongue, you're experiencing machine learning at work. All of these real-world applications use a subset of artificial intelligence technology to find patterns, solve problems, and accomplish tasks. But although machine learning, deep learning, and artificial intelligence (AI) are related, the differences between them can be confusing. In this post, we'll break down the differences in these exciting technologies in plain language, and explore how they're relevant to your business. Let's start with some definitions: Artificial intelligence is the study of how to build programs that can solve problems in a similar way to humans; it's about replicating human problem-solving and intelligence in machines. When working to develop AI, scientists quickly realized that teaching an AI every single thing it needed to know to perform its intended function was a non-starter.

Can AI Tell Us When To Use AI And When Not To?


Or as Mount Sinai Hospital's Robert Freeman said, "these projects are about 5 percent technology, and 95 percent change management". Developments such as that highlighted by the MIT team are a fascinating indication of the progress being made, but it's clear that there is a long way to go before such technologies are a mainstream part of healthcare as we know it.

3 Reasons you should learn Artificial Intelligence in 2020


Some would say that being an AI expert is the job of the century. I think that point of view puts a lot of pressure on the candidates who plan to take a shot at the artificial intelligence industry. It might create the impression that only the most brilliant students who have had straight A's since high school are good enough for an AI role. Learning and practicing AI is not a cakewalk. You need to put in the hours and struggle for solutions.