As someone educated in science and engineering, I've always considered the pursuit of new technologies a higher calling. As someone raised Roman Catholic, I also tend to pay attention when another high call comes in -- like from the Vatican. Last year, the Vatican reached out to our company, IBM. Pope Francis was worried about technology's effects on society and families around the world and its potential to widen the gap between the rich and poor. How could the world harness AI for the greater good while reducing its potential to be a force for evil?
This blog will take a thorough dive into the timeline of AI, beginning from the very start, the 1940s. The term "Artificial Intelligence" was first coined by the father of AI, John McCarthy in 1956. But the revolution of AI began a few years in advance, i.e. the 1940's. Around 37% of industries have implemented AI in some form, which is a 270% increase for the past 4 years. AI has taken multiple forms over the years.
It was renowned British computer scientist Alan Turing – the man who helped crack the Nazi Enigma code – who first came up with a test to see if a machine could show the kind of intelligent behavior seen in humans. In the decades since the Turing test was proposed, computers have become so intelligent that we often don't realise when we're talking to them. That helpful customer services rep who assisted you through your problems over a messaging service? Probably a chatbot, programmed to react to key words contained in your requests. We have gone from computers that can bamboozle a chess grandmaster to intelligent systems that will drive our cars.
Advanced computers have defeated chess masters and learned how to pick through mountains of data to recognize faces and voices. Now, a billionaire developer of software and artificial intelligence is teaming up with top universities and companies to see if A.I. can help curb the current and future pandemics. Thomas M. Siebel, founder and chief executive of C3.ai, an artificial intelligence company in Redwood City, Calif., said the public-private consortium would spend $367 million in its initial five years, aiming its first awards at finding ways to slow the new coronavirus that is sweeping the globe. "I cannot imagine a more important use of A.I.," Mr. Siebel said in an interview. Digital Transformation Institute, the new research consortium includes commitments from Princeton, Carnegie Mellon, the Massachusetts Institute of Technology, the University of California, the University of Illinois and the University of Chicago, as well as C3.ai and Microsoft.
From intelligent personal assistants to home robots, technology once thought of as a sci-fi dream is now embedded into everyday life. But this leap from dream to reality didn't happen overnight. There is no one'eureka' moment in a field as vast as AI. Rather, the technology we enjoy today is a result of countless milestones in artificial intelligence, delivered by countless forgotten people across a countless range of projects. So, let's pay homage to some of that work.
AI for sales is nothing new. But what are the real benefits of AI? Is it a lot of buzz or does it deliver real value? Sales tools using machine learning and deep learning are already widespread in the market today. And while the scope of AI's impact may still be limited to certain types of activities, the power and scope of AI will continue to grow.
From 1960 to 2020, the field of AI has both seen tremendous sprints of progress and strenuous "AI winter" s. Headlines have always accompanied the many breakthroughs on how computers are becoming intelligent and will soon surpass humans, followed by pessimistic views on how limited the current technology is. Currently, the field is at its highest point ever, yet, no signs of a general form of intelligence have been achieved so far, neither has a conclusive definition of what intelligence is or consciousness should look like. For more than fifty years, the field has been chasing one of the most elusive target science has ever seen. For a long time, public opinion was that anything that could play chess well would be intelligent.
Over the past few years, artificial intelligence has matured into a collection of powerful technologies that are delivering competitive advantage to businesses across industries. Global AI adoption and investment are soaring. By one account, 37 percent of organizations have deployed AI solutions--up 270 percent from four years ago.1 Analysts forecast global AI spending will more than double over the next three years, topping US$79 billion by 2022.2 Deloitte's State of AI in the Enterprise, 2nd Edition offers a global perspective of AI early adopters, based on surveying 1,900 IT and business executives from seven countries and a variety of industries.3 These adopters are increasing their spending on AI technologies and realizing positive returns. Almost two-thirds (65 percent) report that AI technologies are enabling their organizations to move ahead of the competition. Sixty-three percent of the leaders surveyed already view AI as "very" or "critically" important to their business success, and that number is expected to grow to 81 percent within two years.
Some tasks that AI does are actually not impressive. Think about your camera recognizing and auto-focusing on faces in pictures. That technology has been around since 2001, and it doesn't tend to excite people. Well, because you can do that too, you can focus your eyes on someone's face very easily. In fact, it's so easy you don't even know how you do it.
Human intelligence is the quality of brain that learns, extracts knowledge, acquires abstract concepts from its surrounding, whereas artificial intelligence is the ability of a machine to mimic the same tasks learning from data it receives. Intelligence is a quality that belongs to humans and if machines could play the game right, our lives would become much easier. Timo Elliott, Innovation Evangelist, SAP said, "The rise of artificial intelligence is raising the premium on tasks that only humans can do: it is freeing workers from drudgery and allowing them to spend time on more strategic and valuable business activities. Instead of forcing people to spend time and effort on tasks that we find hard but computers find easy, we will be rewarded for doing what humans do best -- and artificial intelligence will help make us all more human." However, despite significant advancements, AI still could not match up to human intelligence in most aspects.