If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Facebook has in recent months amped up a push to bring in professors from top-notch universities to work on long-term artificial intelligence research part time, but the company says it views universities as partners rather than competitors to poach top talent from. In an interview with reporters, Chief AI Scientist of Facebook AI Research Yann LeCun gave new details about this "dual affiliation" program that lets professors do research for both Facebook and their universities. The program, most recently expanded to the University of Washington, Carnegie Mellon University in Pittsburgh and Oxford University in the U.K., has come under fire for fear of "brain drain" among top research institutions. That is far from the case argues LeCun, who pointed out that Facebook only brings in one or two people from each school so as not to disrupt the work of their departments. "We are careful; we don't hire five people from the same university," LeCun said.
Hephaestus, the Greek god of blacksmiths, metalworking and carpenters, was said to have fashioned artificial beings in the form of golden robots. Myth finally moved toward truth in the 20th century, as AI developed in series of fits and starts, finally gaining major momentum--and reaching a tipping point--by the turn of the millennium. Here's how the modern history of AI and ML unfolded, starting in the years just following World War II. In 1950, while working at the University of Manchester, legendary code breaker Alan Turing (subject of the 2014 movie The Imitation Game) released a paper titled "Computing Machinery and Intelligence." It became famous for positing what became known as the "Turing test."
When you take a minute to stop and look around, the technological advancements of today could be perceived as something out of a futuristic novel. Cars are learning to drive, hands-free devices can turn on your lights or toast your bread, and flying drones are circling the skies. While the manifestation of Artificial Intelligence (AI) and Machine Learning (ML) haven't been realized, impressive progress has certainly been made. As a location technology platform, we at Foursquare understand the power that something like AI and ML can have on the way people live and move throughout the world. Take for instance, our own Pilgrim SDK technology, the most sophisticated contextual awareness engine.
Lots of 11-year-olds would find this a great idea, especially if the alternative was a homework assignment on French verbs. Welcome to Move Mirror, where you move in front of your webcam. Google takes to the idea of making machine learning more accessible to coders and makers. The desired outcome is inspiring them to play around with this technology. Move Mirror's intent is to show computer vision techniques such as pose estimation and to do it in fun ways.
Deep learning (also known as deep structured learning or hierarchical learning) is part of a broader family of machine learning methods based on learning data representations, as opposed to task-specific algorithms. Learning can be supervised, semi-supervised or unsupervised. Deep learning architectures such as deep neural networks, deep belief networks and recurrent neural networks have been applied to fields including computer vision, speech recognition, natural language processing, audio recognition, social network filtering, machine translation, bioinformatics, drug design, autonomous locomotion and board game programs, where they have produced results comparable to and in some cases superior to human experts. With massive amounts of computational power, machines can now recognize objects and translate speech in real time. Deep learning models are vaguely inspired by information processing and communication patterns in biological nervous systems yet have various differences from the structural and functional properties of biological brains, which make them incompatible with neuroscience evidences.
Will robots eventually put writers out of a job? Are we going to see computers writing the next Great American Novel? Over the last decade or so, though, AI (artificial intelligence) has become increasingly sophisticated, and it's influencing the world of writing in a number of interesting ways. AI is all about machines learning and adapting. Instead of simply being programmed in minute detail with everything they need to know to accomplish a particular task, they're programmed with instructions that allow them to learn from their experience (just as people do).
After no end of false starts, the technology needed for artificial intelligence is finally here. Over the past ten or so years, the amount of progress made in the field has been stunning, and already the technology is finding its way into a number of different industries. It's already clear what benefits AI can bring to a range of different applications. The ability of computers to analyse data and draw conclusions means that it takes them fractions of a second to accurately understand any inputs. As a result, they can tailor the experience they provide, depending entirely on the user.
The events of the past few weeks have provided much fodder to this columnist. First, there was the American grandstanding on immigration, and then the events at Cognizant Technology Solutions Corp., Infosys Ltd and Tata Consultancy Services Ltd. More recently, Cloudflare Inc., which hosts information for close to two million websites, including Uber Technologies Inc. and OKCupid, had an Internet security disaster that saw the leak of passwords, cookies, and private messages from adult dating sites. This "Cloudbleed" was discovered on 17 February, 2017, but has evidently been around for many months. Decisions, decisions--what does one chew on first?
Researchers have shown that it is possible to train artificial neural networks directly on an optical chip. The significant breakthrough demonstrates that an optical circuit can perform a critical function of an electronics-based artificial neural network and could lead to less expensive, faster and more energy efficient ways to perform complex tasks such as speech or image recognition. "Using an optical chip to perform neural network computations more efficiently than is possible with digital computers could allow more complex problems to be solved," said research team leader Shanhui Fan of Stanford University. "This would enhance the capability of artificial neural networks to perform tasks required for self-driving cars or to formulate an appropriate response to a spoken question, for example. It could also improve our lives in ways we can't imagine now."