If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Looking through this lens, ML seems to be a lot like statistical modelling. In statistical modelling, we collect data, verify that it is clean -- in other words, that we have completed, corrected, or deleted any incomplete, incorrect, or irrelevant parts of the data -- and then use this clean dataset to test hypotheses and make predictions and forecasts. The idea behind statistical modelling is the attempt to represent complex issues in relatively generalizable terms, which is to say, terms that explain most events studied. Effectively, we programme the algorithm to perform certain functions based on the data we submit.
On Wednesday at its annual developers conference, the tech giant announced the second generation of its custom chip, the Tensor Processing Unit, optimized to run its deep learning algorithms. In contrast, Nvidia announced its latest generation GPUs in a data center product called the Tesla V100 that deliver 120 teraflops of performance, Nvidia said. Through the Google Cloud, anybody can rent out Cloud TPUs -- similar to how people can rent GPUs on the Google Cloud. "Google's use of TPUs for training is probably fine for a few workloads for the here and now, but given the rapid change in machine learning frameworks, sophistication, and depth, I believe Google is still doing much of their machine learning production and research training on GPUs," said tech analyst Patrick Moorhead.
In a paper published in April of this year, researchers at Princeton discovered that when an AI algorithm is trained with ordinary human language found online, it could acquire cultural biases that are embedded in particular patterns of wording and language. For example, using an analysis of word proximity in 10-word strings to assess the strength of a connection between two words, the researchers found that a set of traditionally African American names had more unpleasantness associations than a set of traditionally European American names. While many people would assume that artificial intelligence algorithms are objective tools making objective calculations, the fact is these tools are created from and trained on large sets of data (images, text, video, etc.) The danger with overuse of artificial intelligence in marketing is that our dominant, biased discourses will remain dominant and biased, especially if we assume an AI tool is taking an objective tack.
The impressive artificially intelligent software was developed to advance machine learning capabilities, including natural language processing, reasoning and knowledge retrieval. Unlike human security researchers, the advanced software can draw on knowledge of 75,000 known software vulnerabilities, plus endless security papers which are published continually. The user interface suggests relevant products using natural language processing and data analysis, helping the customer to make the right purchase. Watson alone can identify and suggest treatments for life-threatening illnesses, defend against cybercrime, help you cook your dinner, persuade consumers to buy things and enable autonomous vehicles.
This company has developed a new anti-cancer drug (against pancreatic, breast, liver or brain cancer) called BPM 31510, which has been discovered by an algorithm. The major technology companies are using millions of people data to find treatments. In addition to the start-ups, all major technology companies have already begun to apply Big Data and artificial intelligence to the service of health. Big Data and artificial intelligence, combined with genetic analysis, allow researchers to search for and find patterns among patients with rare diseases, who may be separated by distance but carry the same mutation.
Dubbed TPU 2.0 or the Cloud TPU, the new chip is a sequel to a custom-built processor that has helped drive Google's own AI services, including its image recognition and machine translation tools, for more than two years. Amazon and Microsoft offer GPU processing via their own cloud services, but they don't offer bespoke AI chips for both training and executing neural networks. Companies and developers undertake this training with help from GPUs, sometimes thousands of them, running inside the massive computer data centers that underpin the world's internet services. Training on traditional CPU processors--the generalist chips inside the computer servers that drive online software--just takes too much time and electrical power.
Custom Decision Service provides a "contextual decision-making API that sharpens with experience"; Essentially providing an abstraction over cognitive services reinforcement learning capabilities which helps adapts the content in the application (think personalized interfaces, A B testing, content recommendations) to respond in real time. The service offers Audio Transcription, Video Indexer, Face tracking and identification, Speaker indexing, Visual text recognition, Voice activity detection, Scene detection, Keyframe extraction, Sentiment analysis, Translation, Visual content moderation, Keywords extraction, and Annotation. Other language services include Bing Spell Check API which detect and correct spelling mistakes, Web Language Model API which helps building knowledge graphs using predictive language models Text Analytics API to perform topic modeling and do sentiment analysis, as well as Translator Text API to perform automatic text translation. In the knowledge spectrum, the Recommendations API to help predict and recommend items, Knowledge Exploration Service to enable interactive search experiences over structured data via natural language inputs, Entity Linking Intelligence Service for NER / disambiguation, Academic Knowledge API (academic content in the Microsoft Academic Graph search), QnA Maker API, and the newly minted custom Decision Service which provides a contextual decision-making API with reinforcement learning features.
Twitter engineers this week unveiled the social media platform's ranking algorithm driven by deep neural networks. Among the results, wrote Nicolas Koumchatzky, a software engineer with Twitter's AI team called Cortex, are "more relevant timelines now, and in the future, as this opens the door for us to use more of the many novelties that the deep learning community has to offer, especially in the areas of [natural language processing], conversation understanding and media domains." The Cortex team of data scientists and machine-learning researchers works on Twitter's deep learning platform. Having worked out most of the kinks in its deep learning models and platform, Twitter's Koumchatzky said "online experiments have also shown significant increases in metrics such as tweet engagement and time spent on the platform."
The initiative will be driven by a government-wide partnership comprising NRF, the Smart Nation and Digital Government Office (SNDGO), the Economic Development Board (EDB), the Infocomm Media Development Authority (IMDA), SGInnovate, and the Integrated Health Information Systems (IHiS). AI:SG will bring together research institutions, AI start-ups and companies developing AI products, to grow knowledge, create tools and develop talent to power Singapore's AI efforts. AI.SG will work with companies to use AI to raise productivity, create new products, and translate and commercialize solutions from labs to the market. Mr Tan Kok Yam, Deputy Secretary, Smart Nation and Digital Government Office, said: "Through AI.SG, we intend to work with AI research performers, start-ups and companies to audaciously tackle tough challenges in areas such as transportation and urban management.
Over the past couple of years they've open sourced around 2.5million lines of machine learning platform code between them (that's 650 human years' worth of code!). Whilst we're on the topic of AI powered voice assistants; of course, Amazon Alexa and Google Home need to be thrown a mention -- these are the devices encouraging the first steps towards AI filled homes. One great example of this is Benevolent AI -- a British company using artificial intelligence to make connections for cures to illnesses that humans simply can't. This kind of AI implementation must be embraced; innovation within companies such as Benevolent AI can provide us with the positive steps forward towards progression in some of mankind's most hard-hitting challenges.