If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
If you're in the financial services industry or have an interest in predicting market movements with machine learning, you may be eager to learn how to move your trading signal and forecast generation code into the cloud. You can easily scale up your computational loads, distribute data processing pipelines to run in parallel on multiple machines, speed up the time required to run complex analytics, eliminate the need for management of data storage, and ultimately eliminate the need for multiple data centers. In this post, we'll show how to build a data processing pipeline that starts with a market data feed as the input and uses machine learning to generate real-time forecasts as the output, with all application components running natively in Google Cloud. In the sections below, you'll learn how to build a complete end-to-end application that subscribes to the Thomson Reuters FX (foreign exchange) data feed published on a Cloud Pub/Sub topic, incrementally trains a TensorFlow neural network model, generates real-time forecasts of FX rates, and saves the forecasts into BigQuery for subsequent analysis. First, we'll focus on Cloud Pub/Sub as the connector used to link multiple application components.
Qualcomm on Wednesday announced the Snapdragon 845 Mobile Platform, a System on a Chip built for immersive multimedia experiences including extended reality (XR), on-device artificial intelligence and high-speed connectivity. The SoC will power next-generation Android flagship smartphones and Windows 10 notebooks based on ARM architecture. Xiaomi reportedly will use it in its forthcoming Mi 7 flagship, to be released next year. The SoC incorporates Qualcomm's Spectra 280 image signal processor (ISP), its Adreno 630 visual processing subsystem, and the company's secure processing unit (SPU), which enables improved biometrics security and user or application data key management. The 845 supports Google's TensorFlow and Tensorflow Lite, and Facebook's Caffe/Caffe2 frameworks, as well as the new Open Neural Network Exchange.
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, introduced the technologies required for implementing these ideas and some early experiments performed in the Kurento open source software community in areas such as entertainment, video surveillance, interactive media broadcasting, gaming or advertising. He concluded with a discussion of their potential business applications beyond plain call models. Speaker Bio Dr. Luis Lopez is associate professor at Universidad Rey Juan Carlos in Madrid, where he works in the creation of advanced multimedia communication technologies.
Checking for care The project is being developed as part of Microsoft's Healthcare NeXT initiative. The company's trying to find ways of offering digital healthcare experiences that let user get immediate information on common ailments. Microsoft has partnered with Aurora Health Care for its latest chatbot service, creating the "Aurora Digital Concierge" for patients. The smartphone app allows users to determine the level of care needed for their condition. By answering questions provided by the bot, the app can suggest possible causes for the symptoms being experienced.
The age of highly accessible, open source machine learning tools is upon us. No longer niche, everyone -- from data scientists to Japanese cucumber farmers -- is using machine-learning technologies. But what is machine learning? Machine learning is exactly what it sounds like -- software that can learn to solve a problem. Using large sets of data, an algorithm can be trained to understand that data.
There's a video of Gal Gadot having sex with her stepbrother on the internet. But it's not really Gadot's body, and it's barely her own face. It's an approximation, face-swapped to look like she's performing in an existing incest-themed porn video. The video was created with a machine learning algorithm, using easily accessible materials and open-source code that anyone with a working knowledge of deep learning algorithms could put together. A clip from the full video, hosted on SendVids, showing Gal Gadot's face on a porn star's body.
The emergence of real-time streaming analytics use cases has shifted the center of gravity for managing real-time processes. Because they operate in the moment, streaming engines by nature have been confined to performing rudimentary operations such as monitoring, filtering, light transformations of data. But as the need for performing more complex operations, such as using streaming data to retrain machine learning models, data pipelines have gained new prominence. Data pipelines spick up where streaming and message queuing systems leave off. They provide end-to-end management of data flows from ingest through buffering, filtering, transformation and enrichment, and basic analytic functions that can be squeezed into real time.
From machine learning-powered fraud defense on Shopify to Salesforce's Einstein, over the past couple of years, SaaS industry leaders have invested heavily in artificial intelligence R&D and have rapidly acquired AI companies to give themselves a lead over the competition. Thanks to cloud computing services democratizing access to AI, we may be on the cusp of a new age when emerging SaaS providers begin to roll out AI applications that really solve problems for consumers. I spoke to a number of experts at this year's SAAS NORTH conference in Ottawa, Canada about the evolving nature of AI in the SaaS industry. According to their insights, here's what the future might hold: The traditional SaaS model is based on rolling monthly subscriptions, meaning SaaS companies need to constantly improve and nurture their customer relationships to keep clients returning month by month. Leo Lax, founder of Ottawa-based SaaS accelerator L-Spark, said, "AI is helping to reduce the manual labor that was involved in the building of the customer relationship and allowing vendors of SaaS to interface with customers in a more meaningful way."
Healthcare is at a two-tined fork: One strip leads to repeating the same mistakes others have already made while the more enlightened rail learns from those instead. That avenue is not a foregone conclusion. Hospitals' development and implementation of emerging technologies under the artificial intelligence, cognitive computing and machine learning rubric is nascent enough that there's time to choose which tine to take. Taking the well-trodden path will neither be easy nor exactly inexpensive, so why pick it? There's a lot to be gained -- and piles of money to be saved -- by learning from those who have gone before.
"Deep Learning" computer systems, based on artificial neural networks that mimic the way the brain learns from an accumulation of examples, have become a hot topic in computer science. In addition to enabling technologies such as face- and voice-recognition software, these systems could scour vast amounts of medical data to find patterns that could be useful diagnostically, or scan chemical formulas for possible new pharmaceuticals. But the computations these systems must carry out are highly complex and demanding, even for the most powerful computers. Now, a team of researchers at MIT and elsewhere has developed a new approach to such computations, using light instead of electricity, which they say could vastly improve the speed and efficiency of certain deep learning computations. Their results appear today in the journal Nature Photonics in a paper by MIT postdoc Yichen Shen, graduate student Nicholas Harris, professors Marin Soljacic and Dirk Englund, and eight others.