As a speed limit sign appeared in the field of view, the program identified it, extracted the speed limit information from it, and displayed it to warn the driver. Of course, the navigation system's speed limit data could be outdated, or simply inaccurate. Together, both factors allowed the quick transformation from the ability to detect speed signs (albeit not in real time) in 2010 to completely and autonomously controlling a driver-less vehicle in 2016. Due to the continued fast growth in processing power and algorithms, computer vision already reached the point of trusting the passenger's (and pedestrians') lives with it.
At its F8 developer conference in San Jose today, Facebook is announcing the launch of Caffe2, a new open source framework for deep learning, a trendy type of artificial intelligence (AI). Today's announcement builds on Facebook's contributions to the Torch open source deep learning framework and more recently the PyTorch framework that the Facebook Artificial Intelligence Research (FAIR) group conceived. There are several other open source deep learning frameworks for people to use for all kinds of purposes. Public cloud infrastructure provider Amazon Web Services (AWS) has added experimental support for Caffe2 in its Deep Learning AMI (Amazon machine image).
Personalized product recommendations in email are a proven way to increase customer engagement, click-through rates (CTRs), and ultimately sales. However, AI algorithms are trained on huge sets of data, including a customer's past behavior, activity, and purchase history, to create personalized recommendations for every customer. AI makes it possible to send highly targeted campaigns integrating dynamic product recommendations in real time, resulting in a lift in CTR and sales. Segmented emails have been proven to get more opens, CTRs, and conversions, besides driving customer engagement.
Researchers from The United Kingdom stated that a self-taught artificial intelligence machine could pave the way in predicting heart attacks better than doctors. The AI machine included four machine learning algorithms namely: random forest, logistic regression, gradient boosting, and neural networks. As known by some, doctors predict heart attacks due to eight risk factors and signs that include age, blood pressure, and cholesterol level. It was identified that some of the strongest signs to predict heart attack weren't mentioned in the ACC/AHA guidelines.
The Internet of Things (IoT) and artificial intelligence (AI), often supported by Big Data, will combine, overlap and in other ways systematically drive telecommunications and enterprise IT forward during the coming years. Today, IBM Watson and Harman Professional Solutions unveiled an offering that illustrates the potential of these new tools. Voice-Enabled Cognitive Rooms, which are outfitted with IBM Watson IoT and Harman speaker capabilities, respond to spoken questions and commands. The changes that the IoT, AI and Big Data portend are so fundamental that an entirely new approach is necessary, according to Samsung.
This is the compelling question posted by Loren McDonald of IBM Watson Marketing during his presentation at the recent Digital Summit conference in Los Angeles. I was able to see a demo of IBM Watson's Cognitive Technology for marketing at the World of Watson conference and the possibilities were impressive. Now, if you're wondering what roles and tasks are at risk, Loren shared this list: Loren says that center-brain marketing melds right brain creativity with left brain analytical thinking with technology to fuel success in a future driven by machine learning. Do you think artificial intelligence and cognitive technology will replace part or all of your job?
The agency is now accepting research proposals for the program's first funding opportunity via a Broad Agency Announcement, published last week. Dubbed the Lifelong Learning Machines program or L2M, DARPA plans through the four-year program to fund the development of "substantially more capable systems that are continually improving and updating from experience." Artificial intelligence systems today can't adapt to situations for which they were not already trained or programmed, as DARPA notes in its Broad Agency Announcement released last week. The project is divided into two technical areas, one around developing the lifelong learning system, and the other around finding ways biological systems learn to inform new algorithms that could help machines do the same.
Community Technology Preview (CTP) 2.0 is the first production-quality preview of SQL Server 2017, and it is available on both Windows and Linux. In this preview, Microsoft added a number of new capabilities, including the ability to run advanced analytics using Python in a parallelized and highly scalable way, the ability to store and analyze graph data, and other capabilities that help you manage SQL Server for high performance and uptime, including the Adaptive Query Processing family of intelligent database features and resumable online indexing.
There has been a lot of buzz in the chatbot community about how the eBay ShopBot team extracts important information from naturally-expressed phrases to provide better search results. ShopBot uses its Knowledge Graph to understand user requests and generate follow-up questions to refine requests before searching for the items in eBay's inventory. For example, if the user types "Black" in the chat window while searching for messenger bags, NLU remembers that the user was searching for messenger bags and understands that black is a color that overrides any previously selected color. As a result, eBay ShopBot will present black messenger bags to the user.
The third is flexibility--the flexibility for developers to compose multiple cloud services into various design patterns for AI, and the flexibility to leverage Windows, Linux, Python, R, Spark, Hadoop, and other open source tools in building such systems. In addition to several advanced machine learning algorithms from Microsoft, R Server 9.1 introduces pretrained neural network models for sentiment analysis and image featurization, supports SparklyR, SparkETL, and SparkSQL, and GPU for deep neural networks. Built on the proven business intelligence (BI) engine in Microsoft SQL Server Analysis Services, it delivers enterprise-grade BI semantic modeling capabilities with the scale, flexibility, and management benefits of the cloud. Azure Analysis Services helps you integrate data from a variety of sources--for example, Azure Data Lake, Azure SQL DW, and a variety of databases on-premises and in the cloud--and transform them into actionable insights.