"We may be in the eternal spring of AI," says Andrew Ng, a luminary in the field of machine learning. Ng, a co-founder and former director of Google's AI team, sat down for an interview with ZDNet to discuss his just-published "playbook" for how to use the technology, which is available as a free download. He dismissed worries that artificial intelligence technology may be entering another one of its periodic "winters," when interest, and funding, drops off sharply. Andrew Ng explains the five principles of his "Playbook for AI." Machine learning, in the form of "connectionist" theories that model computing loosely along the lines of neurons in the brain, has gone through boom and bust cycles, flowering initially with Frank Rosenblatt's "perceptron" in the late 1950s, cooling in the late 60s, emerging again in the late 1980s only to again fall out of favor, and now suddenly back in vogue in the last several years. Those periodic coolings have been termed an "AI winter."
The first wave of artificial intelligence has been about experts: brilliant technologists doing cutting-edge research and building advanced systems in places like Silicon Valley. The second wave of AI will be about practitioners: traditional developers becoming AI rockstars and addressing a wide range of business problems. Access to AI will be democratized. During this transition, we believe AI – particularly deep learning – will begin to resemble a general-purpose computing platform, a topic we explored in a recent Forbes article. But a new set of tools will be necessary to make that vision a reality.
Cloud computing is a group of shared network resource providing services (backups and synchronization). So, one of the last computing errands to be ingested into the cloud is information analysis. Maybe this is on account of researchers are normally great at programming thus they appreciate having a machine on their work areas. Or on the other hand perhaps this is because the lab apparatus is trapped specifically to the PC to record the information. Or then again maybe this is because the data sets can be large to the point that now is the ideal time devouring to move them.
Semiconductor Engineering sat down to discuss artificial intelligence (AI), machine learning, and chip and photomask manufacturing technologies with Aki Fujimura, chief executive of D2S; Jerry Chen, business and ecosystem development manager at Nvidia; Noriaki Nakayamada, senior technologist at NuFlare; and Mikael Wahlsten, director and product area manager at Mycronic. What follows are excerpts of that conversation. To read part one, click here. SE: Artificial neural networks, the precursor of machine learning, was a hot topic in the 1980s. In neural networks, a system crunches data and identifies patterns.
The billions of devices that are expected to proliferate in the coming years at the "edge" of networks, such as autonomous vehicles and embedded Internet-of-Things, present manufacturers with a quandary: the manufacturers want to add smarts to the devices via machine learning, but they can't know what exactly to add until they test their neural networks and see what works out there in the marketplace. Coming in to save the day, so they contend, is a six-year-old startup company named Efinix, based in Santa Clara. The company has been refining the art of programmable chips. It now says that its customers can use its parts to first test a market for AI, and then, once the right neural nets are developed, mass-produce chips to serve those nets. The company's chief executive, Sammy Cheung, took some time to talk with ZDNet about Efinix's technology on the sidelines of the Linley Group Fall Processor Conference last week, hosted by venerable semiconductor analysis firm The Linley Group.
The explosion of AI and machine learning is changing the very nature of computing, so says one of the biggest practitioners of AI, Google. Google software engineer Cliff Young gave the opening keynote on Thursday morning at the Linley Group Fall Processor Conference, a popular computer-chip symposium put on by venerable semiconductor analysis firm The Linley Group, in Santa Clara, California. Said Young, the use of AI has reached an "exponential phase" at the very same time that Moore's Law, the decades-old rule of thumb about semiconductor progress, has ground to a standstill. "The times are slightly neurotic," he mused. "Digital CMOS is slowing down, we see that in Intel's woes in 10-nanometer [chip production], we see it in GlobalFoundries getting out of 7-nanometer, at the same time that there is this deep learning thing happening, there is economic demand."
The explosion of AI and machine learning is changing the very nature of computing, so says one of the biggest practitioners of AI, Google. Google Software engineer Cliff Young gave the opening keynote on Thursday morning at the Linley Group Fall Processor Conference, a popular computer-chip symposium put on by venerable semiconductor analysis firm The Linley Group, in Santa Clara, California. Said Young, the use of AI has reached an "exponential phase" at the very same time that Moore's Law, the decades-old rule of thumb about semiconductor progress, has ground to a standstill. "The times are slightly neurotic," he mused. "Digital CMOS is slowing down, we see that in Intel's woes in 10-nanometer [chip production], we see it in GlobalFoundries getting out of 7-nanometer, at the same time that there is this deep learning thing happening, there is economic demand."
The new technological era is one where task-specific hardware and software are on the rise. This year at Google I/O 2018, Google launched a new generation of Tensor Processing Unit (TPU), already in use to turbocharge a set of products. Now the MountainView search giant has announced enhanced Julia capabilities to the TPU ecosystem. To remain relevant in the new era, Julia Computing has developed a method for running suitable sections of Julia programs to TPUs using an API and the Google XLA compiler. This development has added more options alongside Tensorflow to leverage Google Cloud.
Video surveillance and security plays a huge part in our everyday lives and is becoming both increasingly widespread and intelligent thanks in large part to the integration of analytics and edge computing that has emerged from the expansion of the Internet of Things (IoT) and the Fourth Industrial Revolution. While the IoT has expanded into a significant percentage of industrial and commercial sectors, surveillance and security, especially video surveillance, is currently being radically transformed by the convergence of multiple technologies including video surveillance systems, connected IoT devices, edge computing and artificial intelligence and machine learning. Through this transformation, video analytics has evolved into an essential technology for those employing video surveillance and security systems and the IoT, alongside maturing edge computing systems, is currently helping to develop even more intelligent video analytics solutions. As well as the previously mentioned technologies, artificial intelligence and machine learning algorithms are also helping to make video analytics solutions much more capable. In this article, we'll be looking at how the introduction of edge computing into video analytics has begun to transform the security and surveillance efforts of businesses and organisations all over the world.
Companies of all sizes are embracing cloud computing services to reduce infrastructure costs, increase agility and build new services. Over a third of the companies Lopez Research interviewed are using public cloud infrastructure services but what's impressive now is the range of services companies are adopting. Advanced analytics using cognitive functions is an excellent example of the cloud computing market's expansion beyond infrastructure. Many companies have the issue of legacy infrastructure, but few can say their business goes back hundreds of years. As Tassilo Festetics, the Vice President, Global Solutions at AB InBev, said in a telephone interview, "The world changed around us. At one point, sales were dictated by how fast a salesperson could drive a motorcycle between locations. Today, AB InBev needs to digitize all aspects of its business to be successful."