Artificial intelligence/machine learning (AI/ML) has the potential to generate huge business value for semiconductor companies at every step of their operations, from research and chip design to production through sales. But our recent survey of semiconductor-device makers shows that only about 30 percent of respondents stated that they are already generating value through AI/ML. Notably, these companies have made significant investments in AI/ML talent, as well as the data infrastructure, technology, and other enablers, and have already fully scaled up their initial use cases. The other respondents--about 70 percent--are still in the pilot phase with AI/ML and their progress has stalled. We believe that the application of AI/ML will dramatically accelerate in the semiconductor industry over the next few years. Taking steps to scale up now will allow companies to capture the full benefits of these technologies. This article focuses on device makers, including integrated device manufacturers (IDMs), fabless players, foundries, and semiconductor assembly and test services, or SATS (for more information on our research, see sidebar, "Our methodology").
The volume of data being generated by a spectrum of devices continues to skyrocket. Now the question is what can be done with that data. By Cisco's estimates, traffic on the Internet will be 3.3 zetabytes per year by 2021, up from 1.2 zetabytes in 2016. Traffic on the busiest 60-minute period in a day increased 51% in 2016, compared with a 32% growth in overall traffic. Figure 1: Historical and projected growth of data.
Amid the shift towards more complex chips at advanced nodes, many chipmakers are exploring or turning to advanced forms of machine learning to help solve some big challenges in IC production. A subset of artificial intelligence (AI), machine learning, uses advanced algorithms in systems to recognize patterns in data as well as to learn and make predictions about the information. In the fab, machine learning promises to provide faster and more accurate results in select areas, such as finding and classifying defects in chips. Machine learning also is used in other process steps, but there are still some challenges to deploy it. It has been used in computing and other fields for decades. It first appeared in semiconductor production in the 1990s. Some saw it as a way to help automate the steps for some manually-driven fab equipment. Over time, machine learning has made staggering progress in computing and elsewhere.
Wally Rhines, president and CEO of Mentor, a Siemens Business, sat down with Semiconductor Engineering to discuss a wide range of industry and technology changes and how that will play out over the next few years. What follows are excerpts of that conversation. SE: What will happen in the end markets? Rhines: The end markets are perhaps more exciting from a design perspective right now than they have been in recent years. Everyone is intrigued with the electronic design opportunities that have been emerging in the automotive industry.
John Kibarian, president and CEO of PDF Solutions, sat down with Semiconductor Engineering to talk about the impact of data analytics on everything from yield and reliability to the inner structure of organizations, how the cloud and edge will work together, and where the big threats are in the future. SE: When did you recognize that data would be so critical to hardware design and manufacturing? Kibarian: It goes back to 2014, when we realized that consolidation in foundries was part of a bigger shift toward fabless companies. Every fabless company was going to become a systems company, and many systems companies were rapidly becoming fabless. We had been using our analytics to help customers with advanced nodes, and one of them told me that they were never going to build another factory again. Our analytics had been used for materials review board and better control of our supply chain and packaging before that.