The democratization of artificial intelligence and machine learning is transforming information technology. Developers and operations teams can access intelligent tools to make sense of vast data sets and troubleshoot issues. Now AI and ML are trickling down to regular business people. Intelligent experience technology is putting AI and ML in laymen's hands. It is enabling them to transform the way actual business processes -- not IT processes -- are completed.
Edge computing provides groundbreaking innovations to enterprise cloud organizations, including nearly instant code transfer, reduced latency, and enhanced performance. The lightning speed of edge compute is due to the placement of the platform. Unlike public cloud, edge compute is placed as close as possible to the point of interaction with humans, electronics, and various connected devices. Edge compute becomes more and more relevant to companies as applications evolve, including virtual reality, augmented reality, and video analytics, which rely on artificial intelligence. With real-time code transfer that AI needs to be extremely precise, and as AI evolves, every millisecond counts, according to Paul Savill (pictured), senior vice president of core network and technology solutions at CenturyLink Inc.
Applications are the tip of the iceberg in cloud-native computing. When monoliths shard into microservices and containers (a virtualized method for running distributed applications), the underlying infrastructure feels it; so do administrators and DevOps (Developer Operations) teams. How can they possibly spread themselves thin enough to handle all these distributed components? They can't, and they shouldn't try, according to Vijoy Pandey (pictured), vice president and chief technology officer of cloud computing at Cisco Systems Inc. Distributed apps require support -- from network systems, databases, etc. -- that don't obey the rules that worked for physical and virtual infrastructure, Pandey said.
Lift and shift has not been the greatest friend to those migrating to cloud. Many hustled legacy applications to cloud to little, if any, positive effect; some wound up "repatriating" apps once the bill arrived. Likewise, valuable data from on-premises systems can't be dumped cold into the cloud; after all, next-gen machine-learning applications in cloud stand or fall on well-governed, quality data. Older systems hold operational-processing data that helps machine-learning algorithms -- in cloud or on-prem -- make predictions. However, there is a generational gap to bridge between old data and advanced analytics technologies.
Less than 10% … that's how many artificial-intelligence test projects are estimated to be deployed into full-scale production in enterprise environments, according to a recent report from the International Institute for Analytics. There are a number of reasons for this surprisingly small amount, including an overwhelming amount of data and the lack of easy-to-use tools to analyze it. It's a problem that calls for operationalizing AI and machine learning, making it accessible and repeatable consistently. "Ultimately, if you want to get business value from those models and all of the hard work that you've done, it has to be injected into the business process," said Anant Chintamaneni (pictured), vice president and general manager of BlueData at Hewlett Packard Enterprise Co. "Operationalization of machine learning is ultimately the key, and that's the progression that enterprises have to make." Burris was joined for a digital community event by co-host Stu Miniman (@stu), and they also interviewed Nanda Vijaydev, distinguished technologist and lead data scientist at HPE; Patrick Osborne, vice president and general manager of big data, analytics, and scale-out data platforms at HPE; and Wikibon analyst James Kobielus (@jameskobielus).