Editor's note: Today's guest post comes from AI for healthcare platform Lumiata. Here's the story of how they use Google Cloud to power their platform--performing data prepping, model building, and deployment to tackle inherent challenges in healthcare organizations. If ever there was a year for healthcare innovation--2020 was it. At Lumiata, we've been on a mission to deliver smarter, more cost-effective healthcare since 2013, but the COVID-19 pandemic added new urgency to our vision of making artificial intelligence (AI) easy and accessible. Using AI in healthcare went from a nice-to-have to a must-have for healthcare organizations.
Google on Monday said that it's partnering with Siemens to advance AI deployments in industrial use cases. More specifically, Siemens is integrating Google Cloud data analytics and AI capabilities into its Digital Industries Factory Automation portfolio. The integration gives Google a major partner in the manufacturing space, one of six key verticals the cloud company is targeting. The integration, the companies said, should make it easier for manufacturers to manage factory data, run cloud-based AI and machine learning models on top of it, and deploy algorithms at the network edge. Over the next few months, the companies will have share more about the specific Google Cloud tools that will be integrated into the Siemens portfolio and offered as a joint solution, a Google spokesperson told ZDNet.
According to IDG's 2020 Cloud Computing Study, 92% of organizations have at least some sort of cloud footprint in regard to their IT environment. Therefore, traditional cloud security approaches must evolve to keep up with the dynamic infrastructure and challenges that cloud environments present – most notably, the inundation of data insights generated within the cloud. More than one-third of IT security managers and security analysts ignore threat alerts when the queue is full. This is a common issue that is driving the high demand for machine learning-based analytics, as it helps security teams sift through massive amounts of data to prioritize risks and vulnerabilities and make more informed decisions. However, a word of caution when using machine learning-based technology: the age-old garbage-in, garbage-out applies to security-focused machine learning engines.
To say Kubernetes, everyone's top container orchestration pick, is hard to master is an understatement. Kubernetes doesn't have so much as a learning curve as it does a learning cliff. But, Canonical's MicroK8s lets you learn to climb it in your home. And, with its latest release, it's easier than ever to set up a baby Kubernetes cluster using inexpensive Raspberry Pi or NVIDIA Jetson single-board computers (SBC). MicroK8s is a tiny Kubernetes cluster platform.
The term Artificial Intelligence (AI) was used for the first time by John McCarthy during a workshop in 1956 at Dartmouth College. The first AI application programs for playing checker and chess were developed in 1951. After the '50s, AI was on the rise and fall until the 2010s. Over the years, there have been some investments in AI by vendors, universities, institutions. Sometimes, hopes were high and sometimes hopes were low.
We have a vision of a Network Compute Fabric where the lines between networking and computing disappear. On the journey there, edge cloud computing provides a critical stepping-stone where computing is pushed very close to where it is needed. This distribution of computing capabilities in the network creates new challenges for its management and operation. We argue that a data-centric approach that extensively uses artificial intelligence (AI) and machine learning (ML) technologies to realize specific management functions is a good candidate to tackle these challenges. As can be seen in Figure 1, edge computing services can be provided through compute/storage resources at different locations in a network, such as on-premises at a customer/enterprise site (industrial control, for example) or at access and local/regional sites (telco operators, for example).
In our last post, we demonstrated how to develop a machine learning pipeline and deploy it as a web app using PyCaret and Flask framework in Python. If you haven't heard about PyCaret before, please read this announcement to learn more. In this tutorial, we will use the same machine learning pipeline and Flask app that we built and deployed previously. This time we will demonstrate how to deploy a machine learning pipeline as a web app using the Microsoft Azure Web App Service. In order to deploy a machine learning pipeline on Microsoft Azure, we will have to containerize our pipeline in a software called "Docker".
One of the initial hesitations in many enterprise organizations moving into the cloud in the last decade was the question of security. Significant amounts of money had been put into corporate firewalls, and now technology companies were suggesting corporate data reside outside that security barrier. Early questions were addressed, and information began to move into the cloud. However, nothing stands still, and the extra volume of data and networking intersects with the increased complexity of attacks, and artificial intelligence (AI) is being used to keep things safe. The initial hesitation for enterprise organizations to move to the cloud was met by data centers improving hardware and networking security, while the cloud software providers, both cloud hosts and application providers, increased software security past what was initially offered in the cloud.
Baking is as much science as it is art. Perhaps to find out whether the former's more important, Google Cloud AI is taking on a Great British Bake Off winner in a dessert face-off. Sara Robinson, an amateur baker and Google Cloud developer advocate, built a machine learning model that examined hundreds of baking recipes (including ones for traybakes, cookies and scones) to help her come up with a new one. The model generated lists of ingredients and amounts that were used as the basis for step-by-step recipes. The model was able to come up with hybrid recipes and Robinson opted for one that had a machine learning-generated cake batter on top of a machine learning-generated cookie.
Google has hired former Intel executive Uri Frank to lead its custom chip division. Apart from Google, many companies have taken to chipmaking in the last few years to build competitive moats. The Intel veteran will serve as the Vice-President of Engineering for server chip design at Google. Uri Frank has over two decades of experience in custom CPU design and delivery experience. His expertise in design engineering at Intel will come in handy for Google.