Goto

Collaborating Authors

cloud computing


Federal Government Inching Toward Enterprise Cloud Foundation - AI Trends

#artificialintelligence

The federal government continues its halting effort to field an enterprise cloud strategy, with Lt. Gen. Jack Shanahan, who leads the Defense Department's Joint AI Center (JAIC), commenting recently that not having an enterprise cloud platform has made the government's efforts to pursue AI more challenging. "The lack of an enterprise solution has slowed us down," stated Shanahan during an AFCEA DC virtual event held on May 21, according to an account in FCW. However, "the gears are in motion" with the JAIC using an "alternate platform" for example to host a newer anti-COVID effort. This platform is called Project Salus, and is a data aggregation that is able to employ predictive modeling to help supply equipment needed by front-line workers. The Salus platform was used for the ill-fated Project Maven, a DOD effort that was to employ AI image recognition to improve drone strike accuracy.


Baidu to Ramp Up Investments in Cloud Computing, Artificial Intelligence

#artificialintelligence

Baidu (NASDAQ: BIDU) is making a big push into cutting-edge IT segments. The company announced Thursday that it will be allocating more capital to investments in developing corners of the market, particularly artificial intelligence (AI), cloud computing, and data centers. This project will unfold over the next 10 years, in an attempt by the China-based company to build out assets for future tech needs. This piggybacks on the Chinese government's ambition to develop what it calls "new infrastructure" throughout the country to dramatically modernize its economy. Baidu did not specify how much it would spend on its new infrastructure efforts.


Enabling the Return To Work initiative using SAP Conversational AI & Qualtrics

#artificialintelligence

As many parts of the world continue to remain in lockdown due to the global pandemic, many countries have started to ease the restrictions. Particularly in Australia & New Zealand, schools have reopened, workers are heading back to their offices, and restaurants & retail stores are beginning to resume trade with new set of guidelines. These guidelines might also vary from one state to another and hence businesses that operate and have offices in different states, need to provide relevant updates to their employees to be able to comply with the new regulations. The most common practice from employers is to send out regular emails outlining the guidelines. Chatbots are beginning to play a vital role in providing real-time upto date information.


Deploy Machine Learning Pipeline on AWS Fargate - KDnuggets

#artificialintelligence

In our last post on deploying a machine learning pipeline in the cloud, we demonstrated how to develop a machine learning pipeline in PyCaret, containerize it with Docker and serve it as a web application using Google Kubernetes Engine. If you haven't heard about PyCaret before, please read this announcement to learn more. In this tutorial, we will use the same machine learning pipeline and Flask app that we built and deployed previously. This time we will demonstrate how to containerize and deploy a machine learning pipeline serverless using AWS Fargate. This tutorial will cover the entire workflow starting from building a docker image locally, uploading it onto Amazon Elastic Container Registry, creating a cluster and then defining and executing task using AWS-managed infrastructure i.e.


What Does the Future Hold for Edge Computing?

#artificialintelligence

Edge computing can roughly be defined as the practice of processing and storing data either where it's created or close to where it's generated -- "the edge" -- whether that's a smartphone, an internet-connected machine in a factory or a car. The goal is to reduce latency, or the time it takes for an application to run or a command to execute. While that sometimes involves circumventing the cloud, it can also entail building downsized data centers closer to where users or devices are. Anything that generates a massive amount of data and needs that data to be processed as close to real time as possible can be considered a use case for edge computing: think self-driving cars, augmented reality apps and wearable devices. Edge computing can roughly be defined as the practice of processing and storing data either where it's created or close to where it's generated -- "the edge" -- whether that's a smartphone, an internet-connected machine in a factory or a car.


AI For All: The US Introduces New Bill For Affordable Research

#artificialintelligence

Yesterday, AIM published an article on how difficult it is for the small labs and individual researchers to persevere in the high compute, high-cost industry of deep learning. Today, the policymakers of the US have introduced a new bill that will ensure deep learning is affordable for all. The National AI Research Resource Task Force Act was introduced in the House by Representatives Anna G. Eshoo (D-CA) and her colleagues. This bill was met with unanimous support from the top universities and companies, which are engaged in artificial intelligence (AI) research. Some of the well-known supporters include Stanford University, Princeton University, UCLA, Carnegie Mellon University, Johns Hopkins University, OpenAI, Mozilla, Google, Amazon Web Services, Microsoft, IBM and NVIDIA amongst others.


How Does AIOps Integrate AI and Machine Learning into IT Operations?

#artificialintelligence

This calls for an increase in budgetary allocation increase and more computing power (that can be leveraged) to be added from outside core IT. AIOps bridges the gap between service management, performance management, and automation within the IT eco-system to accomplish the continuous goal of IT operation improvements. AIOps creates a game plan that delivers within the new accelerated IT environments, to identify patterns in monitoring, service desk, capacity addition and data automation across hybrid on-premises and multi-cloud environments.


How Does AIOps Integrate AI and Machine Learning into IT Operations?

#artificialintelligence

This calls for an increase in budgetary allocation increase and more computing power (that can be leveraged) to be added from outside core IT. AIOps bridges the gap between service management, performance management, and automation within the IT eco-system to accomplish the continuous goal of IT operation improvements. AIOps creates a game plan that delivers within the new accelerated IT environments, to identify patterns in monitoring, service desk, capacity addition and data automation across hybrid on-premises and multi-cloud environments.


Advancing Azure service quality with artificial intelligence: AIOps

#artificialintelligence

"In the era of big data, insights collected from cloud services running at the scale of Azure quickly exceed the attention span of humans. It's critical to identify the right steps to maintain the highest possible quality of service based on the large volume of data collected. In applying this to Azure, we envision infusing AI into our cloud platform and DevOps process, becoming AIOps, to enable the Azure platform to become more self-adaptive, resilient, and efficient. AIOps will also support our engineers to take the right actions more effectively and in a timely manner to continue improving service quality and delighting our customers and partners. This post continues our Advancing Reliability series highlighting initiatives underway to keep improving the reliability of the Azure platform. The post that follows was written by Jian Zhang, our Program Manager overseeing these efforts, as she shares our vision for AIOps, and highlights areas of this AI infusion that are already a reality as part of our end-to-end cloud service management."--Mark


Review the top sessions from recent cloud conferences

#artificialintelligence

If there's a silver lining to social distancing, it's the fact that it gives us a chance to catch up on content we otherwise might have missed. There are always too many sessions to attend at cloud conferences -- from service introductions and updates to best practices and use cases -- that could change the way you use cloud technologies. The global health crisis has made it unlikely any of us will gather for a conference in 2020. Given the dangers of COVID-19, it seems unwise for thousands of professionals from around the world to gather in a crowded convention center. While the in-person conference experience is off the table for the near future, there are plenty of resources still available to review from cloud conferences over the past year.