Goto

Collaborating Authors

google cloud


Google Ups its AI Game

#artificialintelligence

Google Cloud is rolling out an "AI Hub" supplying machine learning content ranging from data pipelines and TensorFlow modules. It also announced a new pipeline component for the Google-backed Kubeflow open-source project, the machine learning stack built on Kubernetes that among other things packages machine learning code for reuse. The AI marketplace and the Kubeflow pipeline are intended to accelerate development and deployment of AI applications, Google said Thursday (Nov. The new services follow related AI efforts such as expanding access to updated Tensor processing units (TPUs) on the Google Cloud. The AI Hub is described as a community for accessing "plug-and-play" machine learning content.


Mayo Clinic, Google show how they're deploying cloud-based AI to combat COVID-19

#artificialintelligence

One of the effects of the COVID-19 public health emergency is that it has added urgency and speed to technology transformations that were already occurring, such as cloud migration and deployments of artificial intelligence and machine learning. At few places is that shift more pronounced than at Rochester, Minnesota-based Mayo Clinic, which six months before the pandemic arrived in the United States had embarked on a decade-long strategic partnership with Google Cloud. "Our partnership will propel a multitude of AI projects currently spearheaded by our scientists and physicians, and will provide technology tools to unlock the value of data and deliver answers at a scale much greater than today," said Mayo CIO Cris Ross at the time. Shortly after the partnership was announced, toward the end of 2019, the health system hired longtime CIO Dr. John Halamka as president of Mayo Clinic Platform, tasking him with leading a cloud-hosted, AI-powered digital transformation across the enterprise. In the months since, like the rest of the world, Mayo Clinic has found itself tested and challenged by the pandemic and its ripple effect – but has also embraced the moment as an inflection point, a powerful moment to push forward with an array of new use cases to drive quality improvement, streamline efficiency, and boost the health of patients and populations in the years ahead.


Train your TensorFlow model on Google Cloud using TensorFlow Cloud

#artificialintelligence

Posted by Jonah Kohn and Pavithra Vijay, Software Engineers at Google TensorFlow Cloud is a python package that provides APIs for a seamless transition from debugging and training your TensorFlow code in a local environment to distributed training in Google Cloud.


LF Edge Project Announces the Release of Fledge v1.8

#artificialintelligence

SAN FRANCISCO, Calif., July 31, 2020 – LF Edge, an umbrella organization within the Linux Foundation that aims to establish an open, interoperable framework for edge computing independent of hardware, silicon, cloud, or operating system, announced maturing of its Fledge project, which has issued it's 1.8 release and moved to the Growth Stage within the LF Edge umbrella. Fledge is an open source framework for the Industrial Internet of Things (IIoT), used to implement predictive maintenance, situational awareness, safety and other critical operations. Fledge v1.8 is the first release since moving to the Linux Foundation. However, this is the ninth release of the project code that has over 60,000 commits, averaging 8,500 commits/month. Concurrently, Fledge has matured into a Stage 2 or "Growth Stage" project within LF Edge.


Google claims its new TPUs are 2.7 times faster than the previous generation

#artificialintelligence

Google's fourth-generation tensor processing units (TPUs), the existence of which weren't publicly revealed until today, can complete AI and machine learning training workloads in close-to-record wall clock time. That's according to the latest set of metrics released by MLPerf, the consortium of over 70 companies and academic institutions behind the MLPerf suite for AI performance benchmarking. It shows clusters of fourth-gen TPUs surpassing the capabilities of third-generation TPUs -- and even those of Nvidia's recently released A100 -- on object detection, image classification, natural language processing, machine translation, and recommendation benchmarks. Google says its fourth-generation TPU offers more than double the matrix multiplication TFLOPs of a third-generation TPU, where a single TFLOP is equivalent to 1 trillion floating-point operations per second. It also offers a "significant" boost in memory bandwidth while benefiting from unspecified advances in interconnect technology.


Get The Best From Your E-Commerce Platform With Recommendations AI

#artificialintelligence

With the continuing shift to digital, especially in the retail industry, ensuring a highly personalized shopping experience for online customers is crucial for establishing customer loyalty. In particular, product recommendations are an effective way to personalize the customer experience as they help customers discover products that match their tastes and preferences. Google has spent years delivering high-quality recommendations across our flagship products like YouTube and Google Search. Recommendations AI draws on that rich experience to give organizations a way to deliver highly personalized product recommendations to their customers at scale. Today, we are pleased to announce that Recommendations AI is now publicly available to all customers in beta.


What Is Google's AI Adoption Framework - Analytics India Magazine

#artificialintelligence

Google Cloud, has recently launched their AI Adoption Framework whitepaper, authored by Donna Schut, Khalid Salama, Finn Toner, Barbara Fusinska, Valentine Fontama, and Lak Lakshmanan, to provide a guiding framework for enterprises to leverage the power of AI effectively. Google Cloud's AI Adoption Framework has been designed on four pillars of an organisation -- "people, process, technology, and data." Google Cloud blog further noted that these four pillars of organisations should follow six critical themes -- "learn, lead, access, scale, automate, and secure -- for their AI success. According to Google, "These themes are foundational to the AI adoption framework." The first step concerns the scale of'learning' within an organisation, which includes the process of upskilling existing employees, recruiting new talents, and augmenting analytics and engineering professionals with "experience partners." This process of learning will help organisations to decide which analytics and machine learning skills would be required for the business, and accordingly, they can strategies their hiring process amid this crisis. Second, comes the'leading' which concerns whether or not the leaders of organisations provide enough support and guidance to data scientists and engineers to deploy machine learning and artificial intelligence in their business projects. This step would help businesses in understanding the structure of the team, the cost of the projects and the governance of the projects to encourage cross-functional collaboration in the organisation. Next is the'access" to data, where companies recognise the data management strategies and analytics professionals are able to collect, share, discover, analyse the data and other ML artefacts.


Google signs up Verizon for its AI-powered contact center services – TechCrunch

#artificialintelligence

Google today announced that it has signed up Verizon as the newest customer of its Google Cloud Contact Center AI service, which aims to bring natural language recognition to the often inscrutable phone menus that many companies still use today (disclaimer: TechCrunch is part of the Verizon Media Group). For Google, that's a major win, but it's also a chance for the Google Cloud team to highlight some of the work it has done in this area. It's also worth noting that the Contact Center AI product is a good example of Google Cloud's strategy of packaging up many of its disparate technologies into products that solve specific problems. "A big part of our approach is that machine learning has enormous power but it's hard for people," Google Cloud CEO Thomas Kurian told me in an interview ahead of today's announcement. "Instead of telling people, 'well, here's our natural language processing tools, here is speech recognition, here is text-to-speech and speech-to-text -- and why don't you just write a big neural network of your own to process all that?' Very few companies can do that well. We thought that we can take the collection of these things and bring that as a solution to people to solve a business problem. And it's much easier for them when we do that and […] that it's a big part of our strategy to take our expertise in machine intelligence and artificial intelligence and build domain-specific solutions for a number of customers."


An Architecture for Artificial Intelligence Storage

#artificialintelligence

As we've talked about in the past, the focus on data – how much is being generated, where it's being created, the tools needed to take advantage of it, the shortage of skilled talent to manage it, and so on – is rapidly changing the way enterprises are operating both in the datacenter and in the cloud and dictating many of the product roadmaps being developed by tech vendors. Automation, analytics, artificial intelligence (AI) and machine learning, and the ability to easily move applications and data between on-premises and cloud environments are the focus of much of what OEMs and other tech players are doing. And all of this is being accelerated by the COVID-19 pandemic, which is speeding up enterprise movement to the cloud and forcing them to adapt to a suddenly widely distributed workforce, trends that won't be changing any time soon as the coronavirus outbreak tightens its grip, particularly in the United States. OEMs over the past several months have been particularly aggressive in expanding their offerings in the storage sector, which is playing a central role in help enterprises bridge the space between the datacenter, the cloud and the network edge and to deal with the vast amounts of structured and – in particular – unstructured data being created. That can be seen in announcements that some of the larger vendors have made over the past few months.


Verizon taps Google Cloud for Contact Center AI

ZDNet

Google Cloud on Monday announced that Verizon is piloting its Contact Center AI technology to improve its customer experiences. The deal shows Google making progress in its broad plans to win over the telecommunications industry with its cloud and AI tools. Google's Contact Center AI software, which became generally available last November, enables businesses to deploy virtual agents for basic customer interactions. The service promises more intuitive customer support through natural-language recognition. When a customer contacts Verizon through voice, call or chat, they can simply say or type their request -- there's no need to follow menu prompts or option trees.