Google I/O 2022, Google's largest developer conference, kicked off with a keynote speech from Alphabet CEO Sundar Pichai. The keynote speech had major announcements including the launch of Pixel watch, updates on PaLM and LaMDA, advancements in AR and immersive technology etc. Let us look at the key highlights. "Recently we announced plans to invest USD 9.5 billion in data centers and offices across the US. One of our state-of-the-art data centers is in Mayes County, Oklahoma. I'm excited to announce that, there, we are launching the world's largest, publicly-available machine learning hub for our Google Cloud customers," Sundar Pichai said.
Google has revealed that five of its datacenters are now running almost entirely on carbon-free'clean' electricity. According to Urs Hölzle, Google's senior vice president of cloud infrastructure, the search and ad giant buys power from more than 50 renewable energy projects, and has a capacity of 5.5 gigawatts, which equates to about one million solar rooftops. Google announced the update in line with this week's Earth Day on April 22, a campaign to raise awareness about environmental issues which has been running since 1970. Data centers that power the cloud have become a major source of carbon dioxide (CO2) emissions and tech giants have been showcasing their progress in making the infrastructure that delivers email, productivity apps, e-commerce and games less harmful to the environment. Google CEO Sundar Pichai today noted the company's existing goal of operating entirely carbon-free by 2030 rather than offsetting its carbon footprint. "Within a decade we aim for every Google data center, cloud region, and office campus to run on clean electricity every hour of every day," writes Pichai.
Google may be buying heavens only knows how many GPUs to run HPC and AI workloads on its eponymous public cloud, and it may have talked recently about how it is committed to the idea of pushing the industry to innovate at the SoC level and staying out of designing its own compute engines, but the company is still building its own Tensor Processing Units, or TPUs for short, to support its TensorFlow machine learning framework and the applications it drives within Google and as a service for Google Cloud customers. If you were expecting to get a big reveal of the TPUv4 architecture from the search engine giant and machine learning pioneer at its Google I/O 2021 conference this week, you were no doubt, like us, sorely disappointed. In his two-hour keynote address, which you can see here, Google chief executive officer Sundar Pichai, who is also CEO at Google's parent company, Alphabet, ever so briefly talked about the forthcoming TPUv4 custom ASIC designed by Google and presumably built by Taiwan Semiconductor Manufacturing Corp like every other leading-edge compute engine on Earth. As the name suggests, the TPUv4 chip is Google's fourth generation of machine learning Bfloat processing beasts, which it weaves together with host systems and networking to create what amounts to a custom supercomputer. "This is the fastest system that we have ever deployed at Google – a historic milestone for us," Pichai explained in his keynote.
Here's what developers really think about AWS, Microsoft Azure, and Google Cloud Platform providers lack adequate support resources for developers. Google CEO Sundar Pichai says his company's $2.5bn datacenter expansion across the US will create thousands of jobs in engineering, operations and sales. Google will be expanding its offices and datacenters in 14 states across the US, Pichai said in a blogpost detailing a groundbreaking event for a new datacenter in Clarksville/Montgomery County in Tennessee. Google announced plans in 2015 to build the Tennessee datacenter in a former semiconductor manufacturing facility. The $2.5bn will go on opening or expanding datacenters in Alabama, Oregon, Tennessee, Virginia and Oklahoma.
Google this week pushed back against claims by earlier research that large AI models can contribute significantly to carbon emissions. In a paper coauthored by Google AI chief scientist Jeff Dean, researchers at the company say that the choice of model, datacenter, and processor can reduce carbon footprint by up to 100 times and that "misunderstandings" about the model lifecycle contributed to "miscalculations" in impact estimates. Carbon dioxide, methane, and nitrous oxide levels are at the highest they've been in the last 800,000 years. Together with other drivers, greenhouse gases likely catalyzed the global warming that's been observed since the mid-20th century. It's widely believed that machine learning models, too, have contributed to the adverse environmental trend.