Not enough data to create a plot.
Try a different view from the menu above.
With 105 billion transistors, Azure Maia 100 is "one of the largest chips on 5-nanometer process technology," says Microsoft, referring to the size of the smallest features of the chip, five billionths of a meter. At its annual developer conference, Ignite, Microsoft on Wednesday unveiled the long-anticipated custom cloud computing chip for its Azure cloud service, called Azure Maia 100, which it said is optimized for tasks such as generative AI. The Maia 100 is the first in a series of Maia accelerators for AI, the company said. With 105 billion transistors, it is "one of the largest chips on 5-nanometer process technology," said Microsoft, referring to the size of the smallest features of the chip, five billionths of a meter. Also: Microsoft's latest AI offerings for developers revealed at Ignite 2023 In addition, the company introduced its first microprocessor built in-house for cloud computing, the Azure Cobalt 100.
If you want to run artificial intelligence (AI) and machine learning applications such as large language models (LLMs) at scale, you must run them on Kubernetes. That's where Kubernetes Al toolchain operator -- the latest addition to Microsoft's Azure Kubernetes Service (AKS) -- comes in. AKS already makes Kubernetes on Azure easier. Instead of working it out by hand, AKS's built-in code-to-cloud pipelines and guardrails give you a faster way to start developing and deploying cloud-native apps in Azure. With its unified management and governance for on-premises, edge, and multi-cloud Kubernetes clusters, AKS also makes it simpler (there's no such thing as "simple" when it comes to Kubernetes) to integrate with Azure security, identity, cost management, and migration services.
There is no quick-fix to closing this expectation-reality gap, but the first step is to foster honest dialogue between teams. Then, business leaders can begin to democratize ML across the organization. Democratization means both technical and non-technical teams have access to powerful ML tools and are supported with continuous learning and training. Non-technical teams get user-friendly data visualization tools to improve their business decision-making, while data scientists get access to the robust development platforms and cloud infrastructure they need to efficiently build ML applications. At Capital One, we've used these democratization strategies to scale ML across our entire company of more than 50,000 associates.
In the digital age we live in, the cloud has become a cornerstone of many organizations' operations. However, reliance on one cloud provider for multiple business functions can be a risky choice, according to a new Gartner survey. For the second quarter in a row, Gartner identified cloud concentration as a top five emerging risk for organizations. The results were based on a survey that asked 294 risk executives about their views on emerging risk. Also: Generative AI can't find its own errors.
Modern banking is a far cry from the analog processes of yesteryear. Today's smartphone-wielding customers demand hyper-personalized transactions woven seamlessly into everyday life, one-to-one personal service with their data at agents' fingertips, and instantaneous financial insights--and some are even pressing for features like blockchain integration and support for digital currencies. But the ability to serve these customers is not assured: Gartner predicts that by 2025 more than 85% of organizations will move forward with cloud principles, but will not yet be able to fully use cloud-native architectures and technologies. These tools will be key to banks' ability to move to digital ecosystem platforms and develop new services, partner with other players, work effectively with colleagues, and meet customer expectations. Financial institutions are under pressure to future-proof and accommodate emerging technologies such as artificial intelligence (AI), machine learning (ML), and cloud computing--and they are also facing significant infrastructural strains.
The Asus Chromebook Plus CX34 offers reliable performance, a 1080p webcam, and a stunning design at a reasonable price point. What more could you ask for? Back in the day, Chromebooks were nothing more than low-powered machines that ran a web browser as your operating system, an extension of Google Chrome if you will. They were durable and largely virus-free, making them a popular choice in the educational market. Nowadays, they're capable of cloud gaming, which I never thought I'd see.
This article is from Big Technology, a newsletter by Alex Kantrowitz. On a Saturday night in mid-September, a senior Google engineer shared some rough news with more than 50 colleagues. Part of the company's cloud services offering was failing Anthropic, a darling A.I. startup and key strategic customer, and they'd have to work overtime to fix it. To repair the faulty part of its service--an underperforming and unstable NVIDIA H100 cluster--Google Cloud leadership initiated a seven-day-per-week sprint for the next month. The downside of not making it work, the senior engineer said, was "too large, for Anthropic (most importantly), for Google Cloud, and for Google," according to documents I reviewed.
A massive leak from the FTC v. Microsoft court battle showed Microsoft's roadmap for a mid-generation Xbox Series X console, but that wasn't the only news. The same document also revealed Microsoft's tentative plans for the next-generation Xbox -- what it calls a "hybrid game platform." The system would combine local hardware and cloud computing to create an "immersive game & app platform" arriving around 2028, according to a leaked May 2022 presentation hidden inside another PDF. "Our vision: Develop a next generation hybrid game platform capable of leveraging the combined power of the client and cloud to deliver deeper immersion and entirely new classes of game experiences," one of the slides reads. "Optimized for real time game play and creators, we will enable new levels of performance beyond the capabilities of the client hardware alone."
At this week's big Google Cloud conference, Google Next '23, executives at the tech giant made numerous announcements and forward-looking statements to help developers and IT managers plan for future cloud buildouts and innovations. The ZDNET editorial team deployed en mass to explore the wide range of announcements, which you can find here on ZDNET. My task was to look at Google's work in the areas of developer, data, and AI cloud. Here, Google is doing a lot to empower developers to build out the next generation of cloud-based applications and incorporate generative AI capabilities, where appropriate. Back in May, at Google I/O, Google announced a new capability called Duet AI for developers.