Deloitte has launched the Deloitte Center for AI Computing, designed to accelerate the development of artificial intelligence offerings for its clients. The center is built on NVIDIA's DGX A100 systems to create a supercomputing architecture that will help Deloitte's clients in their efforts to become AI-fueled organizations. The accelerated computing platforms feature NVIDIA graphics processing unit technology, along with its networking and software for advanced data processing, analytics and AI by bringing massive parallel processing capability and speed to deep learning, machine learning and data science workloads, the company said. Deloitte's State of AI in the Enterprise survey found that more than half of respondents reported spending more than $20 million over the past year on AI technology and talent. Nearly all adopters said they were using AI to improve efficiency, while mature adopters are also harnessing the technologies to boost differentiation.
Leading-edge techniques like deep learning are quickly gaining traction as today's enterprises attempt to extract real-time insights from massive data volumes. However, many businesses are looking to get started with deep learning and may be unsure of how to acquire the tools and expertise required for success. New Centers of Excellence (CoEs) from Hewlett Packard Enterprise (HPE) and NVIDIA are addressing these key challenges and providing access to the technological tools and skills that will help customers in every industry better utilize these key innovations. Many businesses today are striving to fully leverage all of their data as a rapidly expanding'Internet of Things' generates a massive amount of data every day. It's become quite a task to analyze, classify, recognize, and categorize such large data volumes, not to mention convert it into actionable intelligence that can be used to drive competitive advantage.
The Global Artificial Intelligence for Healthcare Applications market report enumerates highly classified information portfolios encompassing multi-faceted industrial developments with vivid references of market share, size, revenue predictions along with overall regional outlook. The report illustrates a highly dependable overview of the competition isle, with detailed assessment of business verticals. Post a systematic research initiative and subsequent evaluation overview, the global Artificial Intelligence for Healthcare Applications market mimicking its past growth performance is anticipated to strike a flourishing ROI and is therefore more likely to be on the favorable growth curve in the coming years. This versatile report describing the global Artificial Intelligence for Healthcare Applications market has entailed a range of information portfolios that have been segregated into indispensable and additional information streams that have been represented in the form of tables, pie-charts, graphs and the like to align with maximum reader understanding.
Demand for some of Nvidia's chips has been so hot that it has outpaced the company's ability to increase production, adding to chip-supply shortages riling the semiconductor industry. Nvidia's newest graphics cards were a holiday sensation, Chief Financial Officer Colette Kress said during an earnings call. She added that some inventories are likely to remain low in the first quarter even as Nvidia increases supply. "Throughout our supply chain, stronger demand globally has limited the availability of capacity and components," Ms. Kress said. President Biden on Wednesday signed an executive order directing a broad review of supply chains for semiconductors and other critical materials.
Nvidia on Wednesday published fourth quarter financial results above market expectations, with record revenue in both its Gaming and Data Center segments. Fourth quarter non-GAAP earnings per diluted share were $3.10 on revenue of $5 billion, up 61 percent year-over-year. Analysts were expecting earnings of $2.81 on revenue of $4.82 billion. For the full fiscal year, non-GAAP earnings per diluted share were $10. Revenue was a record $16.68 billion, up 53 percent.
Nvidia reported revenues of $5.0 billion for its fourth fiscal quarter ended January 31, up 61% from a year earlier. The revenues and non-GAAP earnings per share of $3.10 beat expectations as new gaming hardware and AI products generated strong demand. A year ago, Nvidia reported non-GAAP earnings per share of $1.89 on revenues of $3.1 billion. The Santa Clara, California-based company makes graphics processing units (GPUs) that can be used for games, AI, and datacenter computing. While many businesses have been hit hard by the pandemic, Nvidia has seen a boost in those areas.
Perhaps in no other technology has there been so many decades of large year-over-year improvements as in computing. It is estimated that a third of all productivity increases in the U.S. since 1974 have come from information technology,a,4 making it one of the largest contributors to national prosperity. The rise of computers is due to technical successes, but also to the economics forces that financed them. Bresnahan and Trajtenberg3 coined the term general purpose technology (GPT) for products, like computers, that have broad technical applicability and where product improvement and market growth could fuel each other for many decades. But, they also predicted that GPTs could run into challenges at the end of their life cycle: as progress slows, other technologies can displace the GPT in particular niches and undermine this economically reinforcing cycle. We are observing such a transition today as improvements in central processing units (CPUs) slow, and so applications move to specialized processors, for example, graphics processing units (GPUs), which can do fewer things than traditional universal processors, but perform those functions better. Many high profile applications are already following this trend, including deep learning (a form of machine learning) and Bitcoin mining. With this background, we can now be more precise about our thesis: "The Decline of Computers as a General Purpose Technology." We do not mean that computers, taken together, will lose technical abilities and thus'forget' how to do some calculations.
Good news for folks looking to learn about the latest AI development techniques: Nvidia is now allowing the general public to access the online workshops it provides through its Deep Learning Institute (DLI). The GPU giant today announced today that selected workshops in the DLI catalog will be open to everybody. These workshops previously were available only to companies that wanted specialized training for their in-house developers, or to folks who had attended the company's GPU Technology Conferences. Two of the open courses will take place next month, including "Fundamentals of Accelerated Computing with CUDA Python," which explores developing parallel workloads with CUDA and NumPy and cost $500. There is also "Applications of AI for Predictive Maintenance," which explores technologies like XGBoost, LSTM, Keras, and Tensorflow, and costs $700.
Google and Apple are following through on their promise to play nice with each other. After announcing in December that the Apple TV app would come to Chromecasts with Google TV, the companies have shared today that the service is now globally available. With the integration, you'll not only be able to use Apple's app from your Chromecast's interface, but you can also access your purchased films and shows, as well as personalized suggestions without having to cast your iPhone to the TV. In the US, Google TV users will see Apple Originals in their personalized recommendations and search results. You'll also be able to ask the Google Assistant to open the Apple app or play one of its exclusive titles.
Yet according to a new paper, it may be the secret sauce for an entirely new kind of computer--one that combines quantum mechanics with the brain's inner workings. The result isn't just a computer with the ability to learn. The mechanisms that allow it to learn are directly embedded in its hardware structure--no extra AI software required. The computer model also simulates how our brains process information, using the language of neuron activity and synapses, rather than the silicon-based churning CPUs in our current laptops. The main trick relies on the quantum spin properties of cobalt atoms.