Goto

Collaborating Authors

Hardware


The 10 Best Shows on Apple TV Right Now

WIRED

Slowly but surely Apple TV is finding its feet. The streaming service, which at launch we called "odd, angsty, and horny as hell," has evolved into a diverse library of dramas, documentaries, and comedies. It's also fairly cheap compared to services like Netflix--and Apple often throws in three free months when you buy a new iPhone, iPad, Mac, or Apple TV. Curious but don't know where to get started? Below are our picks for the best shows on the service.


What Is AI Computing?

#artificialintelligence

Mathematical instruments mark the history of human progress. They've enabled trade and helped navigate oceans, and advanced understanding and quality of life. The latest tool propelling science and industry is AI computing. AI computing is the math-intensive process of calculating machine learning algorithms, typically using accelerated systems and software. It can extract fresh insights from massive datasets, learning new skills along the way. It's the most transformational technology of our time because we live in a data-centric era, and AI computing can find patterns no human could.


IBM goes big on quantum-computer hardware, software

#artificialintelligence

IBM has rolled out its most powerful quantum-computing system so far--the Osprey, a 433-qubit machine that supports three times more qubits than its current Eagle system, and reiterated its plan to have a 1,121-qubit processor, called Condor, out in 2023. At the IBM Quantum Summit 2022, the company also said it was continuing development of a modular quantum platform called System Two that will combine multiple processors into a single system with new communication links that IBM says will use hybrid-cloud middleware to integrate quantum and classical workflows. In addition IBM said it will continue to prototype quantum software applications for specific use cases. By 2025, IBM said, developers will be able to explore quantum machine-learning applications and more. Big Blue is following the quantum roadmap it laid out earlier this year that set long-term goals.


The Morning After: NVIDIA's GeForce Now Ultimate is a high-end cloud gaming service

Engadget

While Google shuttered Stadia for good this week, other cloud gaming services are expanding their offerings. NVIDIA is upgrading its GeForce Now service with a bunch of features, thanks to the addition of new SuperPODs equipped with RTX 4080 GPUs. This seems to be the first truly high-end cloud gaming experience. The renamed Ultimate plan now includes support for refresh rates of up to 240Hz at full HD or 4K at 120 fps and an expanded set of usable widescreen resolutions (3,840x1,600, 3,440x1,440 and 2,560x1,080). NVIDIA is also adding better support for HDR on both Macs and PCs, along with the ability to use full ray tracing with DLSS3 in supported games.


Nvidia shows how surprisingly hard it is for a robot to pick up a chicken wing

ZDNet

It's football playoff season in America -- a time when it's all too easy to pick up and chow down on a chicken wing without much thought. It turns out, however, that the robot that helped prepare your chicken had to put in a great deal of effort to pick up that piece of meat. Nvidia on Thursday showcased how the Massachusetts-based startup Soft Robotics is deploying its technology -- including on-site GPUs and the Isaac Sim robotics simulation toolkit -- to make it easier to deploy robots designed to handle foods like chicken wings. Also: Nvidia's robot simulator now includes human characters too Food processing and packaging plants may seem like an obvious place to deploy robots. Foods like chicken wings are quickly moved across conveyor belts as they're uniformly cooked and prepared for consumption.


How Nvidia's CUDA Monopoly In Machine Learning Is Breaking - OpenAI Triton And PyTorch 2.0

#artificialintelligence

Over the last decade, the landscape of machine learning software development has undergone significant changes. Many frameworks have come and gone, but most have relied heavily on leveraging Nvidia's CUDA and performed best on Nvidia GPUs. However, with the arrival of PyTorch 2.0 and OpenAI's Triton, Nvidia's dominant position in this field, mainly due to its software moat, is being disrupted. This report will touch on topics such as why Google's TensorFlow lost out to PyTorch, why Google hasn't been able to capitalize publicly on its early leadership of AI, the major components of machine learning model training time, the memory capacity/bandwidth/cost wall, model optimization, why other AI hardware companies haven't been able to make a dent in Nvidia's dominance so far, why hardware will start to matter more, how Nvidia's competitive advantage in CUDA is wiped away, and a major win one of Nvidia's competitors has at a large cloud for training silicon. SemiAnalysis is an ad-free, reader-supported publication. To receive new posts and support, consider becoming a free or paid subscriber. The 1,000-foot summary is that the default software stack for machine learning models will no longer be Nvidia's closed-source CUDA. The ball was in Nvidia's court, and they let OpenAI and Meta take control of the software stack. That ecosystem built its own tools because of Nvidia's failure with their proprietary tools, and now Nvidia's moat will be permanently weakened.


Nvidia, Evozyne create generative AI model for proteins - IT-Online

#artificialintelligence

Using a pretrained AI model from Nvidia, startup Evozyne has created two proteins with significant potential in healthcare and clean energy. A joint paper released today describes the process and the biological building blocks it produced. One aims to cure a congenital disease, another is designed to consume carbon dioxide to reduce global warming. Initial results show a new way to accelerate drug discovery and more. "It's been really encouraging that even in this first round the AI model has produced synthetic proteins as good as naturally occurring ones," says Andrew Ferguson, Evozyne's co-founder and a co-author of the paper.


Nvidia unveils new AI workflows to help the retail industry with loss prevention • TechCrunch

#artificialintelligence

In a recent episode of "Customer Wars" a woman put a chainsaw down her pants in an effort to steal it. And you might have caught the video of the thieves stealing from Ulta. Videos like this are all over the internet, and the retail industry is reporting that theft continues to rise. Target attributed hundreds of thousands of dollars in profit losses in 2022 to organized retail theft, while Walmart recently said increased thefts may result in higher prices and store closures. "Shrinkage" is the term retailers use to convey losses due to product theft, damage or misplacement.


Nvidia builds an AI cloud platform for power users and digital novices

MIT Technology Review

Thank you for joining us on "The cloud hub: From cloud chaos to clarity." Shanker Trivedi, Nvidia's head of enterprise business, explains why he believes AI is shaping up to be the greatest technology force of our time. He describes how the company is combining its world-class hardware and robust development community to construct a cloud-based AI platform for power users and digital novices alike.


NVIDIA, Evozyne Create AI Model for Proteins

#artificialintelligence

Using a pretrained AI model from NVIDIA, startup Evozyne created two proteins with significant potential in healthcare and clean energy. A joint paper released today describes the process and the biological building blocks it produced. One aims to cure a congenital disease, another is designed to consume carbon dioxide to reduce global warming. Initial results show a new way to accelerate drug discovery and more. "It's been really encouraging that even in this first round the AI model has produced synthetic proteins as good as naturally occurring ones," said Andrew Ferguson, Evozyne's co-founder and a co-author of the paper.