Plotting

Hardware


NVIDIA-Knowing the power of Artificial Intelligence

#artificialintelligence

NVIDIA is a technology company that specializes in the design and manufacturing of graphics processing units (GPUs) and system on a chip (SoC) units for the gaming and professional markets. The company was founded in 1993 and is headquartered in Santa Clara, California. NVIDIA's GPUs are widely used in gaming PCs, workstations, and data centers for tasks such as machine learning, data analytics, and scientific simulations. The company also designs and manufactures other products such as Tegra mobile processors, Jetson embedded platform and CUDA parallel computing platform. The company's products are based on its proprietary CUDA architecture, which allows for parallel processing of large amounts of data.


Cyberpunk 2077 levels up frame rates with Nvidia's DLSS 3

PCWorld

It's been a little over two years since CDProjekt Red's sci-fi opus landed, and the general consensus is that, yeah, it's pretty much fixed now. While Cyberpunk 2077 was plagued with bugs and performance issues at launch, the developer has worked tirelessly to improve it. The latest update brings support for Nvidia's DLSS 3, which can work with the latest GeForce cards to dramatically improve frame rates even with demanding effects enabled. For the uninitiated, DLSS stands for Deep Learning Super Sampling. It's an incredibly complex technique that combines custom hardware processing with hundreds or thousands of hours of graphics analysis for each game.


Nvidia's New Eye Contact Feature Will Make You Question Reality - AI Summary

#artificialintelligence

Nvidia has released a new camera feature called Eye Contact that edits a live feed so that it looks like someone is always looking into the camera, even if that's not the case. The feature is intended for content creators who want to maintain eye contact with the camera at all times. In one demo of the feature, scenes from Jurassic Park are edited with Eye Contact so that the actors are always looking at the camera. The feature is yet another example of AI becoming easily usable by everyday consumers. A new Nvidia Broadcast update uses AI to make it look like a person is always looking directly into the camera.


Nvidia CEO says AI will need regulation, social norms

#artificialintelligence

STOCKHOLM, Sweden – Nvidia Chief Executive Officer Jensen Huang on Tuesday, January 24, said that the burgeoning field of artificial intelligence (AI) will create powerful tools that require legal regulation and social norms that have yet to be worked out. Huang is one of the most prominent figures in artificial intelligence because Nvidia's chips are widely used in the field, including in a supercomputer that Microsoft Corp built for startup OpenAI, in which Microsoft said Monday it was making a multibillion-dollar investment. Huang was speaking at an event in Stockholm, where officials said Tuesday they were upgrading Sweden's fastest supercomputer using tools from Nvidia to, among other things, develop what is known as a large language model that will be fluent in Swedish. "Remember, if you take a step back and think about all of the things in life that are either convenient, enabling or wonderful for society, it also has probably some potential harm," Huang said. Lawmakers such as Ted Lieu, a Democratic from California in the U.S. House of Representatives, have called for the creation of a US federal agency that would regulate AI. In an opinion piece in the New York Times on Monday, Lieu argued that systems such as facial recognition used by law enforcement agencies possibly can misidentify innocent people from minority groups.


The 10 Best Shows on Apple TV Right Now

WIRED

Slowly but surely Apple TV is finding its feet. The streaming service, which at launch we called "odd, angsty, and horny as hell," has evolved into a diverse library of dramas, documentaries, and comedies. It's also fairly cheap compared to services like Netflix--and Apple often throws in three free months when you buy a new iPhone, iPad, Mac, or Apple TV. Curious but don't know where to get started? Below are our picks for the best shows on the service.


What Is AI Computing?

#artificialintelligence

Mathematical instruments mark the history of human progress. They've enabled trade and helped navigate oceans, and advanced understanding and quality of life. The latest tool propelling science and industry is AI computing. AI computing is the math-intensive process of calculating machine learning algorithms, typically using accelerated systems and software. It can extract fresh insights from massive datasets, learning new skills along the way. It's the most transformational technology of our time because we live in a data-centric era, and AI computing can find patterns no human could.


IBM goes big on quantum-computer hardware, software

#artificialintelligence

IBM has rolled out its most powerful quantum-computing system so far--the Osprey, a 433-qubit machine that supports three times more qubits than its current Eagle system, and reiterated its plan to have a 1,121-qubit processor, called Condor, out in 2023. At the IBM Quantum Summit 2022, the company also said it was continuing development of a modular quantum platform called System Two that will combine multiple processors into a single system with new communication links that IBM says will use hybrid-cloud middleware to integrate quantum and classical workflows. In addition IBM said it will continue to prototype quantum software applications for specific use cases. By 2025, IBM said, developers will be able to explore quantum machine-learning applications and more. Big Blue is following the quantum roadmap it laid out earlier this year that set long-term goals.


The Morning After: NVIDIA's GeForce Now Ultimate is a high-end cloud gaming service

Engadget

While Google shuttered Stadia for good this week, other cloud gaming services are expanding their offerings. NVIDIA is upgrading its GeForce Now service with a bunch of features, thanks to the addition of new SuperPODs equipped with RTX 4080 GPUs. This seems to be the first truly high-end cloud gaming experience. The renamed Ultimate plan now includes support for refresh rates of up to 240Hz at full HD or 4K at 120 fps and an expanded set of usable widescreen resolutions (3,840x1,600, 3,440x1,440 and 2,560x1,080). NVIDIA is also adding better support for HDR on both Macs and PCs, along with the ability to use full ray tracing with DLSS3 in supported games.


Nvidia shows how surprisingly hard it is for a robot to pick up a chicken wing

ZDNet

It's football playoff season in America -- a time when it's all too easy to pick up and chow down on a chicken wing without much thought. It turns out, however, that the robot that helped prepare your chicken had to put in a great deal of effort to pick up that piece of meat. Nvidia on Thursday showcased how the Massachusetts-based startup Soft Robotics is deploying its technology -- including on-site GPUs and the Isaac Sim robotics simulation toolkit -- to make it easier to deploy robots designed to handle foods like chicken wings. Also: Nvidia's robot simulator now includes human characters too Food processing and packaging plants may seem like an obvious place to deploy robots. Foods like chicken wings are quickly moved across conveyor belts as they're uniformly cooked and prepared for consumption.


How Nvidia's CUDA Monopoly In Machine Learning Is Breaking - OpenAI Triton And PyTorch 2.0

#artificialintelligence

Over the last decade, the landscape of machine learning software development has undergone significant changes. Many frameworks have come and gone, but most have relied heavily on leveraging Nvidia's CUDA and performed best on Nvidia GPUs. However, with the arrival of PyTorch 2.0 and OpenAI's Triton, Nvidia's dominant position in this field, mainly due to its software moat, is being disrupted. This report will touch on topics such as why Google's TensorFlow lost out to PyTorch, why Google hasn't been able to capitalize publicly on its early leadership of AI, the major components of machine learning model training time, the memory capacity/bandwidth/cost wall, model optimization, why other AI hardware companies haven't been able to make a dent in Nvidia's dominance so far, why hardware will start to matter more, how Nvidia's competitive advantage in CUDA is wiped away, and a major win one of Nvidia's competitors has at a large cloud for training silicon. SemiAnalysis is an ad-free, reader-supported publication. To receive new posts and support, consider becoming a free or paid subscriber. The 1,000-foot summary is that the default software stack for machine learning models will no longer be Nvidia's closed-source CUDA. The ball was in Nvidia's court, and they let OpenAI and Meta take control of the software stack. That ecosystem built its own tools because of Nvidia's failure with their proprietary tools, and now Nvidia's moat will be permanently weakened.