NVIDIA RTX 5090 outperforms AMD and Apple running local OpenAI language models

PCWorld 

When you purchase through links in our articles, we may earn a small commission. Developers and creatives looking for greater control and privacy with their AI are increasingly turning to locally run models like OpenAI's new gpt-oss family of models, which are both lightweight and incredibly functional on end-user hardware. Indeed, you can have it run on consumer GPUs with just 16GB of memory. That makes it possible to use a wide range of hardware - with NVIDIA GPUs emerging as the best way to run these sorts of open-weight models. While nations and companies rush to develop their own bespoke AI solutions to a range of tasks, open source and open-weight models like OpenAI's new gpt-oss-20b are finding much more adoption.