rtx 3080
Can A Gamer Train A Mathematical Reasoning Model?
While large language models (LLMs) have achieved remarkable performance in various tasks including mathematical reasoning, their development typically demands prohibitive computational resources. Recent advancements have reduced costs for training capable models, yet even these approaches rely on high-end hardware clusters. In this paper, we demonstrate that a single average gaming GPU can train a solid mathematical reasoning model, by integrating reinforcement learning and memory optimization techniques. Specifically, we train a 1.5B parameter mathematical reasoning model on RTX 3080 Ti of 16GB memory that achieves comparable or better performance on mathematical reasoning benchmarks than models several times larger, in resource-constrained environments. Our results challenge the paradigm that state-of-the-art mathematical reasoning necessitates massive infrastructure, democratizing access to high-performance AI research. https://github.com/shinandrew/YouronMath.
Homebrew RTX 3080 passive cooler is a 23 pound copper block
If you want any kind of power in a modern PC build, you're going to have to move some air around at some point. That's one of the fundamentals of computer hardware: Barring a super-portable laptop or a teeny-tiny desktop the size of a Rubik's cube, you need fans to cool your components. Unless, of course, you're ready to strap about 25 pounds of metal to it and turn it into a very expensive radiator. Just ask this Reddit user, who's attempting to passively cool an Nvidia RTX 3080. The post outlines one person's attempt to get a high-end graphics card to operate without any kind of moving parts, removing the stock cooling shroud and replacing it with a Frankenstein monster of copper and steel.
- Information Technology > Hardware (0.95)
- Information Technology > Artificial Intelligence (0.79)
FusionAI: Decentralized Training and Deploying LLMs with Massive Consumer-Level GPUs
Tang, Zhenheng, Wang, Yuxin, He, Xin, Zhang, Longteng, Pan, Xinglin, Wang, Qiang, Zeng, Rongfei, Zhao, Kaiyong, Shi, Shaohuai, He, Bingsheng, Chu, Xiaowen
The rapid growth of memory and computation requirements of large language models (LLMs) has outpaced the development of hardware, hindering people who lack large-scale high-end GPUs from training or deploying LLMs. However, consumer-level GPUs, which constitute a larger market share, are typically overlooked in LLM due to their weaker computing performance, smaller storage capacity, and lower communication bandwidth. Additionally, users may have privacy concerns when interacting with remote LLMs. In this paper, we envision a decentralized system unlocking the potential vast untapped consumer-level GPUs in pre-training, inference and fine-tuning of LLMs with privacy protection. However, this system faces critical challenges, including limited CPU and GPU memory, low network bandwidth, the variability of peer and device heterogeneity. To address these challenges, our system design incorporates: 1) a broker with backup pool to implement dynamic join and quit of computing providers; 2) task scheduling with hardware performance to improve system efficiency; 3) abstracting ML procedures into directed acyclic graphs (DAGs) to achieve model and task universality; 4) abstracting intermediate represention and execution planes to ensure compatibility of various devices and deep learning (DL) frameworks. Our performance analysis demonstrates that 50 RTX 3080 GPUs can achieve throughputs comparable to those of 4 H100 GPUs, which are significantly more expensive.
Tested: Nvidia's GeForce RTX 4080 offers dazzling creator performance, with a catch
The Nvidia GeForce RTX 4080's slow-paced announcement and eventual launch has been… a fascinating road. From its mis-named little brother getting "unlaunched" to the RTX 4080 Founders Edition being the same size as the massive RTX 4090 for some reason, to the melting 12VHPWR power connectors--which are still on this card, yes. Nvidia has quite the PR battle with the $1,200 GeForce RTX 4080, but if you ignore all the noise, does this new card deliver the goods for content creators? Thankfully, performance-wise it does mostly live up to the hype; I just find myself wishing we had higher VRAM capacities on such expensive graphics cards. This review is focused on work, instead of play, similar to my recent RTX 4090 content creation analysis.
Nvidia's monstrous GeForce RTX 4090 and 4080 revealed: 7 must-know details
More than two excruciatingly long years after the RTX 30-series reveal, a new generation of graphics cards is finally here. Nvidia CEO Jensen Huang unveiled the hotly anticipated GeForce RTX 4090 and not one, but two different RTX 4080 variants during the "Project Beyond" reveal event that kicked off GTC 2022. These graphics cards look absolutely monstrous, full stop, with the RTX 4090 leaving the RTX 3090 Ti stumbling in its wake. But while the performance of these next-gen GeForce GPUs promises to melt your face, the raw speeds are far from the only interesting aspect of this launch. Here are seven must-know facts from the RTX 4090 and RTX 4080 reveal, from ray tracing advancements to staggeringly high new sticker prices for Nvidia's GPUs. Nvidia clearly designed its new "Ada Lovelace" architecture to scream through ray tracing tasks.
- Information Technology > Hardware (0.66)
- Information Technology > Graphics (0.66)
- Information Technology > Artificial Intelligence (0.50)
Spider-Man Remastered swings onto the PC platform
After spending many hours completing Spider-Man Remastered main story and being unable to tear ourselves away from the screen, editors at BabelTechReviews (BTR) recommend it as a great game with a few flaws. Mark Poppin, Mario Vasquez, and I have collaborated on this review, each of us playing the game for 20 or more hours. We cover the gameplay, updated performance with Thursday's patch, and IQ (image quality), which includes ray tracing and testing AMD's and Nvidia's upscaling solutions. Spider-Man Remastered was released originally as a PS4 exclusive in 2018 and two years later was remastered for PS5. Sony then gave it a complete makeover when they ported it to the PC, complete with ray-traced reflections, and together with all the downloadable content, it was released on August 12, 2022.
- Leisure & Entertainment > Games > Computer Games (0.48)
- Energy > Oil & Gas > Upstream (0.41)
15 Great Deals on Gaming Gear for Prime Day
Video gaming can be an expensive hobby, but that just makes scoring deals all the more important. Whether you're looking for a new gaming laptop, a better mouse, or any gadget crammed full of RGB LEDs, there are plenty of deals for you today. The WIRED Gear team tests products year-round. We sorted through hundreds of thousands of deals by hand to make these picks. Crossed out products are out of stock or no longer discounted.
- Information Technology > Hardware (0.75)
- Information Technology > Artificial Intelligence > Games (0.40)
15 Great Deals on Gaming Gear for Prime Day
Video gaming can be an expensive hobby, but that just makes scoring deals all the more important. Whether you're looking for a new gaming laptop, a better mouse, or any gadget crammed full of RGB LEDs, there are plenty of deals for you today. The WIRED Gear team tests products year-round. We sorted through hundreds of thousands of deals by hand to make these picks. Crossed out products are out of stock or no longer discounted.
- Information Technology > Hardware (0.76)
- Information Technology > Artificial Intelligence > Games (0.40)
Asus ROG Zephyrus S17 review: This gaming laptop oozes luxurious power
The Asus ROG Zephyrus S17 is a performance powerhouse that doesn't shirk on providing everything you need to enhance your gaming experience. The latest Asus ROG Zephyrus S17 has just about everything you'd want in a gaming laptop. You can expect loads of power thanks to the powerful pairing of an 11th-generation Intel i9 processor and a Nvidia RTX 3080 GPU. A 17.3-inch 4K display with 120Hz refresh rate also adds to the premium experience, even if it's not the pinnacle of 4K. This laptop also spoils you with a robust metal chassis, six speaker sound system, and RGB per key-lit optical mechanical keyboard – features that any discerning gamer would find hard to resist.
- Information Technology > Artificial Intelligence > Games > Computer Games (0.49)
- Information Technology > Hardware > Memory (0.47)
GeForce Now's new subscription tier is the ultimate cloud gaming experience
Just as Xbox starts to make some major strides with its cloud gaming service, Nvidia comes out with a whopper of an update to its cloud gaming platform, GeForce Now. As of last month, anyone can pre-order a membership to Nvidia's new RTX 3080 subscription tier, which provides more resolution options, features, and benefits than any other dedicated cloud gaming service out there--and for the moment it's also the only cloud gaming platform that supports true 4K gaming. We recently tried out the new subscription tier and all it has to offer. We were incredibly impressed with how smoothly the games ran, even at a level of detail so precise you can see pores on the characters' skin. If you're new to cloud gaming or wondering if subscribing to GeForce's RTX 3080 tier is worth it, here's what you need to know.
- Information Technology > Game Technology (1.00)
- Information Technology > Cloud Computing (1.00)
- Information Technology > Artificial Intelligence > Games > Computer Games (0.40)