Yesterday NVIDIA kicked off their week at CES by announcing the GeForce RTX 2060, the lowest-cost Turing GPU to date at just $349 USD but aims to deliver around the performance of the previous-generation GeForce GTX 1080. I only received my RTX 2060 yesterday for testing but have been putting it through its paces since and have the initial benchmark results to deliver ranging from the OpenGL/Vulkan Linux gaming performance through various interesting GPU compute workloads. Also, with this testing there are graphics cards tested going back to the GeForce GTX 960 Maxwell for an interesting look at how the NVIDIA Linux GPU performance has evolved. The GeForce RTX 2060 features 1920 CUDA cores, a 1365MHz base clock and 1680MHz boost clock speed, 6GB of GDDR6 video memory, and is rated for 37T RTX-OPS and 5 Giga-Rays/s. In comparison, the GeForce RTX 2070 Founder's Edition has 2304 CUDA cores, 1710MHz boost clock speed, and rated for 45T RTX-OPS and 6 Giga-Rays/s; but the RTX 2060 has a launch price of just $349 USD compared to $599 USD for the Founder's Edition model of the RTX 2070.
It's no secret that both NVIDIA (NASDAQ:NVDA) and Advanced Micro Devices (NASDAQ:AMD), the only two makers of high-performance graphics processors, have seen their respective graphics processor businesses soar thanks to the boom in cryptocurrencies. This article originally appeared in the Motley Fool. In a nutshell, as cryptocurrencies like Ethereum and ZCash, which have generally been produced by running computations on graphics processors, have grown in value, so too has demand for the graphics processors that can produce them. That production process is known as mining. However, in recent months, prices of cryptocurrencies have dropped and specialized hardware to mine these cryptocurrencies has emerged.
When it comes to gaming PCs, nothing matters more than your graphics card. To push as many pixels as possible you're going to want the fastest graphics card you can afford--but ever-shifting prices and product lineups make it difficult to keep track of what's available. In honor of keeping frame rates high, we decided to rank all the major available discrete GPUs from Nvidia GeForce and AMD Radeon, starting with the fastest graphics card available and working on down. This list focuses on each company's most current GPU lineups, and doesn't include significantly older graphics cards (yet). Price to performance is not a consideration here--just pure performance.
Seven long months after the next-generation "Volta" graphics architecture debuted in the Tesla V100 for data centers, the Nvidia Titan V finally brings the bleeding-edge tech to PCs in traditional graphics card form. But make no mistake: This golden-clad monster targets data scientists, with a tensor core-laden hardware configuration designed to optimize deep learning tasks. You won't want to buy this $3,000 GPU to play Destiny 2.
Two long years after the GeForce RTX 20-series kicked off real-time ray tracing and uncompromising 4K performance in earnest, Nvidia finally took the wraps off its hotly anticipated successors. The first GeForce RTX 30-series GPUs are damned impressive, culminating in the ferocious GeForce RTX 3090 with 24GB of cutting-edge GDDR6X memory and enough power to game on an 8K display (sometimes). But glancing over the launch lineup, one things sticks out: Why is there no GeForce RTX 3080 Ti? The RTX 2080 Ti was the heavy hitter in the 20-series lineup, after all. Nvidia never talks about unannounced products, but there are several reasons to skip a high-end Ti out of the gate this generation, and a lot of it has to do with the competition.