Based on its $350 MSRP, the RTX 2060 comes at a cost of $4.06 per frame. The GPU provides a slightly better price to performance ratio than the GTX 1070, Vega 56 and the Radeon RX 590. It also makes it around 22% more cost effective than the RTX 2070, so that’s obviously a step in the right direction.
Can you notice a trend on the graph above? Anything mid-range or better will set you back more than it ever has before. There are a few factors worth considering. Are today’s GPUs more complex than they were a few years ago? Have manufacturing costs increased? We know the cost to build a modern fab is astronomical, and yields for these big complex GPUs aren’t great. Also, is more investment being made into research and development? The answer to that one is unquestionably yes. A quick look at power consumption shows us that the RTX 2060 consumes slightly less power than the RTX 2070, placing it roughly on par with the GTX 1080. That’s still reduced total system consumption by around 20% when compared to Vega 56, but that’s hardly surprising. Overall power efficiency hasn’t changed much since Pascal and AMD’s hardware is at a clear disadvantage.
Having said all that, a key explanation of why Turing is such a massive and complex GPU is because of ray tracing, a rendering technology that’s supported by the inclusion of dedicated hardware known as RT cores. Ray tracing is amazing, but at least in its current form – and that’s the real problem – the current hardware implementation isn’t good or fast enough to be of any real benefit to gamers. We plan to test ray tracing performance with the RTX 2060 soon, just like we did before with previous RTX releases. That will need to go into a separate article as there will be a lot of stuff to cover, not a lot of games of course, but plenty of performance and visual stuff to go over.
We believe gamers would be magnitudes more accepting of the GeForce 20 series if it never had any RT cores, either removing them entirely to reduce the silicon size, and therefore price, or replacing them with more CUDA cores for greater performance. But that’s not the situation we find ourselves in, at least for now. A long rumored GeForce 11 series may do away with RT cores, but that may only cover budget priced graphics cards. Currently high-end graphics cards are essentially an all Nvidia-affair. At the extreme high-end we have the Titan RTX, a permanent experiment on Nvidia’s part, usually targeting data scientists and those needing Quadro-level power but wanting to spend less. For gamers the card makes no sense, and it really shouldn’t be seen as a consumer-focused release. At $2,500 and packing just 6% more cores than the RTX 2080 Ti, the card packs 24GB of GDDR6 memory.
The RTX 2080 Ti is the fastest graphics card for gamers, meant to start at $1,000 from AIBs, you can expect to pay closer to $1,300. It’s not great value but it does come as close as it gets to usable RT cores (again, more games will be needed to actually use them, and see how well the play with RTX on). Then we have the RTX 2080 for $700. This GPU is stepping in for the GTX 1080 Ti offering essentially the same performance for the same price. And the RTX 2070, which is 5-10% faster than the GTX 1080 for the same price. GTX 10 GPUs are now two years old, of course. With the arrival of the GeForce RTX 2060, we get a GPU that’s a bit more affordable than the GTX 1070 and offers 10-15% more performance. That’s better than what we got with the RTX 2070 in our book, so there are two ways you can look at this… we can judge the RTX 2060 based on what you’d historically expect after two years of development, or we can judge it based on current market conditions. After two years we’d expect at least a 30% boost in performance for the same money. So the RTX 2060 which is really a GTX 1070 replacement, should be providing the kind of performance we’re seeing from the RTX 2080 for $700. Problem with that being the TU104 silicon is 72% larger than the GP104 silicon used by the GTX 1070, 1070 Ti and 1080, yet we’re only getting a 15% increase in CUDA cores. The CUDA cores are much improved, but even so that’s only resulted in a performance boost of about 30% on average when comparing the RTX 2080 to the GTX 1080.
The other way to look at this is that it’s a result of current market conditions. Nvidia keeps raising the bar in an experiment to see what gamers are willing to pay. But then we have the competition… the competition that is there to help drive pricing down. But AMD has been fairly absent outside of the budget and mid-range GPU game for quite some time. Vega 64 graphics cards are scarcely available and have been ever since release. There’s just a handful of AIB models and most are very expensive. As of writing, Newegg had just a single Asus model selling for $460, though there were two reference designed models for $400, but those are horrible if we’re honest. Then we have models from MSI, Gigabyte and Sapphire all priced upwards for $550. That’s more than a base RTX 2070 which generally is the better offering.
Given what we saw from the RTX 2080 and 2070 I was expecting the eventual RTX 2060 to be less desirable than what we received. The reality is we’re getting performance that’s typical between the GTX 1070 Ti and 1080 with pricing that’s slightly better than that of the GTX 1070’s MSRP. For an RTX graphics card this is a massive win. It also makes the RTX 2060 the best value mid-range offering, beating out Vega 56 and all the Pascal cards priced over $300. That won’t make it an instant upgrade if you already own a fast enough graphics card. Say for example, the old GTX 980 Ti flagship was a $650 graphics card, so the fact that the RTX 2060 is faster should be impressive enough for us to take on this generation. Cons: RTX features remain an unknown, particularly with reduced RT cores. Not a GTX 1060 drop-in replacement, driving much better performance along with a larger price tag and more power consumption.