Nvidia RTX 2000 series cards announced

Probably not a shocker to anyone who has been following along, but the new RTX line of GPU's was announced this morning. The first two cards on sale will be the 2080 Ti and the 2080. That's a departure from releases past where the Ti came several months later as a product stack refresh. New chip is called Turing and the big sell is the RT cores, thus RTX (RT = ray tracing). Nvidia is pushing this new hybrid rendering tech that will render much of the scene in a classic rasterized way while rendering the lighting and shadows using ray-tracing. The demos they showed do look incredible, but as with all rendering tech it will take time to proliferate.

No real performance numbers yet, aside from Nvidia marketing stuff. We have the prices though and they aren't cheap. The Founders Edition cards come in at $1199 for the 2080 Ti, $799 for the 2080 and $599 for the 2070. MSRP's on the cards (e.g. what you will pay once the Founders Edition premium is gone) are $999 for the 2080 Ti, $699 for the 2080 and $499 for the 2070. The one performance detail that Nvidia did toss out is that the 2070 is faster than a current Titan Xp.

Rumor is that GTX isn't going away but will be the lower end of their stack now. The GTX 2060 will likely be the top of that stack and rumor has it that 2060 will basically be a binned 1080 Ti level chip and remain Pascal.

I'm interested to see what the performance numbers look like. If the 2080 Ti is a big enough jump over the 1080 Ti I'll likely upgrade once aftermarket cooler version start appearing.

If you're chomping at the bit and want to order an FE of one of the cards, here you go: https://www.nvidia.com/en-us/geforce/20-series/
I am mostly interested in these announcements because of a price drop of the 10xx cards :) I jumped into the announcement stream to check it out and their presentation was pretty interesting.

Their integration of ray tracing functionality is a big advancement, as can be seen from the demonstrations. Ray tracing has existed for a long time and it's basically a method for rendering a realistic scene pixel by pixel from the camera's perspective. As soon as you add in reflections, refractions, shadows, sub-surface scattering etc., ray tracing becomes computationally very expensive. On a CPU, without clever methods such as tree-based triangle clustering, a simple scene with a few planes and spheres can take 5-10 minutes to render. And that's a single frame.

The question is, are you going to be paying for the ray tracing part of the card without being able to use it everywhere? By the looks of it, small game development companies with custom rendering engines will not be able to make use of it. The presentation shows it as a on-off switch, but I doubt it is easy to just add to any engine.

At first I was curious to see what all the artificial intelligence talk was about within the Turing processor. NVIDIA has been making large contributions to recent research in for example generative networks, and image to image translation networks. In fact, they have already moved on to video generation:

So we know NVIDIA is not clueless when it comes to deep learning, but how exactly it will impact rendering performance for games is still somewhat unclear to me. The way I understand it is that they will use a model (artificial neural network) trained on enhancing rendered images, as a sort of post-processing step. Given enough data, there already exist methods for generating super-resolution versions of images, which is simply increasing the resolution of an image using deep learning.

A lot of questions remain with regards to their DNN module. Designing/training such models takes a lot of research, time and data. Will they be monetizing this, by essentially selling multiple tiers of graphics fidelity, separate from the GPUs?

Image generation using deep learning is still very recent and not perfect. In many results you see missing details or warped physics. Are these artifacts going to occur in the games as well? I guess this depends on how much of the rendered frames they want to leave up to AI to complete.
Interesting times indeed...

This short was created by Epic, Nvidia and Ilmxlab to showcase the new ray tracing techniques, they had to run it on 4 x Titan cards at the time to produce this short.

Nvidia have announced today that a single 2080Ti ran it at exactly the same fps and quality...

Pricey though ffs, £1,200 (!) but considering the performance... hmm. However you cut it GPU's are getting bloody expensive.

That short is rendering those graphics in real time by the way, not a pre rendered cut scene, I mean that's up there with some of the best CGI I've seen and it's rendering in real time on a single GPU? Crazy how powerful these GPU's are getting now.

The issue with these cards to me is that they are very transitionary. We're looking at maybe a 20% raw power increase over the equivalent last generation, but we're being charged up to the next price bracket for features that are years away from being mainstream. In the case of the Ti we're being charged almost double for what is looking on paper to be a 25% power increase. You're effectively paying a serious premium for features you don't even know will be a "thing" in a few years.

Yeah, the RTX series have complicated powerful silicon, but in terms of raw power to bare on current games, and the way games work today, it's kind of a wet fart. All the oomph is in the tensor and RT cores which may be completely useless in a few years of this hybrid ray-tracing stuff doesn't take off and become the norm.
Last edited:
It's a good point Brainling but I really can't see it not becoming a normal thing in games. History dictates that once a few devs start using new techniques on a few big titles it becomes the norm. For me personally my 980Ti is still all I need for running 1440p games on Ultra, and runs VR at 1.3 supersampling with no problems at all. I am going to skip this and possibly the next iteration of cards, it all depends what happens with VR to be honest. I'm keeping a close eye on Pimak 8k VR and Star VR (Thanks Netjun for the heads up on that one) and will see how they turn out before even looking at a new GPU.

£1,200 for a GPU is a LOT of money whichever way you look at it, so I would need to have something else other than 4K / 8K 2D gaming to tempt me in to a big outlay, as if I was going to get one of these 2080Ti's I would build a whole new rig around it. Right now, my old faithful is still running everything absolutely fine (even better now I stripped it down, cleaned it completely out of dust and rebuilt it) so I see no need. If I was on a 7 series card or something like that though this would definitely be a good time to build a new rig with the 2080Ti as the centrepiece though.

One thing for sure though is it's a good time to be a PC gamer, we are really seeing a resurgence in our chosen format, and that's good for all of us!
I do a lot of gaming at 4K because I have a 4K HDR TV I can connect my PC to easily. For me the 1080 Ti is starting to be a bit long in the tooth, which is crazy to say given that it's been the most powerful consumer GPU available for 18 months. So the 2080 Ti has some appeal to me...but at those price points it needs to blow my hair back and on paper it's not looking like it will. I hope I'm wrong, I hope the numbers come out and it's an overwhelming upgrade...but I'm not seeing it right now just based on the specs being released. Given CUDA core count and clock rates it's looking to be about a 20-25% raw floating point upgrade over the 1080 Ti. So if there is more to be gained it's going to have to come from architectural upgrades around the core FPU's.
2080 Ti got here today. Came out of the box with zero boost clock offset, which is to be expected for the "unbinned" version of any card. So fo I have 150mhz on the core and 600mhz on the memory and it hasn't batted an eyelash or gone over 75C. Still 9 degrees of thermal headroom and I'm still pushing the clocks with no issue. Once I find that stable equilibrium point on the OC I'll post comparison numbers to my 1080 Ti FE in the benchmarks thread.

Latest posts

Forum statistics

Latest member
Top Bottom