GeForce RTX 2080 Ti = So Much MOAR POWER, It HERTZ!

Christmas has come early to PC gamers this year, in the form of the spectacular new Geforce RTX 2080 and 2080 Ti cards.

I just received mine, hurrah, so I have a more powerful graphics card than Chief Blur Buster currently has! Now Chief can be jealous until NVIDIA gives him one (har har).

Depending on where you live, it is still very hard to find stock of these cards, and I expect the soon-to-be-released 2070, at a much more affordable price point, might be even harder. I used NowInStock.Net to get an immediate notification and order mine the second it popped up.

Show us the goods, pics or it didn’t happen:

The suspense is killing me!

What a beauty. I picked the Gigabyte model mostly because my new AMD Threadripper 2, 32-core workstation is built around a Gigabyte Aorus Xtreme X399 motherboard, and wanted some consistency. Plus it showed up earlier on my stock alert notification, bonus.

Virtually all the 2080 Tis have the default 3 x DP 1.4, 1 x HDMI 2.0b and 1 x VirtuaLink USB-C connector.

Since I have only one G-Sync monitor, an Acer XB270HU, and the rest of my displays (various monitors and projectors) use HDMI, I bought a pair of DP 1.4 to HDMI 2.0b adapters from Amazon, to connect to some of the new dual-cable VR headsets coming out soon, either the StarVR One or the Pimax 5K+, which, incidentally, I intend to review at some point for Blurbusters.

It’s better this way than having more HDMI 2.0b connectors on the card which have lower max bandwidth, and aside from currently-non-existent VirtuaLink VR headsets, I’ve heard about native DisplayPort VR headsets coming out too.

The backplate on this card is super nice, almost like plate armor which aside from cosmetics also apparently helps with cooling. Nice, errr, backside, baby!


By way of an overdue introduction, my name is Bela, and I’m a AAA graphics engineer and semi-retired from the grind after ten long years of grueling hard work, and now making indie VR games and doing graphics and machine learning research, when I’m not playing VR games of course. I’ve always been a big fan of this site and have mad projects cooking with Mark, the owner here, and good things are coming, so stay tuned!

To me and many others out there, VR is a childhood dream come true, and while it’s still experiencing growing pains, let me just say, I’m very excited to play all my VR game catalog over with my new 2080 Ti.

I’ve been spending a lot of time in Beat Saber, Skyrim VR, and looking forward to playing In Death and other newer titles on this card. When I do benchmarking it will be primarily VR-focused.  Perhaps “bench-marketing” would be a better term, although I don’t consider myself an Nvidia fanboy by any means, the truth is AMD has a lot of catching up to do. Benchmarks bring website visits to the sites hosting them, as well as sell cards (provided they do well or offer good value), that’s really their purpose so I think bench-marketing is apt and honest.

Most gaming sites don’t cater to VR games primarily in their testing, so we hope to provide something here at Blurbusters which is more specialized and unique that you can’t necessarily get elsewhere. When I do any direct GPU comparisons, it will be against my older, min-VR-spec GTX 970 which this card replaces. I tend to skip generations of GPUs to keep my upgrade budget reasonable, but the 970 was getting a bit behind the times, especially for VR which not only pushes our GPUs on resolution, but also demands a steady 90 Hz / 120 Hz framerate.

To folks out there rightly skeptical of marketing mumbo jumbo re: Ray Tracing, let me try to calm your nerves a bit: the new ray tracing acceleration tech exposed via DX12 (DXR) and new Vulkan extensions, that take advantage of these new cards’ capabilities, are going to be used extensively, and already are to great effect in in games like EA’s BattleField 5.

Aside from reflections and shadowing improvements (shadow maps, die!), there is a ton of relevant  academic and industry research going on in machine learning (ML), for example 1 sample-per-pixel SPP denoising of ray-traced global illumination (GI), which makes normally noisy Monte Carlo integration feasible, albeit at lower resolutions. GI makes a massive difference in the visual fidelity of games, and has tremendous gameplay implications too, like requiring less artificial lights to keep things going. I can only imaging what a Skyrim VR mod would look like where you could use realistic lighting and actually be able to walk around at night in the woods, by moonlight alone.

For traditional rasterized games, extra performance and quality boosts for 4K are achieved by Deep Learning Super Sampling (DLSS), where Nvidia’s supercomputers scan existing games to optimize them in advance.

Aside from all that, hardware video acceleration is top notch. Since I hate low framerates in not only games, but movies too, I am a big fan of motion smoothing (sue me), and the performance of this card with SVP on UHD HDR rips (of UHD Blurays I purchased legally) is amazing. My GTX 970 couldn’t even decode UHD rips without dropping frames, without any interpolation enabled on my PC, but this new RTX 2080 Ti manages to bring 2160p24 high bitrate HDR content all the way to 1440p 144hz, barely breaking a sweat at 16-20% GPU occupancy. Highly recommended for smooth video nuts. MEMC (motion estimation) tech will only keep advancing too, with less artifacts expected with Nvidia’s latest slow motion research.


Some other thoughts with considering which cards to pick for VR use, is whether to consider SLI. My new motherboard is a full-sized EATX board with dual X16 slots, so that’s not a problem, HOWEVER, if I want to install a PCI-E 4X card, say, for Wigig wireless VR module for the HTX Vive, that’s where I could maybe run into some problems with a pair of 2.75 slot GPUs covering the rest of my spare x8 slots.

So, plan your build ahead! If I ever need a VR WiGig module, I believe I can replace the motherboard’s built-in 802.11 AC wifi card with the Intel one, but there is a Vive addon that uses a PCI-E 4X slot which could be a problem, unless I can squeeze a PCI-E riser ribbon under the RTX fans. For VR SLI builds, I’d therefore recommend a dual-slot GPU, like the Nvidia founder’s edition cards. Might run a bit slower than the beefier cooled versions, but you will still be able to use all your motherboard’s PCI-E slots for other things, without worrying about riser cables which can be flaky.

One last thing: make sure you get a beefy power supply!

Review sites are showing 300+ Watts used by overclocked RTX 2080 Tis, so 600 Watts for two GPUs if you go SLI. I think VR is the perfect use case for SLI, personally, and the new NVLink connector is quite robust in terms of performance scaling.

In my system, the CPU itself requires 800+ Watts, for an overclocked-to-4-Ghz 32-core AMD Threadripper. I figured 1600 Watts total was enough to handle anything I could throw at this system, so I got the highly rated EVGA model, the highly-rated Supernova 1600 T2.

P.S. My RTX 2080 also supports G-SYNC, as do all currently-on-market NVIDIA cards. Check out Supported NVIDIA GPUs for G-SYNC

Good luck and happy gaming!