What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

NVIDIA GeForce RTX 2080 Ti & RTX 2080 Review

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,841
Location
Montreal
Hellblade: Senua’s Sacrifice

Hellblade: Senua’s Sacrifice Performance


Hellblade is a game we wanted to feature here not just because of its amazing graphics and use of the ubiquitous Unreal Engine 4 but also because it’s from a small studio and it deals with mental illness. It’s a great game and you should pick it up.

Our benchmark run begins at the beginning of the Fire Realm section and is a simple walkthrough of the various zones within this level.


RTX2080-REVIEW-35.jpg

RTX2080-REVIEW-48.jpg


You can actually tell that NVIDIA seems to have their optimizations done right for the Unreal Engine 4 and that’s pretty important since there are dozens of games which use it. Here the RTX cards absolutely dominate everything else, especially at 4K where the 2080 Ti jumps out to a nearly 60% lead!
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,841
Location
Montreal
Hitman (DX12)

Hitman DX12 Performance


The Hitman franchise has been around in one way or another for the better part of a decade and this latest version is arguably the best looking. Adjustable to both DX11 and DX12 APIs, it has a ton of graphics options, some of which are only available under DX12.

For our benchmark we avoid using the in-game benchmark since it doesn’t represent actual in-game situations. Instead the second mission in Paris is used. Here we walk into the mansion, mingle with the crowds and eventually end up within the fashion show area.


RTX2080-REVIEW-36.jpg

RTX2080-REVIEW-49.jpg


What you will likely see as these benchmarks go by is that NVIDIA’s core optimizations have allowed RTX cards to excel in DX12. Much like Battlefield, Hitman proves there have been some major improvements in this field when compared to Pascal cards, particularly at 4K where the new shading horsepower really comes into play.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,841
Location
Montreal
Overwatch

Overwatch Performance


Overwatch continues to be one of the most popular games around right now and while it isn’t particularly stressful upon a system’s resources, its Epic setting can provide a decent workout for all but the highest end GPUs. In order to eliminate as much variability as possible, for this benchmark we use a simple “offline” Bot Match so performance isn’t affected by outside factors like ping times and network latency.

RTX2080-REVIEW-37.jpg

RTX2080-REVIEW-50.jpg


Moving on to a much more popular game like Overwatch, there’s nothing to complain about at either 1440P or 4K but there is something I want to mention here. While the performance gains of 50% for the RTX 2080 Ti and about the same for the 2080 versus 1080 battle, we can’t forget how much more these new graphics cards cost. They aren’t inexpensive but they do give you some amazing performance in today’s games.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,841
Location
Montreal
Middle Earth – Shadow of War

Middle Earth – Shadow of War Performance


Much like the original Middle Earth – Shadow of Mordor, the new Shadow of War takes place in a relatively open environment, offering plenty of combat and sweeping vistas. The benchmark run we chose starts on a rooftop in Minas Ithil and gradually runs through the streets and vaulting over buildings. A few explosions are thrown in for good measure too. Due to the ultra high resolution textures, even the best GPUs can be brought below 60FPS here.

RTX2080-REVIEW-38.jpg

RTX2080-REVIEW-51.jpg


Shadow of War is an extremely taxing game on the entire system, most of all the GPU’s texture pipeline. Neither RTX card has any issues pushing completely palyable framerates at 1440P and 4K while improving upon their predecessors by about 50% once again.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,841
Location
Montreal
Rainbow 6: Siege

Rainbow 6: Siege Performance


Rainbow 6: Siege has been around for a while now but it is continually receiving updates and it remains one of the best co-op multiplayer games on the market. Meanwhile, its UHD Texture Pack allows for some pretty great looking unit models and insures even the best graphics cards are brought to their knees.

As with most online titles, we needed to avoid multiplayer due to the high variability of performance and randomness. Instead, the High Value Target mission in Situations was used.


RTX2080-REVIEW-39.jpg

RTX2080-REVIEW-52.jpg


Yeah by this time I know I’m starting to repeat myself so let’s just roll Rainbow Six results since they’re pretty much in line with the other games tested in this review.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,841
Location
Montreal
Warhammer 2: Total War (DX12)

Warhammer 2: Total War DX12 Performance


Much of this game can be CPU-limited but by increasing in-game details to max, it puts a massive amount of stress on the GPU. Unfortunately, after years in beta, the DX12 implementation is still not all that great.

For this benchmark, we load up a ultra high points multiplayer saved battle between the Empire and Skaven. That means plenty of special effects with Hellfire rockets and warp fire being thrown everywhere. We then go through a set of pans and zooms to replicate gameplay.


RTX2080-REVIEW-40.jpg

RTX2080-REVIEW-53.jpg


This is going to be a tough one since while Warhammer 2: Total War seems like an extremely challenging game to run, it actually isn’t…..provided you use it in DX11 mode. Creative Assembly’s DX12 implementation has been in beta for years now and it is still far from optimized. It actually runs worse than DX11 in my experience. We’ve included it here to show that even with a lack of in-engine optimizations, the RTX series has enough horsepower to deliver playable in-battle framerates.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,841
Location
Montreal
Wolfenstien - New Colossus (Vulkan)

Wolfenstien - New Colossus Vulkan Performance


This is the only Vulkan game in this entire review and that’s simply because this API just isn’t used all that much. It never has been. With that said, it has allowed id Tech to create an amazingly detailed world while still offering great overall performance.

For this test the Manhattan level is used so we can combine interior, exterior and preset combat elements into one 60-second run.


RTX2080-REVIEW-41.jpg

RTX2080-REVIEW-54.jpg


Believe it or not, the latest Wolfenstien game actually provides one of the most surprising yet not completely unexpected results of this entire review. At 1440P things seem to be going really well for the RTX 2080 Ti and RTX 2080 since they’re able to chew through framerates despite every detail setting being maxed. BUT when the resolution is increased to 4K, the RTX 2080’s performance simply falls through the floor while the GTX 1080 Ti and RTX 2080 Ti surge ahead.

The reason for this is pretty simple: the 8GB framebuffer and lower memory bandwidth cause a severe bottleneck on the Manhattan mission we chose. Even Vega 64’s higher bandwidth number allows it to finally become sort of competitive.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,841
Location
Montreal
Witcher 3

Witcher 3 Performance


Other than being one of 2015’s most highly regarded games, The Witcher 3 also happens to be one of the most visually stunning as well. This benchmark sequence has us riding through a town and running through the woods; two elements that will likely take up the vast majority of in-game time.

RTX2080-REVIEW-42.jpg

RTX2080-REVIEW-55.jpg


The final game here in this 13-title test is Witcher 3 and the RTX cards go back to their normal leadership positions at both 1440P and eventually 4K as well.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,841
Location
Montreal
Analyzing Temperatures & Frequencies Over Time

Analyzing Temperatures & Frequencies Over Time


Modern graphics card designs make use of several advanced hardware and software facing algorithms in an effort to hit an optimal balance between performance, acoustics, voltage, power and heat output. Traditionally this leads to maximized clock speeds within a given set of parameters. Conversely, if one of those last two metrics (those being heat and power consumption) steps into the equation in a negative manner it is quite likely that voltages and resulting core clocks will be reduced to insure the GPU remains within design specifications. We’ve seen this happen quite aggressively on some AMD cards while NVIDIA’s reference cards also tend to fluctuate their frequencies. To be clear, this is a feature by design rather than a problem in most situations.

In many cases clock speeds won’t be touched until the card in question reaches a preset temperature, whereupon the software and onboard hardware will work in tandem to carefully regulate other areas such as fan speeds and voltages to insure maximum frequency output without an overly loud fan. Since this algorithm typically doesn’t kick into full force in the first few minutes of gaming, the “true” performance of many graphics cards won’t be realized through a typical 1-3 minute benchmarking run. Hence why we use a 5-minute warm up period before all of our benchmarks.

RTX2080-REVIEW-57.jpg

So let’s start things off with the overall temperatures observed by both GPU-Z and EVGA’s Precision over the course of a 5-minute static benchmark from Hellblade where the character is literally standing still in one place. . As you can see, the RTX 2080 doesn’t go above 72 degrees Celsius which seems to be what NVIDIA’s default peak is for this card. The Boost algorithm won’t allow it to go beyond that relatively low temperature. This means you could probably get some additional clock speed headroom by simply increasing the Temperature Limit in overclocking software.

The RTX 2080 Ti on the other hand shows us something interesting too since its temperatures are allowed to peak at a point that’s 5 degrees higher and that’s where it remains. Basically both of these are pretty cool running cards but it really does look like NVIDIA gave their higher end GPU a bit more leeway. That’s likely because it it needs that headroom to achieve expected performance numbers.

Something did strike me as interesting though. NVIDIA could have easily squeezed some more performance out of the RTX 2080 by simply allowing it to hit a higher temperature like the RTX 2080 Ti. Maybe this was done to insure a bit more separation between the two cards or it could have been for another reason altogether different. Only NVIDIA knows and they’re not telling.

RTX2080-REVIEW-56.jpg

And here you can see what kind of effect NVIDIA’s GPU Boost 3.0 technology has on clock speeds as temperatures increase. Rather than pushing fan speeds up so the RTX series gets overly loud, they fluctuate core frequencies to balance temperature and power consumption. This also goes to show why it is VERY important to either perform longer benchmark runs or insure the cards have a warm-up period before testing each game.

While the RTX 2080 remains around 1900MHz for the first thirty seconds or so, by the end of the 5 minute test that gets reduced to 1845MHz. This is literally right in line with NVIDIA’s Boost specification. The RTX 2080 Ti gets a bit more shaved off by going from 1785MHz to 1680MHz after 5 minutes, it too remains right near the 1635MHz Founders Edition spec. Not bad at all but a bit different from previous generations where we saw the cards exceeding the stated Boost speeds by (sometimes) pretty significant margins.

For those of you wondering about those odd spikes on the RTX 2080 Ti, those are areas where the fans increase their rotational speed by a completely unnoticeable ~200RPMs. That effectively lowers the temperatures by a bit and the core is able to subsequently boost to a higher frequency.

All things considered, I was expecting the TU102 within NVIDIA’s RTX 2080 Ti to exhibit a bit more fluctuation than it did. We’ve already seen other hot-running cards like AMD’s Vega64 and Fury series literally cut 300 to 400MHz off their initial frequencies. So count me moderately impressed.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,841
Location
Montreal
Acoustical Testing / System Power Consumption

Acoustical Testing


What you see below are the baseline idle dB(A) results attained for a relatively quiet open-case system (specs are in the Methodology section) sans GPU along with the attained results for each individual card in idle and load scenarios. The meter we use has been calibrated and is placed at seated ear-level exactly 12” away from the GPU’s fan. For the load scenarios, Hellblade: Senua’s Sacrifice at 4K is used in order to generate a constant load on the GPU(s) over the course of 10 minutes.

RTX2080-REVIEW-58.jpg

NVIDIA promised their RTX 2080 series cards –at least in Founders Edition form- would be quiet at both idle and load. That’s exactly what happened here. Now I can’t attest to these numbers beating any of the triple and 2.5 slot versions the board partners are coming out with but considering the compactness of FE heatsink and the temperatures we saw on the last page, this is one impressive feat of engineering.

There is however one small caveat to add to this particular test. What this doesn’t show is the racket that the RTX 2080 Ti’s coils make when the card is spitting out 160 FPS or more. They squeal and whine in a high pitched symphony of sorrow. Go below 160 FPS and things quiet down substantially while hitting below 120 to 100 retains that blissfully near-silent operation. Luckily, G-SYNC and putting the card into a case (rather than the open test bench I have) fixes the problem quickly.


System Power Consumption


For this test we hooked up our power supply to a UPM power meter that will log the power consumption of the whole system twice every second. In order to stress the GPU as much as possible we used 10 minutes of Hellblade: Senua’s Sacrifice running static while letting the card sit at a stable Windows desktop for 15 minutes to determine the peak idle power consumption.

RTX2080-REVIEW-59.jpg

I want to start with the RTX 2080 Ti because in pre overclocked Founders Edition form, it consumes a quite a bit of electricity when under load. The RTX 2080 isn’t that bad since it manages to outperform the GTX 1080 Ti yet requires less juice. Meanwhile, those idle numbers are right in line with expectations as both cards throttle down to 300MHz to 400MHz to insure efficient operation.

However, this chart doesn’t tell the whole story since as a graphics card is able to process more information, CPU and system memory load increases as well. So other components are making both of these cards look more power hungry than they really are in relation to slower, older GPUs. From a performance per watt standpoint, the RTX 2080 and RTX 2080 Ti are far superior to anything that’s been released up to now.
 
Top