xentr_theme_editor

  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

Alan Wake 2 - performance tested - big VRAM required

Bond007

Well-known member
Joined
Jun 24, 2009
Messages
8,861
Reaction score
1,903
Location
Nova Scotia
xentr_thread_starter
A couple Alan Wake 2 articles up on TPU. One for FSR/DLSS and one on performance. Feel free to go through them, but I just skimmed and noticed the huge VRAM usage.



11GB for 1440p max (no RT/PT/FG), almost 13GB for 4k max (no RT/PT/FG)...toggle all the options on and it will chew up almost 18GB at 4k.

If you go through the rest of their performance review, you will see that this is a hard hitting game...old graphics cards need not even try dialing the settings up (though apparently turning down the settings does make a huge improvement to FPS).

Here is a quote from their conclusion for those that don't want to go through everything:

"Hardware requirements of the game are pretty crazy—similar to other titles that we saw this year. In order to reach 60 FPS at 1080p with highest settings you need a RTX 4060 Ti, RX 7700 XT or faster, and that's with RT disabled. Got a 1440p monitor? Then you need a RX 6900 XT, RX 7800 XT or RTX 4070. 4K60? That won't be easy. Only NVIDIA's GeForce RTX 4090 can achieve more than 60 FPS. AMD's best, the Radeon RX 7900 XTX reaches only 54 FPS—and that's without ray tracing. Once we've turned on ray tracing, performance suffers even more, making upscaling a requirement for all but the most powerful cards. But wait, there's more. Alan Wake 2 supports Path Tracing, too, which comes with another brutal performance hit. Without upscaling, at 1080p, the RTX 4090 only get 83 FPS, 60 FPS at 1440p, 32 FPS at 4K.

The performance scaling is pretty good though. You can roughly double the FPS just with settings. As mentioned before there's support for DLSS and FSR, too, which can further boost performance. DLSS Frame Generation can provide an additional boost, unfortunately there is no support for FSR 3 Frame Generation. Thanks to good work from the map designers, the game still looks good at lowest settings, because many of the story-telling effects are hand-crafted and optimized to work well, even at lower quality settings.

Depending on the settings, VRAM requirements range from "not so bad" at lower resolutions, hovering around 6-14 GB, to "you need a 16 GB+ card," if you're playing at 4K with settings maxed."


vram.png
 
Last edited:
Why is that necessary? Why invite the backlash that poor performance will bring? I'd rather it have a little less eye candy and run better.
Because some of us want games to look as good as they can. And you can still get the performance you want by turning down the settings. I hate it when devs aim for the lowest hanging fruit. Ideally you want to have a nice amount of scaling from the settings to let people like me with the 4090 cards to experience the best they can do and people with the much cheaper cards to still have an enjoyable experience.

Plus hopefully with some updates from both game and drivers we can get a bit more performance as time goes on, and before you know it its late 2025 and someone has a 5090 or a 8900xtx and are playing at 4k with all the settings cranked and getting 60fps. So no reason to aim for mediocrity.
 
xentr_thread_starter
I don't mind the increased vram/performance requirements, as long as there are settings that can bring it back to levels for mass market (in this case I would say mass market is an 8gb GPU @ 1080p-1440p, as they have been mainstream for a number of years)...and it also needs to provide increased visual quality for that rise in requirements. That said, this is a BIG jump on the high end.

I wish they tested a couple 4/6GB GPUs to see if they get hit as hard as I would expect (with lowered settings)
 
I think mass market should really be expected to have 10-12 GB of VRAM, or that's what GPU companies should have given us. I know we've had a lot of 8 GB GPUs released, but my 1080 Ti even had 11 GB.
 
Because some of us want games to look as good as they can. And you can still get the performance you want by turning down the settings. I hate it when devs aim for the lowest hanging fruit. Ideally you want to have a nice amount of scaling from the settings to let people like me with the 4090 cards to experience the best they can do and people with the much cheaper cards to still have an enjoyable experience.

Plus hopefully with some updates from both game and drivers we can get a bit more performance as time goes on, and before you know it its late 2025 and someone has a 5090 or a 8900xtx and are playing at 4k with all the settings cranked and getting 60fps. So no reason to aim for mediocrity.
I never said anyhting about lowest hanging fruit. But 30fps is a non starter honestly. I'll just play it on my ps5 if thats the case.
 
The 4090 has 24GB, might as well use it!

I do wonder if the chart is slightly skewed though because of that - is the game simply maximizing use of what's available? People tend to harp on RAM usage but really, you want your RAM to be used. What good is it sitting empty?

And weren't we told raster rendering is dead? This seems to drive that point home too. DLSS+FG @ 1440p seems to get you back to 100FPS or probably 60FPS @ 4K.
 
The 4090 has 24GB, might as well use it!

I do wonder if the chart is slightly skewed though because of that - is the game simply maximizing use of what's available? People tend to harp on RAM usage but really, you want your RAM to be used. What good is it sitting empty?

And weren't we told raster rendering is dead? This seems to drive that point home too. DLSS+FG @ 1440p seems to get you back to 100FPS or probably 60FPS @ 4K.
I was sure I read an article in the last year or so from a developer that stated GPU ram usage apps don't tell the full story and shouldn't be trusted. GN might have covered it?

Anyways, I'm not privy to the inner working on how devs load textures and use ram, but we're sure to see more issues of ram use as resolutions continue to exponentially increase.
 
A few years back there were definitely articles I was reading about how texture compression and transfer were improving and that it could lead to a more moderate demand for VRAM increases on video cards even at higher resolutions in the future. We are clearly not seeing that yet!

My guess is that as bus speeds and processing power increased, and continue to, there's has been less pressure to make the transferred and VRAM stored textures compact and thus the VRAM requirement continues to increase.

I see the same thinking in younger gen programmers who given little to no thought to memory management when writing their code because we have a boat load of memory in desktop PCs. Heck I've been guilty of that myself by times! I'd guess game devs are similarly focusing their attention other aspects and just assuming the video cards will keep up.

If we build it, they will come... :)
 
xentr_thread_starter
A few years back there were definitely articles I was reading about how texture compression and transfer were improving and that it could lead to a more moderate demand for VRAM increases on video cards even at higher resolutions in the future. We are clearly not seeing that yet!

My guess is that as bus speeds and processing power increased, and continue to, there's has been less pressure to make the transferred and VRAM stored textures compact and thus the VRAM requirement continues to increase.

I see the same thinking in younger gen programmers who given little to no thought to memory management when writing their code because we have a boat load of memory in desktop PCs. Heck I've been guilty of that myself by times! I'd guess game devs are similarly focusing their attention other aspects and just assuming the video cards will keep up.

If we build it, they will come... :)
as long as it doesn't cause an issue with consoles, they probably don't care that much...though this may be one of the few times in gaming that consoles have more "vram" available than many PCs.
 

Latest posts

Back
Top