What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

The big AMD GPU reveal.....BANG!

Sagath

Moderator
Staff member
Folding Team
Joined
Feb 7, 2009
Messages
6,644
Location
Edmonton, AB
I am not sure how much it matters YET, it is a hard thing to quantify as you can't measure memory pressure without in engine tools or 2 identical cards with different VRAM (Which doesn't exist now that Nvidia killed the higher ram variants)

However the part that sucks, is that the 1070 in 2016! had 8GB. No change in 4 years.. come on..
There are lots of ways to see ram usage in games. Most performance apps do it, although some of them include virtual-memory too, so you need to be careful how you're analysing the 'use'. Other ways include in-game (Warzone, etc, has this capability).

If games dont require it I'm fine with not eating additional cost for some rocks to be stuck on my card for no reason.

This weird focus on 'bigger = better' with RAM is bizarre to me.
 

Entz

Well-known member
Joined
Jul 17, 2011
Messages
1,878
Location
Kelowna
There are lots of ways to see ram usage in games. Most performance apps do it, although some of them include virtual-memory too, so you need to be careful how you're analysing the 'use'. Other ways include in-game (Warzone, etc, has this capability).

If games dont require it I'm fine with not eating additional cost for some rocks to be stuck on my card for no reason.

This weird focus on 'bigger = better' with RAM is bizarre to me.
That is the problem though, performance apps can only look at allocated VRAM and typically do not indicate actual usage patterns. Games request everything they can get but may not even use a fraction of it in a given scene. The only true way to know how much a scene ACTUALLY uses is in game tools. Nothing external will ever know.

Steve mentions this a lot in his videos.

The ram thing is mainly about having a card that isn't going to run into bottlenecks when we have consoles with 16GB of shared memory, 8-10 could be used by the GPU, that is what is worrying people mainly.

If games today can't max out 8, why not put on 4? or 6? Then Just throw out your card and buy a new one when a new high fidelity game comes out like Cyberpunk..

Edit: As an example RTSS has been proven to show over 2x the ram usage (requested) in FS2020 instead the actual usage.
 
Last edited:

Sagath

Moderator
Staff member
Folding Team
Joined
Feb 7, 2009
Messages
6,644
Location
Edmonton, AB
You're conflating the discussion with two different things. This isnt (or wasnt) a discussion about maximum RAM size, its about optimal ram size for a card at a given resolution. Yes, 4k or higher resolutions can use more than 8gb ram. Especially with aliasing enabled. No doubt. Does the card NEED it?

As said, is a 3070 16gb a good 1440p card because it has 8gb more ram? My point is that it isnt important. Graphic textures are a function of resolution displayed & texture quality level. Nothing more. Running 1440p on the latest and greatest games isnt utilizing 8gb today, or tomorrow, because thats a function of resolution & quality. Sure, if you want to run 4k on a 3070 with 16gb ram its possible, but at what framerate? Having 32gb isnt going to make it magically run 40fps faster.

This is why when you use the ingame tools (which we're both agreeing on being the metric to see actual data use) and only increase resolution the VRAM usage increases; Its loading bigger textures in to the VRAM. Similarly the usage goes up when you add in additional feature sets like RTSS, as you mentioned as well. Framebuffer hits are real with any AA function.

TLDR: GPU horsepower matters just as much as RAM. You dont NEED 16gb of ram on a 2060. Why? Because its never going to run 4k at high texture settings with (xxAA) enabled. Or well, not without being a slideshow.

Addendum; Again, I'd also argue that the future is upscaling for performance uplift, as shown by DLSS. So I will reiterate as I previously stated that vram usage in the future will actually shrink.
 
Last edited:

Entz

Well-known member
Joined
Jul 17, 2011
Messages
1,878
Location
Kelowna
Why are cards not shipping with 6GB then? Are you saying that texture quality level is not going to change ever? hasn't changed in 4 years since the 1070 came out? 1440p is stuck at 4-6GB for the rest of eternity?

The 3070 is 3x faster than a 1070 yet has the same memory limits? Scene density has clearly gone up massively in 4 years. So you need more textures for that, do you not?

I am not saying they need 16Gb, that is silly, 10GB on a 3070 would give it more breathing room. Was 8GB too much on a FE 1070? 2070? 2070S?
 

Marzipan

Well-known member
Joined
Nov 21, 2007
Messages
12,067
Location
Prince Rupert, British Columbia, Canuckistan
Why are cards not shipping with 6GB then? Are you saying that texture quality level is not going to change ever? hasn't changed in 4 years since the 1070 came out? 1440p is stuck at 4-6GB for the rest of eternity?

The 3070 is 3x faster than a 1070 yet has the same memory limits? Scene density has clearly gone up massively in 4 years. So you need more textures for that, do you not?

I am not saying they need 16Gb, that is silly, 10GB on a 3070 would give it more breathing room. Was 8GB too much on a FE 1070? 2070? 2070S?
herd mentality...bigger must be better. :p
 

Sagath

Moderator
Staff member
Folding Team
Joined
Feb 7, 2009
Messages
6,644
Location
Edmonton, AB
Why are cards not shipping with 6GB then? Are you saying that texture quality level is not going to change ever? hasn't changed in 4 years since the 1070 came out? 1440p is stuck at 4-6GB for the rest of eternity?

The 3070 is 3x faster than a 1070 yet has the same memory limits? Scene density has clearly gone up massively in 4 years. So you need more textures for that, do you not?

I am not saying they need 16Gb, that is silly, 10GB on a 3070 would give it more breathing room. Was 8GB too much on a FE 1070? 2070? 2070S?

I'll try to explain again. GPU Horsepower processes the memory usage (VRAM) to display pixels.

Two cards running the exact same settings on a given game at given textures; The 1070 8gb gets 60 fps (40 billion pixels processed in 1 second), the 2070 gets 90fps (60 billion pixels processed in 1 second). Those are arbitrary numbers by the way.

Increasing the 1070 to 16gb isnt going to change the 60fps, is it?
 

Entz

Well-known member
Joined
Jul 17, 2011
Messages
1,878
Location
Kelowna
I'll try to explain again. GPU Horsepower processes the memory usage (VRAM) to display pixels.

Two cards running the exact same settings on a given game at given textures; The 1070 8gb gets 60 fps (40 billion pixels processed in 1 second), the 2070 gets 90fps (60 billion pixels processed in 1 second). Those are arbitrary numbers by the way.

Increasing the 1070 to 16gb isnt going to change the 60fps, is it?
No but that has nothing to do with the textures that make up that scene.

If new a scene has X million triangles, and are using 4K x 4K textures you can have at most 125 ish addressed on the screen. Say a new game comes out using 8K mega textures? Well now you are out of VRAM at 63... So what are you going to do with the other 63? turn down quality?

The GPU can still do the X million triangles at 244 FPS, but now it needs to go back to the system every second scanline, your FPS will TANK.
 

Sagath

Moderator
Staff member
Folding Team
Joined
Feb 7, 2009
Messages
6,644
Location
Edmonton, AB
No but that has nothing to do with the textures that make up that scene.

If new a scene has X million triangles, and are using 4K x 4K textures you can have at most 125 ish addressed on the screen. Say a new game comes out using 8K mega textures? Well now you are out of VRAM at 63... So what are you going to do with the other 63? turn down quality?

The GPU can still do the X million triangles at 244 FPS, but now it needs to go back to the system every second scanline, your FPS will TANK.

Your opening statement is incorrect. It has everything to do with the textures that make up the screen.

You've just told me that you've increased texture quality by increasing the resolution of the textures 400%; From 4x4 to 8x8.

As I've already said, twice, of course this is going to have an impact on VRAM usage. And of course that is going to be impacted by GPU horsepower since you're processing 400% more pixels per frame. Thats why these are a function of use on each other. As I've already said, twice.
 
Last edited:

Entz

Well-known member
Joined
Jul 17, 2011
Messages
1,878
Location
Kelowna
Your opening statement is incorrect. It has everything to do with the textures that make up the screen.
That makes no sense. You said a card can process 40 bpps as an example, that has nothing to do with the source. To the fill rate, assuming appropriate memory bandwidth limits, a pixel is a pixel, that can come from 1 texture or 125,000 across the entire buffer.

Now if you are talking about effective resolution, as in does a 1440p game need 4k or 8k ultra texture modes that is debatable but filtering quality goes up the better the source as pixels are more than often an average. Otherwise we would still be using 1024x1024 or 256x256 textures for everything.
 

Sagath

Moderator
Staff member
Folding Team
Joined
Feb 7, 2009
Messages
6,644
Location
Edmonton, AB
Ah, I think we've found the disconnect. You're aware that textures are made up of pixels?

256x256 'texture' = 65536 pixels
1024x1024 = 1048576 pixels
4kx4k = 14745600 pixels.

This is why it has everything to do with 'the source'. The source is textures. Texture quality impoves? More pixels pushed. Just because you're downsizing it to a 1440p monitor doesnt mean anything. Monitor resolution is not tied to texture resolution.

Your use of "filter quality" is that downsizing after processing the high quality textures to display on screen. (This is also where aliasing comes in to take out some of the "noise" of the compression)

You're welcome to pull up any 1440p 'texture' and display it 1:1 on your 1440p monitor to view this by the way. You can also try it with a 4k, and 1:1 on a 1440p monitor, then set it to a true aspect ratio if you'd like. I think Skyrim has some mod packs you can pull images out of.
 
Top