You're conflating the discussion with two different things. This isnt (or wasnt) a discussion about maximum RAM size, its about optimal ram size for a card at a given resolution. Yes, 4k or higher resolutions can use more than 8gb ram. Especially with aliasing enabled. No doubt. Does the card NEED it?
As said, is a 3070 16gb a good 1440p card because it has 8gb more ram? My point is that it isnt important. Graphic textures are a function of resolution displayed & texture quality level. Nothing more. Running 1440p on the latest and greatest games isnt utilizing 8gb today, or tomorrow, because thats a function of resolution & quality. Sure, if you want to run 4k on a 3070 with 16gb ram its possible, but at what framerate? Having 32gb isnt going to make it magically run 40fps faster.
This is why when you use the ingame tools (which we're both agreeing on being the metric to see actual data use) and only increase resolution the VRAM usage increases; Its loading bigger textures in to the VRAM. Similarly the usage goes up when you add in additional feature sets like RTSS, as you mentioned as well. Framebuffer hits are real with any AA function.
TLDR: GPU horsepower matters just as much as RAM. You dont NEED 16gb of ram on a 2060. Why? Because its never going to run 4k at high texture settings with (xxAA) enabled. Or well, not without being a slideshow.
Addendum; Again, I'd also argue that the future is upscaling for performance uplift, as shown by DLSS. So I will reiterate as I previously stated that vram usage in the future will actually shrink.