What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

PCIe 6.0 Almost Done & Should Be Ready 2021

danmitch1

Well-known member
Joined
Dec 15, 2007
Messages
2,318
I guess that makes my brand new z390 chipset old already lol....

So wait , does that mean your devices wont use up that many lanes since one lane supports roughly 8GBS ?
 
Last edited:

Izerous

Well-known member
Folding Team
Joined
Feb 7, 2019
Messages
3,650
Location
Edmonton
My understanding was single video cards were not saturating the 3.0 spec as it was and this is all essentially because of NVMe drives being able to fully saturate 3.0 lanes. 8x improvement from 3.0 -> 6.0 to happen over the next couple years could make for some insane NVMe raid setups.
 

Entz

Well-known member
Joined
Jul 17, 2011
Messages
1,878
Location
Kelowna
I guess that makes my brand new z390 chipset old already lol....

So wait , does that mean your devices wont use up that many lanes since one lane supports roughly 8GBS ?
Yeah legacy devices and hubs can get away with a single lane vs 4/8/16. i.e. 10g ethernet and M.2 SSDs of our generation in 1x slot kind of thing.

No whether or not manufactures do that, and don't hog lanes for the sake of form factor, is a different issue. Its great for PCH though. 4x of Gen6 would be able to drive a lot...

Power and signal requirements look scary though. Gen4 is bad enough.

I still wish we got broad oculink support ...
 

Lysrin

Well-known member
Joined
Mar 10, 2014
Messages
7,852
Location
Nova Scotia

This article has some interesting info on the subject, linking to the conclusion:
https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-pci-express-scaling/7.html

Interesting paras:

"Our last PCI-Express scaling article was close to 20 months ago, with the GeForce GTX 1080 "Pascal," which we admit isn't exactly the predecessor of the RTX 2080 Ti, but was the fastest graphics card you could buy then. The GTX 1080 did not saturate PCI-Express 3.0 x16 by a long shot, and we observed no changes in performance between gen 3.0 x16 and gen 3.0 x8 at any resolution.

We are happy to report that the RTX 2080 Ti is finally able to overwhelm PCIe gen 3.0 x8, posting a small but tangible 2%–3% performance gain when going from gen 3.0 x8 to gen 3.0 x16, across resolutions. Granted, these are single-digit percentage differences, and you won't be able to notice them in regular gameplay, but graphics card makers expect you to pay like $100 premiums for factory overclocks that fetch essentially that much more performance out of the box. The performance difference isn't nothing, just like with those small out-of-the-box performance gains, but such small differences are impossible to notice in regular gameplay."
 

Lysrin

Well-known member
Joined
Mar 10, 2014
Messages
7,852
Location
Nova Scotia
that's curious. so really it's only going to be an issue for anyone that want's to run SLi or CrossfireX since most x16 slots default to two x8.

took a long time too...how long have we been PCIe based now? nigh unto 10 years I think...

Yes, but even then I don't think it is an "issue". As the article says there is a percentage difference but not enough to actually perceive in normal gaming experience etc. So when the GPUs come out that really need PCIe 4 or more... who knows. I won't start worrying about turfing my current motherboard for a while yet ;)

And yeah, it has been a long time. The fast GPU slot before that was AGP iirc?
 
Top