What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

NVIDIA's GTX 1080 & GTX 1070 Detailed (Comment Thread)

D

Dark Knight

Guest
I'm somewhat surprised that Nvidia chose to go with GDDR5X and not HBM and that they chose to leave the SLI connectors in place. AMD introduced bridgeless Crossfire with the R9 series in 2013 and Nvidia is choosing to use a different SLI bridge now which is completely new. I'd like to see how AMD will answer with Polaris but if both those new architectures/releases can offer significant gains then it will mean that prices should come down for previous generations especially used and should offer more competition with current generation products between AMD and Nvidia.
 

Soultribunal

Moderator
Staff member
Joined
Dec 8, 2008
Messages
9,417
Location
Orangeville
I'm somewhat surprised that Nvidia chose to go with GDDR5X and not HBM and that they chose to leave the SLI connectors in place. AMD introduced bridgeless Crossfire with the R9 series in 2013 and Nvidia is choosing to use a different SLI bridge now which is completely new. I'd like to see how AMD will answer with Polaris but if both those new architectures/releases can offer significant gains then it will mean that prices should come down for previous generations especially used and should offer more competition with current generation products between AMD and Nvidia.

Supply of the material is what I would wager a guess at.
Go for something more proven and easier to supply/engineer into their solution for Mainstream flagship cards.

I am sure we will eventually see HBM trickle down into all the lineup eventually.

-ST
 

AkG

Well-known member
Joined
Oct 24, 2007
Messages
5,270
I would wait for 1440P or better results... as 1080P is just not going to stress these cards and can actually hide flaws. Plus who in their right mind buys a flagship card for 1080P res
 

Mayoo

Well-known member
Joined
Apr 6, 2011
Messages
298
Location
Québec, Canada
I can't wait to see their full price and STRIX versions come out. My 570s are getting tired. I'm glad I've waited for them, the performance increase seems somewhat substantial.

That being said, am I the only one out there that is kind of sad that they didn't think of USB 3.1 Type-C for output? They technically deliver more than enough plus they would remove the need for monitor power cables (remember that this USB can deliver up to 100W). It's minor, but still I would have loved to see this.
 

AkG

Well-known member
Joined
Oct 24, 2007
Messages
5,270
To add USB 3.1 and have it able to power a monitor they would have to add in another 8-pin PCIe cable (6-pin is only 75watts) and further beef up the power subsystem. I dont see that happening outside of a custom halo model.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
I'm somewhat surprised that Nvidia chose to go with GDDR5X and not HBM and that they chose to leave the SLI connectors in place. AMD introduced bridgeless Crossfire with the R9 series in 2013 and Nvidia is choosing to use a different SLI bridge now which is completely new. I'd like to see how AMD will answer with Polaris but if both those new architectures/releases can offer significant gains then it will mean that prices should come down for previous generations especially used and should offer more competition with current generation products between AMD and Nvidia.

NVIDIA is brilliant for avoiding the HBM cluster*uck. It increases BOM costs, has limited capacity, adds unneeded complications to the manufacturing process through the interposer, is still having availability issues, etc. etc. GDDR5X on the other hand has a perfect mix of yields, commonality, capacity and bandwidth.

As for the SLI connector, the bandwidth of Pascal's SLI interface would cause issues when running over a PCI-E slot. Maybe with PCI-E 4.0.
 
D

Dark Knight

Guest
NVIDIA is brilliant for avoiding the HBM cluster*uck. It increases BOM costs, has limited capacity, adds unneeded complications to the manufacturing process through the interposer, is still having availability issues, etc. etc. GDDR5X on the other hand has a perfect mix of yields, commonality, capacity and bandwidth.

As for the SLI connector, the bandwidth of Pascal's SLI interface would cause issues when running over a PCI-E slot. Maybe with PCI-E 4.0.

Perhaps for their current Pascal release with the 1070/1080 but if Nvidia releases their flagship parts (Titan Y?, 1080Ti) those parts would likely use HBM2 but we wouldn't likely see those parts till about a year later or half a refresh cycle later. GDDR5 and GDDR5X are and were intended to be transitional technologies just like GDDR3 was. Interestingly enough AMD first made the jump to GDDR5 in 2008 with the RV770 and R700 series while Nvidia was still on GDDR3 (and would be until 2010 with GF100). 8 years later and AMD looks like they will be going HBM2 either this year or in the next year or two while Nvidia will still be on GDDR5X. Looks like the same situation as 8 years ago with GT200 and RV770/R700.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Perhaps for their current Pascal release with the 1070/1080 but if Nvidia releases their flagship parts (Titan Y?, 1080Ti) those parts would likely use HBM2 but we wouldn't likely see those parts till about a year later or half a refresh cycle later. GDDR5 and GDDR5X are and were intended to be transitional technologies just like GDDR3 was. Interestingly enough AMD first made the jump to GDDR5 in 2008 with the RV770 and R700 series while Nvidia was still on GDDR3 (and would be until 2010 with GF100). 8 years later and AMD looks like they will be going HBM2 either this year or in the next year or two while Nvidia will still be on GDDR5X. Looks like the same situation as 8 years ago with GT200 and RV770/R700.

You have perfectly highlighted one of the major reasons why AMD's GPU division has struggled for the last 5 years. If they go with HBM2 this year for gaming-facing cards, put another nail in the coffin.
 

Latest posts

Top