What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

NVIDIA's GTX 1080 & GTX 1070 Detailed (Comment Thread)

D

Dark Knight

Guest
I'm not a fan of Steam for their Hardware Survey. But maybe that's just me.
I don't move in rare circles, but I am on most HW forums and in lots of gaming circles (though under one of my multitude of names). So I get a little bit of a pulse on what's out there.
Its still going to take time, but it will come, at least I believe it will.

I would also be inclined to believe the Steam hardware survey. 1080P and even lower resolutions are still mainstream and the most common resolutions encountered. 1440P and 1600P won't become mainstream until they can come down even further in price and have hardware provide similar performance as 1080P at 1440P or 1600P resolutions. I've had a chance to try 1600P (albeit in some stores) many years ago and I still haven't seen 1440P/1600P achieve mainstream status. Enthusiasts make up a very small niche market and percentage that even if the percentage was to increase it still wouldn't compare to the prevalence of 1080P. For example prior to me acquiring my 1152P monitors I ran 1080P and prior on a 4:3 1280x1024 LCD monitor.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
I agree. I just don't understand why AMD is aiming Polaris at lower to mid range users. It peeves me off that AMD won't have higher end cards. I predict that Nvidia will win over AMD. Depending if Nvidia got A Sync right. I guess it won't matter much if the GTX1080/1070 will be power houses in the market.

Simply put: because while AMD is losing quite badly in desktop market volumes, they are getting absolutely murdered in the notebook market. They need something that can scale into both segments and, at least for the time being, that means mid-tier offerings.

AMD certainly has a chance here though. They need a quick launch of Polaris and then a rapid move to the higher end before the GeForce lineup's cadence leads to a GP100 offspring. GP104 looks to be a killer core but the eventual full unveiling of Pascal into the desktop market is the stuff nightmares are made of for AMD.

While NVIDIA's current lead over the Fury X will likely be in the 45% to 50% range and seems insurmountable in the short-term, if AMD is able to realize the same benefits as NVIDIA has on their respective manufacturing process, it is possible to compete against or even beat the GTX 1080. They just need to be A LOT faster at rollout than they have been in the past.

One of the biggest challenges for AMD will have been their prediction of NVIDIA's moves. Remember, it takes upwards of two years to bring a new architecture and its eventual rollout to fruition. In other words, with Pascal launched, NVIDIA is already well into Volta's design. What that means is relatively straightforward: if AMD was banking on the very-likely possibility of NVIDIA sticking to their typical 25% to 35% generational performance increases, things aren't going to turn out very well.

With the GTX 1080 we are seeing a transition that would typically take an architecture a year to complete. Think of it as NVIDIA jumping from the GTX 780 Ti all the way to the GTX 980 Ti while skipping the GTX 980 completely.

While AMD likely isn't too concerned about a niche card like the GTX 1080, the GTX 1070 may end up being what keeps them up at night. IF its in-game performance gravitates somewhere between the GTX 980 Ti and TitanX, its $379 price is a potentially catastrophic development for Polaris' pricing structure. Even at $299 (mid-tier card pricing) it would need to match current R9 Fury performance levels.


From what i understand it is the Polaris 11 you talk of being geared towards lower to mid-range users. Polaris 10 will be for higher levels. And then is Vega, the highest end SKU with HBM 2.0. I think for once AMD has thrown Nvidia under the bus, at least for a while, getting exclusivity on the HBM memory : we see how hard they struggle with poor yields and had them scramble to a memory type with only one producer.

I think the opposite. I think AMD did NVIDIA a huge favor by reserving a good portion of HBM wafers. For the record though, AMD doesn't have exclusivity on the technology or production capacity. Also, SK Hynix is currently the ONLY producer of HBM and their yield levels have been in the absolute shitter since day one even according to Hynix's own investor briefings and as evidenced by the poor availability of the Fury / Fury X.

What NVIDIA has done is effectively let AMD be the guinea pig again and again when it comes to new memory technologies (of which G5X is not) and process technologies. The result has been expensive delays, cancelled product lineups and a highly challenging competitive landscape that has forced AMD to price their wares below market levels. That's not an optimal situation for any company to be in.
 

Soultribunal

Moderator
Staff member
Joined
Dec 8, 2008
Messages
9,426
Location
Orangeville
I would also be inclined to believe the Steam hardware survey. 1080P and even lower resolutions are still mainstream and the most common resolutions encountered. 1440P and 1600P won't become mainstream until they can come down even further in price and have hardware provide similar performance as 1080P at 1440P or 1600P resolutions. I've had a chance to try 1600P (albeit in some stores) many years ago and I still haven't seen 1440P/1600P achieve mainstream status. Enthusiasts make up a very small niche market and percentage that even if the percentage was to increase it still wouldn't compare to the prevalence of 1080P. For example prior to me acquiring my 1152P monitors I ran 1080P and prior on a 4:3 1280x1024 LCD monitor.

I think perhaps we got a little off track of the point I was trying to make.
It was in relation to the end of Jokesters Post actually. He stated that most people aren't aiming for 4K gaming. What I was trying to get across was that the technology is catching up to make it affordable to the point where more people join the 4K Crowd.
There are many people in my Circle who don't have steam (So never do those surveys) and run these setups (Last LAN party at work had only me running a 1080p Monitor, and there were 30 of us). So I don't count those surveys nearly as close as other people might. Again, I am not saying 1080p isn't mainstream. It is. But cheaper better technology is catching up to the point where I think there will be a shift. And what I stated initially still holds true. There are more people on 4k today then last year, and there will be more into 4K a year from now than today.
The HALO cards, well I agree that is for a select breed. But being able to drive higher resolutions for less cost than last year is the new reality that will shift it, IMO.

I suspect, to be honest, the average/typical system lies somewhere between Steam surveys and what we see in various forums.... People, especially in the case of monitors, don't upgrade very often; hell, I never even owned a 1080 monitor.

Your last comment- and I honestly don't mean this as a flame- crystallised something I had noticed in researching my CPU upgrade, but never quite could put my finger on. Too often, it seems, those that work professionally with computers, whether it be IT, content creation or what have you, think their work knowledge/requirements/expectations transfer, unquestionably, to the (for lack of a better term) real world. Of course, technology advances, becoming cheaper in the process. However, newer technologies don't always supplant older, established ones. A pure solid state PC is foreseeable, provided you don't need TBs of bulk storage. SSDs are still almost ten times the price per GB of HDDs, if 4k content ever arrives/takes off, HDDs become even more attractive.

In other words, I've started researching my next upgrade, 6/8TB of storage or a 400GB Intel 750. People who don't visit HW/gaming forums, I suspect, don't consider such things. They're far more likely to be rational, maybe even sane. :haha:

I don't take it as a flame at all. I love discussing technology, I'm a technologist true to heart. I don't think that my knowledge always translates over into the Business world...except when it does.
Because I have an understanding of the latest that is out there it has allowed me to use it to the benefit of my company.
I have under my wing 22 Techs, and their related Hardware. Because I switched them all over to Solid State, I have not replaced a HDD in 2 years. Where the previous 3 there was much different. That knowledge I held went to the real world in a good way for me.
My adaptation of a 'Enthusiast Grade' PSU into the Nortel BCM's we support allowed me to reduce the failure rate to less than 1% (They were using cheap, Sparkle PSU's that were factory chosen by Nortel. I now put good Delta Antec Branded inside). Another Spin on adapting what I know into the 'real world'.

Not everything I know will bridge that gap, but more and more I see it applied in my daily life and what I do with my work. I see it applied with my Brother as well whom is a Technical Director at a Teaching Hospital.

Anywho, I do agree with the principal of what you were saying. I just feel there is a place for new technologies to be adapted. And for older ones to remain. But the split (Ratio if you will) is changing.
Faster than we might admit.

Our Solid state Server...is expensive yes. And its storage capacity is reduced. But, our access times are unquestionably faster, and productivity over the course of a year with our employees outweight the costs of that solution by a factor of 10. You just have to be willing to make the attempt.

Anywho, back to the GPU's that this thread is related about since I went off on a bit of a Technology rant there lol.

-ST
 

Lysrin

Well-known member
Joined
Mar 10, 2014
Messages
7,862
Location
Nova Scotia
Taking us in a slightly different direction, the GPU upgrade vs. G-Sync monitor upgrade question comes to mind again with the arrival of these cards.

The shown performance of the 1080 being faster than 2x 980 SLI means a single card ought to eat up gaming at 1440p. So with the fps the new 1080s are supposed to put out, does that change the best use of money for upgrade? Is it better to go with a 1080 that'll likely push beyond 60 fps at 1440p pretty much all the time or are you still going to get an overall better gaming experience by sticking with a 980 for example and adding G-Sync to your set up?

I'm thinking the latter because high fps on an older 60 Hz monitor like I have without G-Sync is going to result in visual issues that make you want to use v-sync or adaptive v-sync and then you get into the pros and cons of that again. Is that right or is Pascal changing more things than I realise that makes it not a simple answer?
 

Groove

Well-known member
Joined
Jan 14, 2009
Messages
435
Location
Ottawa(ish)
Taking us in a slightly different direction, the GPU upgrade vs. G-Sync monitor upgrade question comes to mind again with the arrival of these cards.

After little over a month of using G-Sync I can say that I will never go back to a regular monitor and if I personally had to choose between swapping my 980 for a 1080 or buying a G-Sync monitor I would choose the G-Sync monitor hands down! That being said, I'm totally aware that a 980 is nowhere near the level of performance we can expect from the 1080 but for the next year or so I don't see the 980 struggling to a point where games will be unplayed, even at 1440p. If you're trying to figure out which one to choose as an upgrade in the next few weeks/months, I would definitely go with the G-Sync panel, plus I don't see the point or running a 1080 on a 1440p 60hz panel.
 

Soultribunal

Moderator
Staff member
Joined
Dec 8, 2008
Messages
9,426
Location
Orangeville
After little over a month of using G-Sync I can say that I will never go back to a regular monitor and if I personally had to choose between swapping my 980 for a 1080 or buying a G-Sync monitor I would choose the G-Sync monitor hands down! That being said, I'm totally aware that a 980 is nowhere near the level of performance we can expect from the 1080 but for the next year or so I don't see the 980 struggling to a point where games will be unplayed, even at 1440p. If you're trying to figure out which one to choose as an upgrade in the next few weeks/months, I would definitely go with the G-Sync panel, plus I don't see the point or running a 1080 on a 1440p 60hz panel.

The 980 will be more than potent enough for many users still.

Good to hear about G-Sync, that's something I haven't quite bit the bullet on yet, though its nice to hear the feedback about it.

-ST
 

beck2448

Member
Joined
Dec 16, 2010
Messages
21
Excited to see the top out of the box overclocks the AIB partners can achieve. Should be interesting.
 

Lysrin

Well-known member
Joined
Mar 10, 2014
Messages
7,862
Location
Nova Scotia
I've gone from shut up and take my money to HURRY UP AND TAKE MY MONEY!
:biggrin: Must have all the new things!!! I am very interested but still think that the advice people have given me to go new monitor with G-Sync to pair with my 980 is money better spent.
 

Latest posts

Top