To be perfectly honest with you, I'm still not sold on 4K either. Give me a 144Hz, 16:9, 2560x1440, 30" monitor and I'm in heaven.
Fair enough, but you don't need to spend 1200US card to enjoy that. Just glad you didn't give it a DAM good, DAM good value, or innovation award. I see a few reviewers that got their hands on the card and giving it a gold seal of approval. That gave Nvidia's asking price justification to go up even further on the next launch.
Well what can be said? I think it would run my 1440p ultrawide just fine! :haha: I think including the 1440 results in the review is a good call (no offence intended ontariotl) because that resolution is becoming the new 1080p and so seeing a cards performance at that res is certainly beneficial. I would love if HWC would add 3440x1440p for a least a few of the really crushing games. I know the performance would be in between 2560x1440 and 4K, but it would still be nice to see actual numbers because the difference isn't linear. The price, yes crazy, and I was going to ask who this card is really for until I read the review conclusion. I think you summed it up perfectly SKYMTL. And I'm with you, my responsible self wouldn't let me buy it, but who wouldn't want that level of performance in their games if they could find the money? It would be amazing!!!
But who knows... I didn't think I would spend north of $1K on the monitor either but I have an X34 on my desk now and I couldn't be more pleased! ... well I could be a little more pleased if I added... :biggrin:
No offence taken whatsoever. We all have a voice here. Spending 1k on a monitor can be justified as its a product that stays with you longer after a few refreshes of video card product that you will likely make in its lifetime. As much as I would love the ultrawide aspect of 3440x1440, the problem I have with it is most gaming developers are not supporting it natively (21:9 in general). Jumping through hoops to mod an ini file or using a 3rd party app gets tiresome and frustrating just like in the eyefinity/surround gaming days a few years back which is why I abandoned it. Even Crysis which before ultrawide came to the consumer level supports 21:9 natively. Why can't
ALL newer games follow suit?
This is an embarrassing release for Nvidia not only are they not allowing any partners to sell that card (which means that Canadian RMA is not an option) but they do not allow non-reference designs or voltage unlocking (which was allowed on Kepler for example). Then to top it off they named it the same as Maxwell's Titan and haven't provided any real differentiating features. It's a money grab and I agree with SKYMTL for that.
I'm not going to consider this video card especially not at the price point of $1200 USD. That's more than what the previous Titan's were priced at and it still lacks compute/workstation capabilities unlike the original GK110 Titan's.
Also saw someone on Kijiji in Toronto trying to sell one for $2300 when even after conversion it's nowhere near his "firm" asking price. I had a good chuckle when I looked at his asking price because even with tax the person that posted the ad is taking advantage of gullible consumers.
Just like Nvidia trying to sell this at $1200US, there is always someone that will try to capitalize their slice of the pie as well. Unfortunately there will probably be a sucker to take them up on that offer.
I think Nvidia is holding this card to themselves as they probably don't have the yields to distribute to their partners this time around. I also think it's Nvidia is testing the grounds on how much someone is willing to pay for the next latest and greatest since there is no competition in the high end at the moment.
What AMD needs is pretty clear:
- AVOID HBM AT ALL COSTS. HBM1 gives them limited capacity while HBM2 adds an overly complicated design, huge cost increases and limited benefits over GDDR5X.
- Somehow achieve GTX 1070 / GTX 1080 performance at a comparable TDP level. With the 14nm obviously falling short of expectations on the TDP front, that may be a challenge. A card that hits under GTX 1080 FE performance levels with a TDP of 200W+ would be a disaster.
- Make sure there's product in the channel.
That should be pretty simple right? Right?
If I recall isn't it AMD/ATI's way of testing a smaller die on a lesser performance gpu to get the kinks out before they roll it out for their new flagship? It maybe falling short of expectations on the 480, but maybe Vega will improve on that TDP front.
I'm worried that AMD is waiting on stock levels for HBM2 to improve before rolling out Vega. They should have just gone the GDDR5X route as well. There is a time to experiment when you have the funds and can afford failure, but now isn't the time. AMD already had that with the Fury line and HBM. If your competition isn't jumping on the bandwagon, it's time to move on with good reason.
It makes me wonder if AMD could have matched 1070/1080's clock for clock with the 480, where would it be standing at this moment? I wish I had a 480 and at least a 1070 to match clock speeds (down clocking obviously) and see who really has the better performance in benchmarks putting aside the TDP level. I've seen cpu reviews like that in the past.
I remember in the golden days no one gave two shits about TDP levels. All it was "can it play it play Crysis?".
As for Make sure there's product in the channel. That can be said for Nvidia as well.