What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

Newegg Leaks Lots of RX 6000 Specs

Bond007

Well-known member
Joined
Jun 24, 2009
Messages
5,944
Location
Nova Scotia
With an architecture update and a node shrink it is hard to jump to much of a conclusion with just the specs. It looks ok on paper (better than what is there right now), but it doesn't instill huge confidence. 6700XT really (on paper) doesn't look like an upgrade at all, though the architecture may prove otherwise. I really hope it is competitive to RTX 3070-3080 levels at a minimum.
 

Izerous

Well-known member
Folding Team
Joined
Feb 7, 2019
Messages
1,265
Location
Edmonton
If they can at least go neck to neck with a 3070 but at a lower price they might be able to do enough volume in sales to make the difference, especially after nvidia has come in with the 30 series at much better prices than the 20 series. The majority of the market isn't the halo cards but the ones under them.

I'm not expecting a whole lot out of the AMD cards to be honest and havn't for years. They could really use a win on this side but I don't think it is going to happen.

I kinda half expect a form of multi gpus to make a come back though, but not like the 4970x2 / 6990 etc where it was 2 GPUs on a card. the CPU chiplet design is working well for them... a similar chiplet approach on the GPU side of this could be really interesting to improve yields and potentially in turn cut costs. And spread heat out, could even get huge idle power consumption down in theory since you could shutdown chiplets completely.

Something like card A uses 2 chiplets, card b uses 3 chiplets, card c uses 4 chiplets. Then it all comes down to yields and 1 production line.
 

JD

Moderator
Staff member
Joined
Jul 16, 2007
Messages
10,155
Location
Toronto, ON
Will be interesting to see how it all plays out, since technically developers have more experience with RNDA2 now as it's powering the new consoles too. The PS5 only has 32 CUs though compared to the XSX at 52 CUs? That would mean the PS5 is a cutdown 6700XT and the XSX is a cutdown 6800XT. It's always impressive what consoles can pull off though with "less".
 

Mr. Friendly

Well-known member
Joined
Nov 21, 2007
Messages
5,520
Location
British Columbia
if the 6900 XT doesn't do better than the 3080...it was a bad idea. you'd automatically assume a 6900 XT will compete with the 3090, but if it can't, it leaves you with egg and mud on your face and customers with a very bitter / sour taste in their mouth.
 

Entz

Well-known member
Joined
Jul 17, 2011
Messages
1,659
Location
Kelowna
Will come down to boost clocks as well. Clocks are why the PS5 and XBSeX are not light years apart despite the CU diff.

With mature drivers the 5700XT can hang with a 2070S/2080 , the 6900XT having twice the cores, more bandwidth, healthy boost and RDNA2+ changes double is not out of the realm of possibility. It may not beat a 3090 level card but it could sit between the 3080 and 3090. If priced right (3080 money) it would be a winner.

The power differences are stark, so if AMD really needed too they could drop a last minute bios update on everyone again and increase it by 20% for higher clocks lol.
 
Last edited:

Sagath

Moderator
Staff member
Folding Team
Joined
Feb 7, 2009
Messages
4,910
Location
Edmonton, AB
if the 6900 XT doesn't do better than the 3080...it was a bad idea. you'd automatically assume a 6900 XT will compete with the 3090, but if it can't, it leaves you with egg and mud on your face and customers with a very bitter / sour taste in their mouth.

I would disagree. As consumers, even uninformed ones, only care about Price/Performance usually. This is why the 1660 and 2060 were the best selling cards on the market, and even the 5700xt was highly purchased.

The high end market isnt a majority of sales, or even the majority of the in-use market.
 

Dwayne

Well-known member
Joined
Aug 17, 2008
Messages
1,374
Location
Courtenay, BC
Will come down to boost clocks as well. Clocks are why the PS5 and XBSeX are not light years apart despite the CU diff.

With mature drivers the 5700XT can hang with a 2070S/2080 , the 6900XT having twice the cores, more bandwidth, healthy boost and RDNA2+ changes double is not out of the realm of possibility. It may not beat a 3090 level card but it could sit between the 3080 and 3090. If priced right (3080 money) it would be a winner.

The power differences are stark, so if AMD really needed too they could drop a last minute bios update on everyone again and increase it by 20% for higher clocks lol.

I don't think that doubling the cores / CUs is going to yield any where close to double the performance. Not even with a node shrink for the Ampere did their performance increase that much. RDNA 2 is an evolutionary iteration on the same node with improvements, call it 7+ if you will. And even with a claimed 50% increase in performance per watt, doubling the cores is going to increase the power required. Toss on the mythical 2.2 GHz boost speed, at under 300 W? I just don't see a lot of this being something in the realm of reasonable.

Before the Ampere drop people were talking about the best RDNA 2 card beating the 2080 Ti, but not by how much. I would suspect that is true, and that the best card AMD will crank out will be something that falls between a 2080 Ti (3070) and the 3080. Which is to say it will be a fine product, and if priced aggressively it will draw a lot of attention. But so many AMD lovers are just setting themselves up for major disappointment believing some of the leaked information.

You mention the consoles, the PS5 GPU component is running 36 CUs at 2.233 GHz while the XBox GPU component will run 52 CUs at 1.825 GHz. I think the companies are running the cores as aggressive as they dare, and that more cores means slowing the clock down, as seen with the XBox clock speed. It stands to reason, just look at the clock speeds of the 2080S to 2080 Ti, 3080 to 3090, and so forth. An increase in cores means more power draw. More clock speed means more power draw. To keep power at a "reasonable" level you balance off cores and speed.

It will be interesting to see the leaks and guesses coming out though. But I remember the "Wait for xxxx" hype around each AMD offering from Vega onward, and the disappointment that it wasn't an "NVidia Killer" after all.
 

Mr. Friendly

Well-known member
Joined
Nov 21, 2007
Messages
5,520
Location
British Columbia
I would disagree. As consumers, even uninformed ones, only care about Price/Performance usually. This is why the 1660 and 2060 were the best selling cards on the market, and even the 5700xt was highly purchased.

The high end market isnt a majority of sales, or even the majority of the in-use market.
so you're saying someone buys a 6900 XT, thinking it's to compete with the 3090 and then finds out it's only as good as a 3070, will be happy?

I think not...they'd be choked. so AMD has a lot to riding on something named the 6900 XT...
 

sswilson

Moderator
Staff member
Joined
Dec 9, 2006
Messages
19,949
Location
Moncton NB
so you're saying someone buys a 6900 XT, thinking it's to compete with the 3090 and then finds out it's only as good as a 3070, will be happy?

I think not...they'd be choked. so AMD has a lot to riding on something named the 6900 XT...

But why would they "think it's to compete with the 3090" ? It's been a long time since we've had parity with flagship GPUs, it's the mid range equivalents in Price/Performance that tell the real story IMO.
 
Top