What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

ATI Radeon HD3870 X2 1GB Review

Status
Not open for further replies.

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
HD3870 X2 versus HD3870 512MB Crossfire

HD3870 X2 versus HD3870 512MB Crossfire

HD3870X2-3.jpg
HD3870_26.jpg

Since the HD3870 X2 is essentially a pair of HD3870 cards with higher clock speeds and lower-speed GDDR3 memory, it is obvious that parallels will be drawn between the two. Since we had a pair of HD3870 512MB cards kicking around I decided to run a few quick tests.

HD3870X2-36.JPG

HD3870X2-32.JPG

HD3870X2-67.JPG

The HD3870 X2 provides overall better performance than a pair of standard HD3870 cards in all of the games we benchmarked the two against one another. There isn’t too much of a difference between the two but it seems like having two chips on one card with a PLX bridge is a bit more efficient than having two cards on two slots connected by a Crossfire bridge. Even though the HD3870 X2 looks to be a bit more expensive than a pair of individual HD3870 512MB cards it has the benefit of using only one PCI-E slot and offering better overall performance.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Early Driver Performance Issues & Solutions

Early Driver Performance Issues & Solutions

A graphics card like this is either laughed at or accepted based on how well its drivers support modern games and benchmarks. Writing and perfecting drivers for what is essentially a Crossfire setup is a long and arduous task and just the thought of the amount of work involved makes my head spin. Yet, the ATI driver team has been working furiously and we received revised drivers a few days before this review was going to be published. So, all the benchmarks you have seen are based around these new drivers.

That being said, when we first received the HD3870 X2, there were some driver issues which you will likely encounter when you receive the card. Below is a window into some of the conflicts we experienced and how you can overcome them while waiting for the next driver revision.

Once again, please note that these issues were only experienced with EARLY drivers and are only included here to give you some idea of how to bypass them


World in Conflict (DX9)

HD3870X2-24.JPG

Problem: While the DX10 benchmarks ran without much of a hitch things fell apart when the game booted in DX9 mode. As you can see above, I experienced a fair amount of image corruption with the settings at default to the point where the in-game menus were completely illegible.

HD3870X2-27.JPG

Solution: Turning off the Catalyst AI fixed this problem but this resulted in extremely low frame rates. Nonetheless, the game was playable and the menus were completely without artifacts.


Call of Juarez (DX10)

Problem: The problems with this game proved to be a real head-scratcher for me since Call of Juarez has been used by ATI in the past to show off their cards. However, with the HD3870X2 the in-game DX10 benchmark refused to boot when AA was turned on. The screen turned black and there was no way to recover short of manually resetting the system.

Solution: Once again, nothing I did fixed this problem other than disabling Catalyst AI but since this disables optimizations, the performance suffers greatly.

Thankfully, through all of the tests I put the HD3870X2 through these were the only two games where it experienced any glaring issues with older drivers. Now onto the shining achievement of the ATI team…


A New Hope…..

ATI pulled a rabbit out of their collective hats with only 3 days to go before the NDA was lifted with a new driver release. The performance difference was so extreme I couldn’t ignore it and I had to redo EVERY test over again and rewrite a significant part of this review. Not only were the issues I had mentioned above completely solved but the performance difference in games like Crysis, World in Conflict and Call of Juarez was like night and day. It is like the entire driver team suddenly sat up, shouted “Eureka!!” and boosted performance by leaps and bounds. They breathed new life into the HD3870 X2 making it a great card where an ok card once existed.

Stay tuned for these drivers to be released to the general public.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
The Trials of Overclocking

The Trials of Overclocking

Some cards embrace overclocking with open arms but there are a few things about overclocking the HD3870 X2 that you should be made aware of. First of all there is the little caveat of the power connectors.

HD3870X2-6.jpg

Like with the HD2900XT, there is one 6-pin and one 8-pin PCI-E connector on this card. We tried a few combinations to see if it was possible to overclock with a pair of 6-pin connectors plugged in and these are the results we got.

HD3870X2-25.JPG
HD3870X2-23.JPG

On the left we see the ATI Overdrive option that comes up when both the 8-pin AND the 6-pin PCI-E connectors are installed. Meanwhile, when only two 6-pin connectors are plugged there is no Overdrive option though the card operates perfectly normal at stock speeds in any game. There are ways around the need for an 8-pin connector for overclocking and if you need any pointers for this, please feel free to post on the HD3870 X2 comment thread in our forums.

Even with the 8-pin connector installed, overclocking didn’t go very far with this engineering sample.

Max Overclocks:

Core: 865Mhz
Memory: 1892Mhz (DDR)

These clocks represent a pretty pathetic overclock of 50Mhz on the cores and 92Mhz on the memory. Considering the GDDR3 memory is rated at 2Ghz (DDR) I would think that either ATI tightened the timings or there is insufficient voltage running to it to facilitate higher overclocks. The core is another matter altogether; all signs point to the dreaded issue that plagued the early HD3870 512MB cards which couldn’t clock past the 862 to 867Mhz mark. These cards needed a BIOS flash in order to overclock past that point but flashing the BIOS on this card with two GPUs is a bit beyond the scope of this review. It is possible that this is an issue with our engineering sample and won’t carry into the actual production cards.

HD3870X2-56.JPG

As you can see, these minor overclocks net us less than 200 points in 3DMark06 and would provide next to no performance increase in games. Also, as we discussed before you have to overclock both GPUs and memory separately like you would in a dual card Crossfire setup. The process goes something like this:

1. Overclock GPU #1
2. Overclock memory associated with GPU#1
3. Test Clocks
4. If overclock passes, accept Overclock
5. Overclock GPU #2
6. Overclock memory associated with GPU#2
7. Test Clocks
8. If overclock passes, accept Overclock

ATI should make things easier by just allowing one GPU to determine the clocks of the other. It would simplify all our lives.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Heat & Acoustical Performance / Power Consumption

Heat & Acoustical Performance

HD3870X2-75.jpg

Overall, I was quite surprised at the stock heatsink’s ability to cool down both cores while maintaining relatively quiet operation. 86°C on one core and 81°C is quite good considering this engineering sample had a bit of a bastardized heatsink design with two different materials used for the GPU contact plates. On the other hand, I would suggest letting the card cool down before you remove it from your case since the backplate affixed to the underside gets burning hot.

At first I was worried about the fan making the same racket as the leaf-blower type affair used on the HD2900XT but I am happy to report the fan stayed quite silent through many of the tests. There were some instances where the fan sped up but even then it stayed at a volume which was acceptable to me. Just remember; I can only report about what I experienced and you may or may not find the fan on this card loud since these acoustical tests are only subjective.


Power Consumption

This test will be done a bit different from the last power consumption tests we have conducted. In this case Company of Heroes is now used with AA enabled to put less strain on the CPU so it will not impact as much on the results.

Please remember that this is the power consumption for the WHOLE SYSTEM.

HD3870X2-72.JPG

While I was expecting the 55nm cores to do some good, it seems like this engineering sample consumes copious amounts of power. While this could be due to the nature of the early version of the card we had, if it carries over into the retail cards, you will need at least a quality 600W power supply for a single card and a quality 850W or higher power supply for two of these cards. While it may have not made much of a difference, I would be very interested to see what power consumption would have been if the GDDR3 had been switched out for GDDR4 modules. However, it should be mentioned that there are TWO cores on this card as well as a whole gig of memory it is understandable why the power consumption is so high. Actually, when you look at it that way, the power needs of the HD3870 X2 are well within reason.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Aftermarket Cooler & Water Block Installation

Aftermarket Cooler & Water Block Installation

PLEASE NOTE THAT BEFORE READING ANY FURTHER YOU SHOULD KNOW THAT IT HAS COME TO OUR ATTENTION THAT EVEN THOUGH THE HR-03-REV.A FIT ON THE HD3870X2, MANY OTHER COOLERS WILL NOT FIT. THIS INCLUDES ANY 7600GT / X850 COMPATIBLE WATER BLOCK OR AIR COOLER.

I was originally going to skip over this section since I didn’t have access to any coolers which were compatible with the core placement of the HD3870 X2. Yet, when I looked a bit closer I noticed that there were a few things that everyone should know about the hole offset and other stumbling blocks you may come across. Before we start, let me make three things apparent:

1. Two Thermalright HR-03 Rev A. heatsinks will not fit. It is being used here to demonstrate the mounting hole offset and nothing more.

2. We will be testing various heatsinks from Zalman and other manufacturers for compatibility so stay tuned for an article related to this.

3. The PLX switch chip produces very little heat but the large VRMs need heatsinks if you are going with an aftermarket cooler.

HD3870X2-78.jpg
HD3870X2-76.jpg

So here we are…..the HR-03 Rev.A installed without a problem even though as you can see it is not compatible with the card itself. The holes line up perfectly with the 7600-series mounting plate that comes with this particular heatsink.

HD3870X2-77.jpg
HD3870X2-79.jpg

Here is what the standard HD3870 512MB mounting holes look like next to those of the HD3870 X2’s. They aren’t the same so I am guessing ATI went with a closer offset in order to be able to use as much space on the PCB as possible for necessary components.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Conclusion

Conclusion

Without a doubt, the ATI HD3870 X2 is a good card but at the same time, it lives and dies by driver and in-game Crossfire support. There are quite a few new and old games out there that do not support Crossfire and ATI’s driver team faces a monumental task of optimizing this dual-GPU card for those situations and as we have seen, they are making great headway. That being said, even though the card we reviewed here today was an early engineering sample, the potential of ATI’s new flagship card is simply mind blowing. It feels good to finally be able to say it: ATI has a flagship card. Against all odds, they have crammed two GPU cores on a single PCB the same length as an 8800GTX and its performance is nearly flawless. The HD3870 X2 can fight toe-to-toe with the best Nvidia currently has to offer and in many cases, it beats everything else hands down. At many points, this card shows its brilliance by outpacing even the mighty 8800GTX by a good 20% but at other times, performance falls short of our expectations.

With the 8800GTX and 8800 Ultra set to make their departure from the market in a few weeks time, ATI has positioned itself perfectly to take advantage of the vacuum created in the high-end category. As it stands, the HD3780 X2 is at the very top of the performance heap more often than not and until Nvidia comes up with an answer to it, it will continue to reign supreme. Like we saw in the progression of this review as its drivers rapidly mature, things will get even better performance-wise.

Then there is the question of price. From where I am standing, it looks like the HD3870 X2 will retail for about $500CAD (and higher) at many retailers. Personally, I think this is a great value for your money considering the performance we have seen with this card. This price puts it about $20 more expensive than buying two separate HD3870 512MB cards and running them in Crossfire. Considering we have seen that the HD3870 X2 offers higher performance than two separate cards and has the added convenience of using one PCI-E slot, I think this is a win-win situation.

However, with all this praise come a few words of warning. Like with all good things, there are still some areas in which improvements have to be made but this is usually the story with any freshly released card. The DX10 performance in World in Conflict stands out quite glaringly as one area that needs improvement while the always-suffering ATI scores in Lost Planet continue with this card as well. These are all things that should be easily fixed with upcoming driver revisions but the power consumption is something that consumers should watch out for. If you think your 500W power supply will tango with one of these cards, you are sadly mistaken and you should go out and buy a fittingly high powered unit instead.

Finally, mention really has to be made about the overclocking potential of our sample; to put it delicately, it was pathetic. Enthusiasts want to see good amounts of overclocking headroom if their spend $500 on a graphics card and the amount of headroom this card has will not make any perceivable difference in games or synthetic benchmarks. Whether this will change in the retail cards is not yet known but all indications point towards disappointing overclocking potential for the first run of HD3870 X2 cards. We hope to get our hands on a retail HD3870 X2 soon to compare the performance and overclocking difference between it and this engineering sample so stay tuned for that one.

After seeing ATI fight an uphill battle for the better part of a year, many a naysayer was counting the days until the company rolled over and died. Thankfully it looks like under the tutelage of AMD, ATI is back in the game and we can see that there are some very exciting times ahead in the graphics card industry.


Pros:

- Heart-stopping performance
- Price
- HDMI dongle

Cons:

- Sometimes driver optimizations go AWOL
- Fan spins up quite quickly
- Overclocking? What overclocking?


 
Last edited:
Status
Not open for further replies.

Latest posts

Top