What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

Sapphire Radeon HD 5970 2GB OC Edition Review

Status
Not open for further replies.

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal

Sapphire Radeon HD 5970 2GB OC Review




Product Number: 21156-01-50R
Price: $600USD / $675CAD
Warranty: 2-years



ATI is on a roll. There is no doubting it or denying the fact that the boys in red have managed to hammer a successive number of nails into NVIDIA’s DX11 aspirations by being first to market with not one but a whole series of brand new, segment-leading DX11 cards. The HD 5800-series was first on the scene and proved that these new cards could compete with the best of the best from the previous generation and then some. However, in many people’s opinions, there was one thing missing: ATI firmly marking their turf by laying claim to the fastest graphics card in the world. That’s where the HD 5970 2GB comes into the picture.

At its most basic, the new HD 5970 is a dual GPU card that makes use of an on-board PLX bridge chip to handle the communication between the two cores. Each GPU core is able to address a whopping 1GB of GDDR5 memory which will hopefully make the bandwidth issues of the HD5800-series of cards a thing of the past. From a pure performance standpoint, this card’s potential is simply out of this world.

We all remember the HD 4870 X2 and the older yet no less significant HD 3870 X2 dual GPU cards so some of you may be wondering where the “X2” moniker went. Well, ATI has decided to do away with old naming conventions for one reason or another and believe it or not, we welcome this change. It cements the HD 5900-series as the current high performance cards in ATI’s lineup while keeping a clear distinction between all of their product ranges.

In this review we will be looking something unique: a pre-overclocked ATI card being released right alongside the reference-clocked version. That’s right, at launch there will be two different HD 5970 cards being released by the likes of Sapphire, XFX and other ATI board partners: one with standard speeds and another with some increased performance potential. Along with this somewhat shocking revelation, there are several other things that make the HD 5970 a cut above but we will go into those a bit later in this review.

Our introduction wouldn’t be complete without some speculation about the HD 5970’s pricing and availability and on both fronts, it isn’t pretty. We should be looking at an initial “launch” price of about $600USD or $675CAD which will make it the most expensive card on the market by a long shot. However, this price is likely to skyrocket in the days following launch since we hear it will be be next to impossible to find. The retailers we have spoken to are all expecting less than 10 cards in total at launch which makes this a paper launch that we are sure will be passed off as a hard launch.

With NVIDIA’s Fermi cards firmly behind the iron curtain somewhere in Santa Clara, ATI has a clear path to complete market domination with their HD 5970. Let’s hope they make the most out of it.


 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
A Look at the ATI 5000-series

A Look at the ATI 5000-series


[


As you can probably tell by the chart above, all of the HD 5000-series fit perfectly into ATI’s current lineup. At the top of the heap we have the ultra high performance dual GPU HD 5970 which carries most of the same specifications as a pair of HD 5870s. There are however some sacrifices that had that had to be made in the clock speed department in order to keep power consumption within reasonable levels. So, while this card has the same number of texture units and stream processors as the HD 5870, its core and memory run at speeds identical to the HD 5850.

Judging from paper specifications alone, the HD 5870 is a technological marvel considering it packs all of the rendering potential of ATI’s past flagship card and then some while not being saddled by an inefficient dual processor design. The fact that this new card could trump the performance of a HD 4890 just a few months after that card’s release is nothing short of stunning.


The HD 5850 on the other hand looks to be the purebred price / performance leader of the new ATI lineup. Barring slightly lower clock speeds for both the core and memory along with eight disabled texture units (totalling 160 stream processors), it is basically a clone of the HD 5870. This is the card ATI hopes will compete directly with the GTX 285 for the near future and then come into its own when DX11 games make their way into the market. We believe this card will appeal to the majority of early adopters since it allows them to buy class-leading DX9 and DX10 performance now without gambling $400 on unproven DX11 potential.

We can also see that ATI did some careful price cutting prior to launch since even though the HD 4890 looks to offer significantly less performance than a HD 5850, it is actually priced accordingly. As such, this previously high end card will stick around for the next few months in the $200 price bracket but that isn’t to say that it will stay there indefinitely...


Meanwhile, we now have the HD 5700-series of code-named Juniper cards as well with the HD 5770 and HD 5750. The HD 5770 1GB is one of the first sub-$200 cards which will come stock with 1GB of memory and along with the GDDR5 memory, comes with some hefty clock speeds as well. However, even though upon first glance the HD 5770 looks like it can compete with the HD 4890, this isn’t the case. According to ATI, the 128-bit memory interface will limit this card’s performance so it lies right within its stated price range. We should also mention that ATI won’t be replacing the HD 4890 until at least the first quarter of 2010 even though the HD 5770 is looking to take over from the HD 4850.

The HD 5750 on the other hand is simply a cut down HD 5770 with lower clocks, less SPs and a cut down number of Texture Units. It is this card that ATI sees going head to head with the NVIDIA GTS 250 and 9800 GT. It uses GDDR5 memory but there will be both 512MB and 1GB versions released to cater to the $100 market along with those looking for a little jump in performance.

So there you have it. In the high stakes game of poker that is the GPU industry, ATI has shown its hand. All that is left is for the competition to respond.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
A Focus on DX11

A Focus on DX11


It has been a hair under three years since the release of Windows Vista and with it the DirectX 10 API. In that amount of time, a mere 33 DX10 games were released. That isn’t exactly a resounding success considering the hundreds of titles released in that same time. Let’s hope DX11 does a bit better than that.


DX11 is focused on taking the lessons learned from the somewhat inefficient DX10 and shaping them into a much more efficient API which will demand less system resources while being easier to develop for. In addition to the usual 3D acceleration, it will also be used to speed up other applications which in the past have not been associated with the DirectX runtime. This may be a tall order but with the features we will be discussing here, developers have already started using DX11 to expand the PC gaming experience. It is an integral component in Windows 7 and according to Microsoft, will also be adopted into Windows Vista through a software update.

Let’s scratch the surface of what DX11 can bring to the table.


Unlike past DirectX versions, DX11 endeavours to move past the purely graphics-based uses of the API and push it towards being the lynchpin of an entire processing ecosystem. This all begins with the power which DirectX Compute will bring into the fold. Not only can it increase the efficiency of physics processing and in-game NPC intelligence within games by transferring those operations to the GPU but it can also be used to accelerate non-3D applications.




Through the use of Compute Shader programs in Shader Model 5.0, developers are able to use additional graphical features such as order independent transparency, ray tracing, and advanced post-processing effects. This should add a new depth of realism to tomorrow’s games and as mentioned before, also allow for programs requiring parallel processing to be accelerated on the GPU.


For the majority of you reading this review, it is the advances in graphics processing and quality that will interest you the most. As games move slowly towards photo-realistic rendering quality, new technologies must be developed in order to improve efficiency while adding new effects.


Some of the technologies that ATI is championing are DX11’s new Depth of Field, OIT (or Order Independent Transparency) and Detail Tessellation. While the pictures above do a good job of showing you how each of these works, it is tessellation which ATI seems most excited about. They have been including hardware tessellation units in their GPUs for years now and finally with the dawn of DX11 will these units be finally put to their full use. OIT on the other hand allows for true transparency to be added to an object in a way that will be more efficient resource-wise than the standard alpha blending method currently used.


Let’s talk about DX11 games. As you would expect, due to the ease of programming for this new API and the advanced tools it gives developers, many studios have been quite vocal in their support. Even though some of the titles listed above may not be high on your list of must have games, A-list titles like the upcoming Aliens vs. Predator from Rebellion and DiRT 2 are sure to get people interested. What we like see is at least three DX11 games being available before the Christmas buying season even though BattleForge is already available and will have DX11 support added through a patch.

Another exciting addition to the list is EA DICE’s FrostBite 2 Engine which will power upcoming Battlefield games. Considering the popularity of this series, the inclusion of DX11 should open up this API to a huge market.

 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
OpenCL: The Next Big Thing?

OpenCL: The Next Big Thing?



As consumers, we have all heard of the inroads GPUs have been making towards offering stunning performance in compute-intensive applications. There have been attempts to harness this power by engines such as NVIDIA’s Compute Unified Device Architecture (CUDA) and ATI’s Stream SDK (which in v2.0 supports OpenCL).


“Build it and the will come” says the old mantra but industry adoption of CUDA and Stream was anything but quick since there were two standards being pushed for the same market. CUDA in particular is having a hard time of it since it is vendor-specific without hardware support from any other vendor. The industry needed a language that was universal and available across multiple platforms. That’s were OpenCL (Open Computing Language) along with DirectX Compute come into play. It is completely open-source and managed by a non-profit organization called the Khronos Group which also has control over OpenGL and OpenAL


At its most basic level, OpenCL is able to be executed across multiple mediums such as GPUs, CPUs and other types of processors. This makes it possible to prioritize workloads to the processor that will handle them most efficiently. For example, a GPU is extremely good at crunching through data-heavy parallel workloads while an x86 CPU is much more efficient at serial and task-specific This also allows developers to write their programs for heterogeneous platforms instead of making them specific to one type of processor.


So what does this mean for gamers? First of all, AMD has teamed up with Bullet and PixeLux in order to achieve more realistic environments for players. The Bullet Physics is an open-source physics engine which has an ever-expanding library for soft body, 3D collision detection and other calculations. Meanwhile, PixeLux uses their DMM (Digital Molecular Matter) engine which uses the Finite Element Analysis Method of calculating physics within a game. In past applications, it has been used to calculate actions which have an impact on the game’s environment such as tumbling rubble or debris movement.


With Stream moving to OpenCL, ATI is truly moving towards an open platform for developers which they are hoping will lead to broader developer and market adoption than the competition’s solutions. At this point it looks like we will soon see ATI’s GPUs accelerating engines from Havok, PixeLux and Bullet through the use of OpenCL. Considering these are three of the most popular physics engines on the market, ATI is well placed to make PhysX a thing of the past.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
ATI’s Eyefinity Technology

ATI’s Eyefinity Technology



The term Surround Gaming may not mean much to many of you who are reading this article but with the advent of ATI’s new Eyefinity technology, now is a good time to educate yourself. Basically, Eyefinity will give users the ability to use multiple monitors all running from the same graphics card. In the past, simple dual monitor setups have been used by many graphics, CAD or other industry professionals in order to increase their productivity but gaming on more than one monitor was always a bit of a clunky affair. Granted, some products like Matrox’s TripleHead2Go were able to move multi monitor setups into the public’s perception but there were always limitations (resolution and otherwise) associated with them. ATI is aiming to make the implementation of two or even more monitors as seamless as possible within games and productivity environments while offering the ability to use extreme resolutions.


While the price of two or even three new monitors may be a bit daunting at first for many of you, but good 20” and even 22” LCDs have come down in price to the point where some are retailing below the $200 mark. ATI figures that less than $600 for three monitors will allow plenty of people to make the jump into a true surround gaming setup. Indeed, with three or even six monitors, the level of immersion could be out of this world.


The reason that main in the professional field are familiar with multi monitor setups is for one simple matter: they increase productivity exponentially. Imagine watching a dozen stocks without having to minimize windows all the time or using Photoshop on one screen while watching a sports broadcast on another and using the third screen for Photoshop’s tooltips. The possibilities are virtually limitless if it is implemented properly.


When it comes to a purely gaming perspective, the thought of a massive view of the battlefield or the ability to see additional enemies in your peripheral vision is enough to make most gamers go weak in the knees. Unfortunately, the additional monitors will naturally mean decreased performance considering the massive amount of real-estate that would need rendering. This will mean tradeoffs may have to be made in terms of image quality if you want to use Eyefinity.


According to ATI, all of the new HD 5800-series graphics cards will have the ability to run up to three monitors simultaneously. This is done by having a pair of DVI connectors as well as a DisplayPort and HDMI connector located on the back of the card. It should be noted that ATI will be releasing a special Eyefinity version of the HD 5870 in the coming months which features six DisplayPort connectors for those of you who want to drive six monitors from a single card.


This technology is all made possible through the use of DisplayPort connectors but this also provides a bit of a limitation as well. Above we can see that a number of 3-screen output combinations which the current HD5800-series support and one thing is constant: you will need at least one monitor which supports DisplayPort. Unfortunately, at this time DP-supporting monitors tend to carry a price premium over standard screens which will increase the overall cost of an Eyefinity setup. Luckily the other two monitors can either use DVI or a combination of DVI and HDMI for connectivity.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
Hemlock Under the Microscope

Hemlock Under the Microscope


With the release of the HD 5970 (code named Hemlock), ATI feels that they now have a card that can take over the ultra high end segment and lay claim to the title of being the fastest graphics card in the world. Behind this card there are some interesting features which should allow ATI to keep their newfound performance lead in the short term.


A the heart of the HD 5970 beats a pair of underclocked HD 5870 cores which are interconnected by a second generation PLX PCI-E switch. This is the same basic setup that we saw with the HD 4870 X2 where the switch connects directly to each core’s PCI-E 2.1 bus interface and acts a lot like an on-board Crossfire interconnect.


One of the most interesting aspects of the HD 5970 is the fact that ATI is has put a ton of emphasis on its overclocking abilities. They bill this as being an “unlocked” card which can reach at least 850Mhz on the core with memory speeds in excess of 4.8Ghz while running at stock volts. In addition, various board partners will be coming out with voltage tuning software in order to push things even further.


The HD 5970 is truly designed as an enthusiast’s dream card. As you can see above, ATI is pushing design features which are normally seen on high-end custom boards from the likes of Gigabyte, ASUS and others. With high-end Volterra digital VRMs and the much talked about ceramic supercapacitors, this seems to be a card that was designed to be pushed to the limits.


In order to keep power consumption and heat in check while offering some headroom for higher than stock clock speeds, ATI was forced to think beyond what most people think a reference card should incorporate. You see, most reference cards from both NVIDIA and ATI / AMD follow the lowest common denominator when it comes to component choices in order to save cost. Not with the HD 5970.

Heat is controlled by the incorporation of a vapour chamber into the contact plate of the heatsink along with a temperature controlled fan which exhausts air through a full-length exhaust grill. In addition, the choice was made to use lower leakage cores that have been binned for decreased power consumption. This is an ultra high end card and ATI’s choices reflect that.


While we already mentioned the specifications of this card, what should pop out here is the overall power consumption. It is nothing short of remarkable. Even though the HD 5970 packs a pair of HD 5870 cores, power consumption remains about 40 less than a pair of HD 5850s due to a simplified PCB and lower clock speeds. ATI recommends a 650W power supply but that might be a bit of an understatement as we will see in our power consumption testing.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
Sapphire HD 5970 Specs / Packaging & Accessories

Sapphire HD 5970 Specifications



Sapphire bills this HD 5970 as their OC Edition which means it is comes with higher than stock clock speeds right out of the box. Naturally, there will be a reference edition at a slightly lower price but looking at the “overclock” given to this card, we hope there isn’t much of a price premium. We are looking at a whole 10Mhz on the core and 40Mhz on the memory. This should allow it to look a bit better than a standard HD 5970 on paper but not make any noticeable difference when it comes to gameplay.


Packaging and Accessories



For the HD 5000-series, Sapphire has gone with a standard box design along with a dreadlock-totting Ruby look alike for their mascot. All things considered, we were expecting a larger box but this compact design means that shipping will actually cost both you and Sapphire less in the long run.


Within the box, we see that the card itself is well protected by a thick cardboard holder as well as an anti-static bubble wrap sleeve. There is also an additional foam insert for some additional lateral protection.


Sapphire has packaged a virtual cornucopia of extras along with their HD 5970 OC. You get two DX11 titles in the form of BattleForge and Dirt but it is important to remember that no CDs are included. Rather, you will have to download both games from dedicated servers.

In addition to the game, there is the usual Crossfire bridge, 6-pin to 8-pin power cable, Molex to 6-pin adaptor and driver CD. Sapphire has also included a pamphlet showing where you can download their new Redline overclocking and voltage tuning software. Finally, Sapphire has also included a DVI to HDMI dongle and a mini-DisplayPort to DisplayPort adaptor.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
A Closer Look at the Sapphire HD 5970 2GB OC

A Closer Look at the Sapphire HD 5970 2GB OC



The HD 5970 2GB is one heck of an imposing card but ATI has designed it in such a way that it closely resembles the rest the HD 5000-series cards. Its full length, all-black heatsink shroud covers both cores and is covered in Sapphire’s unique branding sticker.


Much like the HD 5870, HD 5850 and HD 5770, this card uses just the right amount of red highlights to remind users of its ATI lineage. However, unlike other 5-series products, the HD 5970 carries slots along the entire side of the heatsink shroud. These do dump some hot air back into the card’s immediate vicinity but there isn’t enough heat buildup for you to worry about.

We should also mention the HD 5970 has a single Crossfire connector due to the fact that you are able to link no more than two of these cards together for quad Crossfire.


Once again we see the oddball red-fringed vents at the back of an ATI 5-series heatsink. These are in place to help with ventilation over the VRMs and other components at the rearmost portion of the PCB while airing air intake.


The HD 5970 also comes with a full coverage backplate in order to disperse the immense heat generated by the GDDR5 memory modules placed on the card’s back. While it may help with heat issues, we are going to recommend that you let the card stand idle for at least 5 minutes after turning off the system before you handle it. This backplate gets insanely hot and you WILL burn yourself if you are not careful.

The back also incorporates a pair of LEDs which indicate the status of the card.


Considering this card uses about 300W and can supposedly overclock like no-one’s business, ATI thought it prudent to include a single 8-pin PCI-E connector alongside a 6-pin. Since most good power supplies now carry 8-pin connectors, this shouldn’t be an issue for any of you.

The backplate shows us a departure from previous 5000-series cards we have looked at since there is a full-length exhaust grille above the connectors. This is to supposedly facilitate the movement of the additional airflow needed to cool the two cores.

Directly below the exhaust grille are the two DVI connectors as well as a mini DisplayPort output.


There is no denying that the HD 5970 is the longest card in the history of modern GPUs. It dwarfs even the 11.5” HD 5870 by measuring about 12.25” (31cm) which will make it a tight fit into anything but the largest of EATX cases. It used to be that upgrading your graphics card meant perhaps upgrading your power supply but with this card, you may need to upgrade your case as well.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
Under the Heatsink

Under the Heatsink


Please remember that removing the heatsink from your card will void your warranty.


Once the heatspreader on the back of the HD 5970 is removed we can start to see some of the inner workings on this card. The heatsink is made out of a very thin aluminum and makes direct contact with the Hynix GDDR5 memory modules. These H5GQ1H24AFR ICs are rated at a massive 2.5Ghz DDR (5Gbps) at 1.5V which bodes well for overclocking (we’ll be covering overclocking in a separate article) and is laid out in an 8x128MB pattern.


There should be no doubt that the HD 5970 is a complicated card to make and we can now see why the cores are placed so close to the “front” of the card: the power distribution area is massive. It contains a quartet of huge solid capacitors as well as the much ballyhooed Volterra digital VRMs that can be programmed via software applications.


Here we have the true heart of this beast: the two RV870 cores along with the PLX switch that facilitates the communication between the two chips. We’ve also got a bit of mystery on our hands here. ATI advertises that this card was “designed to take full advantage of PCI-E 2.1 with (a) Gen2 PLX bridge” but it seems that this PLX chip is actually a PCI-E 1.1 / 1.0a unit . According to the number written on the heatspreader, this is a PEX 8547 unit that uses 48 lanes. We are guessing that ATI picked this particular switch due to its low latency when compared to the other PLX products.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
Test System & Setup

Test System & Setup

Processor: Intel Core i7 920(ES) @ 4.0Ghz (Turbo Mode Enabled)
Memory: Corsair 3x2GB Dominator DDR3 1600Mhz
Motherboard: Gigabyte EX58-UD5
Cooling: CoolIT Boreas mTEC + Scythe Fan Controller
Disk Drive: Pioneer DVD Writer
Hard Drive: Western Digital Caviar Black 640GB
Power Supply: Corsair HX1000W
Monitor: Samsung 305T 30” widescreen LCD
OS: Windows Vista Ultimate x64 SP1


Graphics Cards:

Sapphire HD 5970 OC
HD 5970 (Reference)
ATI HD 4870 X2 (Reference)
Sapphire HD 5870 1GB (Reference)
XFX HD 5850 1GB (Reference)
2x HD 5850 1GB
Sapphire HD 4890 1GB (Reference)
NVIDIA GTX 295 (Reference)

Drivers:

ATI 9.12 Beta (HD 5970)
ATI 9.10 WHQL
NVIDIA 191.07 WHQL


Applications Used:

Call of Duty: World at War
Call of Juarez: Bound in Blood
Crysis: Warhead
Dawn of War II
Fallout 3
Far Cry 2
Left 4 Dead
Tom Clancy’s HawX


*Notes:

- All games tested have been patched to their latest version

- The OS has had all the latest hotfixes and updates installed

- All scores you see are the averages after 4 benchmark runs

All game-specific methodologies are explained above the graphs for each game

All IQ settings were adjusted in-game
 
Status
Not open for further replies.
Top