What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

XFX Radeon HD 5770 1GB GDDR5 Review

Status
Not open for further replies.

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
HD5850-17.jpg


XFX Radeon HD 5770 1GB GDDR5 Review



Manufacturer's Part Number: HD-577A-ZNFC
Price: Approx. $160USD
Warranty: Double Lifetime



For gamers, the last month has been a dream come true as ATI has effectively opened the floodgates on its 40nm manufacturing process by bringing an entire range of well-priced DX11 cards onto the market. The release of the high-end HD 5870 was quickly followed by the launch of the HD 5850; a card many people are calling the new 8800 GT due to its amazing price / performance ratio. The fun continues today with the simultaneous launch of two performance-oriented mid-range cards: the HD 5770 1G and the HD 5750 which will be available in 1GB and 512MB forms. In this review we will be looking at the XFX HD 5770 1GB GDDR5.

Unlike the HD 5870 and HD 5850, the positioning of the HD 5770 is a bit less ambiguous since its price of around $160USD doesn’t particularly line up with much in NVIDIA’s current lineup. At this point, the GTS 250 512MB sits at around $125 while the less popular 1GB version comes in at $140 which means ATI is hoping their new card bridges the gap between the GTS 250 and the GTX 260 216. In essence, the gap we are referring to is presently filled in ATI's lineup by the HD 4850 512MB whose $125 price point is a good 20% lower than that of the HD 5770. Depending on the HD 5770’s performance, this might mean that consumers will be forced to pick between a less expensive DX10 card or a costlier DX11 GPU.

In the grand scheme of things, ATI is still aiming to lead the pack when it comes to both the price consumers have to pay for performance as well as performance per watt. Their 40nm manufacturing process seems to have matured to a point where both of these goals are within reach which is why we are seeing this sudden glut of DX11 cards.

The only issue thus far with the 5-series launch has been the sparse availability of products. A combination of popularity coupled with very few units being restocked has led to shortages of the HD 5870 and the virtual non-existence of the HD 5850. Whether these problems carry over into the 5700-series is anyone’s guess but from what we hear, these lower-end cards should be available in good quantities come launch.

The HD 5770 1GB we are reviewing today isn’t the usual overclocked card we have all come to expect from XFX but rather it is based on the ATI reference design and sports stock clocks. We are sure that in the future, XFX will flex their overclocking might but the stock nature of this GPU doesn’t stop it from carrying the usual Double Lifetime Warranty. This warranty basically gives you along with the second buyer (if you choose to sell it) a modder-friendly lifetime warranty. Along, with their 5-star customer and technical support, XFX hopes that these little touches will differentiate them from the competition.

XFX-HD5770-17.jpg
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
A Look at the ATI 5000-series

A Look at the ATI 5000-series


XFX-HD5770-69.jpg

As you can probably tell by the chart above, all of the HD 5000-series fit perfectly into ATI’s current lineup. In essence, The HD 5870 takes the place of the expensive-to-produce and comparably inefficient dual GPU HD 4870X2 as the top dog for the time being. Judging from paper specifications alone, the HD 5870 is a technological marvel considering it packs all of the rendering potential of ATI’s past flagship card and then some while not being saddled by an inefficient dual processor design. The fact that this new card could trump the performance of a HD 4890 just a few months after that card’s release is nothing short of stunning.

XFX-HD5770-16.jpg

The HD 5850 on the other hand looks to be the purebred price / performance leader of the new ATI lineup. Barring slightly lower clock speeds for both the core and memory along with eight disabled texture units (totalling 160 stream processors), it is basically a clone of the HD 5870. This is the card ATI hopes will compete directly with the GTX 285 for the near future and then come into its own when DX11 games make their way into the market. We believe this card will appeal to the majority of early adopters since it allows them to buy class-leading DX9 and DX10 performance now without gambling $400 on unproven DX11 potential.

We can also see that ATI did some careful price cutting prior to launch since even though the HD 4890 looks to offer significantly less performance than a HD 5850, it is actually priced accordingly. As such, this previously high end card will stick around for the next few months in the $200 price bracket but that isn’t to say that it will stay there indefinitely...

HD5870-100.jpg

In short order, ATI will have a full range of DX11 cards on the market; all of which have been talked about in rumours over the last quarter. To begin with we will see the two “Cypress” series cards which are the HD 5870 and HD 5850 followed before the new year by the dual GPU Hemlock card which will make use of two Cypress processors. The Hemlock sticks to ATI’s mantra of never releasing a card that retails for above $500 but it will nonetheless take over the premier position of this DX11 lineup.

Meanwhile, we now have the HD 5700-series of code-named Juniper cards as well with the HD 5770 and HD 5750. The HD 5770 1GB is one of the first sub-$200 cards which will come stock with a 1GB framebuffer and along with the GDDR5 memory, comes with some hefty clock speeds as well. However, even though upon first glance the HD 5770 looks like it can compete with the HD 4890, this isn’t the case. According to ATI, the 128-bit memory interface will limit this card’s performance so it lies right within its stated price range. We should also mention that ATI won’t be replacing the HD 4890 until at least the first quarter of 2010 even though the HD 5770 is looking to take over from the HD 4850.

The HD 5750 on the other hand is simply a cut down HD 5770 with lower clocks, less SPs and a cut down number of Texture Units. It is this card that ATI sees going head to head with the NVIDIA GTS 250 and 9800 GT. It uses GDDR5 memory but there will be both 512MB and 1GB versions released to cater to the $100 market along with those looking for a little jump in performance.

So there you have it. In the high stakes game of poker that is the GPU industry, ATI has shown its hand. All that is left is for the competition to respond.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
A Focus on DX11

A Focus on DX11


It has been a hair under three years since the release of Windows Vista and with it the DirectX 10 API. In that amount of time, a mere 33 DX10 games were released. That isn’t exactly a resounding success considering the hundreds of titles released in that same time. Let’s hope DX11 does a bit better than that.

HD5870-109.jpg

DX11 is focused on taking the lessons learned from the somewhat inefficient DX10 and shaping them into a much more efficient API which will demand less system resources while being easier to develop for. In addition to the usual 3D acceleration, it will also be used to speed up other applications which in the past have not been associated with the DirectX runtime. This may be a tall order but with the features we will be discussing here, developers have already started using DX11 to expand the PC gaming experience. It is an integral component in Windows 7 and according to Microsoft, will also be adopted into Windows Vista through a software update.

Let’s scratch the surface of what DX11 can bring to the table.

HD5870-110.jpg

Unlike past DirectX versions, DX11 endeavours to move past the purely graphics-based uses of the API and push it towards being the lynchpin of an entire processing ecosystem. This all begins with the power which DirectX Compute will bring into the fold. Not only can it increase the efficiency of physics processing and in-game NPC intelligence within games by transferring those operations to the GPU but it can also be used to accelerate non-3D applications.

HD5870-111.jpg


HD5870-104.jpg

Through the use of Compute Shader programs in Shader Model 5.0, developers are able to use additional graphical features such as order independent transparency, ray tracing, and advanced post-processing effects. This should add a new depth of realism to tomorrow’s games and as mentioned before, also allow for programs requiring parallel processing to be accelerated on the GPU.

HD5870-101.jpg

For the majority of you reading this review, it is the advances in graphics processing and quality that will interest you the most. As games move slowly towards photo-realistic rendering quality, new technologies must be developed in order to improve efficiency while adding new effects.

HD5870-105.jpg

Some of the technologies that ATI is championing are DX11’s new Depth of Field, OIT (or Order Independent Transparency) and Detail Tessellation. While the pictures above do a good job of showing you how each of these works, it is tessellation which ATI seems most excited about. They have been including hardware tessellation units in their GPUs for years now and finally with the dawn of DX11 will these units be finally put to their full use. OIT on the other hand allows for true transparency to be added to an object in a way that will be more efficient resource-wise than the standard alpha blending method currently used.

HD5870-102.jpg

Let’s talk about DX11 games. As you would expect, due to the ease of programming for this new API and the advanced tools it gives developers, many studios have been quite vocal in their support. Even though some of the titles listed above may not be high on your list of must have games, A-list titles like the upcoming Aliens vs. Predator from Rebellion and DiRT 2 are sure to get people interested. What we like see is at least three DX11 games being available before the Christmas buying season even though BattleForge is already available and will have DX11 support added through a patch.

Another exciting addition to the list is EA DICE’s FrostBite 2 Engine which will power upcoming Battlefield games. Considering the popularity of this series, the inclusion of DX11 should open up this API to a huge market.

HD5870-103.jpg
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
OpenCL: The Next Big Thing?

OpenCL: The Next Big Thing?


HD5870-115.jpg

As consumers, we have all heard of the inroads GPUs have been making towards offering stunning performance in compute-intensive applications. There have been attempts to harness this power by engines such as NVIDIA’s Compute Unified Device Architecture (CUDA) and ATI’s Stream SDK (which in v2.0 supports OpenCL).

HD5870-113.jpg

“Build it and the will come” says the old mantra but industry adoption of CUDA and Stream was anything but quick since there were two standards being pushed for the same market. CUDA in particular is having a hard time of it since it is vendor-specific without hardware support from any other vendor. The industry needed a language that was universal and available across multiple platforms. That’s were OpenCL (Open Computing Language) along with DirectX Compute come into play. It is completely open-source and managed by a non-profit organization called the Khronos Group which also has control over OpenGL and OpenAL

HD5870-114.jpg

At its most basic level, OpenCL is able to be executed across multiple mediums such as GPUs, CPUs and other types of processors. This makes it possible to prioritize workloads to the processor that will handle them most efficiently. For example, a GPU is extremely good at crunching through data-heavy parallel workloads while an x86 CPU is much more efficient at serial and task-specific This also allows developers to write their programs for heterogeneous platforms instead of making them specific to one type of processor.

HD5870-116.jpg

So what does this mean for gamers? First of all, AMD has teamed up with Bullet and PixeLux in order to achieve more realistic environments for players. The Bullet Physics is an open-source physics engine which has an ever-expanding library for soft body, 3D collision detection and other calculations. Meanwhile, PixeLux uses their DMM (Digital Molecular Matter) engine which uses the Finite Element Analysis Method of calculating physics within a game. In past applications, it has been used to calculate actions which have an impact on the game’s environment such as tumbling rubble or debris movement.

HD5870-117.jpg

With Stream moving to OpenCL, ATI is truly moving towards an open platform for developers which they are hoping will lead to broader developer and market adoption than the competition’s solutions. At this point it looks like we will soon see ATI’s GPUs accelerating engines from Havok, PixeLux and Bullet through the use of OpenCL. Considering these are three of the most popular physics engines on the market, ATI is well placed to make PhysX a thing of the past.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
ATI’s Eyefinity Technology

ATI’s Eyefinity Technology


2404aba41c60e982.jpg

The term Surround Gaming may not mean much to many of you who are reading this article but with the advent of ATI’s new Eyefinity technology, now is a good time to educate yourself. Basically, Eyefinity will give users the ability to use multiple monitors all running from the same graphics card. In the past, simple dual monitor setups have been used by many graphics, CAD or other industry professionals in order to increase their productivity but gaming on more than one monitor was always a bit of a clunky affair. Granted, some products like Matrox’s TripleHead2Go were able to move multi monitor setups into the public’s perception but there were always limitations (resolution and otherwise) associated with them. ATI is aiming to make the implementation of two or even more monitors as seamless as possible within games and productivity environments while offering the ability to use extreme resolutions.

2404aba41c633257.jpg

While the price of two or even three new monitors may be a bit daunting at first for many of you, but good 20” and even 22” LCDs have come down in price to the point where some are retailing below the $200 mark. ATI figures that less than $600 for three monitors will allow plenty of people to make the jump into a true surround gaming setup. Indeed, with three or even six monitors, the level of immersion could be out of this world.

2404aba41c626f4b.jpg

The reason that main in the professional field are familiar with multi monitor setups is for one simple matter: they increase productivity exponentially. Imagine watching a dozen stocks without having to minimize windows all the time or using Photoshop on one screen while watching a sports broadcast on another and using the third screen for Photoshop’s tooltips. The possibilities are virtually limitless if it is implemented properly.

2404aba41c634766.jpg

When it comes to a purely gaming perspective, the thought of a massive view of the battlefield or the ability to see additional enemies in your peripheral vision is enough to make most gamers go weak in the knees. Unfortunately, the additional monitors will naturally mean decreased performance considering the massive amount of real-estate that would need rendering. This will mean tradeoffs may have to be made in terms of image quality if you want to use Eyefinity.

HD5870-15.jpg

According to ATI, all of the new HD 5800-series graphics cards will have the ability to run up to three monitors simultaneously. This is done by having a pair of DVI connectors as well as a DisplayPort and HDMI connector located on the back of the card. It should be noted that ATI will be releasing a special Eyefinity version of the HD 5870 in the coming months which features six DisplayPort connectors for those of you who want to drive six monitors from a single card.

2404aba41c635d0d.jpg

This technology is all made possible through the use of DisplayPort connectors but this also provides a bit of a limitation as well. Above we can see that a number of 3-screen output combinations which the current HD5800-series support and one thing is constant: you will need at least one monitor which supports DisplayPort. Unfortunately, at this time DP-supporting monitors tend to carry a price premium over standard screens which will increase the overall cost of an Eyefinity setup. Luckily the other two monitors can either use DVI or a combination of DVI and HDMI for connectivity.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Packaging and Accessories

Packaging and Accessories


XFX-HD5770-1.jpg
XFX-HD5770-2.jpg

Typical to all of XFX’s new ATI cards, their HD 4770 1GB comes in a long and narrow box with an exterior design that is sure to stand out on any retailer’s shelf. While it may look to hold all of the information one would need, the box lacks mention of one critical thing: clock speeds.

XFX-HD5770-3.jpg

There are also quite a few stickers with ATI’s Eyefinity logo but the one thing that stuck out most was the fact this card is offering a voucher for a free full version download of the game Battle Forge instead of DiRT. However, we have to remember that DiRT was included by ATI only with the HD 5800-series of cards and won’t necessarily be part of HD 5700-series packaging.

XFX-HD5770-5.jpg

You can tell that XFX uses the same box for all of their HD 5000-series of cards since there is plenty of place between the HD 5770 and the outer edge of the box. That being said, the card doesn’t move around at all and the protection afforded here is more than enough.

XFX-HD5770-4.jpg

As for accessories, we get the usual array of items including the Corssfire bridge, manual, driver CD, Molex to PCI-E adaptor, DVI to VGA dongle and XFX door sign. We already mentioned that a coupon for the full version of Battle Forge is included as well but with the download, you are also given access to a special edition in-game card: the Fallen Sky Elf.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
A Closer Look at the XFX HD 5770 1GB

A Close Look at the XFX HD 5770 1GB


XFX-HD5770-8.jpg
XFX-HD5770-6.jpg

The heatsink shroud is usually the one thing that stands out the most about any graphics card and indeed the HD 5770 looks to be a spitting image of the HD 5870 and HD 5850. It is interesting to see that ATI is going for a dual slot heatsink design on a mainstream card that sports a 40nm core but who are we to question such a great looking design?

XFX-HD5770-7.jpg

The sticker XFX uses on the heatsink goes well with the red highlights that are present around the reference card but we can’t help feeling that the whole yellow and black “workzone” section is a bit overdone.

XFX-HD5770-9.jpg
XFX-HD5770-10.jpg

The rearmost portion of the HD 5770 has a pair of intake vents for the fan which also serve to cool off the VRMs so aside from looking a bit gaudy, they do serve a purpose. Unlike those found on the HD 5850 and HD 5870, these don’t work so well. First of all, the single PCI-E power connector is recessed quite far within its vent which makes finding it when the card in installed a bit of a hit and miss affair. In addition, the whole shroud projects about a quarter inch over the PCB causing it to look like a cheap, tacked-on afterthought rather than a seamless piece.

XFX-HD5770-14.jpg
XFX-HD5770-13.jpg

The side of the HD 5770 mirrors that of the higher end cards with an ATI Radeon-branded strip of red that houses a number of vents. These openings are strategically placed in order to take care of any excess hot air flow that doesn’t get exhausted out the backplate.

The backplate holds exactly what you would expect for an Eyefinity-compatible card: a pair of DVI connectors along with lone HDMI and DisplayPort connectors.

XFX-HD5770-11.jpg
XFX-HD5770-12.jpg

The back of the card is left without an aluminum heatsink plate even though it houses a quartet of GDDR5 memory modules. As with all of the other Radeon cards of the DX11 generation, the PCB is black.

The memory modules used are H5GQ1H24AFR units from Hynix. These ICs are rated for 1.25Ghz (5Ghz QDR) speed at 1.5V and are set up in a 128MB x 8 pattern on the HD 5770.

XFX-HD5770-15.jpg

The HD 5770 is quite short which is par for the course when it comes to sub-$200 cards. In total, its PCB is 8 ¼” long but since the heatsink shroud projects about ½” over the back, its total length is closer to 8 3/4”.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Test System & Setup

Test System & Setup

Processor: Intel Core i7 920(ES) @ 4.0Ghz (Turbo Mode Enabled)
Memory: Corsair 3x2GB Dominator DDR3 1600Mhz
Motherboard: Gigabyte EX58-UD5
Cooling: CoolIT Boreas mTEC + Scythe Fan Controller
Disk Drive: Pioneer DVD Writer
Hard Drive: Western Digital Caviar Black 640GB
Power Supply: Corsair HX1000W
Monitor: Samsung 305T 30” widescreen LCD
OS: Windows Vista Ultimate x64 SP1


Graphics Cards:

XFX HD 5770
XFX HD 5750
XFX HD 5850 (Reference)
ATI HD 4890 (Reference)
Sapphire HD 4850 (Reference)
Diamond HD 4770 (Reference)
EVGA GTX 260 216 (Reference)
EVGA GTS 250 1GB (Reference)
EVGA GTS 250 512MB (Reference)
9800 GT 512MB (Reference)


Drivers:

ATI 8.66 RC7 Beta (HD 5000-series)
ATI 9.9 WHQL
NVIDIA 191.07 WHQL


Applications Used:

Call of Duty: World at War
Call of Juarez: Bound in Blood
Crysis: Warhead
Dawn of War II
Fallout 3
Far Cry 2
Left 4 Dead
Tom Clancy’s HawX


*Notes:

- All games tested have been patched to their latest version

- The OS has had all the latest hotfixes and updates installed

- All scores you see are the averages after 4 benchmark runs

All game-specific methodologies are explained above the graphs for each game

All IQ settings were adjusted in-game
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Call of Duty: World at War

Call of Duty: World at War


HD4890-27.jpg

To benchmark this game, we played through 10 minutes of the third mission (Hard Landing) starting from when the player first enters the swamp, through the first bunker until the final push onto the airfield. This was benchmarked using FRAPS.


1680 x 1050

XFX-HD5770-20.jpg


XFX-HD5770-21.jpg


1920 x 1200

XFX-HD5770-22.jpg


XFX-HD5770-23.jpg


2560 x 1600

XFX-HD5770-24.jpg


XFX-HD5770-25.jpg


Lately, it seems like Call of Duty: World at War has been particularly hard on the performance of ATI cards but the HD 5770 actually does quite well. In the majority of the tests, it stays behind the GTX 260 216 and gradually pulls away from the GTS 250 1GB as the resolution increases. We can also see that the gap between it and the HD 5750 is significant to say the least.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Call of Juarez II: Bound in Blood

Call of Juarez II: Bound in Blood


VAPORX-2GB-84.jpg

CoJ is a bit of an oddity without any in-game AA options but nonetheless, it looks incredible. For this benchmark we used a 10 minute gameplay sequence which included panoramic views of a town and gun battles. FRAPS was used to record the framerates.

1680 x 1050

XFX-HD5770-26.jpg


1920 x 1200

XFX-HD5770-27.jpg


2560 x 1600

XFX-HD5770-28.jpg


Call of Juarez is an NVIDIA card’s worst nightmare and the HD 5770 is able to manhandle even the more expensive GTX 260 216 at every resolution. Once again, it is also lightyears ahead of the HD 5750 1GB.
 
Status
Not open for further replies.

Latest posts

Top