What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

Sapphire Radeon HD5570 1GB DDR3 Single & Crossfire Review

Status
Not open for further replies.

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,900
Location
Montreal


Sapphire Radeon HD5570 1GB DDR3 Single & Crossfire Review





Product Number: 11167-04-40R
Warranty: 2 years
Price: Approx $85 USD
TechWiki Info: Sapphire Radeon HD5570 1GB



Just last week, we reviewed the new ATI HD 5450 which is destined to be the lowest-end card in the current 5000-series lineup. It was a low-end product that did leave a bit to be desired due to the fact that it didn’t offer anything drastically different from last-generation cards that are retailing for substantially less. Putting that card behind them, ATI is today releasing yet another series of products that are destined to complete their assault on the sub-$100 price point.

The HD 5500 series is to be the spiritual successor to the outgoing HD HD 4550 and encompasses two cards for the time being: the HD 5550 and HD 5570. Both of these cards are based off of the Redwood Pro core and are targeted straight at the NVIDIA GT 220 while replacing the HD 4670 and HD 4650. With the supposed capability to allow games to run at or slightly above 1680 x 1050, they seem to be perfectly placed to suit the needs of budget gamers. However, budget gamers aren’t the only ones being targeted by these cards since they are equally well-equipped to tackle HD video and audio decoding as well. ATI has indeed found their way into the hearts of HTPC users everywhere and they intend to keep their preeminence here as well.

For the purpose of this review, we will be focusing on gaming capabilities of the higher-end HD 5570 1GB. ATI does make some interesting claims about it being able to play games up to 1080P resolution which would make this card much more than what first meets the eye considering its price of around $85USD. This price also puts it into close proximity to the aforementioned GT 220 which is presently retailing for between $75 for the 1GB DDR2 part and $75 for the full-bore DDR3 unit. Even though the GT 220 is stated as ATI’s target, the $85 price of the higher-end SKU brings it to nearly the same price the 1GB GT 240s and within spitting distance of some GT 240 512MB GDDR5 models. It should also be mentioned that the HD 4670 cards are retailing these days for around $80 but ATI is just hoping that the HD 5570’s DX11 capabilities and Eyefinity support will sell you on its price.

All in all, it should be interesting to see how this product does when compared to other cards in its class. We also have some interesting tests lined up which will pit a pair of HD 5570s against the competition while running in software Crossfire mode. So, let’s not wait anymore and get right into this review.

 
Last edited by a moderator:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,900
Location
Montreal
HD 5570 1GB DDR3 Specifications

HD 5570 1GB DDR3 Specifications



The HD 5570 is without a doubt an interesting looking card when looking at its specifications and upon first glance, it looks to fit somewhere between the older HD 4650 and HD 4670. When compared to the HD 4670, it has more SPs but gets beat clean in terms of texture horsepower and straight out core speed which could lead to a very close battle between these two lower-end cards. Several publications have also hinted at the possibility that the HD 5000 series’ shaders don’t perform quite as well as those on the HD 4000-series and from experience, we tend to agree. This combined with the lower texture unit count means the HD 4670 will pull ahead of the new HD 5570 in several key situations.


One of the main points that will attract many people to the HD 5570 is the fact that it is one of the most powerful single slot, low profile cards currently on the market. This will allow it to fit into some of the slimmest cases around and could prove to be just what the doctor ordered for people with small form factor HTPC cases.

Price-wise, the HD 5570 is a bit of a mixed bag when compared to both this generation’s cards and the HD 4000 series but there should be enough of a separation to eliminate overlap. Unfortunately, with the HD 4670 hanging around on several retailer’s shelves at under $90 things should be looking pretty good for this new card as long as it can perform up to expectations.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,900
Location
Montreal
Focusing on DX11

Focusing on DX11


It has been a hair under three years since the release of Windows Vista and with it the DirectX 10 API. In that amount of time, a mere 33 DX10 games were released. That isn’t exactly a resounding success considering the hundreds of titles released in that same time. Let’s hope DX11 does a bit better than that.


DX11 is focused on taking the lessons learned from the somewhat inefficient DX10 and shaping them into a much more efficient API which will demand less system resources while being easier to develop for. In addition to the usual 3D acceleration, it will also be used to speed up other applications which in the past have not been associated with the DirectX runtime. This may be a tall order but with the features we will be discussing here, developers have already started using DX11 to expand the PC gaming experience. It is an integral component in Windows 7 and according to Microsoft, will also be adopted into Windows Vista through a software update.

Let’s scratch the surface of what DX11 can bring to the table.


Unlike past DirectX versions, DX11 endeavours to move past the purely graphics-based uses of the API and push it towards being the lynchpin of an entire processing ecosystem. This all begins with the power which DirectX Compute will bring into the fold. Not only can it increase the efficiency of physics processing and in-game NPC intelligence within games by transferring those operations to the GPU but it can also be used to accelerate non-3D applications.




Through the use of Compute Shader programs in Shader Model 5.0, developers are able to use additional graphical features such as order independent transparency, ray tracing, and advanced post-processing effects. This should add a new depth of realism to tomorrow’s games and as mentioned before, also allow for programs requiring parallel processing to be accelerated on the GPU.


For the majority of you reading this review, it is the advances in graphics processing and quality that will interest you the most. As games move slowly towards photo-realistic rendering quality, new technologies must be developed in order to improve efficiency while adding new effects.


Some of the technologies that ATI is championing are DX11’s new Depth of Field, OIT (or Order Independent Transparency) and Detail Tessellation. While the pictures above do a good job of showing you how each of these works, it is tessellation which ATI seems most excited about. They have been including hardware tessellation units in their GPUs for years now and finally with the dawn of DX11 will these units be finally put to their full use. OIT on the other hand allows for true transparency to be added to an object in a way that will be more efficient resource-wise than the standard alpha blending method currently used.


Let’s talk about DX11 games. As you would expect, due to the ease of programming for this new API and the advanced tools it gives developers, many studios have been quite vocal in their support. Even though some of the titles listed above may not be high on your list of must have games, A-list titles like the upcoming Aliens vs. Predator from Rebellion and DiRT 2 are sure to get people interested. What we like see is at least three DX11 games being available before the Christmas buying season even though BattleForge is already available and will have DX11 support added through a patch.

Another exciting addition to the list is EA DICE’s FrostBite 2 Engine which will power upcoming Battlefield games. Considering the popularity of this series, the inclusion of DX11 should open up this API to a huge market.

 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,900
Location
Montreal
OpenCL: The Next Big Thing?

OpenCL: The Next Big Thing?



As consumers, we have all heard of the inroads GPUs have been making towards offering stunning performance in compute-intensive applications. There have been attempts to harness this power by engines such as NVIDIA’s Compute Unified Device Architecture (CUDA) and ATI’s Stream SDK (which in v2.0 supports OpenCL).


“Build it and the will come” says the old mantra but industry adoption of CUDA and Stream was anything but quick since there were two standards being pushed for the same market. CUDA in particular is having a hard time of it since it is vendor-specific without hardware support from any other vendor. The industry needed a language that was universal and available across multiple platforms. That’s were OpenCL (Open Computing Language) along with DirectX Compute come into play. It is completely open-source and managed by a non-profit organization called the Khronos Group which also has control over OpenGL and OpenAL


At its most basic level, OpenCL is able to be executed across multiple mediums such as GPUs, CPUs and other types of processors. This makes it possible to prioritize workloads to the processor that will handle them most efficiently. For example, a GPU is extremely good at crunching through data-heavy parallel workloads while an x86 CPU is much more efficient at serial and task-specific This also allows developers to write their programs for heterogeneous platforms instead of making them specific to one type of processor.


So what does this mean for gamers? First of all, AMD has teamed up with Bullet and PixeLux in order to achieve more realistic environments for players. The Bullet Physics is an open-source physics engine which has an ever-expanding library for soft body, 3D collision detection and other calculations. Meanwhile, PixeLux uses their DMM (Digital Molecular Matter) engine which uses the Finite Element Analysis Method of calculating physics within a game. In past applications, it has been used to calculate actions which have an impact on the game’s environment such as tumbling rubble or debris movement.


With Stream moving to OpenCL, ATI is truly moving towards an open platform for developers which they are hoping will lead to broader developer and market adoption than the competition’s solutions. At this point it looks like we will soon see ATI’s GPUs accelerating engines from Havok, PixeLux and Bullet through the use of OpenCL. Considering these are three of the most popular physics engines on the market, ATI is well placed to make PhysX a thing of the past.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,900
Location
Montreal
ATI’s Eyefinity Technology

ATI’s Eyefinity Technology


*Note: Eyefinity is not available on all models of the HD 5450*


The term Surround Gaming may not mean much to many of you who are reading this article but with the advent of ATI’s new Eyefinity technology, now is a good time to educate yourself. Basically, Eyefinity will give users the ability to use multiple monitors all running from the same graphics card. In the past, simple dual monitor setups have been used by many graphics, CAD or other industry professionals in order to increase their productivity but gaming on more than one monitor was always a bit of a clunky affair. Granted, some products like Matrox’s TripleHead2Go were able to move multi monitor setups into the public’s perception but there were always limitations (resolution and otherwise) associated with them. ATI is aiming to make the implementation of two or even more monitors as seamless as possible within games and productivity environments while offering the ability to use extreme resolutions.


While the price of two or even three new monitors may be a bit daunting at first for many of you, but good 20” and even 22” LCDs have come down in price to the point where some are retailing below the $200 mark. ATI figures that less than $600 for three monitors will allow plenty of people to make the jump into a true surround gaming setup. Indeed, with three or even six monitors, the level of immersion could be out of this world.


The reason that main in the professional field are familiar with multi monitor setups is for one simple matter: they increase productivity exponentially. Imagine watching a dozen stocks without having to minimize windows all the time or using Photoshop on one screen while watching a sports broadcast on another and using the third screen for Photoshop’s tooltips. The possibilities are virtually limitless if it is implemented properly.


When it comes to a purely gaming perspective, the thought of a massive view of the battlefield or the ability to see additional enemies in your peripheral vision is enough to make most gamers go weak in the knees. Unfortunately, the additional monitors will naturally mean decreased performance considering the massive amount of real-estate that would need rendering. This will mean tradeoffs may have to be made in terms of image quality if you want to use Eyefinity.


According to ATI, all of the new HD 5800-series graphics cards will have the ability to run up to three monitors simultaneously. This is done by having a pair of DVI connectors as well as a DisplayPort and HDMI connector located on the back of the card. It should be noted that ATI will be releasing a special Eyefinity version of the HD 5870 in the coming months which features six DisplayPort connectors for those of you who want to drive six monitors from a single card.


This technology is all made possible through the use of DisplayPort connectors but this also provides a bit of a limitation as well. Above we can see that a number of 3-screen output combinations which the current HD5800-series support and one thing is constant: you will need at least one monitor which supports DisplayPort. Unfortunately, at this time DP-supporting monitors tend to carry a price premium over standard screens which will increase the overall cost of an Eyefinity setup. Luckily the other two monitors can either use DVI or a combination of DVI and HDMI for connectivity.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,900
Location
Montreal
HD Audio and Video

HD Audio and Video



One of the main drawing points of the lower-end cards in the HD 5000 series lineup is the fact that they are literally unmatched when it comes to HTPC use. Granted, the GT 210, 220 and 240 cards from NVIDIA are the first cards from the green side of the pond to receive native audio processing without having to resort to a clunky S/PDIF cable but their HD audio compatibility is limited to non-PAP (Protected Audio Path) implementations. Meanwhile, the HD 5000 series features not only support for native HDMI audio support with compatibility with AC3, 8-channel LPCM and DTS among others but it also introduces PAP support for bitstream output of Dolby True HD, DTS HD Master Audio, AAC and Dolby AC-3. This allows high-end audio for 7.1 sources to be passed unhindered from your computer onto your receiver and is a huge step up from what the competition offers.

As for HD video, you get everything that you would expect from and ATI card: compatibility with HDMI 1.3 formats, an option for a DisplayPort connector and full support for ATI’s UVD 2.2.


Enhanced DVD Upscaling & Dynamic Contrast


While there are plenty of us who will use HD signals through the HD5000-series of cards, whether we like it or not we will still be outputting lower definition signals to our wonderful new HDTV every now and then. In these cases, a standard 480i picture will look absolutely horrible if it is scaled up to fit on a high definition 1080P TV so ATI provides the Avivo HD upscaling option in their drivers. What this does is take the low resolution signal and clean it up so to speak so it looks better when displayed on a high definition screen.


Another interesting feature ATI has packed into their drivers is the Dynamic Contrast Adjustment. Personally, I more often than not adjust the contrast manually based on the application since the values from one game or movie to the next can vary a lot. ATI has taken the guesswork and thrown it out the window by providing a post-processing algorithm which will automatically (and smoothly) adjust the contrast ratio in real time.

While there are other benefits of using the 5000-series for audio and video pass-through for your home theater, we will stop here and get on with the rest of this review.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,900
Location
Montreal
A Closer Look at the Sapphire HD 5570 1GB

A Closer Look at the Sapphire HD 5570 1GB




As we have been saying for a while now, once you have seen one Sapphire box you have pretty much seen them all. As much as possible, Sapphire standardizes their box size which will save them money but it can mean that the end user will pay more in shipping charges for smaller cards.



Accessories-wise, say hello to the bare minimum. You get a drive CD along with a disk containing SimHD for Messenger and a pair of low profile brackets. There isn’t a single adaptor in sight which is disappointing considering this particular HD 5570 comes without a HDMI connector.



The physical size of the HD 5570 should come as a pleasant surprise to many of you since it is the first card that can offer relatively high performance in a low-profile capable package. However while it is compact, it does have an active heatsink as opposed to the passive one found on lower-end ATI cards. Sapphire has also done away with the usual black PCB and has decked out their HD 5570 series with their usual blue attire.



The back of this particular card holds a quartet of 120MB Samsung memory modules and not much else. It is however interesting to see that Sapphire has gone with a solid installation for their heatsink with screws and washers instead of lower-end pushpins.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,900
Location
Montreal
A Closer Look at the Sapphire HD 5570 1GB con't

A Closer Look at the Sapphire HD 5570 1GB pg.2



We did notice a slight difference in the heatsink design between the Sapphire card and a reference sample we were provided. From what we can tell, the Sapphire HD 5570 makes use of a solid aluminum-based heatsink that covers the GPU core but not the full die as opposed to the copper-based full coverage unit used on the ATI reference sample. While it may look like the Sapphire heatsink has a cut-down design, we believe it to be slightly superior due to the fact that it can draw in additional cool air from the back of the assembly.


The backplate shows us one of the reasons we are a bit disappointed in this particular card and indeed ATI’s reference sample as well. Basically, it forgoes an HDMI output for Eyefinity support and a legacy VGA output. We know that there will be some Sapphire HD 5570 cards released with HDMI outputs but we just wish this was one of them.



The length of this card is the same as the HD 5670 and HD 5450 which is about as short as a graphics card can get while building within the limits for the PCI-E slot. One way or another, you WILL be able to fit this card into your case…length-wise.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,900
Location
Montreal
Test System & Setup

Test System & Setup

Please note that this test system was specifically picked out to run our budget GPUs hand in hand with a configuration that doesn't cost more than $500CAD for the CPU, motherboard and memory.

Processor: Intel Core i5 750 @ 2.67Ghz (Turbo Mode Enabled)
Memory: 2x2GB OCZ Platinum PC-15000 @ 6-7-6-17 1066Mhz DDR
Motherboard: Gigabyte H57M-USB3
Cooling: Thermalright TRUE
Disk Drive: Pioneer DVD Writer
Hard Drive: Western Digital Caviar Black 640GB
Power Supply: Corsair HX520
Monitor: Samsung 305T 30” widescreen LCD
OS: Windows 7 Ultimate N x64




Graphics Cards:

Sapphire HD 5670 1GB GDDR5
XFX HD 5750 (Reference)
Diamond HD 4770 (Reference)
ATI HD 4650 512MB
ATI HD 4670 512MB
Sparkle GT 240 1GB GDDR3
Sparkle GT 240 512MB GDDR5
Sparkle GT 220 1GB GDDR3
EVGA 9800 GT 512MB (Reference)
EVGA 9600 GT 512MB (Reference)
Palit 9600 GSO 384MB (Reference)


Drivers:

ATI 10.1 WHQL
ATI 10.2 Beta
NVIDIA 196.21 WHQL


Applications Used:

Batman Arkum Asylum
Borderlands
Dawn of War II
DiRT 2
Dragon Age: Origins
Far Cry 2
Left 4 Dead 2


*Notes:

- All games tested have been patched to their latest version

- The OS has had all the latest hotfixes and updates installed

- All scores you see are the averages after 2 benchmark runs

All game-specific methodologies are explained above the graphs for each game

All IQ settings were adjusted in-game
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,900
Location
Montreal
Batman: Arkham Asylum

Batman: Arkham Asylum (DX9)


Even though Batman: AA has its own in-game benchmarking tool, we found that its results are absolutely not representative of real-world gaming performance. As such, we used FRAPS to record run-through of the first combat challenge which is unlocked after completing the first of The Riddler’s tasks. It includes close-in combat with up to 8 enemies as well as ranged combat. In addition, we made sure to set the smoothframerate line in the game’s config to “false”. No AA was used as the game engine does not natively support it.


1680 x 1050



1920 x 1200

 
Status
Not open for further replies.

Latest posts

Twitter

Top