What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

Sapphire Radeon HD 5830 1GB Review

Status
Not open for further replies.

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal


Sapphire Radeon HD 5830 1GB GDDR5 Review





Product Number: 11169-00-51R
Price:Approx $250 USD / $280 CAD
Warranty: 2-years



ATI has been attacking the DX11 market from every single price point as of late and they finally have what appears to be a full deck of cards which appeal to literally every market niche. They have released an astonishing nine cards in the last six months with a few more to come while the competition seems to be doing nothing but spinning their wheels and respinning chips. Even though NVIDIA’s March 26th launch of their GF100 parts is just around the corner, ATI isn’t by any means sitting back and watching the world pass them by considering they are actively working on refreshing their lineup in addition to releasing new products. This in itself is no small feat and it really shows how far this company has come since AMD stepped in to purchase them.

In what probably seems to be a wet dream for many people looking to upgrade their GPUs and a reoccurring nightmare for reviewers, ATI is releasing yet another GPU today: the “Cypress LE” or HD 5830 1GB as it will be commonly referred to. ATI’s HD 5000 series lineup does span almost every price bracket but they felt that there was a big enough performance gap between the higher-end HD 5850 and decidedly more mainstream HD 5770 to warrant a product that would bridge the gap between the two. Previously, this spot was taken up by the HD 4890 1GB which was retailing for around $200 USD before getting the axe in preparation for the HD 5830 launch. With the HD 5850 currently sitting in what many consider to be the leading price / performance spot at around $315 USD and some HD 5770s retailing for less than $170, there was plenty of room to play with.

For all intents and purposes, this is a bit of a “freak” since what we are looking at is a card which is based off of GPU cores that weren’t able to meet the binning requirements for use in the HD 5850. This translates into an identical die size and transistor count as the HD 5850 but in a product that is significantly less powerful. This is a great move from a cost savings standpoint since the cores that were going unused can now be recycled into a competitive product. To make matters even more interesting, ATI is supposedly not giving their board partners a set reference design for them to base their products off of. As such, at launch you will likely see all manner of HD 5830s; some based off of HD 5850 PCBs while others will be using slightly lower-end starting points and more cooler designs than you can shake a finger at.

When it comes to a card like the HD 5830, price is everything and when we reached out to our retailer contacts they came back with answers. Basically, expect launch prices for this card to go all the way up to (and maybe even slightly over) $275 CAD / $245 USD depending on the accessory and game packages. This puts ATI’s new card in an interesting position at $70 USD less than a HD 5850 and about $70 more than a HD 5770 we’re really talking about the middle ground here. As has been tradition with ATI launches these days, stock will also be a bit tight for the first few days at some locations but should improve quickly following that.

In this particular review we will be looking at the Sapphire HD 5830 1GB which makes use of a cooler that is akin to the one used on the HD 5850 Vapor-X and is based off of that card’s PCB. This should give it some incredible temperatures but will also make it significantly longer than the HD 5770. That being said, without further ado let’s get on with this review.

 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
The HD 5830’s Specifications & Market Placement

The HD 5830’s Specifications & Market Placement



As we mentioned in the introduction, the HD 5830 is tailor made to slot into the substantial price gap between the HD 5850 and the HD 5770. It is also destined to replace the outgoing HD 4890 1GB. Unlike the HD 5770 which used a smaller, 1.04 billion transistor version of the HD 5850’s and HD 5870’s 2.15 billion transistor cores, this new card uses that same enthusiast-grade Cypress core and redubs it the “Cypress LE”. However, as you can see from the specifications above the HD 5830 uses a highly cut down version of ATI’s high end architecture. This is basically done by taking cores that failed to pass the binning process necessary for HD 5850 and HD 5870 cards, shaving off some SPs, ROPs and texture units in order to make a product that will fit into a more mainstream price segment.


The result of these die cuts is a video card that does bridge the gap between high end and mainstream products but at a cost of significant amounts of rendering horsepower. Many people were hoping that this card would have something along the lines of 1280 Stream processors enabled but ATI seems to have gone a bit wild here and ended up cutting 320 for a total of 1120 SPs. The ROPs and texture units also went under the knife with the ROPs in particular being castrated down to half of their original count in the HD 5850. This can and will have a significant impact on overall rendering performance regardless of the fact the HD 5830 has its core clocked 75Mhz higher than its bigger brother. Memory speeds stay the same however.


When compared to the outgoing HD 4890 on the other hand, this new card looks to hold an edge even though it has been theorized that the HD 5000-series’ SPs don’t work as well as those on the HD 4000 series. With more texture units and the same number of ROPs yet slightly lower core clock speeds, it seems like the HD 5830 1GB should hold a slight edge in some situations. Or at least we hope…

All of this cutting results in a card that is priced right at the midpoint between the HD 5850 and HD 5770 while targeting the now-discontinued GTX 260 216 and GTX 275 cards in terms of overall performance. Will this approach appeal to this card’s target audience? Only time will tell.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
Focusing on DX11

Focusing on DX11


It has been a hair under three years since the release of Windows Vista and with it the DirectX 10 API. In that amount of time, a mere 33 DX10 games were released. That isn’t exactly a resounding success considering the hundreds of titles released in that same time. Let’s hope DX11 does a bit better than that.


DX11 is focused on taking the lessons learned from the somewhat inefficient DX10 and shaping them into a much more efficient API which will demand less system resources while being easier to develop for. In addition to the usual 3D acceleration, it will also be used to speed up other applications which in the past have not been associated with the DirectX runtime. This may be a tall order but with the features we will be discussing here, developers have already started using DX11 to expand the PC gaming experience. It is an integral component in Windows 7 and according to Microsoft, will also be adopted into Windows Vista through a software update.

Let’s scratch the surface of what DX11 can bring to the table.


Unlike past DirectX versions, DX11 endeavours to move past the purely graphics-based uses of the API and push it towards being the lynchpin of an entire processing ecosystem. This all begins with the power which DirectX Compute will bring into the fold. Not only can it increase the efficiency of physics processing and in-game NPC intelligence within games by transferring those operations to the GPU but it can also be used to accelerate non-3D applications.




Through the use of Compute Shader programs in Shader Model 5.0, developers are able to use additional graphical features such as order independent transparency, ray tracing, and advanced post-processing effects. This should add a new depth of realism to tomorrow’s games and as mentioned before, also allow for programs requiring parallel processing to be accelerated on the GPU.


For the majority of you reading this review, it is the advances in graphics processing and quality that will interest you the most. As games move slowly towards photo-realistic rendering quality, new technologies must be developed in order to improve efficiency while adding new effects.


Some of the technologies that ATI is championing are DX11’s new Depth of Field, OIT (or Order Independent Transparency) and Detail Tessellation. While the pictures above do a good job of showing you how each of these works, it is tessellation which ATI seems most excited about. They have been including hardware tessellation units in their GPUs for years now and finally with the dawn of DX11 will these units be finally put to their full use. OIT on the other hand allows for true transparency to be added to an object in a way that will be more efficient resource-wise than the standard alpha blending method currently used.


Let’s talk about DX11 games. As you would expect, due to the ease of programming for this new API and the advanced tools it gives developers, many studios have been quite vocal in their support. Even though some of the titles listed above may not be high on your list of must have games, A-list titles like Aliens vs. Predator from Rebellion and DiRT 2 are sure to get people interested. What we like see is at least three DX11 games being available before the Christmas buying season even though BattleForge is already available and will have DX11 support added through a patch.

Another exciting addition to the list is EA DICE’s FrostBite 2 Engine which will power upcoming Battlefield games. Considering the popularity of this series, the inclusion of DX11 should open up this API to a huge market.

 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
OpenCL: The Next Big Thing?

OpenCL: The Next Big Thing?



As consumers, we have all heard of the inroads GPUs have been making towards offering stunning performance in compute-intensive applications. There have been attempts to harness this power by engines such as NVIDIA’s Compute Unified Device Architecture (CUDA) and ATI’s Stream SDK (which in v2.0 supports OpenCL).


“Build it and the will come” says the old mantra but industry adoption of CUDA and Stream was anything but quick since there were two standards being pushed for the same market. CUDA in particular is having a hard time of it since it is vendor-specific without hardware support from any other vendor. The industry needed a language that was universal and available across multiple platforms. That’s were OpenCL (Open Computing Language) along with DirectX Compute come into play. It is completely open-source and managed by a non-profit organization called the Khronos Group which also has control over OpenGL and OpenAL


At its most basic level, OpenCL is able to be executed across multiple mediums such as GPUs, CPUs and other types of processors. This makes it possible to prioritize workloads to the processor that will handle them most efficiently. For example, a GPU is extremely good at crunching through data-heavy parallel workloads while an x86 CPU is much more efficient at serial and task-specific This also allows developers to write their programs for heterogeneous platforms instead of making them specific to one type of processor.


So what does this mean for gamers? First of all, AMD has teamed up with Bullet and PixeLux in order to achieve more realistic environments for players. The Bullet Physics is an open-source physics engine which has an ever-expanding library for soft body, 3D collision detection and other calculations. Meanwhile, PixeLux uses their DMM (Digital Molecular Matter) engine which uses the Finite Element Analysis Method of calculating physics within a game. In past applications, it has been used to calculate actions which have an impact on the game’s environment such as tumbling rubble or debris movement.


With Stream moving to OpenCL, ATI is truly moving towards an open platform for developers which they are hoping will lead to broader developer and market adoption than the competition’s solutions. At this point it looks like we will soon see ATI’s GPUs accelerating engines from Havok, PixeLux and Bullet through the use of OpenCL. Considering these are three of the most popular physics engines on the market, ATI is well placed to make PhysX a thing of the past.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
ATI’s Eyefinity Technology

ATI’s Eyefinity Technology



The term Surround Gaming may not mean much to many of you who are reading this article but with the advent of ATI’s new Eyefinity technology, now is a good time to educate yourself. Basically, Eyefinity will give users the ability to use multiple monitors all running from the same graphics card. In the past, simple dual monitor setups have been used by many graphics, CAD or other industry professionals in order to increase their productivity but gaming on more than one monitor was always a bit of a clunky affair. Granted, some products like Matrox’s TripleHead2Go were able to move multi monitor setups into the public’s perception but there were always limitations (resolution and otherwise) associated with them. ATI is aiming to make the implementation of two or even more monitors as seamless as possible within games and productivity environments while offering the ability to use extreme resolutions.


While the price of two or even three new monitors may be a bit daunting at first for many of you, but good 20” and even 22” LCDs have come down in price to the point where some are retailing below the $200 mark. ATI figures that less than $600 for three monitors will allow plenty of people to make the jump into a true surround gaming setup. Indeed, with three or even six monitors, the level of immersion could be out of this world.


The reason that main in the professional field are familiar with multi monitor setups is for one simple matter: they increase productivity exponentially. Imagine watching a dozen stocks without having to minimize windows all the time or using Photoshop on one screen while watching a sports broadcast on another and using the third screen for Photoshop’s tooltips. The possibilities are virtually limitless if it is implemented properly.


When it comes to a purely gaming perspective, the thought of a massive view of the battlefield or the ability to see additional enemies in your peripheral vision is enough to make most gamers go weak in the knees. Unfortunately, the additional monitors will naturally mean decreased performance considering the massive amount of real-estate that would need rendering. This will mean tradeoffs may have to be made in terms of image quality if you want to use Eyefinity.


According to ATI, all of the new HD 5800-series graphics cards will have the ability to run up to three monitors simultaneously. This is done by having a pair of DVI connectors as well as a DisplayPort and HDMI connector located on the back of the card. It should be noted that ATI will be releasing a special Eyefinity version of the HD 5870 in the coming months which features six DisplayPort connectors for those of you who want to drive six monitors from a single card.


This technology is all made possible through the use of DisplayPort connectors but this also provides a bit of a limitation as well. Above we can see that a number of 3-screen output combinations which the current HD5800-series support and one thing is constant: you will need at least one monitor which supports DisplayPort. Unfortunately, at this time DP-supporting monitors tend to carry a price premium over standard screens which will increase the overall cost of an Eyefinity setup. Luckily the other two monitors can either use DVI or a combination of DVI and HDMI for connectivity.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
HD Audio and Video

HD Audio and Video



One of the main drawing points of the lower-end cards in the HD 5000 series lineup is the fact that they are literally unmatched when it comes to HTPC use. Granted, the GT 210, 220 and 240 cards from NVIDIA are the first cards from the green side of the pond to receive native audio processing without having to resort to a clunky S/PDIF cable but their HD audio compatibility is limited to non-PAP (Protected Audio Path) implementations. Meanwhile, the HD 5000 series features not only support for native HDMI audio support with compatibility with AC3, 8-channel LPCM and DTS among others but it also introduces PAP support for bitstream output of Dolby True HD, DTS HD Master Audio, AAC and Dolby AC-3. This allows high-end audio for 7.1 sources to be passed unhindered from your computer onto your receiver and is a huge step up from what the competition offers.

As for HD video, you get everything that you would expect from and ATI card: compatibility with HDMI 1.3 formats, an option for a DisplayPort connector and full support for ATI’s UVD 2.2.


Enhanced DVD Upscaling & Dynamic Contrast


While there are plenty of us who will use HD signals through the HD5000-series of cards, whether we like it or not we will still be outputting lower definition signals to our wonderful new HDTV every now and then. In these cases, a standard 480i picture will look absolutely horrible if it is scaled up to fit on a high definition 1080P TV so ATI provides the Avivo HD upscaling option in their drivers. What this does is take the low resolution signal and clean it up so to speak so it looks better when displayed on a high definition screen.


Another interesting feature ATI has packed into their drivers is the Dynamic Contrast Adjustment. Personally, I more often than not adjust the contrast manually based on the application since the values from one game or movie to the next can vary a lot. ATI has taken the guesswork and thrown it out the window by providing a post-processing algorithm which will automatically (and smoothly) adjust the contrast ratio in real time.

While there are other benefits of using the 5000-series for audio and video pass-through for your home theater, we will stop here and get on with the rest of this review.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
Packaging and Accessories

Packaging and Accessories




According to the information we have, many of ATI’s board partners will be packaging their first run of HD 5830 cards with either Call of Duty: Modern Warfare 2 or Aliens versus Predator. Naturally, you will probably be able to buy products without the included games and save yourself a few bucks.

Sapphire chose to bundle their card with CoD: MW2 and their packaging reflects this right down to the battle hardened soldiers gracing both sides of their box. Other than the oversized advertisement for MW2, there are also numerous logos indicating all of the technologies and connectors this particular card comes with. All in all, it’s a busy looking box.



The interior packaging is done up to Sapphire’s usual high standards with the card sitting amid a sea of foam, cardboard and bubble-wrap. Trust us when we say this package will take a licking before something can actually damage your brand new card.



The accessory list is basic other than the inclusion of a nearly $60 game in Modern Warfare 2. You get a driver CD, two extremely long Molex to SATA cables, an oversized Crossfire bridge, a DVI to VGA dongle and the usual quick start guide. The Crossfire connector needs special mention since it is the perfect size for those of you with motherboards featuring PCI-E slots that are further apart than the standard Crossfire bridge can reach.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
A Closer Look at the Sapphire Radeon HD 5830 1GB

A Closer Look at the Sapphire Radeon HD 5830 1GB




For those of you suffering through a case of déjà vu right about now, we can sympathize with you since the Sapphire HD 5830 1GB is literally a spitting image of the HD 5850 Toxic we reviewed earlier this week. Naturally, there are quite a few differences between the two cards hiding under the massive heatsink shroud but upon first glance, they can easity be mistaken for one another.

Naturally, as with all new ATI cards the heatsink ends up being the dominating visual impact of this card with its massive expanse of black plastic and centrally mounted 90mm fan which covers up the aluminum cooling fin assembly. The downside to this design over other ATI cards on the market is the fact that it doesn’t exhaust much of the heat from the core outside of your case. However, this is almost a non-issue since the 40nm core on this card doesn’t generate all that much heat in the first place. We can also see that Sapphire also makes references to Modern Warfare 2 on the card itself with a small yet effective sticker.


HD 5830 below, HD 5850 Toxic above

While the HD 5850 Toxic and HD 5830 may look the same, there are some subtle differences which start with the heatsink. First of all, the plastic shroud on Sapphire’s HD 5830 1GB uses a different design and actually protrudes over the back of the PCB by a good ½” making this card slightly longer that its near-twin.


HD 5830 right, HD 5850 Toxic left

Getting a closer look, the fan reveals that the inner cooling assembly is also different since Sapphire chose not to use their more expensive Vapor-X cooler on the lower-end card. This means the HD 5830 uses a slightly smaller cooling fins assembly by replacing the fins directly over the core with a slab of aluminum instead. On the other hand, cooling could also be improved by the fact that two more heatpipes are used to whisk the heat away from the GPU.


As alluded to in the past paragraphs, the heatsink on this card makes use of five copper heatpipes that touch the main copper contact plate. These heatpipes then quickly move the heat away from the core and out to the aluminum fins where it is dispersed by the fresh air from the fan. We can also see that Sapphire chose to leave the memory ICs on this card free of any heatsinks since they are confident the downwards airflow created by the fan will be sufficient to cool them off.


The rearmost portion of the HD 5830 shows us a pair of 6-pin PCI-E power connectors since even though the specification of this card can’t hold a candle to those of the HD 5850, it consumes almost as much power. Meanwhile, the backplate houses a pair of DVI connectors as well as outputs for DisplayPort and HDMI giving this card Eyefinity compatibility.


When it comes to overall length of this card, things don’t look too great for those of you expecting a design that was more compact than the HD 5850. The Sapphire HD 5850 is about 10 1/2” from end to end which makes it longer than the HD 5850 but shorter than the HD 5870 and easily passes over the edge of a standard ATX motherboard. With the connectors placed at the back of the card, you will need about 11” of clear space within your case if you have any hope of installing this card.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
Test System & Setup

Test System & Setup

Processor: Intel Core i7 920(ES) @ 4.0Ghz (Turbo Mode Enabled)
Memory: Corsair 3x2GB Dominator DDR3 1600Mhz
Motherboard: Gigabyte EX58-UD5
Cooling: CoolIT Boreas mTEC + Scythe Fan Controller (Off for Power Consuption tests)
Disk Drive: Pioneer DVD Writer
Hard Drive: Western Digital Caviar Black 640GB
Power Supply: Corsair HX1000W
Monitor: Samsung 305T 30” widescreen LCD
OS: Windows 7 Ultimate N x64 SP1


Graphics Cards:

Sapphire HD 5830 1GB
Sapphire HD 5870 1GB (Stock)
ATI HD 4890 1GB (Reference)
Sapphire HD 5850 1GB (Stock)
EVGA GTX 285 (Stock)
GTX 275 896MB (Stock)
GTX 295 (Stock)
EVGA GTX 260 216 (Stock)


Drivers:

ATI 10.3 Beta
NVIDIA 195.62 WHQL


Applications Used:

Batman Arkum Asylum
Borderlands
Dawn of War II
DiRT 2
Dragon Age: Origins
Far Cry 2
Left 4 Dead 2


*Notes:

- All games tested have been patched to their latest version

- The OS has had all the latest hotfixes and updates installed

- All scores you see are the averages after 2 benchmark runs

All game-specific methodologies are explained above the graphs for each game

All IQ settings were adjusted in-game
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,857
Location
Montreal
Batman: Arkham Asylum

Batman: Arkham Asylum (DX9)


Even though Batman: AA has its own in-game benchmarking tool, we found that its results are absolutely not representative of real-world gaming performance. As such, we used FRAPS to record run-through of the first combat challenge which is unlocked after completing the first of The Riddler’s tasks. It includes close-in combat with up to 8 enemies as well as ranged combat. In addition, we made sure to set the smoothframerate line in the game’s config to “false”. No AA was used as the game engine does not natively support it.


1680 x 1050



1920 x 1200



2560 x 1600

 
Status
Not open for further replies.
Top