What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

AMD’s 890GX Chipset: Low-End Price, High-End Features

Status
Not open for further replies.

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal


AMD’s 890GX Chipset: Low-End Price, High-End Features




Contributing Writers:
Patrick "Mac"
Mike "lemonlime"



Before we really get into this article, let’s rewind to November of 2007 when the original AMD Phenom processors were first released as B2 stepping chips. As many of you will remember, those were tough times for AMD since Intel had already been heavily marketing their quad core Kentsfield for almost a year. To make matters even worse, the original Phenoms failed to perform up to expectations and featured a TBL bug that popped up in certain situations. AMD however retained their fighting spirit and pushed through adversity to release the B3 stepping Phenoms and eventually went on to introduce the highly successful Phenom II and Athlon II lines of dual, tri and quad core processors. With these products, they have been able to concentrate on brining value to the CPU market in the face of what seems to be rapidly increasing prices from Intel’s camp.

While AMD’s processors have quickly progressed from one generation to the next, the associated AM2+ and AM3 motherboards didn’t fare quite as well. 790FX-based boards were the flagship products back when they were released in 2007 and still remain so today. The same can be said about the other 790-series boards but many of them were replaced with the newer 785G products over the course of last year. Let’s be honest though; in terms of component lifecycles, nearly three years is a hell of a long time. The 700 series of chipsets were getting long in the tooth and while the 785G boards did breathe some life into things, AMD needed an update. This is where the new 800-series comes into play.

Today marks the official release of AMD’s new chipsets that are based on the new SB850 and SB810 southbridge chips. These new products are basically more evolved versions of the 700-series and will make up the backbone of the new Leo and Dorado platforms. At their most basic, the two new platforms will be targeting different ends of the market. AMD has stated the Leo will appeal to buyers looking at the upcoming Phenom II X6 and Zosma-based quad core processors while the Dorado aims for the lower end dual and quad core products. According to the information we have, these new chipsets pack a number of new features but will retail for almost the same amount as the outgoing products. If anything, this should cement AMD’s value-oriented goals into the mindsets of the buying public.

It should also be noted that today marks the first time the media is allowed to talk about the upcoming Thurban-based Phenom II X6 as a bona-fide product. You won’t see any firm benchmarks yet (at least not from us) and actual availability is slated for sometime in the April / May timeframe but at least we now know the name of this elusive 40nm, 6-core processor. However, the only reason AMD is announcing it now is to stave off some of the bleeding which will inevitably come with Intel’s upcoming Gulftown release.

For the purposes of this article we will be concentrating on the 890GX chipset since it targets a potentially lucrative market for AMD and one which will appeal to consumers looking at a value-oriented setup. There won’t be many benchmarks here since they will come in the separate motherboard reviews but we do have boards from ASUS and Gigabyte on-hand which will be on display a bit later in this article. So, treat this article like a primer for a whole series of 890-series reviews in the coming months.


 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
Introducing the Leo and Dorado Platforms

Introducing the Leo and Dorado Platforms


With the release of the 800-series chipsets, AMD is now moving away from their older “Dragon” and “Pices” platforms towards what they call “Leo” and “Dorado”. Leo will basically be the all-encompassing name for their high end and upper mainstream products while Dorado will end up targeting entry level consumers. Let’s take a look at how these new platforms compare to their predecessors.


Looking at the chart above, it should be come apparent that the new SB8xx-based boards aren’t really going to offer anything revolutionary but will rather evolve AMD’s platforms to fit better with today’s market trends. The most important move in our opinions from a compatibility standpoint is the omission of DDR2 support from the newer platforms. It was only a matter of time until the move to a DDR3 / AM3-exclusive platform was made and make no mistake about it; we feel this has been a long time in coming.

Much like the 790GX and 785G from the last generation, the lower-end 800-series boards will feature integrated graphics but we will go into more about that a bit later. One of the more interesting things about the graphics aspects for these chipsets is the fact that yet another mid-range GX-series board from AMD will feature dual 8x PCI-E lanes for Crossfire use. However, it should be noted that it is up to the board manufacturer to implement the necessary on-board automatic switch to ensure full 16x operation if only the first PCI-E slot is populated. Also of note is the fact that none of these boards will be SLI certified for the time being. The 800G on the other hand makes due with a single 16x PCI-E slot.


890GX: Gigabyte and ASUS

One of the most significant evolutions between the 700-series and 800-series is the addition of SATA 6Gb/s compatibility to the southbridge. This could give AMD a serious leg up on Intel who would have had to change their CPU dies in order to support this new standard since all of their I/O functions are now controlled through the CPU package. Instead, all AMD had to do was add the functionality to their southbridge and it was off to the races. Unfortunately, while the number of USB 2.0 ports was increased on the SB800-series, SuperSpeed USB (or USB 3.0) will not be natively supported but can be added through a controller chip. Just remember that the 880G will not have support for SATA 6Gb/s.

AMD’s new line-up does look quite strong and it is great to see pricing won’t see significant upwards movement when going from one generation to the next. We have to applaud AMD for this price stagnation and there is no doubt in our minds the new features will entice quite a few people to make the jump to a non-Intel platform.

These will basically be the last major revisions to AMD’s current chipset architecture before the release of the Scorpius and Lynx platforms in 2011. While not much is known about either of these platforms, we can tell you that they will offer the first brand new x86 architecture from AMD in quite some time. Basically, Scorpius will use a 32nm Zambezi CPU with up to 8 physical Bulldozer cores while Lynx will encompass 32nm dual, triple and quad core Llano CPUs. All in all, it looks like AMD will have a strong line-up far into the future.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
890GX Features and Specifications

890GX Features and Specifications

As with any new chipset launch, we’ll turn things over to the customary block diagram.


Image courtesy of AMD.

As with all of AMD’s platforms, the 800 series is a dual-chip solution encompassing the 890GX primary controller hub – we don’t feel right calling it a northbridge any more as the memory controller resides in the CPU – as well as the SB850 southbridge. As expected, the duties of 890GX include CPU interface and PCI-Express I/O control as well as graphics processing via the integrated graphics core dubbed Radeon HD 4290. You may notice that there isn’t much detail on the IGP depicted above, but not to worry, we’ll cover that in detail shortly.

You may notice that there is no mention of DDR2, only DDR3 and socket AM3 in the block diagram. Unlike earlier IGP platforms, it appears that AMD will be targeting the 890GX to AM3 based DDR3 systems – particularly with their new AM3 based Athlon II line of CPUs. Since memory control is exclusive to the CPU, and the CPU interfaces with the chipset using the same Hypertransport 3.0 bus, there is technically no reason that the 890GX couldn’t be adapted for use with AM2+ based systems. We’ll have to wait and see what board partners decide to produce, but we wouldn’t be surprised if the 890GX platform remains exclusively AM3, which makes sense as that seems to be the market direction – even for budget systems.

Let’s begin with the PCI-Express configuration supported by the 890GX. Like it’s predecessor, the 790GX, the 890GX has a healthy array of PCI Express 2.0 lanes, including a 16X 2.0 lane that can be evenly split into two 8X 2.0 lanes for Crossfire-X configurations. This was certainly nice to see, as the previous lower end 785G chipset couldn’t split it’s 16X lane, and users are forced to more of a crippling 16x4 configuration that could be limiting with higher end cards. A dual 8X configuration gives buyers a lot of flexibility. Not only can they begin with nothing but integrated graphics, but they can move all the way up to a pair of higher end cards if they so desire.

Aside from the graphics card lanes, there are a total of six additional 1x 2.0 lanes available for integrated components – like audio and network controllers – and additional slots. If this weren’t enough, two more 1x 2.0 lanes are provided by the SB850 southbridge.

Moving on down to the shiny new SB850, we see that AMD has updated the chipset interconnect and is now using a 2GB/s interface dubbed “Alink Express III”. Although we’re not sure exactly what has changed, the older Alink Express II was essentially a 4x PCI-Express 1.1 lane, so bandwidth appears to have been increased – likely to a 4x 2.0 lane - for improved chipset to chipset communication performance.

Speaking of the southbridge, the most significant new feature that is has been brought to the table is 6Gbps SATA support. That’s right, those lucky enough to own one of the new Sandforce 1500 based SSDs can now enjoy Read/Write performance well beyond 300MB/s. Aside from updated SATA support, the remainder of the southbridge is consistent with the older SB750. We unfortunately don’t get to enjoy integrated USB 3.0 support at this point in time, as the SB850 remains a USB 2.0 controller.


Image courtesy of AMD.

On the topic of USB 3.0, we should note that AMD was very careful to point out that USB 3.0 controllers can be interfaced to the chipset using the 1x PCI-Express 2.0 lanes for a maximum theoretical throughput of 500MB/s. Coincidentally, Intel’s new H55 and H57 are limited to half bandwidth lanes and a maximum of 250MB/s to off-chip USB 3.0 and SATA 3.0 controllers. This likely won’t be of concern for USB 3.0, but having on-chip SATA 3.0 support is certainly a benefit as the only bottleneck is the 2GB/s Alink interface between the chipsets and the 1x component interface lanes don’t need to be used at all.

So with that out of the way, let’s take a closer look at the new Radeon HD 4290 integrated GPU.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
The Radeon HD 4290 IGP

The Radeon HD 4290 IGP



Image courtesy of AMD

Perhaps the most anticipated improvements that the 890GX brings to the table are in regards to its IGP. When the 790GX first hit the streets, it was greeted by very positive reviews. Today, we hate to be the bearer of less than exciting news, but the 890GX IGP is nothing more than a higher clocked version of the one found on the 785G – hardly anything to get excited about.


Based on the same RV620 architecture, the HD4290 is nothing more than a 700MHz variant of the DX10.1 HD 4200 found in the 785G chipset. This is definitely a little disappointing, as AMD has made terrific strides in the discrete graphics arena, with numerous DX11 cards at all price points and very potent performance. It appears that we’ll have to wait for AMD’s next generation – or should we say current generation – IGP.

Aside from a higher IGP core speed, the shader count, TUs and ROPs all remain consistent at 40/4/4. There was some speculation that AMD’s next IGP would employ 80 shaders for a nice performance boost, but this is not likely until we see a chipset die shrink from the current 55nm process.

All of the new features that the 785G brought to the table remain within the 890GX, including DirectX 10.1 support, HDMI 1.3 and implementation of AMD’s UVD2. UVD2 or “Unified Video Decoder 2” brings further enhancement in the form of multiple video stream acceleration (think picture-in-picture) as well as other image quality enhancing features. Lots more information on UVD can be found here.

Like the 790GX and 785G, the 890GX also supports “Hybrid Crossfire X”, although it appears that AMD is moving away from this name in favour of “Dual Graphics”. This essentially allows the IGP core to work in tandem with a similar discrete graphics core to improve performance. There has surprisingly been a lot of confusion and contradictory information as far as which cards can be paired with the older 785G chipset, but this time AMD is calling out the HD 5400 and HD 5500 series for this purpose.


Image courtesy of AMD.

Given the 4000 series IGP name, it seems odd that a brand new DX11 card would be paired with it, but AMD claims a solid 20-25% performance boost in some titles. At this point we can only assume that pairing a non-DX11 card with a DX10.1 IGP will forfeit hardware DX11 support in whatever game is launched in this mode, but we look forward to testing out the 890X “Dual Graphics” in our lab.

We’re also pleased to see that AMD has continued to include Sideport DDR3 support for the 890GX. Sideport memory is essentially a single 64 or 128MB DDR3 IC that is dedicated for use by the IGP. Since the IGP normally has to “share” some of the main memory for its purposes, having some dedicated cache is beneficial as it is faster and can be accessed directly i.e. not having to use the CPU’s memory controller and resources to access. Board partners have produced IGP systems without Sideport memory to reduce cost, but that is more common in the lower end budget models. We’d be surprised to see any 890GX boards without Sideport memory.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
ASUS M4A89GTD PRO/USB3 Motherboard Overview

ASUS M4A89GTD PRO/USB3 890GX Motherboard Overview


One of ASUS’ first boards sporting the 890GX name will be their M4A89GTD PRO/USB3. While the “Pro” name may throw some of you off, this is actually a relatively basic board with additional support for USB 3.0 built in via a controller chip. Pricing should be around the $145 (non-USB 3.0) to $155 (the board we see below) range. Remember, this is only a preview of this product so expect us to go into more depth in the review itself.


The box looks to take a different approach when compared to past ASUS products and moves towards a “green” theme. All of the necessary information is there for the reading and the back of the box actually goes into quite a bit of detail.


The board itself is impressive to say the least but don’t be mistaken by the picture above; the PCB is actually a deep brown; not black.


The M4A89GTD sports a relatively robust power distribution section capped off by an impressive looking heatsink design. One of the more interesting features of this motherboard is the tall 8-pin CPU connector which is designed to allow access without needing to worry about the heatsink getting in the way.


Making our way south along the side of the board we come to a switch for core unlocking on two or three core AMD chips as well as a Turbo Key toggle which is supposed to automatically overclock your CPU. Further down, there are a pair of SATA 6Gb/s connectors that are placed at a right angle to the board as well as four additional headers closer to the southbridge heatsink.


Even though for the most part the expansion slot connectors are spaced at very good intervals, ASUS has resorted to an archaic mode for PCI-E lane switching. Basically, instead of including an on-board chip to automatically control the switch from one PCI-E 16x slot to two PCI-E 8x slots when two GPUs are installed, both mechanical PCI-E slots run natively at 8x mode. In order to get the bottom slot to word at full 16x performance, you need to install an included dummy card. And no, the top PCI-E slot will never work at 16x mode, even if you install the switcher into the bottom slot.


The backplate shows a great selection of connectors with the two blue USB headers allowing for full USB 3.0 speeds and various display outputs (including a HDMI 1.3 connector) to round things out.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
The Gigabyte 890GPA-UD3H Motherboard Overview

The Gigabyte 890GPA-UD3H 890GX Motherboard Overview


Lately, Gigabyte has done a good job in shying away from the complicated naming schemes many other manufacturers use for their boards. Basically, the 890G denotes the chipset, the “A” shows us that there is USB 3.0 installed and the UD3 is the Ultra Durable namesake followed by the product category. This board will be marketed for around the $110 - $130 USD mark which could vault it right into the lead in the price / performance segment.


As with many other motherboard boxes, this one from Gigabyte is literally chalk full of information and marketing mumbo jumbo. Naturally, mentions are made about the 2oz copper PCB, Gigabyte’s 3 year warranty and the inclusion of both USB 3.0 and SATA 6Gb/s.


Once again we see a near-perfect layout from Gigabyte on a board that holds all the usual hallmarks of their design team. Blue, white and yet more shades of blue take over the colour scheme and while this may bother many people, at least Gigabyte has done away with the oddball green and pink colours they were using less than a year ago.


The area around the CPU socket is relatively clear of any obstructions which would impede the installation of larger coolers but it is still interesting to see Gigabyte going for such a large heatsink over the northbridge. The power layout on this board shows us a 4+1 phase design which may seem a bit anaemic by today’s lofty standards but it should be more than enough for current and upcoming AMD processors.


The southbridge heatsink is remarkably small and next to it lie eight right-angle SATA 6Gb/s connectors. Meanwhile, the expansion slot layout on this board is perfect and no, you don’t need a switch to change the PCI-E configuration like on the ASUS product as we saw before. It also looks like Gigabyte went with three PCI-E 1x connectors to give users perfect placement for the installation of aftermarket sound cards.


The backplate on the 890GPA-UD3H is relatively simple but it provides all of the connectors you could possibly want. Much like the ASUS board, the two USB 3.0 connectors are finished in a blue colour to differentiate them from the others.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
Test Systems and Setup

Test Systems and Setup



AMD System Setup

ASUS M4A785TD-V EVO (785G) [0512 BIOS]
ASUS M4A78T-E (790GX) [1708 BIOS]
GIGABYTE GA-890GPA-UD3H (890GX) [F2 BIOS]
AMD Phenom II X4 945
Thermalright Ultra-120 eXtreme
Thermalright TR-FDB-12-1600 120MM FAN - 63.7CFM 1600RPM
2x2GB Patriot Sector 5 Viper II DDR3-2000 8-8-8-20 @ DDR3-1333 7-7-7-18-1T
Tuniq Miniplant 950W
Western Digital 320GB WD3200AAKS-00B3A0
Windows Vista Ultimate SP1 64-bit (with all updates)
Catalyst 10.3 Beta Driver


Intel X4500 + NVIDIA IGP Test System

GIGABYTE GA-E7AUM-DS2H (F4 BIOS) + NVIDIA GeForce 196.34 drivers
GIGABYTE GA-EG41M-US2H (F2 BIOS) + Intel INF_allOS_9.1.1.1020_PV drivers

Intel Q9550

Memory: 2x2GB Crucial Ballistix Tracer DDR2-800 4-4-4-12-2T BL2KIT25664AR802A




For all of the benchmarks, appropriate lengths are taken to ensure an equal comparison through methodical setup, installation, and testing. The following outlines our testing methodology:

A) Windows is installed using a full format.

B) Chipset drivers and accessory hardware drivers (audio, network, GPU) are installed followed by a defragment and a reboot.

C)To ensure consistent results, a few tweaks were applied to Windows Vista and the NVIDIA control panel:
  • Sidebar – Disabled
  • UAC – Disabled
  • System Protection/Restore – Disabled
  • Problem & Error Reporting – Disabled
  • Remote Desktop/Assistance - Disabled
  • Windows Security Center Alerts – Disabled
  • Windows Defender – Disabled
  • Screensaver – Disabled
  • Power Plan - High Performance
  • NVIDIA PhysX – Disabled
  • V-Sync – Off

D) Programs and games are then installed & updated followed by another defragment.

E) Windows updates are then completed installing all available updates followed by a defragment.

F) Benchmarks are each ran three times after a clean reboot for every iteration of the benchmark unless otherwise stated, the results are then averaged. If they were any clearly anomalous results, the 3-loop run was repeated. If they remained, we will mention it in the indvidual benchmark write-up.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
IGP HD Video Decoding

IGP HD Video Decoding



In order to test the high definition video decoding capabilities of a few modern IGPs, we loaded up CyberLink PowerDVD Ultra 9.0.2320 and played high definition 1080p media with three different types of HD codecs, namely VC-1, H.264/AVC, and WMW HD. Hardware acceleration was enabled in PowerDVD to take advantage of the accelerated decoding capabilities of our IGPs and GPUs.

Chapter 24 of the Batman Begins Blu-ray DVD was our source for VC-1.
Chapter 8 of the Transformers Blu-ray DVD was our source for H.264/AVC.
The entire clip of [URL="http://www.microsoft.com/windows/windowsmedia/musicandvideo/hdvideo/contentshowcase.aspx]The Living Sea (IMAX)[/URL] was our source for WMW HD.

What we are looking for is the lowest possible CPU utilization.

 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
3DMark06 / 3DMark Vantage

Futuremark 3DMark06


3DMark06 v1.1.0
Graphic Settings: Default
Resolution: 1280X1024

Test: Specific CPU Score and Full Run 3Dmarks
Comparison: Generated Score

The Futuremark 3DMark series has been a part of the backbone in computer and hardware reviews since its conception. The trend continues today as 3DMark06 provides consumers with a solid synthetic benchmark geared for performance and comparison in the 3D gaming realm. This remains one of the most sought after statistics, as well as an excellent tool for accurate CPU comparison, and it will undoubtedly be used for years to come.





Futuremark 3DMark Vantage


3DMark Vantage v1.0.1
Graphic Settings: Entry Preset
Resolution: 1024X768

Test: Specific CPU Score and Full Run 3Dmarks
Comparison: Generated Score

3DMark Vantage is the follow-up to the highly successful 3DMark06. It uses DirectX 10 exclusively so if you are running Windows XP, you can forget about this benchmark. Along with being a very capable graphics card testing application, it also has very heavily multi-threaded CPU tests, such Physics Simulation and Artificial Intelligence (AI), which makes it a good all-around gaming benchmark.


 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
Far Cry 2 / World In Confict

Far Cry 2


Far Cry 2 1.02
Resolution: 1280x1024
Anti Aliasing: 0
Quality Settings: Low
Global Settings: DX9 Enabled

Test 1: Ranch Long Demo
Comparison: FPS (Frames per Second)

Far Cry 2 is the hot new new first-person shooter from Ubisoft's Montreal studio, and the first game to utilize the new visually stunning Dunia Engine, which will undoubtedly be used by numerous future games. Using the included Benchmarking Tool, we ran the Long Ranch demo in DX9 mode at 1280x1024 with all settings set to low.




World in Conflict


World in Conflict v1.010
Resolution: 1280x1024
Anti-Aliasing: 0X
Anisotropic Filtering: 0X
Graphic Settings: Low
Test 1: Built-in Benchmark
Comparison: FPS (Frames per Second)

One of the detailed and most visually stunning real-time tactical games in recent history, World in Conflict remains a staple in gaming lineup . For this test we used the in-game benchmarking tool.



 
Status
Not open for further replies.

Latest posts

Twitter

Top