What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

AMD Ryzen 5 2400G & Ryzen 3 2200G Review

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
Gaming Performance (Battlefield 1 / COD: IW)

Battlefield 1


Battlefield 1 will likely become known as one of the most popular multiplayer games around but it also happens to be one of the best looking titles around. It also happens to be extremely well optimized with even the lowest end cards having the ability to run at high detail levels.

In this benchmark we use a runthough of The Runner level after the dreadnought barrage is complete and you need to storm the beach. This area includes all of the game’s hallmarks in one condensed area with fire, explosions, debris and numerous other elements layered over one another for some spectacular visual effects.





Call of Duty: Infinite Warfare


The latest iteration in the COD series may not drag out niceties like DX12 or particularly unique playing styles but it nonetheless is a great looking game that is quite popular.

This benchmark takes place during the campaign’s Operation Port Armor wherein we run through a sequence combining various indoor and outdoor elements along with some combat.




In these tests the onus is upon the GPU rather than the CPU but it does look like the 2400G's higher clock speeds do tend to benefit it in some instances. Supposedly, the move to a single CCX for these APUs could be either a positive or negative impact depending on the game and that's what we might be seeing above as well.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
Gaming Performance (Deus Ex / DOOM)

Deus Ex – Mankind Divided


Deus Ex titles have historically combined excellent storytelling elements with action-forward gameplay and Mankind Divided is no difference. This run-through uses the streets and a few sewers of the main hub city Prague along with a short action sequence involving gunplay and grenades.




Doom


Not many people saw a new Doom as a possible Game of the Year contender but that’s exactly what it has become. Not only is it one of the most intense games currently around but it looks great and is highly optimized. In this run-through we use Mission 6: Into the Fire since it features relatively predictable enemy spawn points and a combination of open air and interior gameplay.



With the graphics card dictating performance in these games (or the game engines themselves adding an artificial limit like DOOM does) there really isn't much to talk about here. I do want to mention that this is actually an important take-away from any of these reviews: CPU power takes a back seat to graphics output when gaming at high detail settings.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
Gaming Performance (GTA V / Overwatch)

Grand Theft Auto V


In GTA V we take a simple approach to benchmarking: the in-game benchmark tool is used. However, due to the randomness within the game itself, only the last sequence is actually used since it best represents gameplay mechanics.



Overwatch


Overwatch happens to be one of the most popular games around right now and while it isn’t particularly stressful upon a system’s resources, its Epic setting can provide a decent workout for all but the highest end GPUs. In order to eliminate as much variability as possible, for this benchmark we use a simple “offline” Bot Match so performance isn’t affected by outside factors like ping times and network latency.

 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
IGP Gaming Performance (Synthetic)

IGP Gaming Performance (Synthetic)


AMD has spoken a lot about the capabilities of Raven Ridge’s integrated graphics and that shouldn’t come as a surprise. In this generation of APUs they’re utilizing the Vega architecture which first debuted on the Vega 64 and Vega 56 last year.

While neither of those discrete GPUs made a particular splash due to limited availability and the fact they were simply launched too late, the Vega 10 and Vega 8 units installed into the 2400G and 2200G are something else entirely. Both efficient and miles better than anything previously installed into an APU, there’s hope that alongside the new Zen microarchitecture, they’ll make for a compelling all-in-one solution.


With that beings said, I need to talk a bit about the settings and games used for this section of the review. Each integrated graphics processor was given its maximum 2GB framebuffer in the BIOS while memory was set to 2667MHz at 14-14-14-32 1T timings. Memory speeds actually have a huge effect upon performance but more about that later. All other settings were loaded through Optimized Defaults. A GTX 1050 2GB was used as a baseline measurement for a discrete card since a GT 1030 wasn’t available at the time of writing.

While the next two pages will focus on games, this one will run a test we are all familiar with: 3DMark. Three separate tests are used, those being Cloud Gate, Sky Diver and Fire Strike. Each of these become progressively harder and of course a comparative benchmark result is displayed at the end of each test.





I understand that adding a $150 discrete graphics card may seem to be a bit unfair here but it does play an important role. We can see that the Vega 11 and Vega 8 integrated GPUs perform admirably even when placed up against a dedicated GPU. Meanwhile, they are able to run all over the latest and greatest from Intel.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
IGP Gaming Performance (Games)

IGP Gaming Performance (Games)


Typically in these tests I focus on 720P gaming performance but not this time. In my opinion, 1080P is the new mainstream “low end” resolution and gameplay at anything below that is simply sub-optimal. Nearly all new systems come with a 1080P or greater display so if AMD’s integrated graphics struggle at that resolution, it’s time to go back to the drawing board. I’m not too worried about that here.

As for the games themselves, I’ve decided to target titles that are most likely to be played with an integrated graphics processor. I don’t hold any illusions that an IGP will play the latest Far Cry at high detail settings but these titles are nonetheless meant to represent a straightforward sampling of genres and well recognized games.











While I wouldn't classify the Vega 11 or Vega 8 as "fast" graphics solutions, they are without a doubt extremely impressive. At 1080P both are able to deliver consistently playable framerates in every single game.

To be honest with you, I did load the dice a bit by using the 100% render resolution in Destiny 2 but nonetheless, I couldn't help but be pleasantly surprised by what AMD has been able to accomplish here. It makes me wish Vega would have trickled down into lower price points in the discrete market.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
IGP Performance VS Clock Speeds

IGP Performance VS Clock Speeds


On one of the previous pages I mentioned that memory speeds have a pretty large effect upon the ability of AMD’s Vega graphics to perform at its most optimal level. While the 2667MHz setting I used for all the standard testing was relatively quick given the price and performance level of affordable kits these days, every one of these APU reviews has shown me that integrated graphics are bottlenecked by the system memory.

Until AMD or Intel begins integrating high bandwidth memory directly onto the SoC package, that situation likely won’t change in any way. However, in an effort to break the Vega 11 and Vega 8 out of the proverbial memory frequency chains, I decided to give AMD’s in-house XMP profiles a shot in conjunction with a kit of G.Skill’s Flare-X Ryzen-certified memory. That meant 3200MHz at the exact same 14-14-14-32 1T timings as our standard tests. The results….well they were once again eye opening.






As expected, adding a bit more juice to the memory subsystem allowed both Vega graphics units to really stretch their legs. In most cases the improvement was somewhere in the neighborhood of 12% to 18% which certainly isn’t anything to sneeze at. Unfortunately, moving beyond that speed to 3400MHz at the same timings resulted in system lockups and loosening those timings would have thrown off the apples to apples comparison.

As I alluded to above, the only thing that will likely hold people back from going this route is the current price of DRAM modules. Right now buying a 16GB kit that can consistently achieve 3200MHz with a minimum of fuss requires an investment that exceeds the price of the 2400G combined with an inexpensive B350 motherboard.


Turning to GPU Core Clocks

My next step was to address the core frequencies of the Ryzen 5 2400G’s GPU itself and let me tell you, that wasn’t an easy thing. The beta BIOS on the MSI B350 board consistently threw a hissy fit whenever the GPU clock was moved one iota. AMD’s Ryzen Master software was thankfully much better and I ended up getting to a constant 1438MHz which was the maximum that could be achieved without black screening during game testing or applying excess voltage to the equation.




As the frequencies moved up, the overall performance did indeed improve but nothing like it did with faster memory. It could be that memory speeds were once again the culprit and ended up limiting GPU throughput or this could have been just the law of diminishing returns. Either way, framerates were certainly impressive for an integrated graphics solution.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
Power Consumption

Power Consumption


I don’t typically dedicate a whole page to power consumption but there’s a pretty substantial story lurking behind the numbers you see below and how they directly relate to TDP claims from both Intel and AMD. Without getting too technical, the way these two companies go about measuring TDP is fundamentally different from one another.

What you need to know is that TDP values are a universally poor way to determine actual power consumption for end users since they are simply thermal design guidelines that are given to system integrators. As I say in every review, TDP is not actual power consumption so don’t take it as such.

As both Intel and AMD recommend, the best way to measure true power deltas between processors is via a simple (yet calibrated) power meter plugged into the wall outlet. That’s exactly what we do but add in a controlled 120V power input to eliminate voltage irregularities from impacting the results.


For a performance per watt standpoint, both of these new APUs carry on in the footsteps of other Ryzen processors. That means they’re able to deliver very good performance for the amount of power they require. Luckily, when a dedicated graphics card is installed, the Vega graphics is turned off but it still looks like there’s a small effect upon efficiency.


One the IGP itself is turned on and loaded, system power requirements are middling at most with the entire setup consuming under 100W when gaming. That’s quite impressive given the framerates achieved by both Raven Ridge APUs. Unfortunately, I don’t have an A12-9800 on-hand for comparative analysis since most of those ended up showing up in prebuilt systems rather than in the DIY market.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
Overclocking Results - Hitting The Wall

Overclocking Results - Hitting the Wall


Now before I get too far into this section, I have a bit of a rant. I've been overclocking in one way or another for the last 15 years and for the life of me I can't remember an experience that was as frustrating as this one. I'm not sure if this was due to the MSI B350i PRO AC motherboard or simply a byproduct of an existing platform coming to grips with a new processor. However, the amount of memory corruption, black screens, failed boots and other issues I encountered had me throwing my arms up in frustration over and over again.

The situation was so bad that after spending the better part of two days dialing in the Ryzen 5 2400G, I simply walked away from the system and never even tried to overclock the Ryzen 3 2200G. Maybe I'll revisit it sometime in the future but with an NDA lift looming, there was no way I was going to waste time going through the same torturous process.


With that said, AMD needs to be complimented for their continued development of the Ryzen Master Utility. It dialed in an overclock when MSI's BIOS fought me every step of the way. There were some times when it wouldn't apply an overclock but other than that, it performed admirably.

So what was actually achieved? Well not that much actually since like its standard Ryzen siblings, the Ryzen 5 2400G simply refused to go beyond the 4GHz mark. This has been a well known limitation, one which has been prevalent on many Zen-based chips I've seen thus far. In this case, I ended up hitting 3.95GHz on all cores which is decent but only a 350MHz increase over the APU's standard Base Clock of 3.6GHz. Memory went to 3400MHz and anything above that wasn't achievable, once again due to the usual limitations we've seen with Ryzen in the past. Honestly, for all the frustration endured during the overclocking journey, I would have much rather left this thing at its stock settings.

Plus, the actual performance uplifts were good but I have to wonder if the effort was worthwhile in the end:

 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,861
Location
Montreal
Conclusion – Finally, an APU Success?

Conclusion – Finally, an APU Success?


By the time you read this conclusion I’ve probably rewritten it about four times. What started as a relatively straightforward review morphed into something else entirely and as it evolved, so too did my opinion about these Raven Ridge APUs. The Ryzen 5 2400G and Ryzen 3 2200G certainly weren’t going to burn up our charts but their primary goals were accomplished: to provide a good enough foundation upon which someone can build a basic system. Basically, they’re utilitarian workhorses but don’t take that to mean excitement is completely lacking because there’s a lot of cool stuff happening here.

Looking at raw CPU performance from a high level perspective shows that AMD seems to have balanced their architecture quite well to make up for the single CCX, reduced cache layout in these processors. There indeed are some very specific usage scenarios where high clock speeds and lower on-die latencies are trumped by the need for a higher amount of shared L3 cache but many of those fell within the synthetic testing. I didn’t run into any glaring deficiencies during testing and as a matter of fact, both of these chips performed admirably well.

Drill a bit further down into the results and a pattern begins to emerge where the 2400G ended up going head to head against the Ryzen 5 1500X while consistently beating the Ryzen 5 1400. That was to be expected given this new processor’s significant clock speed advantage over the CPU it replaces. However, those frequencies didn’t grant any advantage over the 1500X which is likely due to Raven Ridge’s lack of XFR regardless of the additional frequency granularity granted by Precision Boost 2. The L3 cache loss likely plays a role here as well.

Nearly the same situation was evident with the Ryzen 3 2200G. It remained well ahead of AMD’s own Ryzen 3 1200 and sometimes traded blows with the 1300X. However, the operative word here is “sometimes” since there were scenarios –particularly in real world testing- where the 1300X ended up walking all over its APU cousin. In those situations the $30 premium for the standard Ryzen 3 looked like money well spent.

One of the issues here is consistency, something which the standard Ryzen 5 and Ryzen 3 processors don't have any problems with. That's why if I was upgrading or building a new system right now and GPU prices turned me off, I'd simply wait or buy a 1300X / 1500X along with a used graphics card. These APUs have their place but they're not a straight-up replacement for a system that's primarily used for gaming or diverse workloads.

When their integrated Vega graphics processing units are enabled, the 2200G and 2400G provide some surprisingly decent in-game results as well. I ended up focusing on today’s more popular titles (of which many are quite basic) and the Vega 11 and Vega 8 had absolutely no trouble delivering playable framerates at 1080P provided some image quality sacrifices are made. Add in a bit higher memory frequencies along with a slightly boosted GPU core speed and the results go from surprising to downright impressive. Honestly, seeing framerates like this from integrated graphics makes me hugely disappointed we haven’t seen discrete Vega GPUs rolled into lower price points.

Thus far you’ve likely noticed a distinct lack of Intel mentions in this performance analysis. That’s simply because they don’t really have anything to compete against APUs at this point in time. While Coffee Lake’s performance in standardized processing workloads exceeds Zen’s, their integrated graphics results look byzantine by comparison. Plus, the i5-8400 is the only sub-$200 Coffee Lake processor I have on-hand and that clearly competes with the Ryzen 5 1600 rather than offering a pricing alternative to Raven Ridge.

With that being said, despite several microarchitecture updates, these APUs constantly fall behind Intel’s Coffee Lake when gaming with a discrete card. Granted, the Titan X I’m using is horrendously overpowered but its tidal wave of horsepower always highlights potential failings. So while AMD has certainly made great strides in closing a yawning gap that existed as recently as last year, Zen still finds itself fighting an uphill battle in lightly threaded applications. But pair up a 2400G or 2200G with a mid-level card and the GPU will become the bottleneck long before those x86 cores strangle off performance.

So there you have it. The Ryzen 5 2400G and Ryzen 3 2200G get the job done and done well; they are actually the most adaptable all-in-one processors I’ve seen since Intel’s short lived i7-5775R with Iris Pro graphics. The combination of Zen and Vega looks like a match made in heaven but as with previous generations, once again the ultimate challenge now falls to AMD. They need to find companies and individuals who want to hear more about the benefits of these new APUs. That’s been a challenge in the past but given the results we’ve seen here, I’m willing to be they’ll find many, many more potential buyers who are now willing to listen.
 
Last edited:

Latest posts

Twitter

Top