What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

AMD's FreeSync; A Long-Term Review

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
13,264
Location
Montreal
For the three weeks or so, I've been living in a vacuum. Our lab only received our AMD FreeSync monitor relatively late (on the day of launch actually) and since then I’ve been avoiding reading anyone else’s opinions about the technology, its benefits and potential drawbacks. Simply put, I needed to formulate an unbiased opinion based upon my game-time experiences rather than looking at FreeSync based upon technological expectations.

Unlike many of our other reviews you’ll be seeing my personal, somewhat biased opinions along with the usual litany of raw performance testing and a bit of technical jargon. Yes it’s different but the only way to truly experience and understand the benefits of technologies like FreeSync is from a first-hand perspective. This took a bit longer than I would have liked (the BenQ XL2730Z now has over 175 hours of use on it) but in the end I believe this will result in a somewhat unique perspective.


Before I go on, there's a small admission I have to make. I'm a screaming fanboy of NVIDIA's G-SYNC technology. An Acer XB280HK has been my primary gaming display for the better part of four months now and I have logged over 250 hours on it. So when AMD said they were sending along a FreeSync monitor, I approached it with a mix of excitement and trepidation.

We’ve already seen how AMD’s initiatives offer up a mixed bag with more recent technologies like Eyefinity and Mantle meeting with huge success while others like HD3D and TrueAudio ultimately failed to meet expectations. I simply didn’t want FreeSync to go down the latter path since, when taken at face value, it has so much to offer gamers.


In order to understand AMD’s FreeSync, some basic knowledge of VESA’s DisplayPort Adaptive Sync and V-Sync in general is necessary. Gamers who want the best possible performance and minimal mouse latency typically play with V Sync disabled which allows framerates to run independently of the monitor’s refresh rates. This acts as a double edged sword since screen tearing occurs as multiple frames tend to be displayed onscreen during a monitor refresh cycle. The end result may be lightning quick reaction times but a distracting onscreen image that is filled with artifacts since the display’s refreshes aren’t properly synchronized with the frames being delivered by the graphics card.

Meanwhile, those who care about achieving optimum image quality tend to enable V-Sync but that caps framerates at the monitor’s maximum refresh rate. It also increases mouse latency and introduces noticeable stuttering if the graphics card can’t keep up with the vertical refresh rates and buffers frames in preparation for the next monitor refresh. When this happens, the previous frame is repeated before the current frame is launched onto the screen.

Another problem with enabling V-Sync is a step-down effect which happens when the system is displaying framerates that are below the monitor’s native refresh rate. On a 60Hz monitor, that could lead to FPS jumping and stuttering between 60, 30, 15 and other integer multiples as the graphics card tries to synchronize its output in parallel to the display. Panels with higher refresh rates somewhat mitigate the performance capping issues but still suffer from stuttering.


Adaptive Sync is a technology baked into the DisplayPort 1.2a protocol that is meant to eliminate the aforementioned tearing and stuttering by synchronizing the GPU and monitor so frames are displayed when ready through a framerate-aware variable monitor refresh rate. However, while DisplayPort Adaptive Sync is a mechanism to achieve better onscreen fluidity, it requires system-end support to function properly.

FreeSync is simply the driver-side facilitator which allows refresh rate information to be passed between the source (in this case an AMD graphics card) and the panel. There’s a handshake protocol which allows the monitor to tell the GPU the fastest time it’s ready to accept a frame and the slowest. In effect this gives the GPU full knowledge of what’s happening without having to actually polling the monitor first while the display works off of this information to vary its refresh rate accordingly.

Since all of this is accomplished within hardware, FreeSync runs agnostically from any game-level hiccups that may be encountered. In short, like G-SYNC, FreeSync should be compatible with every game in existence, regardless of the API (DirectX, Mantle, OpenGL, etc.) being used since it doesn't rely on driver profiles to work.

As this functionality runs hand in hand with the DisplayPort specification, costs are kept down since additional hardware isn’t needed. It should also allow FreeSync to be easily ported over to notebooks, a market segment which could seriously benefit from this technology.


Quite a few discussions have homed in on how FreeSync stacks up to NVIDIA’s competing G-SYNC since both technologies parallel one another in many ways. They actually claim to accomplish the same set of goals by smoothing out onscreen animations and eliminating the image artifacts normally associated with running or disabling V-SYNC. It seems straightforward right? Not so fast because both companies’ methods vary quite a bit.

While AMD is harnessing Adaptive Sync’s benefits without the need for proprietary, expensive hardware, G-SYNC panels require an add-on module which replaces a monitor’s scaler. As a result FreeSync monitors are generally less expensive than their competitors since the necessary protocols are included directly in their EDID. Interestingly enough, this also means that any DP 1.2a-equipped monitor could be FreeSync compatible provided there’s a compatible firmware for it.

Raw cost of entry may be a major determining factor in the FreeSync versus G-SYNC battle but some of the other differences are more nuanced. Even though AMD’s technology can only run through DisplayPort it also allows for other connectivity like DVI, HDMI and even VGA to be built into supporting products, something NVIDIA doesn’t offer. This may not be a big deal for many gamers but during my time with G-SYNC I missed the ability to use my notebook on a larger screen via its HDMI output.

A lot of what you see in the chart above is also pure marketing mumbo jumbo. For example, AMD’s FreeSync may support a wider refresh rate range than G-SYNC but you’ll always be limited in this respect by the panel itself, none of which even begin to approach the 9Hz to 240Hz claims. As I’ll explain a bit later, the so-called “performance penalty” needs to be taken with a grain of salt as well.


One of the main complaints leveled by gamers at V-SYNC is mouse lag. While I’m more averse to onscreen artifacts than a nearly imperceptible amount of input hesitation, I also prefer slower paced strategy games, think the current first person shooter genre is boring as hell…and I love rabbits. Even though I was once placed in the top 50 global Counter Strike players (little known fact alert!) these days I'm a good bit older and the reflexes are shot to hell from too much wine so I won't cry bloody murder at a missed headshot.

With that being said, I have to applaud AMD for the way they’re handling V-Sync here. Whereas NVIDIA automatically enables vertical synchronization whenever G-SYNC is turned on, FreeSync can operate independently of the screen’s vertical synchronization locks. This allows for a gaming experience tailored to your liking. Want the best possible motion quality with typical input latency? Turn on V-Sync alongside FreeSync. Want to keep FreeSync’s ability to minimize tearing and improve latency? Simply turn off V-Sync in whatever game you’re playing but allow FreeSync to do its thing.


Within all of these potential benefits of FreeSync, there are some notable areas where it falls short as well. For starters, it boasts somewhat limited compatibility compared to G-SYNC. While AMD’s feature is limited to current-generation cards, GeForce products that support G-SYNC date back to the GTX 600-series days with everything faster than a GTX 650 Ti Boost Edition able to communicate with certified monitors.

Perhaps the largest miss for FreeSync is its lack of Crossfire support at launch. Adaptive synchronization technologies require higher framerates to showcase their true potential and dips below the monitor’s vertical refresh rate window tend to cause the exact artifacts AMD is seeking to avoid (more on this later).

Luckily AMD does natively support Virtual Super Resolution (VSR) on FreeSync but without Crossfire, internally boosting rendering resolutions would be counter-intuitive. On the other hand, NVIDIA has always supported multi card configurations on G-SYNC. G-SYNC is also supported within Dynamic Super Resolution super sampling setting but their current drivers don’t allow for a combination of SLI, G-SYNC and DSR.


Activating FreeSync couldn’t be easier. Simply go into AMD’s Catalyst Control Center and turn it on within the display settings dialog area. Typically a pop-up will appear when Windows starts and you can check the settings by following the onscreen instructions.


My time with FreeSync wasn’t completely smooth though. At random intervals the warning above would present itself but from what I could gather, this didn’t negatively affect anything and it didn’t seem like things were going awry. I have a feeling it was a false positive so don’t panic.

When taken at face value, FreeSync looks like a bona fide competitor to G-SYNC but when push comes to shove both solutions’ goals are exactly the same: to provide a feature that will draw people to purchase a given graphics architecture. We have to remember that NVIDIA has a year-long lead on AMD but does that actually translate into a drastically different first-hand gaming experience? That’s what I’m going to explore in the upcoming pages while also endeavoring to explain a few more of FreeSync’s more intricate nuts and bolts.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
13,264
Location
Montreal
Understanding FreeSync A Bit More

Understanding FreeSync A Bit More


At first glance, it may look like AMD’s FreeSync is simply a display-bound technology which should have a drastic impact upon in-game motion fluidity but minimal effect upon actual framerates. This couldn’t be further from the truth since it has the ability to both improve and reduce in-game performance depending on the way it is utilized. Remember, any system with FreeSync has three possible display modes: FreeSync ON & V-Sync ON for the best possible image quality, FreeSync ON & V-Sync OFF, FreeSync OFF & V-Sync OFF and only V-Sync ON

I’ve included a chart below which clearly highlights how these settings translate into framerates. Other than a visual representation of stuttering and tearing, I think this is the clearest visual reference of FreeSync’s benefits and shortcomings.


Disabling V-Sync is obviously the best way to go from a raw performance standpoint since the framerate is able to operate independently from the monitor’s refresh rate limitations. However, it introduces noticeable tearing of the image.

Turning on FreeSync while keeping V-Sync off doesn’t impact framerates at all but, as I’ll discuss in the gameplay impressions section, the visual benefits are minimal at best. From my understanding, this is the setting AMD has been using to compare their solution’s performance to G-SYNC but from a visual standpoint there’s very little to recommend it.

Looking at performance from a V-Sync enabled perspective highlights why this setting can be detrimental to onscreen motion consistency. Even on a 144Hz panel we can see how the framerates go through a series of step-downs as the frames are delivered in an order that matches the panel’s refresh rate. When framerates are above 144 things match up perfectly but synchronizing below that point causes problems since the graphics card is forced to buffer ready frames as it waits for a refresh cycle to hit. This leads to those frames being delivered at ratio of 1:1, 1:2, 1:3, 1:4 and so on of the maximum refresh rate which roughly translates to 144FPS, 72FPS, 48FPS, 36FPS, etc. on this particular BenQ monitor depending on what the actual deliverable framerate is.

For example, if the GPU can deliver 100 frames per second it will sync at 72 since 144 isn’t achievable or 65 would become 48. In these situations stuttering becomes increasingly apparent as the frame times jump around like a jack in the box. Meanwhile, a good portion of the graphics card’s performance is left on the table to rot.

Enabling FreeSync and V-Sync leads to a completely different situation than those described above. Its performance range under 144FPS is very similar to what a V-Sync disabled environment provides. Here, FreeSync allows framerates to vary without any impact upon visual quality (tearing and stuttering are completely eliminated) but above that point, framerates are still capped at the panel’s maximum refresh rate. Granted, once again some of the GPU’s raw horsepower is wasted in those few instances where it can effectively render more than 144 frames per second but users are left without the jarring step-downs that happen when just V-Sync is enabled.


Perhaps one of AMD’s most controversial claims about FreeSync is its ability to actually increase performance while G-SYNC tends to reduce framerates. Supposedly this is due to the fact that Adaptive Sync’s handshake protocols give the graphics card real-time knowledge of the monitor’s refresh states while G-SYNC has to constantly poll the monitor, taking up some core operational cycles. We aren’t talking about noticeable differences since even AMD’s most optimistic charts all highlight variances that are so minor they can be chalked up to any number of variables. However, the numbers are there and I needed to test them for myself.


The Unigine chart posted above should give some hints about what FreeSync can and can’t do from a performance standpoint. It looks like when FreeSync is enabled and V-Sync is turned off, there’s very little to no negative impact. With that being said, when FreeSync operates identically to G-SYNC –that’s to say with V-Sync enabled to eliminate tearing- there’s actually a very minor reduction in delivered frames when under 144FPS while the upper range is still capped at the vertical synchronization rate.

In order to look into this situation a bit closer, I took ten games and repeated run-throughs for each ten times. In each situation, G-SYNC and FreeSync + V-SYNC were turned on and off for a true apples to apples comparison. These 100 data points were translated to the chart below.


Contrary to AMD’s claims, all of the data points towards FreeSync negatively impacting framerates more than G-SYNC does. This could be due to additional driver overhead or GPU cycles being used to insure FreeSync’s constant functionality or it could point towards FreeSync’s relative immaturity. Whatever it is, I feel like I’m splitting hairs here since there’s just no way a gamer notice any visible performance differences in either scenario.

In these scenarios I was insuring that the framerates remained above 40 and below 144 since FreeSync behaves very differently when performance drops below the monitor’s minimum refresh rate. Outside of this “sweet zone” it exhibits a worrying tendency to step down to even lower levels than V-Sync does, completely destroying motion fluidity in many games. This will be discussed more on the next page.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
13,264
Location
Montreal
Performance Outside the “Zone”

Performance Outside the “Zone”


Prior to this page I’ve discussed how well FreeSync performs when it’s enabled alongside V-Sync and framerates remain relatively high. However, what happens when a more demanding group of settings is used and the graphics card is rendering less frames? Let’s be honest, when someone invests big-time money in a FreeSync monitor and a higher end graphics card, they want to maintain optimum image quality and high detail levels. In many of today’s triple-A titles, that means performance between 30FPS and 75FPS.

Throughout testing I noticed there was a massive performance drop-off in games like GTA V which caused framerates to absolutely tank in situations where they hit points below 40. During particularly heavy action sequences it was nearly impossible to react since framerates were so abysmally low. This never happened with the G-SYNC equipped RoG Swift so I was perplexed. Let’s start off with the same Unigine test I conducted before but this time at higher detail settings.


I obviously wasn’t going crazy when I noticed some games degenerating into literal slideshows with FreeSync enabled. Whenever framerates remained above 40, AMD’s technology delivered a superlative gaming experience by virtually eliminating stutter and tearing. It was awesome. Get below that and things fell apart in a hurry.

Compare the FreeSync & V-Sync ON results in the chart above to those with V-Sync turned off and it becomes apparent that whenever framerates dip below 38 or 37, they go into freefall all the way down to 20 or so. In these situations it seems like Adaptive Sync turns off, letting V-Sync take over synchronization of frames with refresh rates. The result is jarring to watch, destroys an otherwise good gaming experience and it makes the slight stuttering with V-Sync look like child’s play.


Grand Theft Auto V exhibits the exact same framerate cliff-diving whenever the action gets intense and let me tell you, it kills immersion. To make matters worse, simply having V-Sync enabled arguably delivers a better experience since it doesn’t cause instantaneous dips to 20FPS whenever framerates drop below 40. That area between 30 and 40 FPS feels completely payable territory in GTA, Shadow of Mordor and many other games but in cases like the one above, Adaptive Sync cuts it out of the equation completely.

So what is happening here? I asked Robert Hallock from AMD and his answer was wonderfully straightforward:

On this particular display (the BenQ XL2730Z), the LCD flickers if you rapidly switch between 40Hz, 144Hz and back to 40Hz. We’ve set the V-Sync rate to 40Hz on this display when you fall below the range, which means you would see factors of 40 as the V-Sync jumps, but flickering is completely eliminated.

This is all subject to change, as we can modify these behaviors in the driver. We’re always looking at stuff like this to find new/better ways to deal with the particulars of an LCD, given that each one has its own characteristics.


That’s a simple, effective response but I’ll chime in here as well. The entire idea behind Adaptive Sync is to implement variable panel refresh rates that synchronize properly with frames output from the graphics card. Instead of utilizing the typical ratio step downs for a 144Hz panel of 1:2 (72FPS), 1:3 (48FPS), 1:4 (36FPS) and so on, it fills in the spaces between those outputs so to speak.

We also have to take into account that current panel technology has refresh rate limits and in the XL2730Z’s case the upper and lower limits are 144Hz and 40Hz respectively. This causes a “zone” for lack of a better word between those two points where Adaptive Sync and by association FreeSync seem to be happy operating in.

Synchronizing panel refreshes and framerates below 40Hz is a challenge since the human eye begins picking up on peripheral flickering around the 35Hz (~29ms) mark while discernible direct screen flicker can be detected at 30Hz (~34ms). If Adaptive Sync was allowed to tie framerates with refreshes below the 35FPS mark, flickering would quickly become an issue. Therefore, below the variable refresh rate zone FreeSync is turning off and simply letting the regular V-Sync operation take over. Hence why turning off V-Sync while keeping FreeSync enabled didn’t exhibit this problem.

At first glance this shouldn’t have caused too much of a problem since V-Sync allows for several step-down ratios from 144 that occur below 40. 1:4 (36FPS), 1:5 (29FPS) and 1:6 (24FPS) are all possibilities which would have caused some stuttering as the panel / GPU handoff varied between them but nothing like what’s occurring. Instead, AMD is syncing to a drastically lower refresh rate in an effort to eliminate flickering. The end result is pretty abysmal performance whenever framerates dip below 40 but, according to AMD, the behavior can be further refined in future driver iterations. In addition, there could be other monitors with lower minimum refresh rates which open a broader window for FreeSync to operate in.


In order to visually compare this situation to G-SYNC, I ran the identical tests but modified settings so the GeForce card (in this case a GTX 980) was operating even lower framerates than AMD’s solutions.

In Unigine we can see that AMD’s step-down process is a marked departure from NVIDIA’s implementation which provides a universally smooth output below 40FPS and even as low as 28FPS in some cases.


It seems like NVIDIA has things figured out though since G-SYNC seems to operate just perfectly below 40FPS. How this has been accomplished is a closely guarded secret but it seems like harnessing full control over the monitor’s scaler via an add-in module does seem to have some benefits.

If anything the situation above shows why AMD so desperately needs FreeSync-compatible Crossfire drivers; with two cards working in sync to boost performance, the possibility of framerates dipping below the “zone” decreases by a substantial amount.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
13,264
Location
Montreal
Gaming Observations

Gaming Observations


Gaming on FreeSync is a unique experience for AMD users since, after a year of looking enviously at G-SYNC capable monitors, they have something comparable. However, as I’ve discussed on the last few pages, DisplayPort Adaptive Sync has some interesting eccentricities that translate into a slightly different gaming experience than G-SYNC panels provide.

Now before I go on, some mention has to be made of BenQ’s XL2730Z. It is an absolutely incredible monitor, one which is actually superior to ASUS’ RoG Swift in many ways. While both may utilize the same TN-based panel, the Swifth has been in my possession for nearly a year now and in all that time I was never able to customize it to my liking. Sure, it has plenty of options within its OSD and NVIDIA’s control panel adds even more possibilities but it seemed plagued by an odd iridescence, particularly in high contrast games like Borderlands: The Pre Sequel and One Finger Death Punch (yes, I’m strangely addicted to that game). Meanwhile, I had the BenQ humming along at a pretty close approximation to “perfect” in under 20 minutes and as a result I ended up enjoying my time with it all that much more. This ain’t no IPS display but its capabilities are far beyond those of older TN-based monitors.

So what games were actually used for the countless hours I used FreeSync for? Pretty much everything actually. Strategy games like Homeworld: Remastered and Total War: Attila were touched upon after their respective release dates while One Finger Death Punch keeps making a comeback in my collection since it’s an excellent outlet for frustration. GTA V and Shadow of Mordor also factored heavily into the rotation.


First person shooters like Battlefield Hardline and Call of Duty Advanced Warfare arguably have the most to gain from FreeSync’s ability to eliminate stutter while also ironing out distracting screen tearing. Both issues were completely absent in my gaming sessions when FreeSync and V-Sync were paired up and I didn’t notice any debilitating input lag either.

One Finger Death Punch is a perfect example of a game that absolutely requires the best possible reaction times and it truly benefits from FreeSync’s ability to operate independently of V-Sync. While FreeSync + V-Sync and even NVIDIA’s G-SYNC delivered a perfectly “clean” image, I did notice input lag from time to time, particularly when fighting critical boss battles. That issue was completely eliminated by simply turning off V-Sync while still letting FreeSync still take care of some –but not all- tearing. AMD’s adaptive sync implementation clearly has the edge here.


While you don’t need hair trigger reflexes for GTA V, it showed the best and worst that FreeSync has to offer. On one hand the lack of image artifacts and stuttering brings a whole new level of depth to the city of Los Santos. It is an absolute treat to see and I was very, very impressed with what AMD has been able to achieve without resorting to expensive scaler replacements.

However, due to the relatively high system demands, in order to play with FreeSync at its best you’ll need to turn down some settings. GTA V is a game that’s perfectly happy puttering along at 30 to 40FPS in nearly every scenario which grants a unique opportunity for single card or slightly lower-end systems to boost detail levels. This is the aforementioned “zone” where FreeSync, in its purest form has so many performance problems.


Shadow of Mordor ran into the same issues as GTA V. This is a title that feels perfectly content running at lower framerates and actually delivering that kind of performance is well within a single R9 290X’s capabilities. However, with V-Sync and FreeSync enabled for the best possible image quality, the game ran face first into AMD’s imposed 40Hz minimum and went stuttering along from there.

In both of these games and likely with upcoming titles as well, a single R9 290X just can’t provide sufficient performance at 1440P with higher detail levels enabled to insure framerates continually remain above that critical 40FPS mark. I constantly found myself running into situations in both games where framerates would drop to 20 or so, rendering the whole affair unplayable.

Crossfire support would have certainly helped in this situation but, according to AMD, there’s still no word on when that will be available. This leaves a good portion of AMD most loyal customers without a way to effectively insure optimal FreeSync performance.


FreeSync On (Left) / FreeSync Off (Right)

Alongside the obvious performance hiccups in lower framerate zones, I also noticed some ghosting in higher contrast, fast motion games. Oddly enough, the BenQ XL2730Z doesn’t exhibit any ghosting when FreeSync is turned off which points to some additional processing or voltage tuning going on behind the scenes. In most games this wasn’t perceptible in any way but One Finger Death Punch did end up showing off the problem. It was odd really; ghosting is the last thing one would expect a 144Hz TN panel to exhibit but there it was.

In the images above you notice the telltale after-image simply because AMD’s own FreeSync test program provides a perfect nesting ground for ghosting to rear its head. High contrast between foreground and background, continually moving objects and no horizontal or vertical camera movement really highlight the faint but noticeable after-effects. This happened with and without V-SYNC enabled so the issue can be placed firmly in the lap of DisplayPort Adaptive Sync. Meanwhile, G-SYNC didn’t exhibit any ghosting.

I’ve mainly be touching on the negatives here simply because they were so glaring when placed next to an otherwise impressive gaming experience. Provided you are cognizant of FreeSync’s relatively broad functionality zone of 40FPS to 144FPS, it provides incredibly fluid images and will allow you to look at games in a whole new light. It is the perfect companion (rather than competitor) to G-SYNC since it provides 90% of NVIDIA’s capabilities at a significantly lower cost. Truth be told, I was surprised by how well it replicated G-SYNC’s high points despite the few areas where it fell short. Hopefully AMD will only improve it from here on.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
13,264
Location
Montreal
Countless Gaming Hours Later....Parting Thoughts

Countless Gaming Hours Later....Parting Thoughts


Throughout my time with FreeSync I went through a rollercoaster of emotions; from hope to absolute elation to disappointment. For the most part I’d call it an enticing piece of technology which flat-out delivers on AMD’s numerous promises. FreeSync works and works really, really well when given the chance but there are still some areas that need ironing out before I can sit here and wholeheartedly recommend it.

With FreeSync being broken down into two modes choosing which one is right for you shouldn’t be too hard. Enabling FreeSync and V-Sync will grant the best possible results, buttery smooth framerates and a lack any visual artifacts. It’s awesome. With FreeSync On but V-Sync off, there’s a bit less screen tearing but I didn’t notice any major differences between this setting running without any type of synchronization. Input latency in some games is improved though. Having a choice is great but all of the observations you’ll see below are based on both V-Sync and FreeSync being enabled.

I’ve already made it more than evident that G-SYNC has been my go-to gaming technology for the better part of a year and whenever I switch to a standard monitor, the differences are like night and day. It has changed the way I look at games. That means I was genuinely convinced that switching to the FreeSync-equipped BenQ XL2730Z would be a downgrade in the visual experience department. Had that been the case I could have wrapped up this article in a week or so and locked away the monitor for good. Well guess what? Three weeks and almost 200 hours of gaming later, I still struggle to point out the differences between G-SYNC and FreeSync under the right conditions. That’s huge praise coming from someone who thought FreeSync may end up being consigned to the same dustbin as TrueAudio and HD3D. The monitor is actually still front and center in my gaming setup since it was able to hit a color pallet that was more pleasing to me than the RoG Swift.

During my time with FreeSync, first and foremost in my mind were three questions. Does FreeSync provide notable improvements that can be seen by the naked eye? Does FreeSync offer a viable solution for gamers? Can FreeSync elicit the same visceral reaction that G-SYNC did when it was first launched? The answer to every one of those is a resounding “YES!”. FreeSync was a faithful companion in games like Homeworld: Remastered, GTA V, Shadow of Mordor, One Finger Death Punch, and Battlefield: Hardline.

Now I’m certainly not a professional gamer or someone who nitpicks every conceivable pixel but I’m convinced there’s a huge future in this technology. Stepping up to a VRR environment may not provide visually extreme differences but taking a step back to regular refresh rates is like a slap in the face once you’ve lived with adaptive sync for a few weeks. The screen tearing and stuttering become horribly noticeable to the point where you’ll be wondering how you ever lived through it before.

My experience with FreeSync was for the most part overwhelmingly positive but only under those “right conditions” as I mentioned above. Those conditions arise when FreeSync is used in a framerate zone that mirrors that panel’s native minimum and maximum refresh rates. Go below the lower of those two points and things start getting ugly. The panel suddenly finds itself in a 40Hz V-Sync mode and framerates plummet down to 20FPS, leaving a massive 20FPS dead zone between 20 and 40.

This situation doesn’t affect G-SYNC and is one of the major deficiencies of AMD’s technology right now. Could this change in the future? With panels capable of wider refresh rate ranges and possible driver tweaks, there’s certainly some room for hope. However, APUs -which struggle to achieve 40FPS under the best of circumstances- may find themselves better off without FreeSync in many games.

Beyond the minor ghosting and performance drop-offs under 40FPS, perhaps the largest issue with the current iteration of FreeSync is its complete lack of Crossfire support. Not only would Crossfire allow AMD’s cards to safely operate beyond the lower 40FPS limit but the hybrid Dual Graphics version would provide a much-needed speedup for APUs as well. Unfortunately, after being pushed back from March to April to May, FreeSync support is still coming but, according to AMD’s latest post on the matter, we might not see it anytime soon:

AMD fans have been outspoken about their love for the smooth gameplay of AMD FreeSync™ technology, and have understandably been excited about the prospect of using compatible monitors with the incredible performance of an AMD CrossFire™ configuration.

After vigorous QA testing, however, it’s now clear to us that support for AMD FreeSync™ monitors on a multi-GPU system is not quite ready for release.

As it is our ultimate goal to give AMD customers an ideal experience when using our products, we must announce a delay of the AMD Catalyst™ driver that would offer this support. We will continue to develop and test this solution in accordance with our stringent quality standards, and we will provide another update when it is ready for release.


Now we understand that an inordinate amount of the driver team’s time and effort is being put into next generation products like Carrizo and the R9 300-series but neither is an excuse to let your paying customers with R9 295X2 or dual R9 290X cards wait months for the best possible performance with a potentially game-changing technology. In this respect, like it or not, NVIDIA seems to keep proving why a closed and tightly controlled in-house ecosystem is sometimes better than “open” initiatives when it comes to endpoint execution.

That brings me to the eternal FreeSync versus G-SYNC debate. I don’t find there’s a clear winner since both have their benefits and drawbacks. They don’t even compete with one another but rather feel like complementary technologies. If you find an AMD card fits your needs better than a GeForce product then FreeSync represents an incredible feature but certainly not an added value since, like G-SYNC, it requires the purchase of a new monitor.

Would either FreeSync or G-SYNC ultimately sway my decision towards the Radeon or GeForce brand? Absolutely not. Both are great technologies but there are other key elements in a GPU purchase (price, drivers, power consumption, acoustics and overall value to name a few) that should be pondered long before getting into the G-SYNC versus FreeSync debate.

I do have to put my own spin on some of FreeSync's finer points though. First of all, FreeSync features the exact same infinitesimal performance drop-off as G-SYNC (less than 3% in most cases). I wouldn’t exactly call this technology free either. While G-SYNC panels certainly cost more, FreeSync is being attached to most monitor vendors’ premium products which already go for a significant amount more than standard TN-based gaming panels. This isn’t AMD’s doing since FreeSync is being integrated without manufacturers adding more hardware but it does show how the “free” terminology should actually read “Less Expensive Than G-SYNC”.

While FreeSync is certainly not without its drawbacks, some of which are quite significant depending on your setup, I believe it has just as much (if not more) chance for success as G-SYNC. Nothing is stopping NVIDIA from using Adaptive Sync but it looks like there are certain protocols within the technology that need to be addressed before it can be considered a direct threat to their solution. With that being said, AMD's FreeSync (and by association DP Adaptive Sync) has done incredible things with a relatively immature standard and they have certainly given their users, and me, something to be excited about. I'm just not quite ready to pronounce a true winner in the image quality battle just yet.
 
Last edited:

Latest posts

Twitter

Top