AMD’s FreeSync; A Long-Term Review
Date: May 10, 2015
Product Name: FreeSync
For the three weeks or so, I’ve been living in a vacuum. Our lab only received our AMD FreeSync monitor relatively late (on the day of launch actually) and since then I’ve been avoiding reading anyone else’s opinions about the technology, its benefits and potential drawbacks. Simply put, I needed to formulate an unbiased opinion based upon my game-time experiences rather than looking at FreeSync based upon technological expectations.
Unlike many of our other reviews you’ll be seeing my personal, somewhat biased opinions along with the usual litany of raw performance testing and a bit of technical jargon. Yes it’s different but the only way to truly experience and understand the benefits of technologies like FreeSync is from a first-hand perspective. This took a bit longer than I would have liked (the BenQ XL2730Z now has over 175 hours of use on it) but in the end I believe this will result in a somewhat unique perspective.
Before I go on, there’s a small admission I have to make. I’m a screaming fanboy of NVIDIA’s G-SYNC technology. An Acer XB280HK has been my primary gaming display for the better part of four months now and I have logged over 250 hours on it. So when AMD said they were sending along a FreeSync monitor, I approached it with a mix of excitement and trepidation.
We’ve already seen how AMD’s initiatives offer up a mixed bag with more recent technologies like Eyefinity and Mantle meeting with huge success while others like HD3D and TrueAudio ultimately failed to meet expectations. I simply didn’t want FreeSync to go down the latter path since, when taken at face value, it has so much to offer gamers.
In order to understand AMD’s FreeSync, some basic knowledge of VESA’s DisplayPort Adaptive Sync and V-Sync in general is necessary. Gamers who want the best possible performance and minimal mouse latency typically play with V Sync disabled which allows framerates to run independently of the monitor’s refresh rates. This acts as a double edged sword since screen tearing occurs as multiple frames tend to be displayed onscreen during a monitor refresh cycle. The end result may be lightning quick reaction times but a distracting onscreen image that is filled with artifacts since the display’s refreshes aren’t properly synchronized with the frames being delivered by the graphics card.
Meanwhile, those who care about achieving optimum image quality tend to enable V-Sync but that caps framerates at the monitor’s maximum refresh rate. It also increases mouse latency and introduces noticeable stuttering if the graphics card can’t keep up with the vertical refresh rates and buffers frames in preparation for the next monitor refresh. When this happens, the previous frame is repeated before the current frame is launched onto the screen.
Another problem with enabling V-Sync is a step-down effect which happens when the system is displaying framerates that are below the monitor’s native refresh rate. On a 60Hz monitor, that could lead to FPS jumping and stuttering between 60, 30, 15 and other integer multiples as the graphics card tries to synchronize its output in parallel to the display. Panels with higher refresh rates somewhat mitigate the performance capping issues but still suffer from stuttering.
Adaptive Sync is a technology baked into the DisplayPort 1.2a protocol that is meant to eliminate the aforementioned tearing and stuttering by synchronizing the GPU and monitor so frames are displayed when ready through a framerate-aware variable monitor refresh rate. However, while DisplayPort Adaptive Sync is a mechanism to achieve better onscreen fluidity, it requires system-end support to function properly.
FreeSync is simply the driver-side facilitator which allows refresh rate information to be passed between the source (in this case an AMD graphics card) and the panel. There’s a handshake protocol which allows the monitor to tell the GPU the fastest time it’s ready to accept a frame and the slowest. In effect this gives the GPU full knowledge of what’s happening without having to actually polling the monitor first while the display works off of this information to vary its refresh rate accordingly.
Since all of this is accomplished within hardware, FreeSync runs agnostically from any game-level hiccups that may be encountered. In short, like G-SYNC, FreeSync should be compatible with every game in existence, regardless of the API (DirectX, Mantle, OpenGL, etc.) being used since it doesn’t rely on driver profiles to work.
As this functionality runs hand in hand with the DisplayPort specification, costs are kept down since additional hardware isn’t needed. It should also allow FreeSync to be easily ported over to notebooks, a market segment which could seriously benefit from this technology.
Quite a few discussions have homed in on how FreeSync stacks up to NVIDIA’s competing G-SYNC since both technologies parallel one another in many ways. They actually claim to accomplish the same set of goals by smoothing out onscreen animations and eliminating the image artifacts normally associated with running or disabling V-SYNC. It seems straightforward right? Not so fast because both companies’ methods vary quite a bit.
While AMD is harnessing Adaptive Sync’s benefits without the need for proprietary, expensive hardware, G-SYNC panels require an add-on module which replaces a monitor’s scaler. As a result FreeSync monitors are generally less expensive than their competitors since the necessary protocols are included directly in their EDID. Interestingly enough, this also means that any DP 1.2a-equipped monitor could be FreeSync compatible provided there’s a compatible firmware for it.
Raw cost of entry may be a major determining factor in the FreeSync versus G-SYNC battle but some of the other differences are more nuanced. Even though AMD’s technology can only run through DisplayPort it also allows for other connectivity like DVI, HDMI and even VGA to be built into supporting products, something NVIDIA doesn’t offer. This may not be a big deal for many gamers but during my time with G-SYNC I missed the ability to use my notebook on a larger screen via its HDMI output.
A lot of what you see in the chart above is also pure marketing mumbo jumbo. For example, AMD’s FreeSync may support a wider refresh rate range than G-SYNC but you’ll always be limited in this respect by the panel itself, none of which even begin to approach the 9Hz to 240Hz claims. As I’ll explain a bit later, the so-called “performance penalty” needs to be taken with a grain of salt as well.
One of the main complaints leveled by gamers at V-SYNC is mouse lag. While I’m more averse to onscreen artifacts than a nearly imperceptible amount of input hesitation, I also prefer slower paced strategy games, think the current first person shooter genre is boring as hell…and I love rabbits. Even though I was once placed in the top 50 global Counter Strike players (little known fact alert!) these days I’m a good bit older and the reflexes are shot to hell from too much wine so I won’t cry bloody murder at a missed headshot.
With that being said, I have to applaud AMD for the way they’re handling V-Sync here. Whereas NVIDIA automatically enables vertical synchronization whenever G-SYNC is turned on, FreeSync can operate independently of the screen’s vertical synchronization locks. This allows for a gaming experience tailored to your liking. Want the best possible motion quality with typical input latency? Turn on V-Sync alongside FreeSync. Want to keep FreeSync’s ability to minimize tearing and improve latency? Simply turn off V-Sync in whatever game you’re playing but allow FreeSync to do its thing.
Within all of these potential benefits of FreeSync, there are some notable areas where it falls short as well. For starters, it boasts somewhat limited compatibility compared to G-SYNC. While AMD’s feature is limited to current-generation cards, GeForce products that support G-SYNC date back to the GTX 600-series days with everything faster than a GTX 650 Ti Boost Edition able to communicate with certified monitors.
Perhaps the largest miss for FreeSync is its lack of Crossfire support at launch. Adaptive synchronization technologies require higher framerates to showcase their true potential and dips below the monitor’s vertical refresh rate window tend to cause the exact artifacts AMD is seeking to avoid (more on this later).
Luckily AMD does natively support Virtual Super Resolution (VSR) on FreeSync but without Crossfire, internally boosting rendering resolutions would be counter-intuitive. On the other hand, NVIDIA has always supported multi card configurations on G-SYNC. G-SYNC is also supported within Dynamic Super Resolution super sampling setting but their current drivers don’t allow for a combination of SLI, G-SYNC and DSR.
Activating FreeSync couldn’t be easier. Simply go into AMD’s Catalyst Control Center and turn it on within the display settings dialog area. Typically a pop-up will appear when Windows starts and you can check the settings by following the onscreen instructions.
My time with FreeSync wasn’t completely smooth though. At random intervals the warning above would present itself but from what I could gather, this didn’t negatively affect anything and it didn’t seem like things were going awry. I have a feeling it was a false positive so don’t panic.
When taken at face value, FreeSync looks like a bona fide competitor to G-SYNC but when push comes to shove both solutions’ goals are exactly the same: to provide a feature that will draw people to purchase a given graphics architecture. We have to remember that NVIDIA has a year-long lead on AMD but does that actually translate into a drastically different first-hand gaming experience? That’s what I’m going to explore in the upcoming pages while also endeavoring to explain a few more of FreeSync’s more intricate nuts and bolts.