The (Unproductive) Battle of FreeSync and G-Sync

By Published April 01, 2016 at 1:00 pm
  •  

Stutter as a result of V-Sync (which was made to fix screen tearing -- another problem) has been a consistent nuisance in PC gaming since its inception. We’ve talked about how screen-tearing and stutter interact here.

Despite the fact that FPS in games can fluctuate dramatically, monitors have been stuck using a fixed refresh rate. Then nVidia’s G-Sync cropped-up. G-Sync was the first way to eliminate both stutter and screen-tearing on desktop PCs by controlling FPS-refresh fluctuations. Quickly after nVidia showed off G-Sync, AMD released their competing technology: FreeSync. G-Sync and FreeSync are the only adaptive refresh rate technologies currently available to consumers on large.

FreeSync and G-Sync are currently “competing” to be the dominant technology in monitors. This capitalist society encourages competition – in any form – as a benefit to consumers and improvement in products. And that’s normally the outcome of competition, too; but the “competition” between G-Sync and FreeSync does little to benefit the consumer, and instead creates vendor lock-in and makes for fractured development.

In this op-ed, we’ll talk about the G-Sync and FreeSync differences, market posturing, and why we think this particular type of technology should be more universal.

NVidia G-Sync vs. AMD FreeSync Technical Differences

gsync11-2

FreeSync and G-Sync are implemented in vastly different forms and have different advantages, but both ultimately aim to solve screen-tearing and stutter without sacrifice (V-Sync, it turns out, has major sacrifices). The primary technical difference  between them is that G-Sync doesn’t have variable refresh rate range like FreeSync does and that FreeSync is generally cheaper. Each FreeSync monitor has its rated FreeSync range, which means that outside of this range, FreeSync doesn’t work, which results in either screen tearing or stutter. In contrast, G-Sync monitors don’t have a minimum refresh rate for G-Sync to function, instead G-Sync works even below 20 FPS – although that’s practically just a fast slide show. Of course, both G-Sync and FreeSync are limited by the maximum refresh rate of the monitor.

G-Sync may be superior in providing a more uniform experience and better at handling low-FPS situations, but monitors with FreeSync – reflecting its name – are cheaper than their G-Sync equivalents. This is in part because of how variable refresh rate is implemented. FreeSync is implemented via ASICS (in the case of FreeSync, scalers used in monitors) that can be cheaply mass produced by multiple companies  – leading to healthy supply-side competition – as long as they are compliant with DisplayPort’s adaptive sync feature. In contrast, G-Sync is implemented through the use of field programmable gate arrays (FPGAs) and licensed to monitor manufacturers; this is a physical hardware component on the display. FPGAs are adaptable and quicker-to-market than ASICs, but they’re more expensive and, since nVidia is the only company capable of licensing G-Sync, there is little competition to help drive down the cost of implementing G-Sync.

G-Sync was developed by, is proprietary to, and licensed by nVidia, so AMD can’t use it unless nVidia changes its mind on letting its competitor use technologies it’s developed. And, of course, AMD would have to effectively admit defeat and begin adapting their technology to support a competitor’s product. Not going to happen.

freesync-whitepaper

AMD’s approach was to instead utilize a feature in the Embedded DisplayPort spec – which is used in laptops – that allows for refresh rate to vary with FPS in order to save power. They quickly helped VESA, the organization that develops DisplayPort, with the process of porting this over to the normal DisplayPort standard so it could be used on desktops. FreeSync is AMD’s brand name for their software and hardware support that allows AMD GPUs to use DisplayPort’s adaptive sync feature. AMD also recently expanded what FreeSync does by adding official support for FreeSync over HDMI through the use of custom protocols.  It is completely possible to implement variable refresh rate via DisplayPort’s adaptive sync feature without calling it FreeSync or being required to get AMD’s permission.

The Effects of G-Sync and FreeSync Competition

Competition often leads to better and cheaper products, but that isn’t as exaggerated in the case of G-Sync vs. Freesync. Their “competition” primarily leads to vendor lock-in and less competition with greater constraints and contingencies for coupled internal hardware. It would be impractical for a monitor to support both FreeSync and G-Syns since it would require two scalers, and more importantly, AMD GPUs only support FreeSync, and nVidia GPUs only support G-Sync. This leads to the situation in which consumers have fewer monitors to choose from (halving of the pool, per GPU vendor). Or a large pool, but with handicaps -- no one wants to pay extra extra for a monitor with a feature they can’t use. GPU upgrade choices also become instantly limited by monitors, which folks tend to hang onto for long periods of time. For example, take someone who buys a G-Sync monitor to go with their nVidia GPU. When it’s time to upgrade their GPU -- maybe 1-3 years from now -- they can only consider nVidia GPUs lest a major monitor feature becomes unusable.

Furthermore, FreeSync and G-Sync also fracture developments in variable refresh rate technologies. G-Sync and DisplayPort’s adaptive sync feature (which FreeSync uses) cannot share developement between each other. This means that, in order for a feature to be implemented in both G-Sync and DisplayPort’s adaptive sync environments (and implementations based on it, like FreeSync), that feature must be developed and implemented twice. Take, for example, if nVidia added G-Sync support for partial screen refresh – in that only certain parts of the monitor are refreshed, rather than an entirely new frame being drawn: VESA would have to do their own development to add the equivalent feature to Displayport, and then AMD would have to spend the time and money to support it in its drivers.

We see this fractured development as a hindrance to consumers.

What Should Happen

There’s no denying that G-Sync and FreeSync both have their own advantages. It would be easy to suggest that nVidia should drop G-Sync, and support their own version of FreeSync – under a different name, for marketing reasons – using DisplayPort’s variable refresh rate feature. The issue with this is that for any technologies using DisplayPort’s variable refresh rate feature there is a different variable refresh range for each panel. Some will be wide as 30-144Hz, while others have narrower ranges like 40-60Hz; for some monitors, it is nearly impossible to actually find the range. Outside of this range, V-Sync or screen tearing must be tolerated -- and V-Sync creates input lag and stutter, which no gamer wants.

In contrast, G-Sync’s range is only limited by the maximum refresh rate of the panel, since G-Sync doubles the refresh rate when it goes low enough. G-Sync produces a marginally more fluid expeirence in some situations, aided by its hardware implementation on the display. These advantages don’t change the fact G-Sync monitors are generally more expensive than their FreeSync equivalents. AMD allows the FreeSync branding to be put on monitors very liberally, since there are no real requirements for how large the FreeSync range must be. FreeSync also, just for sake of education, isn’t properly “free.” There is validation cost, but it is cheaper than G-Sync.

Instead of supporting FreeSync and G-Sync as they are now, nVidia and AMD should work together to utilize DisplayPort’s adaptive sync feature and support a single standard like Adaptive Sync – crazy idea, we know. We’d like to see this technology given to monitors that support DisplayPort’s adaptive sync feature with a sufficiently wide range – e.g. 24Hz to the maximum refresh rate of the panel, be it 60, 120, 144, or more.

Conclusion

I’ve spent a lot of words in this article criticizing both FreeSync and G-Sync for their problems, but both of them are still revolutionary technologies for PC gaming. They both fix problems of screen tearing and stutter that have plagued PC gaming for years and are both huge steps in the right direction. That being said, the current situation doesn’t create a true, end-all solution, and instead fragments development and leads to vendor lock-in for consumers. AMD and nVidia should work together to support one standard that’s cheap, open, and has a wide variable refresh rate range. This would benefit consumers, companies, and the PC gaming industry in general by increasing the pool of competition.

For PC gamers looking to get an adaptive sync-capable monitor now (G-Sync or FreeSync), we wouldn’t suggest waiting for a new standard to emerge. Despite the fact that consumers and corporations would benefit in multiple ways from a singular standard supported by both major GPU vendors, we don’t expect either of them to abandon their respective technologies anytime soon.

- Michael "The Bear" Kerns.

Michael Kerns

Michael Kerns first found us when GN's Editor-in-Chief was tirelessly answering questions on reddit pertaining to a new product launch, likely after the Editor had stayed up all night writing the news post. Michael offered a tired Editor reprieve, taking over the role of questions-answerer-extraordinaire when it was most needed. These days, Michael can be found pulling his mechanical keyboard collection apart and building Frankenstein's Monster-like monsters of keyboards. Michael wrote the vast majority of our mechanical keyboard dictionary and is an expert in keyboards.

We moderate comments on a ~24~48 hour cycle. There will be some delay after submitting a comment.

Advertisement:

  VigLink badge