G-SYNC 101: G-SYNC Ceiling vs. V-SYNC


Identical or Fraternal?

As described in G-SYNC 101: Range, G-SYNC doesn’t actually become double buffer V-SYNC above its range (nor does V-SYNC take over), but instead, G-SYNC mimics V-SYNC behavior when it can no longer adjust the refresh rate to the framerate. So, when G-SYNC hits or exceeds its ceiling, how close is it to behaving like standalone V-SYNC?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Pretty close. However, the G-SYNC numbers do show a reduction, mainly in the minimum and averages across refresh rates. Why? It boils down to how G-SYNC and V-SYNC behavior differ whenever the framerate falls (even for a moment) below the maximum refresh rate. With double buffer V-SYNC, a fixed frame delivery window is missed and the framerate is locked to half the refresh rate by a repeated frame, maintaining extra latency, whereas G-SYNC adjusts the refresh rate to the framerate in the same instance, eliminating latency.

As for “triple buffer” V-SYNC, while the subject won’t be delved into here due to the fact that G-SYNC is based on a double buffer, the name actually encompasses two entirely separate methods; the first should be considered “alt” triple buffer V-SYNC, and is the method featured in the majority of modern games. Unlike double buffer V-SYNC, it prevents the lock to half the refresh rate when the framerate falls below it, but in turn, adds 1 frame of delay over double buffer V-SYNC when the framerate exceeds the refresh rate; if double buffer adds 2-6 frames of delay, for instance, this method would add 3-7 frames.

“True” triple buffer V-SYNC, like “alt,” prevents the lock to half the refresh rate, but unlike “alt,” can actually reduce V-SYNC latency when the framerate exceeds the refresh rate. This “true” method is rarely used, and its availability, in part, can depend on the game engine’s API (OpenGL, DirectX, etc).

A form of this “true” method is implemented by the DWM (Desktop Window Manager) for borderless and windowed mode, and by Fast Sync, both of which will be explained in more detail further on.

Suffice to say, even at its worst, G-SYNC beats V-SYNC.



3751 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
wttrbb
Member
wttrbb

Hey man thanks for the detailed guide! I’m sorry if this is been mentioned anywhere I just couldn’t find it. What would your recommended settings for frame generation in games be? Is there anything to keep in mind when using that?
Kind regards

schustez
Member
schustez

does using reflex on + boost or ultra low latency always cap your fps below the max refresh rate of your monitor when you can sustain a higher fps or only when you are gpu bound?

higorhigorhigor
Member
higorhigorhigor

In my tests with the AMD RX 6750 XT, an LG 180hz IPS monitor, on both Linux and Windows 11, I noticed that when I cap the FPS at 60, for example, in scenarios that could deliver 160 uncapped, what happens is that my GPU significantly reduces its clock speeds, and this creates instability in frame production, causing the refresh rate to fluctuate widely.

On Linux, in LACT (a program for managing GPUs on Linux), I created a profile for the specific game and activated a performance level option that keeps the clock speeds higher. This completely solved the problem that occurs when I limit the FPS far below what my GPU can produce.

On Windows, I haven’t found a similar option to this, but I also haven’t looked much since I’m not using Windows a lot. I came here to comment, so that in case you weren’t aware of this, it might help other users who feel they have this VRR disengagement issue even when the FPS seems stable in RTSS.

COLEDED
Member
COLEDED

Thanks for the detailed guide. Sorry if this is a mostly unrelated question, I ask because the power plan is mentioned in the conclusion section.

My default Window’s “High performance” plan puts my minimum processor state at 0% for some reason.

Is it a good idea to just use Bitsum’s highest performance plan from park control, which sets the Minimum processor state at 100%, all of the time? I haven’t seen an increase in idle CPU power consumption or utilization after changing to this profile

Does this setting actually changes anything?

PODDAH
Member
PODDAH

Before you read this, I’m sorry for wasting your time if this question has already been answered in the article or in the comments. I tried to read everything to the best of my ability and still am a bit confused because my English is not the best.

Hey, I’m just writing this to make sure that I’m using the best setting. I have adaptive sync on in my monitor’s settings, which enables me to use G-Sync, and then I have G-Sync compatible enabled and V-Sync enabled too on in NCP, and the preferred refresh rate is at application controlled. I tried checking the delay and everything in Fortnite, because it has a setting which lets me do that. This gives me the least amount of delay, and even if I change my preferred refresh rate to the highest available, it still pretty much gives the same delay. I also have my FPS cap in Fortnite set to 144 just in case. I tried other things, and either they give me screen tearing or more delay. I only have one question: is this good enough to get the least amount of delay without getting any screen tearing?

wpDiscuz