G-SYNC 101: G-SYNC vs. V-SYNC OFF


Beyond the Limits of the Scanout

It’s already been established that single, tear-free frame delivery is limited by the scanout, and V-SYNC OFF can defeat it by allowing more than one frame scan per scanout. That said, how much of an input lag advantage can be had over G-SYNC, and how high must the framerate be sustained above the refresh rate to diminish tearing artifacts and justify the difference?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Quite high. Counting first on-screen reactions, V-SYNC OFF already has a slight input lag advantage (up to a 1/2 frame) over G-SYNC at the same framerate, especially the lower the refresh rate, but it actually takes a considerable increase in framerate above the given refresh rate to widen the gap to significant levels. And while the reductions may look significant in bar chart form, even with framerates in excess of 3x the refresh rate, and when measured at middle screen (crosshair-level) only, V-SYNC OFF actually has a limited advantage over G-SYNC in practice, and most of it is in areas that one could argue, for the average player, are comparatively useless when something such as a viewmodel’s wrist is updated 1-3ms faster with V-SYNC OFF.

This is where the refresh rate/sustained framerate ratio factors in:

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

As shown in the above diagrams, the true advantage comes when V-SYNC OFF can allow not just two, but multiple frame scans in a single scanout. Unlike syncing solutions, with V-SYNC OFF, the frametime is not paced to the scanout, and a frame will begin scanning in as soon as it’s rendered, regardless whether the previous frame scan is still in progress. At 144Hz with 1000 FPS, for instance, this means with a sustained frametime of 1ms, the display updates nearly 7 times in a single scanout.

In fact, at 240Hz, first on-screen reactions became so fast at 1000 FPS and 0 FPS, that the inherit delay in my mouse and display became the bottleneck for minimum measurements.

So, for competitive players, V-SYNC OFF still reigns supreme in the input lag realm, especially if sustained framerates can exceed the refresh rate by 5x or more. However, while at higher refresh rates, visible tearing artifacts are all but eliminated at these ratios, it can instead manifest as microstutter, and thus, even at its best, V-SYNC OFF still can’t match the consistency of G-SYNC frame delivery.



3720 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
olam
Member
olam

Hi thanks for you guide!

Is it recommended to disable G-SYNC on my other monitors which support G-SYNC?
Only my main monitor runs at 360Hz, while the other two run at 240Hz.

My actual issue is that my games don’t feel as smooth as they should.

Thank you in advance.

Hena
Member
Hena

Quick question, is it normal for a game like Final Fantasy XIV to have a laggy mouse cursor with GSYNC + NVCP V-Sync, it’s not the case when I deactivate GSYNC, it’s weird despite the fact that game has an option for hardware mouse cursor, or is it just the intended behavior with GSYNC ?

user2422
Member
user2422

I’ve read below that you personally disable the windows VRR option. Is there a specific reason for that? If i remember correctly, you said in an older comment that this setting is not directly related but it doesn’t hurt to leave it enabled for any edge cases.

Jokerstarik
Member
Jokerstarik

Hello! Thanks for your article. Could I ask your opinion based on your experience?
I have a TCL C805 TV in Game Mode with VRR enabled. My RTX 5070 Ti runs with G-Sync on, and V-Sync is enabled in the driver. I use Frame Generation (×2 and ×3) to reach 138 FPS. Reflex limits it to 138 FPS automatically.

Here’s my issue: with DLSS 4 I get a stable 63 FPS, but when using ×2 generation I don’t reach 138 FPS. With ×3 I do, but Reflex seems to cap the base render to around 46 FPS—the rest is generated. I can see this when disabling generation: FPS locks at 44–46 and GPU load stays at 65–70%.

How can I remove this base render limit so the GPU renders around 60 real FPS, and Frame Generation raises it to 138 FPS?
(V-Sync, G-Sync/VRR, Reflex + Multi-Frame Generation enabled.)

kdog1998
Member
kdog1998

regarding low latency mode and reflex, when exactly do i turn them on or off? I use RTTS and set a fps cap to 65 in a game, my gpu hits that easily so i can maintain smooth gameplay. Do i still need to have a low latency mode enabled? would i set it to on or ultra or neither? in another game, i either exceed or sometimes sit just below my max refresh rate, do i need to use either reflex or low latency mode if available?

I have also noticed something weird when it comes to fps caps and im not sure what causes it, if its my monitor specifically or g sync. In example god of war ragnarok, i was able to hit 160 fps uncapped and when i would rotate my camera it was buttery smooth. But if i added a fps cap, even if it was 157 fps the camera panning was unsmooth and seemed excessively blurry. I tested this by going to a hard to run area, and uncapped my fps. at uncapped 120 fps camera panning was smooth , but if i placed a fps cap at 120 and panned it was wrong weird again. it only happens with a fps cap in place. ive tried it with ultra and on low latency mode and both have the issue, but i have not tried it off completely. I hope this makes sense and doesnt seem like rambling lol

wpDiscuz