G-SYNC 101: G-SYNC Ceiling vs. FPS Limit


How Low Should You Go?

Blur Busters was the world’s first site to test G-SYNC in Preview of NVIDIA G-SYNC, Part #1 (Fluidity) using an ASUS VG248QE pre-installed with a G-SYNC upgrade kit. At the time, the consensus was limiting the fps from 135 to 138 at 144Hz was enough to avoid V-SYNC-level input lag.

However, much has changed since the first G-SYNC upgrade kit was released; the Minimum Refresh Range wasn’t in place, the V-SYNC toggle had yet to be exposed, G-SYNC did not support borderless or windowed mode, and there was even a small performance penalty on the Kepler architecture at the time (Maxwell and later corrected this).

My own testing in my Blur Busters Forum thread found that just 2 FPS below the refresh rate was enough to avoid the G-SYNC ceiling. However, now armed with improved testing methods and equipment, is this still the case, and does the required FPS limit change depending on the refresh rate?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

As the results show, just 2 FPS below the refresh rate is indeed still enough to avoid the G-SYNC ceiling and prevent V-SYNC-level input lag, and this number does not change, regardless of the maximum refresh rate in use.

To leave no stone unturned, an “at” FPS, -1 FPS, -2 FPS, and finally -10 FPS limit was tested to prove that even far below -2 FPS, no real improvements can be had. In fact, limiting the FPS lower than needed can actually slightly increase input lag, especially at lower refresh rates, since frametimes quickly become higher, and thus frame delivery becomes slower due to the decrease in sustained framerates.

As for the “perfect” number, going by the results, and taking into consideration variances in accuracy from FPS limiter to FPS limiter, along with differences in performance from system to system, a -3 FPS limit is the safest bet, and is my new recommendation. A lower FPS limit, at least for the purpose of avoiding the G-SYNC ceiling, will simply rob frames.



3696 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
tearxinnuan
Member
tearxinnuan

Thank you very much for your article and tutorial! I’ve set up the appropriate settings according to your article, but I still have some questions I’d like to ask!

First, my current settings are:
NVCP: G-SYNC + V-SYNC on, LLM off,
In Game: Reflex on + boost, V-SYNC off

I believe this setup is optimal for GSYNC usage. I don’t limit my frame rate using any external software or NVCP. When I enable Reflex in-game, it automatically caps my frame rate at 260 FPS (my monitor is 280Hz). I think relying solely on Reflex to limit my frame rate would be more straightforward than setting it separately, and perhaps also avoid conflicts and instability caused by multiple frame limits. Secondly, I’ve personally tested the games I play, and Reflex takes precedence over both the in-game and NVCP frame limits. That is, no matter how much I limit my frame rate, once Reflex is enabled, it caps it at 260 FPS.

I primarily play competitive games like Valve, APEX, and Overwatch, but I also occasionally play other single-player games. Then, the competitive games I play all have Reflex, so can I completely abandon all external frame limiting methods and rely solely on Reflex?

Also, regarding LLM in NVCP, should I set it on or off, or even set it to Ultra? I’m not sure if there are any advantages or disadvantages to turning LLM on, even though Reflex takes over a lot of the processing. There’s a lot of controversy online about LLM, and even NVIDA officials claim that setting LLM to Ultra will minimize V-SYCN latency.

Looking forward to your answers!

dimacbka
Member
dimacbka

Hi. I really liked this article. But I have a couple of questions. I have a new PC that gives 800 fps in cs2. How do I set up this gsync+vsync+reflex bundle correctly? My monitor is 280Hz. I’m confused, do I need to limit frames via the nvidia panel? Yesterday I turned on “delay” on Ultra and reflex+boost. In the game, the frames were around 260. With the fps_max parameter 0

mike-lesnik
Member

Hello, jorimt! My question is more about input delay than G-sync, but I decided to ask it here because I really like your style of response — simple and clear.
I don’t quite understand what role frametime plays in input delay? It is often written that frametime is the time needed to create a frame, but 60 frames of 16.6 ms each can be created by either an underloaded or overloaded GPU. On the screen, we see the same framerate and frametime in both cases, but the resulting input delay will be different…
That is, the frametime is not “the time it took the system (CPU-OS-Engine-GPU) to create the frame”, but “the time allotted for displaying the frame by the display before the next one appears”?

dpawelcz
Member
dpawelcz

I’m having an awful time trying to get Street Fighter 6 feeling good on my Zephyrus G14 gaming laptop. It has a 120hz OLED screen. I swear in game it doesn’t feel like its getting 120hz, and feels input laggy.
The game is locked at 60fps, and it feels as if its running at 60hz. Outside the game i’ve confirmed im running at 120hz on the display. I have gsync ON and vsync ON in the nvidia control panel. I’ve also noticed that no matter what, sf6 starts with vsync on in the settings and i have to turn it off every time manually. I suspect that might be the issue.
Any tips would be greatly appreciated

anthony3192
Member
anthony3192

When I activate vsync from the nvidia app per profile, therefore from game to game (as you advised me) some games like the witcher, wukong, fortnite are limited to 225fps (I have a 240hz monitor) while other titles like Valorant and Cod have an unlocked frame rate, so they are not limited. How come?

wpDiscuz