G-SYNC 101: In-game vs. External FPS Limiters

Closer to the Source

Up until this point, an in-game framerate limiter has been used exclusively to test FPS-limited scenarios. However, in-game framerate limiters aren’t available in every game, and while they aren’t required for games where the framerate can’t meet or exceed the maximum refresh rate, if the system can sustain the framerate above the refresh rate, and a said option isn’t present, an external framerate limiter must be used to prevent V-SYNC-level input lag instead.

In-game framerate limiters, being at the game’s engine-level, are almost always free of additional latency, as they can regulate frames at the source. External framerate limiters, on the other hand, must intercept frames further down the rendering chain, which can result in delayed frame delivery and additional input latency; how much depends on the limiter and its implementation.

RTSS is a CPU-level FPS limiter, which is the closest an external method can get to the engine-level of an in-game limiter. In my initial input lag tests on my original thread, RTSS appeared to introduce no additional delay when used with G-SYNC. However, it was later discovered disabling CS:GO’s “Multicore Rendering” setting, which runs the game on a single CPU-core, caused the discrepancy, and once enabled, RTSS introduced the expected 1 frame of delay.

Seeing as the CS:GO still uses DX9, and is a native single-core performer, I opted to test the more modern “Overwatch” this time around, which uses DX11, and features native multi-threaded/multi-core support. Will RTSS behave the same way in a native multi-core game?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Yes, RTSS still introduces up to 1 frame of delay, regardless of the syncing method, or lack thereof, used. To prove that a -2 FPS limit was enough to avoid the G-SYNC ceiling, a -10 FPS limit was tested with no improvement. The V-SYNC scenario also shows RTSS delay stacks with other types of delay, retaining the FPS-limited V-SYNC’s 1/2 to 1 frame of accumulative delay.

Next up is Nvidia’s FPS limiter, which can be accessed via the third-party “Nvidia Inspector.” Unlike RTSS, it is a driver-level limiter, one further step removed from engine-level. My original tests showed the Nvidia limiter introduced 2 frames of delay across V-SYNC OFF, V-SYNC, and G-SYNC scenarios.

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Yet again, the results for V-SYNC and V-SYNC OFF (“Use the 3D application setting” + in-game V-SYNC disabled) show standard, out-of-the-box usage of both Nvidia’s v1 and v2 FPS limiter introduce the expected 2 frames of delay. The limiter’s impact on G-SYNC appears to be particularly unforgiving, with a 2 to 3 1/2 frame delay due to an increase in maximums at -2 FPS compared to -10 FPS, meaning -2 FPS with this limiter may not be enough to keep it below the G-SYNC ceiling at all times, and it might be worsened by the Nvidia limiter’s own frame pacing behavior’s effect on G-SYNC functionality.

Needless to say, even if an in-game framerate limiter isn’t available, RTSS only introduces up to 1 frame of delay, which is still preferable to the 2+ frame delay added by Nvidia’s limiter with G-SYNC enabled, and a far superior alternative to the 2-6 frame delay added by uncapped G-SYNC.

507 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked

Why do you think that basically no (to my understanding) cs:go pros use these settings or anything similar? Just regular settings with no capped FPS. Is it lack of knowledge or what do you think?


Hey jorimt,

Just a quick question after reading your guide on G-sync + V-sync and input lag…

So I understand that if G-sync is used in combination with V-sync, it would essentially be forcing the computer to deliver steady frame timing over the period over a second (so for 144 Hz they would all be 6.9 ms versus the 144 frames averaging out to 6.9 ms). Also from what I understand, all of this discussion is related to changing these settings withing the NVIDIA control panel.

What does this mean with regards to in-game settings say for a game like Apex Legends or Modern Warfare? Do we have ENABLE V-sync within the individual games as well, to see the benefits from the G-sync + V-sync settings in the Control Panel? Or is doing so not necessary, and might cause other issues?


Hello, jorimt.

This guide just rocks. I have just one question about the new Nvidia driver setting “Max Frame Rate”. From now on, I can set the maximum frame rate that the GPU will render a game directily in the driver.

Is there a difference in latency between set the max FPS at the driver new option or in the RTSS?

Thank you!


New LG Tvs with G-Sync (Firmware 04.71.04):

1. Improvement
1) Featuring NVIDIA G-Sync Compatible.

2. Applicable model list


>And while each frame is still rendered in 16.6ms, and delivered in intervals of 60 per second on the higher refresh rate display, they are scanned in at a much faster 6.9ms per.

Could you explain this line a bit more? In a 240 Hz 60 FPS CAP example, for instance?