G-SYNC 101: In-game vs. External FPS Limiters


Closer to the Source*

*As of Nvidia driver version 441.87, Nvidia has made an official framerate limiting method available in the NVCP; labeled “Max Frame Rate,” it is a CPU-level FPS limiter, and as such, is comparable to the RTSS framerate limiter in both frametime performance and added delay. The Nvidia framerate limiting solutions tested below are legacy, and their results do not apply to the “Max Frame Rate” limiter.

Up until this point, an in-game framerate limiter has been used exclusively to test FPS-limited scenarios. However, in-game framerate limiters aren’t available in every game, and while they aren’t required for games where the framerate can’t meet or exceed the maximum refresh rate, if the system can sustain the framerate above the refresh rate, and a said option isn’t present, an external framerate limiter must be used to prevent V-SYNC-level input lag instead.

In-game framerate limiters, being at the game’s engine-level, are almost always free of additional latency, as they can regulate frames at the source. External framerate limiters, on the other hand, must intercept frames further down the rendering chain, which can result in delayed frame delivery and additional input latency; how much depends on the limiter and its implementation.

RTSS is a CPU-level FPS limiter, which is the closest an external method can get to the engine-level of an in-game limiter. In my initial input lag tests on my original thread, RTSS appeared to introduce no additional delay when used with G-SYNC. However, it was later discovered disabling CS:GO’s “Multicore Rendering” setting, which runs the game on a single CPU-core, caused the discrepancy, and once enabled, RTSS introduced the expected 1 frame of delay.

Seeing as the CS:GO still uses DX9, and is a native single-core performer, I opted to test the more modern “Overwatch” this time around, which uses DX11, and features native multi-threaded/multi-core support. Will RTSS behave the same way in a native multi-core game?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Yes, RTSS still introduces up to 1 frame of delay, regardless of the syncing method, or lack thereof, used. To prove that a -2 FPS limit was enough to avoid the G-SYNC ceiling, a -10 FPS limit was tested with no improvement. The V-SYNC scenario also shows RTSS delay stacks with other types of delay, retaining the FPS-limited V-SYNC’s 1/2 to 1 frame of accumulative delay.

Next up is Nvidia’s FPS limiter, which can be accessed via the third-party “Nvidia Inspector.” Unlike RTSS, it is a driver-level limiter, one further step removed from engine-level. My original tests showed the Nvidia limiter introduced 2 frames of delay across V-SYNC OFF, V-SYNC, and G-SYNC scenarios.

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Yet again, the results for V-SYNC and V-SYNC OFF (“Use the 3D application setting” + in-game V-SYNC disabled) show standard, out-of-the-box usage of both Nvidia’s v1 and v2 FPS limiter introduce the expected 2 frames of delay. The limiter’s impact on G-SYNC appears to be particularly unforgiving, with a 2 to 3 1/2 frame delay due to an increase in maximums at -2 FPS compared to -10 FPS, meaning -2 FPS with this limiter may not be enough to keep it below the G-SYNC ceiling at all times, and it might be worsened by the Nvidia limiter’s own frame pacing behavior’s effect on G-SYNC functionality.

Needless to say, even if an in-game framerate limiter isn’t available, RTSS only introduces up to 1 frame of delay, which is still preferable to the 2+ frame delay added by Nvidia’s limiter with G-SYNC enabled, and a far superior alternative to the 2-6 frame delay added by uncapped G-SYNC.



3062 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
toby23
Member
toby23

If I have a 120 Hz monitor with G-Sync and can achieve 115fps average in a game, is there any negative to locking the framerate to 59fps with RTSS to lower power consumption and smooth out Frametime?
Running unlocked in MSFS results in the Frametime jumping around all over the place but locking to 59 fps makes it steady.

PS Super article, thank you so much for keeping it updated.

Ryan Le
Member
Ryan Le

In Sea of Thieves, I have V-Sync turned off, but there is also an option to set the buffering to either double or triple, and there’s no off option. I set it to double buffering, but do I still need to enable V-Sync in NVCP? Would the in-game double buffering option (with in-game V-Sync off) conflict with NVCP V-Sync since it’s also running on double buffering?

Sequinoz
Member
Sequinoz

Hi Jorimt. A bit of a long comment here so I hope you don’t mind. Recently I bought a new G-Sync monitor (XG321UG) and noticed a peculiar G-Sync behaviour but I’m unsure if it’s abnormal or not.

To reproduce the behaviour, I ran the G-Sync Pendulum Demo application and manually changed the FPS from 60 to 50. The expected behaviour is a seamless framerate change with no noticeable stutter.

Instead, I noticed a ~0.2 seconds of continuous stutter from the moment the FPS changed from 60 to 50. It’s almost as if the G-Sync module tries to “catch up” to the sudden change in FPS.

Changing from 60 to 55 did not seem to show the problem much (if at all) and changing from 50 to 60 showed more of a one-time frame “jump”. Setting the FPS to gradually change back and forth between 60 and 40 seems to also be normal. Notably, the problem is less noticeable at higher FPS.

I tried testing my old G-Sync monitor (PG27AQ) and changing the FPS from 60 to 50 only seemed to show one stutter/frame jump but was less noticeable and did not stutter for as long as ~0.2 seconds.

I’m wondering if the symptom I’m seeing on my new monitor is normal or if it’s an indication that the G-Sync module is faulty.

As a side note: I also turned on the display’s built-in refresh rate counter and whenever I change FPS from 60 to 50, the refresh rate would go 60 > 49 > 43 > 49 > 50. Changing FPS from 60 to 55 instead showed 60 > 56 > 53 > 55. I’m unsure though if the built-in refresh rate counter is 100% accurate.

Perhaps the dips below the targeted FPS is the reason for the 0.2 secs stutter, which is more noticeable at lower targeted FPS and higher change in FPS. Again, I’m not fully sure about this and would like to know what you think.

Some more info that might be helpful~
GPU: RTX 3090
Driver Version: 551.46
Connection: DisplayPort (tried both cables that came with the old and new monitor but no difference)
Resolution: 3840 x 2160 Native
Refresh Rate: 144Hz Native
G-Sync Mode: On and Fullscreen Only. I heard the Pendulum Demo test overrides the G-Sync setting, in which case only the G-Sync option was used.

Indignified
Member
Indignified

Hello, why do pro players for fps games use uncapped fps instead of these settings? Are there any benefits to using uncapped fps?

Pyerimi
Guest
Pyerimi

Also I wanted to ask about what to do in case if ingame FPS limiter has bad implementation and introduce microstuttering like in this case: https://www.reddit.com/r/horizon/comments/i5p6io/pc_psa_do_not_use_the_ingame_fps_limiter_use_rtss/

Should I still use ingame option or RTSS?

wpDiscuz