G-SYNC 101: Control Panel

G-SYNC Module

The G-SYNC module is a small chip that replaces the display’s standard internal scaler, and contains enough onboard memory to hold and process a single frame at a time.

The module exploits the vertical blanking interval (the span between the previous and next frame scan) to manipulates the display’s internal timings; performing G2G (gray to gray) overdrive calculations to prevent ghosting, and synchronizing the display’s refresh rate to the GPU’s render rate to eliminate tearing, along with the delayed frame delivery and adjoining stutter caused by traditional syncing methods.


The below Blur Busters Test UFO motion test pattern uses motion interpolation techniques to simulate the seamless framerate transitions G-SYNC provides within the refresh rate, when directly compared to standalone V-SYNC.

G-SYNC Activation

“Enable for full screen mode” (exclusive fullscreen functionality only) will automatically engage when a supported display is connected to the GPU. If G-SYNC behavior is suspect or non-functioning, untick the “Enable G-SYNC, G-SYNC Compatible” box, apply, re-tick, and apply.

G-SYNC Windowed Mode

“Enable for windowed and full screen mode” allows G-SYNC support for windowed and borderless windowed mode. This option was introduced in a 2015 driver update, and by manipulating the DWM (Desktop Windows Manager) framebuffer, enables G-SYNC’s VRR (variable refresh rate) to synchronize to the focused window’s render rate; unfocused windows remain at the desktop’s fixed refresh rate until focused on.

G-SYNC only functions on one window at a time, and thus any unfocused window that contains moving content will appear to stutter or slow down, a reason why a variety of non-gaming applications (popular web browsers among them) include predefined Nvidia profiles that disable G-SYNC support.

Note: this setting may require a game or system restart after application; the “G-SYNC Indicator” (Nvidia Control Panel > Display > G-SYNC Indicator) can be enabled to verify it is working as intended.

G-SYNC Preferred Refresh Rate

“Highest available” automatically engages when G-SYNC is enabled, and overrides the in-game refresh rate selector (if present), defaulting to the highest supported refresh rate of the display. This is useful for games that don’t include a selector, and ensures the display’s native refresh rate is utilized.

“Application-controlled” adheres to the desktop’s current refresh rate, or defers control to games that contain a refresh rate selector.

Note: this setting only applies to games being run in exclusive fullscreen mode. For games being run in borderless or windowed mode, the desktop dictates the refresh rate.


G-SYNC (GPU Synchronization) works on the same principle as double buffer V-SYNC; buffer A begins to render frame A, and upon completion, scans it to the display. Meanwhile, as buffer A finishes scanning its first frame, buffer B begins to render frame B, and upon completion, scans it to the display, repeat.

The primary difference between G-SYNC and V-SYNC is the method in which rendered frames are synchronized. With V-SYNC, the GPU’s render rate is synchronized to the fixed refresh rate of the display. With G-SYNC, the display’s VRR (variable refresh rate) is synchronized to the GPU’s render rate.

Upon its release, G-SYNC’s ability to fall back on fixed refresh rate V-SYNC behavior when exceeding the maximum refresh rate of the display was built-in and non-optional. A 2015 driver update later exposed the option.

This update led to recurring confusion, creating a misconception that G-SYNC and V-SYNC are entirely separate options. However, with G-SYNC enabled, the “Vertical sync” option in the control panel no longer acts as V-SYNC, and actually dictates whether, one, the G-SYNC module compensates for frametime variances output by the system (which prevents tearing at all times. G-SYNC + V-SYNC “Off” disables this behavior; see G-SYNC 101: Range), and two, whether G-SYNC falls back on fixed refresh rate V-SYNC behavior; if V-SYNC is “On,” G-SYNC will revert to V-SYNC behavior above its range, if V-SYNC is “Off,” G-SYNC will disable above its range, and tearing will begin display wide.

Within its range, G-SYNC is the only syncing method active, no matter the V-SYNC “On” or “Off” setting.

Currently, when G-SYNC is enabled, the control panel’s “Vertical sync” entry is automatically engaged to “Use the 3D application setting,” which defers V-SYNC fallback behavior and frametime compensation control to the in-game V-SYNC option. This can be manually overridden by changing the “Vertical sync” entry in the control panel to “Off,” “On,” or “Fast.”

215 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked

Hi do you have 10 minutes to listen to my story?

I was wondering if you could explain me a very persistent Frametime-Spiking-relating issue I’ve got recently with the console-emulator Cemu that I am tearing my hair out about at this time.

My system is capable of reaching constant 60-100 FPS (On 2560×1440 resolution; depending on the ingame-region) but literally EVERY 10-15 seconds an annoying framespike is being dropped even without loading shaders or rendering new objects and I’m starting to get headache of this..

It seems to be a well known issue of the newer versions in the Cemu-community and it is assumed being a driver-problem with Nviadia cards and I would be totally ok with it when the developers will find that issue
and fix it for good, if there ACTUALLY WEREN’T several ppl on a Nvidia card who do NOT suffer from this problem at all, running on different Intel-Cpu-gens aswell.
Of course I tried all kind of different settings Cemu, Windows and Nvidia-related (pre-rendered frames, threaded-optimization – quite any combination of all settings), I try fixing this for over a month by now..
I also started to change hardware-parts to exclude damaged hardware, first of the Cpu which is probably the most important component for emulation (I tried 2 different 4790K OC’d @4.8 GHz; I chose this Cpu bc of their high
singlecore-performance, even nowadays, and bc of Win7 that I am still running on). Tomorrow I’ll check another graphic card; atm I use a custom-1080ti @2050MHz running via PCI-E 3 on x8 lanes.
With “real” games on Steam, I merely get Frametime-spiking if a new region is being rendered. Btw Cemu has a frame-profiler that shows how often spikes occour and it’s WAY too often considering I’m using a system that can run games at 4K quite fluidly..
Nontheless, some hardware-parts are no more the newest, also no Windows 10, and I already start believing that is the issue, but if so WHY THE ACTUAL HECK?? The 4790K @ 4.8 GHz can compete with a 7700K @ stock 4.2 GHz if I compare CPU-Z’s bench score.
This is my build btw:

Cpu: 4790K @ 4.8 GHz; always running at max. 70°C while running Cemu; 60-70% workload in Taskmanager with HT, but Cemu isn’t optimized for Hyper-Threading, I still rock between 60 and 100 FPS.
Gpu: KFA2 gtx 1080ti @2050MHz GPU Clock; 6000MHz Memory Clock; GPU-Z shows a workload of 60-70% on 4K resolution; I can keep it between 50-60°C.
Ram: 2x8GB G.Skill TridentX DDR3 @ 2400MHz, 1.65V DRAM
SSD: Samsung 960 Pro NVME M.2 512GB on PCI-E 3 using the other x8 lanes
MB: Asus Maximus VII Gene Z97
PS: Corsair TX 750 W
OS: Win 7 HP x64
Monitor: Gsync-able WQHD 144Hz
Yes the Mainboard and the Cpu ain’t the newest but WHO CARES, I don’t want to upgrade on an OS I’m unfamiliar with, Win7 is still an option..

Also, and as you explained in your “What are Frametime-Spikes?”-section, Cemu probably falls under the category “inefficient game engine” yes yes but that doesn’t explain why several other guys I watched demonstration-videos from doesn’t have this issue to THAT
extent; I mean every 10-15 seconds is no once in 2 minutes.

There are actually things that reduced spiking, these are:
-running the game on Stock-resolution (1280×720)
-running the game on Stock-framerate (30 FPS)
-This point is really pissing me: GSYNC alone, compared to Vsync, SEEMS TO PROVOKE MORE SPIKING!! Unbelievable, I just got hands on a 700€ Gsync-monitor and yes, it does do an amazing job, the well-known Vsync-stutter vanished BUT FOR WHAT PRICE?? Actually increased Frametime-Spiking!
Yes if Cemu already has Nvidia-issues then this shouldn’t be such a surprise, but I didn’t read one thing before about Gsync limiting Cemu in that way so I feel quite lonely with this tbh! Your advice using Gsync+Vsync enabled may work for regular games but for me with Cemu it
ACTUALLY carries the spikings to extremes; at 4K, no 5 seconds pass without a head-burning spike, despite running between 50-70 FPS at that resolution (resolution set to 3840×2160 in Cemu), seriously what’s the matter with all that?!
If there’s no difference with the other card (gtx 1060) I test tomorrow, then it might be the Mainboard itself since that is the only used part I once bought from ebay; I simply can’t find a logical reason why some ppl on a 6700K, 7700K and 8700K all with Nvidia don’t have this issue,
my system is not incapable, or is it?

Other informative things I do:
-Running MSI Afterburner and RTSS to OC the gfx card; I read that having the “Power”-option enabled in the monitoring graph may cause spiking, I disabled it – no difference.
-Older versions of Cemu really HAD MUCH LESS spiking-issues; I talked to the developer of Cemu and he said that the newer versions have been reworked from scratch; generally it is assumed being VRAM-issues, still what about those other guys?
-The reason why I don’t use older versions is bc certain mods doesn’t work with it.
-I’ve created a Ram-Dive and run Cemu from it; despite having a SSD, I notice a quicker responsive-time eg when opening menus ingame etc.; there are no more than 10 out of 16 GB Ram in usage while running Cemu.
-I don’t run any other unnecessary background-programs; also it’s no malware on my system (at least afaik)
-I run Norton; this sometimes is a huge resource-devourer, but nothing serious
-Latest gfx card driver, no yellow marks in device manager, I installed Windows with the help of Fernando’s Win-Raid forum, he provided (among others) the NVME-driver for my SSD since Win7 doesn’t support this type at installation;
other drivers were Micrsoft Management Engine, Rapid-Start-Technology, Chipset INF-files (no drivers), memory controller for the SSD, the rest were the recommended drivers from Asus.
-C:-drive (Windows) installed on the SSD with 150 GB (70 GB free); D:-drive as a separate partition where most important stuff is installed (100 GB free; I don’t bloat the C:-drive)
-Proper Net.Framework, Windows-Updates etc.
-Windows Power-management at fullspeed; in BIOS no energy-saving-options ticked (afaik); BIOS on latest available version
-page file default by Windows, but I never run nearly out of Ram anyway
-the in-built frame profiler in Cemu shows a green amplitude everytime one of these Frametime-Spikes occour – which stands for “Render target management, FBO creation, texture creation”; yes sounds gfx card-related but could there be a connection to other hardware?

Sorry btw for that wall of text, but I am quite at my wit’s end by now on my way comprehending other/similar builds’ success although they probably don’t care half of that, pc-related, the way I do and I seemingly checked all of your mentioned points possibly causing frametime-spikes
and will continue with the hardware-part tomorrow.
If you have any suggestion that merely could hint in a direction I overlooked, PLEASE let me know.

Thank you for reading.


On the topic of FPS limiters: Two tests have been done somewhat recently that found that RTSS provides more consistent frame times than in-game limiters do (at the expense of 1 frame of input lag)
What’s your take on these?


What value should the frame time limit be set to in rtss?


Amazingly detailed, thanks for the guide. Is there a difference between having V-SYNC: ON versus VSYNC: Fast?

Thanks heaps.


So changing the max pre render frame did nothing for me but the vr pre rendered frame was set to 1 by default, should this be set to application instead?