How Does G-SYNC Fix Stutters?

G-SYNC is a variable refresh rate technology. The refresh rate changes while gaming!

People are very surprised that you can have a fluctuating framerate without seeing any stutters. But, surprisingly, it is possible on a variable refresh rate display such as nVidia’s G-SYNC. These diagrams explain why nVidia G-SYNC eliminates erratic stutters, from the human vision perspective, when tracking eyes on moving objects on a monitor:

Without G-SYNC:

With G-SYNC:

As your eyes track a moving object at a constant speed, the positions of moving on-screen objects are now exactly where they are supposed to be, when using nVidia G-SYNC at any frame rate fluctuating within G-SYNC’s range (e.g. 30fps to 144fps). You do not see erratic stutters during variable-framerate motion.

There is, however, a very minor side effect: Variable motion blur.
– As framerates go down, your eyes perceive more motion blur. 
– As framerates go up, your eyes perceive less motion blur.
This effect is self-explanatory in the animations at where lower framerates looks more blurry than higher framerates (on LCD displays).

How To Get G-SYNC?

See List of G-SYNC Monitors. Also discuss G-SYNC in the Blur Busters Forums!

3 comments on “How Does G-SYNC Fix Stutters?

  1. Chief Blur Buster says:

    A reader wrote in asking a question about how G-SYNC’s maximum framerate affects this, when a game caps out at the maximum rate, and the game tries to render more frames:

    • G-SYNC does have a maximum refresh rate, so there’s a finite limit to how close the “dots” on the line can become on the third (last) diagram. This can cause divergences because of the slight divergence between frame time versus real world time.
    • However, assuming framerates never caps out at G-SYNC’s maximum refresh rate, and the game engine calculates object positions correctly, the charts accurately explains the concept.
    • The different charts are not necessarily all the identical refresh rate; they simply explain how on-screen object positions can diverge from eye tracking positions. Technically, the first chart could refer to 60fps@60Hz, while the final G-SYNC chart is G-SYNC’d 144Hz.
  2. Chief Blur Buster says:

    [Old discussion, 2013]

    For games, you must have frame capture/generate times correspond to frame presentation times, though:
    frame captured/generated for T+1.3ms presented to human eye at T+1.3ms
    frame captured/generated for T+11.9ms presented to human eye at T+11.9ms
    frame captured/generated for T+21.7ms presented to human eye at T+21.7ms
    frame captured/generated for T+30.5ms presented to human eye at T+30.5ms
    To eliminate erratic stutters, you must keep the object positions inside the frame, to correspond to the presentation time of the frame. This only works for games, and not for prerendered content (movies, etc).

    As long as the intervals between the frames are sufficiently small, and the number of frames sufficiently high, it’s already perceived as smooth motion. The key is to make sure that rate changes occurs at sufficiently high enough that the average smoothness smoothes out to correspond with the average framerate; and thus 60-100fps would average out to look like smooth 80fps@80Hz motion. For on-the-fly rendered content (games), random 60-100fps on a variable framerate display looks much better than 79fps@80Hz (one stutter per second).

    The timing of frame rendering matches up with the timing of frame display. The object positions are correctly adjusted by the virtue of the timing of the frames being in sync with refresh cycles. This preadjusts object positions to stay relative to eye gaze point (of eye tracking a moving object on screen) — regardless of how frame rate fluctuates as long as within the variable refresh rate range.
    random fluctuating framerate 25-35fps now looks as smooth as 30fps@30Hz
    random fluctuating framerate 55-75fps now looks as smooth as 65fps@65Hz
    random fluctuating framerate 80-140fps now looks as smooth as 110fps@110Hz

    The side effect is simply variable motion blur. However, rapid minor modulations in the size of motion blur is far less noticeable and less objectionable than even a single stutter. In cases where fluctuations are small (20% variance in frame timings), the variability in motion blur is no longer noticed. In cases where framerates are very high (e.g. varying framerates always above 100fps), the variances in motion blur is so rapid, it averages out.

    Therefore, whereas even a single stutter was formerly noticed during VSYNC ON motion at high framerates (e.g. 143fps@144Hz, 1 frame drop per second), variances in refresh rates from ~100fps-144fps is apparently no longer noticeable with G-SYNC, assuming games adjust object positions in frames to keep in sync with frame presentation timings.

  3. engineer123 says:

    Hi Chief,
    excellent website and content, like it a lot. Kudos!
    You wrote above: “G-SYNC does have a maximum refresh rate, so there’s a finite limit to how close the “dots” on the line can become on the third (last) diagram. This can cause divergences because of the slight divergence between frame time versus real world time.”

    Can you explain that “in detail”? I’d like to know what G-Sync does and especially how it looks for the viewer when – on a 144 Hz monitor – the generated FPS by the graphics card reach 145 and more (be it 160, 180, 240 whatever)?
    Does it cause tearing? Does G-Sync “automatically” switch to a VSync at 144 Hz (no tearing, but input lag and stutters)?

    I’m especially interested in that point because the latest report at about FreeSync on CES 2015 says something like “above 60 FPS on 60Hz monitor FreeSync switches to VSync while G-Sync does not behave the same way”.

Add Comment