Preview of NVIDIA G-SYNC, Part #2 (Input Lag)

featuredin— Written by Mark Rejhon

If you have not seen it yet, see G-SYNC Preview, Part #1  first! Next, as I continue to test G-SYNC I was faced with the question: What can we test that no other site has ever tested? Blur Busters, is not the everyday review site, having adopted some unusual testing techniques and experiments (e.g. pursuit camera patterns, creating a strobe backlight). So, I decided to successfully directly measure the input lag of the G-SYNC monitor.

What!? How Do We Measure G-SYNC Input Lag?

Measuring real-world input lag of G-SYNC is a very tricky endeavour. Fortunately, I invented a very accurate method of measuring real-world in-game input lag by using a high speed camera combined with a modified gaming mouse.


Wires from a modified Logitech G9x are attached to a LED, which illuminates instantly upon pressing the mouse button (<1ms). Using a consumer 1000fps camera, one can measure the time between the light coming from the LED, and the light coming from the on-screen gunshot, such as a gun movement, crosshairs flash, or muzzle flash:

lag-video-1This is an example high  speed video frame of a game scene.

lag-video-2The mouse LED turns on first, as the mouse button is pressed.

lag-video-3Finally, then the gun reacts several milliseconds later.


With 1000fps video, one frame is one millisecond. Conveniently, one can calculate the difference in frame numbers, between the LED illumination and the first gun reaction (whatever the first on-screen reaction is; such as a crosshairs animation, or gun moving upwards right before muzzle flash). This equals the whole-chain input lag in milliseconds, all the way from mouse, button-to-pixels! To speed up high speed analysis, I tested multiple video players to find which could do very fast forward/backward single-stepping. I found QuickTime on offline MOV files worked best (holding down K key while using J/L keys to step backwards/forwards) since YouTube unfortunately does not allow single-frame stepping. This allowed me to more rapidly obtain the input lag measurements.

This makes it possible to honestly measure the whole chain without missing any G-SYNC influences (G-SYNC effect on hardware, drivers & game software), and compute latency differences between VSYNC OFF versus G-SYNC, in a real-world, video-game situation.

With this, I set out to measure input lag of VSYNC OFF versus G-SYNC, to see whether G-SYNC lives up to NVIDIA’s input lag claims.

Quick Intro To Input Lag

A long-time Blur Busters favorite is AnandTech’s Exploring Input Lag Inside And Out, which covers whole-chain input lag in detail, illustrating the button-to-pixels latency. This is the latency, in milliseconds, between a mouse button press, and light being emitted from pixels, seen by human eyes. This is above-and-beyond display lag, and includes several parts, such as mouse lag, CPU, GPU, frame transmission, and LCD pixel response.

longlag  bestcase

Credit: AnandTech; diagrams of the whole input lag chain.

The important point is that while pixel response may be only 2ms, the whole chain input lag can be several tens of milliseconds, and even a hundred milliseconds. There are many factors that can influence input lag, including low game tick rate (e.g. Battlefield 4, which runs at a very low tick rate of 10 Hertz) as well as intentionally added latency (e.g. realistic in-game gunshot latency that more resembles a slightly random latency of a physical gun).

Now that you understand the whole-chain button-to-pixels input lag, Blur Busters presents the world’s first publicly available millisecond-accurate whole-chain input lag measurements of G-SYNC, outside of NVIDIA.

System Specifications & Test Methodology

The test system has the following specifications:

OSWindows 8.1 (with 1000Hz mouse fix)
CPUIntel i7-3770K
GPUGeForce GTX Titan (Driver 331.93)
RAM16GB (4 x 4GB) Mushkin 1600Mhz DDR3
MouseLogitech G9x (modified)
MotherboardASUS P8Z77V-Pro

The Casio EX-ZR200 camera used, has a 1000fps high speed video feature. Although low-resolution, it is known to provide 1 millisecond frame accuracy. A high-contrast spot in the game is recorded for measurement reliability. Multiple test passes is done to average-out fluctuations, including CPU, GPU, timing variability, microstutter (early/late frames), and monitor. The screen center is measured (cross hairs, the usual human focus), as it is widely known to accurately represent average screen input lag, for all displays and all modes, including strobed, non-strobed, scanned, CRT, VSYNC OFF, VSYNC ON. The lag measurement start is LED illumination, and is accurate to 1ms. The lag measurement end is screen reaction, and is accurate to 1ms on modern TN LCDs, as the beginning of visibility in high speed video also represents the beginning of visibility to human eyes. Screen output timing variability (e.g. scan-out behavior) is also averaged-out by doing multiple passes.

Total Input Lag of Battlefield 4

The game, Battlefield 4, is known to be extremely laggy, even on fast systems. It low 10Hz tick rate ads a huge amount of input lag, and the game rarely caps out at a monitor’s full refresh rate. Battlefield 4 is a game that typically runs at frame rates that benefits immensely from G-SYNC in eliminating erratic stutters and tearing.

I tested using maximum possible graphics settings, with AA set to 4X MSAA, running in the same location that consistently ran at 84 frames per second during VSYNC OFF and 82 frames per second during G-SYNC. Analysis of high speed videos (VSYNC OFF Run #1, Run #2, and G-SYNC Run #1, Run #2) yields this chart:


From pressing the mouse button to gunshot reaction (measured as number of 1000fps camera frames from mouse button LED illumination, to the frame with the first hint of muzzle flash), the input lag varied massively from trigger to trigger in Battlefield 4.
Note: Shot 11 and Shot 12 are missing from VSYNC OFF Run #2, as the gun started reloading. VSYNC OFF Run #2 Average is taken from 10 instead of 12 shots.

With VSYNC OFF averages of 72ms and 74ms, this is very similar to G-SYNC averages of 77ms and 74ms respectively. The variability of the averages appears to fall well below the noise floor of the high variability of Battlefield 4, so Blur Busters considers the differences in averages statistically insignificant. During game play, we were unable to feel the input lag difference between VSYNC OFF versus G-SYNC. This is good news; G-SYNC noticeably improved Battlefield 4 visual gameplay fluidity with no input lag compromise.

Total Input Lag of Crysis 3

From the famous “Will it run Crysis?“, the Crysis series have been a gold benchmark game for mercilessly bringing even the most powerful systems to an absolute grind, especially at maxed-out graphics settings. Armed with a GeForce GTX Titan and running at Ultra settings using 4X MSAA, we stood in a location in the game that could not crack 47 frames per second during VSYNC OFF, and 45 frames per second during G-SYNC.

High speed video analysis (VSYNC OFF Run #1Run #2, and G-SYNC Run #1Run #2) yields the following chart:


In high speed video, input lag is measured from mouse button LED illumination, to when the crosshair first visibly “expands” (just before the muzzle flash).

It is obviously that even during bad low-framerate situations (<50fps), Crysis 3 has a much lower total input lag than Battlefield 4. Input lag variability is almost equally massive, however, with routine 10-to-20ms variances from gunshot to gunshot. The average input lag of 53ms and 55ms for VSYNC OFF, versus averages of 59ms and 52ms for G-SYNC, is still significantly below the noise floor of the large input lag variability of the game.

It was good that we were also unable to detect any input lag degradation by using G-SYNC instead of VSYNC OFF. There were many situations where G-SYNC’s incredible ability to smooth the low 45fps frame rate, actually felt better than stuttery 75fps — this is a case where G-SYNC’s currently high price tag is justifiable, as Crysis 3 benefitted immensely from G-SYNC.

Total Input Lag of Counter Strike: Global Offensive

The older game, CS:GO, easily runs at 300 frames per second on a GeForce Titan, so this presents an excellent test case to max-out the frame rate of a G-SYNC monitor. We were curious if G-SYNC monitors started having input lag when frame rates were maxed out at the G-SYNC monitors’ maximum frame rate. We got some rather unusual results, with some very bad news immediately followed by amazingly good news!

I ran the following tests at various different “fps_max” values:
Input Lag – CS:GO – fps_max=300 – VSYNC OFF #1
Input Lag – CS:GO – fps_max=300 – VSYNC OFF #2
Input Lag – CS:GO – fps_max=143 – VSYNC OFF #1
Input Lag – CS:GO – fps_max=143 – VSYNC OFF #2
Input Lag – CS:GO – fps_max=300 – GSYNC #1
Input Lag – CS:GO – fps_max=300 – GSYNC #2
Input Lag – CS:GO – fps_max=143 – GSYNC #1
Input Lag – CS:GO – fps_max=143 – GSYNC #2
Input Lag – CS:GO – fps_max=120 – GSYNC #1
Input Lag – CS:GO – fps_max=120 – GSYNC #2
Analysis of the high speed videos result in the following very interesting chart:


In the high speed videos, I measured from mouse button LED illumination, to first upwards movement of the gun (that occurs before the muzzle flash). For all numbers in milliseconds, see PDF of spreadsheet.

As a fast-twitch game with a fairly high tick rate (Up to 128Hz, configurable), whole input lag in CS:GO is very low compared to both Battlefield 4 and Crysis 3. Sometimes, total whole-chain input lag from button-to-pixels even went below 20ms, which is quite low!

At first, it was pretty clear that G-SYNC had significantly more input lag than VSYNC OFF. It was observed that VSYNC OFF at 300fps versus 143fps had fairly insignificant differences in input lag (22ms/26ms at 300fps, versus 24ms/26ms at 143fps). When I began testing G-SYNC, it immediately became apparent that input lag suddenly spiked (40ms/39ms for 300fps cap, 38ms/35ms for 143fps cap). During fps_max=300, G-SYNC ran at only 144 frames per second, since that is the frame rate limit. The behavior felt like VSYNC ON suddenly got turned on.

The good news now comes: As a last-ditch, I lowered fps_max more significantly to 120, and got an immediate, sudden reduction in input lag (27ms/24ms for G-SYNC). I could no longer tell the difference in latency between G-SYNC and VSYNC OFF in Counterstrike: GO! Except there was no tearing, and no stutters anymore, the full benefits of G-SYNC without the lag of VSYNC ON.

Why is there less lag in CS:GO at 120fps than 143fps for G-SYNC?

We currently suspect that fps_max 143 is frequently colliding near the G-SYNC frame rate cap, possibly having something to do with NVIDIA’s technique in polling the monitor whether the monitor is ready for the next refresh. I did hear they are working on eliminating polling behavior, so that eventually G-SYNC frames can begin delivering immediately upon monitor readiness, even if it means simply waiting a fraction of a millisecond in situations where the monitor is nearly finished with its previous refresh.

I did not test other fps_max settings such as fps_max 130, fps_max 140, which might get closer to the G-SYNC cap without triggering the G-SYNC capped-out slow down behavior. Normally, G-SYNC eliminates waiting for the monitor’s next refresh interval:

G-SYNC Not Capped Out:
Input Read -> Render Frame -> Display Refresh Immediately

When G-SYNC is capped out at maximum refresh rate, the behavior is identical to VSYNC ON, where the game ends up waiting for the refresh.

G-SYNC Capped Out
Input Read -> Render Frame -> Wait For Monitor Refresh Cycle -> Display Refresh

This is still low-latency territory

Even when capped out, the total-chain input lag of 40ms is still extremely low for button-to-pixels latency. This includes game engine, drivers, CPU, GPU, cable lag, not just the display itself. Consider this: Some old displays had more input lag than this, in the display alone! (Especially HDTV displays, and some older 60Hz VA monitors).

In an extreme case scenario, photodiode oscilloscope tests show that a blank Direct3D buffer (alternating white/black), shows a 2ms to 4ms latency between Direct3D Present() and the first LCD pixels illuminating at the top edge of the screen.  This covers mostly cable transmission latency and pixel transition latency. Currently, all current models of ASUS/BENQ 120Hz and 144Hz monitors are capable of zero-buffered real-time scanout, resulting in sub-frame latencies (including in G-SYNC mode).

Game Developer Recommendations

It is highly recommended that Game Options include a fully adjustable frame-rate capping capability, with the ability to turn it on/off.  The gold standard is the fps_max setting found in the Source Engine, which throttles a game’s frame rate to a specific maximum.

These frame rate limiters hugely benefit G-SYNC because the game now controls the timing of monitor refreshes during G-SYNC.  By allowing users to configure a frame rate cap somewhat below G-SYNC’s maximum refresh rate, the monitor can begin scanning the refresh immediately after rendering, with no waiting for blanking interval.

What about LightBoost – Ultra Low Motion Blur?

The cat is now out of the bag by many sources, and several news releases, so Blur Busters comments on the LightBoost sequel that’s included in all G-SYNC monitors!

All G-SYNC monitors include a LightBoost sequel called ULMB, which stands for Ultra Low Motion Blur. This is activated by a button on the monitor, activatable during 85Hz/100Hz/120Hz mode, and eliminates most motion blur, in the same manner as LightBoost. ULMB is a motion blur reduction strobe backlight (high speed video) to lower persistence, by flashing the backlight only on fully refreshed frames (and almost completely bypassing GtG pixel transitions on LCD).

For those who do not yet understand strobing fully, the educational TestUFO animations, Black Frame Strobing Animation and Eye Tracking Motion Blur (Persistence) Animation, helps explain the principle of modern strobe backlights in gaming monitors such as LightBoost, ULMB, Turbo240, or BENQ’s Blur Reduction. Also see Photos: 60Hz versus 120z versus LightBoost.

More tests for ULMB are coming in a future article. However, that said, I provide the preliminary Blur Busters impressions:

  • No Utility Needed
    You only need to press a button on the monitor to turn ON/OFF the ULMB strobing. Very easy to turn on/off at any time, even at the Desktop or while inside a game, as long as you’re running in non-GSYNC mode.
  • Color quality is significantly better than VG248QE LightBoost.
    There is no reddish or crimson tinting. Although the picture is not as colorful as some higher end strobe backlight monitors (e.g. EIZO FG2421 with Turbo240 strobing), the picture is much better than VG248QE LightBoost. The contrast ratio is less than non-LightBoost but slightly better than LightBoost. Brightness is same as LightBoost=100%.
  • Colors are adjustable, and works in games.
    You can finally recalibrate “LightBoost” colors via the monitor itself, via a utility. There is now a HOWTO: Adjust Colors on VG248QE G-SYNC.
  • Fewer LCD Inversion and banding artifacts.
    There are less artifacts visible at than on all other LightBoost monitors that I have seen. Even the TestUFO Flicker Test (full screen) look much cleaner and more uniform.
  • Strobe Length Not Adjustable
    Many readers have asked if motion clarity can be adjusted (similiar to LightBoost 10% versus 50% versus 100%). Unfortunately, persistence is not currently adjustable.
    Note: Blur Busters has informed NVIDIA to recommend them to add an adjustment (e.g. DDC/CI command) to add strobe length adjustability for the strobe backlight in future G-SYNC monitors, as a trade-off between brightness versus motion clarity.
  • LCD Contrast Ratio Still Lower in Strobe Mode
    The contrast ratio remains approximately the same as LightBoost on the same panel. Currently, 27″ panels have less contrast ratio degradation for strobed mode than with current 24″ 1ms panels. Some of us prefer the full contrast range, even at the expense of slightly more overdrive artifacts. It is observed one old LightBoost monitor (the ASUS VG278H, original H suffix, not HE suffix, when configured via OSD Contrast 90-94%) is one of the few models able to achieve nearly 1000:1 contrast ratio in LightBoost mode with some ghosting trade offs. In contrary, the 24″ panel is only able to barely exceed 500:1 contrast ratio in comparison. However, that being said, VG248QE ULMB is still much better looking than VG248QE LightBoost.
    Note: Blur Busters has informed NVIDIA to further investigate this issue, to see if there are ways to decrease the contrast ratio difference between strobed and non-strobed in future monitors. 

Should I use G-SYNC or ULMB?

Currently, G-SYNC and ULMB is a mutually-exclusive choice – you cannot use both simultaneously (yet), since it is a huge engineering challenge to combine the two.

G-SYNC: Eliminates stutters, tearing and reduces lag, but not motion blur.
LightBoost/ULMB: Eliminates motion blur, but not stutters or tearing.

Motion blur eliminating strobe backlights (LightBoost or ULMB) always looks best when strobe rate matches frame rate.  Such strobe backlights tend to run at high refresh rates only, in order to avoid flicker (to avoid eyestrain-inducing 60Hz style CRT flicker).

We found that G-SYNC looked nicer at the low frame rates experienced in both Battlefield 4 and Crysis 3, while ULMB looked very nice during Counter Strike: GO. We did not yet do extensive tests on input lag, but preliminary checks shows that ULMB adds only approximately 4ms (center/average) input lag compared to VSYNC OFF or good frame-capped G-SYNC. If you do, however, use ULMB, and you prefer not to fully lock the frame rate to refresh rate, then using a close frame rate works well (e.g. fps_max 118) as a latency compromise, if you prefer the motion clarity of ULMB.

It is a personal preference whether to use G-SYNC or ULMB. As a rule of thumb:

G-SYNC: Enhances motion quality of lower & stuttery frame rates.
LightBoost/ULMB: Enhances motion quality of higher & consistent frame rates.

Yes, the G-SYNC upgrade kit includes ULMB. ULMB works in multiple monitor mode (much more easily than LightBoost) even if G-SYNC can only work on one monitor at a time. Currently, G-SYNC only works on the primary monitor at this time, with current NVIDIA drivers.


As even the input lag in CS:GO was solvable, I found no perceptible input lag disadvantage to G-SYNC relative to VSYNC OFF, even in older source engine games, provided the games were configured correctly (NVIDIA Control Panel configured correctly to use G-SYNC, and game configuration updated correctly). G-SYNC gives the game player a license to use higher graphics settings in the game, while keeping the gameplay smooth.

We are very glad that manufacturers are paying serious attention to strobe backlights now, ever since this has been Blur Busters raison d’être (ever since our domain name used to be in 2012, during the Arduino Scanning Backlight Project).

How To Get G-SYNC?

See List of G-SYNC Monitors.

68 comments on “Preview of NVIDIA G-SYNC, Part #2 (Input Lag)

  1. Pingback: Measuring Input Lag of G-SYNC! – Preview Part #2 | Blur Busters

    • Chief Blur Buster says:

      I did a quick test against the BENQ XL2411T earlier (uses same panel as VG248QE), and could not see a lag difference in the photodiode oscilloscope test of the screen scanout (~3-4ms for top edge of screen, <1ms accuracy margin). So, this means VSYNC OFF on XL2411T has no difference with non-GSYNC VSYNC OFF on the modded VG248QE. However, that said, I will be publishing comparisions intermittently throughout 2014!

      I had been doing lag test developments in the last 12 months related to my Arduino lag tester. So for all practical purposes, all BENQ / ASUS 120Hz/144Hz monitors, in their best refresh modes, have roughly identical input lag (give or take a millisecond), in their non-LightBoost modes.

      So there's no effect on lag in non-GSYNC mode when doing the GSYNC upgrade to a VG248QE.

      • Jabbadab says:

        More interesting would be to see how lag is different with normal vsync on vs g-sync. And maybe even forced tb+vsync on from nvidia inspector. And how lag differs between g-sync vs adaptive vsync.

        Interesting test nonetheless thanks for that!

        • Chief Blur Buster says:

          Agreed. Eventually, we’ll do these tests in additional articles.

          These tests were time consuming, so I ran out of time to do VSYNC ON passes — but we already know VSYNC ON almost never has less input lag than VSYNC OFF. However, we definitely want to do more input lag tests, in additional situations where we are interested in seeing lag results (LightBoost, ULMB, new fps_max values, VSYNC ON, other games, etc).

          • Blue_Ninja0 says:

            How are those tests going? I would be very interested in comparing the values from this article to V-Sync values from various game engines, since you made such a perfect method to measure finger to eye delay.

  2. Arbaal says:

    What method did you used to come to the conclusion that the statistical dispersion of the first two GSYNC tests are noise? Your conclusion seems flawed against your own data, without showing how you determined that the dispersion is indeed noise.

    • Chief Blur Buster says:

      This is not Journal of Physics (though we wish we could hire a good scientist!)

      With gunshot-to-gunshot latencies varying by more than 20ms, and the averages of the multiple runs varying by less than 2ms (e.g. 52+59=55.5 versus 54+55=54.5), none of us here who tried it out, could feel the difference in all the shot-to-shot variations, so subjectively, it falls far below the noise floor of the shot-to-shot variability.

      Shot-to-shot variability is caused by huge numbers of factors, including game code related, game timing inaccuracies, aliasing effects between game tickrate and framerate, aliasing effects between framerate and non-GSYNCd refreshrate, frame-to-frame rendering time variabilities, aliasing effects between mouse pollrate and game tickrate, hardware interrupts, bus contention, cache misses, CPU thread scheduling, CPU variability, and dozens/hundreds/thousands of other interactions that inject shot-to-shot latency variability, some at microsecond timescales and others at millisecond timescales — all acting simultaneously to inject gunshot-to-gunshot variability. To our eyes and feel, this simply ends up as “noise” from an end-user perspective. On a subjective basis, the shot-to-shot variability is visually noise, even if the variability is often algorithmic. The specific cause of the noise is not discussed in detail, even though as a programmer, I’m familiar with at least a portions of what causes varabilities in button-to-pixel latencies in games.

      All gaming reviews of all sites, all include a lot of subjective data, so it’s very important to remind that such conclusions in such reviews includes subjective assessment (actual gameplay feel), in addition to the objective data. Although Blur Busters articles contains far more display science than the average mainstream review, it is still a review. Good if you view it as a review, flawed if you try to view it as a science paper.

      I would love to see impartial objective science done — including by aspiring scientists (yoohoo, you University students looking for Thesis topics!) take up this challenge & do some real scientific papers on these topics such as input lag. Perhaps even peer reviewed. On gaming-related behaviors that are more extensive than what we do. Competitive gameplay, aiming behaviors, input lag, modern low-persistence techniques, tracking down which percentage of each factor contribute to lag variability, etc. We know that public funded science tends to avoid the frivolous stuff such as gaming, so this industry is somewhat science-paper-poor at the moment.

  3. TheLeetFly says:

    Thank you very much for this test.
    Could you make a guess on the input-lag value in any COD game [just for my personal comparison]?
    Are the 4ms Input-Lag of ULMB equal to Lightboost or could you perform the Test again with standard Lightboost?

    • Chief Blur Buster says:

      The latency of ULMB and LightBoost appears almost identical (less than 1ms difference), when I do the photodiode oscilloscope measurement.

      Measurements showed ULMB seemed to have a fraction of a millisecond less latency than LightBoost. Insignificant, but at came up as slightly less.

      Detailed latency measurements of multiple strobe backlights will be posted in a forthcoming Blur Busters article.

  4. krom_lg says:

    Mark i have some questions.
    When G-Sync and ULMB work together this can be acompliched by realese a new firmware or you shoud by new monitor with G-Sync 2?
    Why Nvidia do not add possobility of changing lower limit when ULMB is on, beacuse flicker visibility change from person to person?
    (Benq Motion Blur Reduction work on 75Hz when ULMB work on 85Hz and above)
    What about G-Sync (start to work above 30fps) and 24p-films, may in future thier add this function?
    Thanks for reply.

  5. Atzenkeeper500 says:

    Hi there I have 2 questions. I am using the BenQ XL2411T at the moment and prordered the new ASUS GSYNC Edition. I play COD Ghosts, this game has a frame limit at 90Hz. But in the ingame setting I can choose till 144 Hz. With a gsync monitor is it better to choose a limit of 120hz or even 100 hz to stay below the max Monitor refresh rate?

    2. Question: in the nvidia graphic setting you can play around with a prerender [email protected] auto or 1-4. In the moment I choosed 1. Do you have an Info whats the best prerender limit for a gsync monitor is?

    Thank you. I am waiting so hard for this monitor, maybe next week they start the selling it in germany

    • Chief Blur Buster says:

      You want to use the highest possible frame rate cap, that’s at least several frames per second below the G-SYNC maximum rate, in order to prevent G-SYNC from being fully capped out. Testing each run took a lot of time, so I didn’t end up having time to test in-between frame caps (other than fps_max 120, 143 and 300).

      Technically, input latency should “fade in” when G-SYNC caps out, so hopefully future drivers can solve this behavior, by allowing fps_max 144 to also have low latency. Even doing an fps_max 150 should still have lower input lag than fps_max 300 using G-SYNC, since the scanout of the previous refresh cycle would be more finished 1/150sec later, rather than 1/300sec later. Theoretically, the drivers only needs to wait a fraction of a millisecond at that time and begin transmitting the next refresh cycle immediately after the previous refresh finished. I believe the fact that latency occured at fps_max 143, to be a strange quirk, possibly caused by the G-SYNC polling algorithm used. I’m hoping future drivers will solve this, so that I can use fps_max 144 without lag. It should be technically possible, in my opinion. It might even be theoretically possible to begin transmitting the next frame to the monitor before the display refresh is finished, by possibly utilizing some spare room in the 768MB of memory found on the G-SYNC board (To my knowledge, this isn’t currently being done, and isn’t the purpose of the 768MB memory). Either way, I think this is hopefully an easily solvable issue, as there should only be a latency fade-in effect when G-SYNC caps out at fps_max 143, fps_max 144, fps_max 150 — rather than an abrupt additional latency. I’ll likely contact NVIDIA directly and see what their comments are, about this.

      For G-SYNC, it probably would be most optimal that prerender be set as small as possible (e.g. 1).

  6. Chief Blur Buster says:

    Some people ask why I didn’t come up with alternative methods of measuring input lag. The whole chain method is really simple, easy, and deterministic:
    — human input: button press (measurement accurate to 1ms)
    — human output: display reacts (measurement accurate to 1ms)

    It’s simple, it’s deterministic, it’s honest, it’s real-world games.
    1. Run full chain test with VSYNC OFF
    2. Run full chain test with GSYNC
    3. Compute difference.

    It was necessary to do the whole chain:
    – to include whatever the display is doing (hardware-based GSYNC latency)
    – to include whatever the driver is doing (software-based GSYNC latencies)
    – to include whatever the game is reacting to GSYNC (software-based GSYNC latencies)

    This makes the full chain method the most honest honest and least flawed method of measuring G-SYNC input lag in real-world situations (games).

  7. Jameson says:

    Great tests, my main game is cs:go, so it’s great to see you tested this one. I’d imagine i’ll get absolutely no gain from g-sync in this game. Of course, if i run fps_max at 143 fps i suffer tearing, but at 300fps i haven’t seen it tear even once. I can’t imagine g-sync being smoother than this, but I’m all for being pleasantly surprised.

    Did you notice g-sync as even smoother over true no v-sync 300fps in 144hz mode? Would be very interested to hear your thoughts. One test i might quickly run is a fraps frame time test, i suppose if each frame is very evenly timed, it kind of proves g-sync will offer little benefit in this scenario. I have no doubt g-=sync is amazing in many other games, though, just cs:go is so optimized already.

  8. Pingback: G-Sync Input Lag Tested | Developed

  9. Pingback: [Sammelthread] Nvidia G-SYNC - Seite 10

  10. neel says:

    I love the led idea, will use it for my own latency tests (mostly racing sim).

    About video analysis tool I use virtualdub for my latency video analysis. You see frame number, can use arrow keys for frame by frame (and hold it for normal play), can use a simple avisynth script if you want frame number displayed on the video directly (I haven’t digged but I think we could make a script that would also add keys to press to automaticaly store starting and ending frame numbers, which would help speed up the process if you are currently writing this down by hand as I do). Avidemux should be an option too (compatible with more video formats than virtualdub).

    As a side note maybe you could avoid talking about input latency as what we are testing is input-to-output latency. Game developpers always jump out their chair when players tell them about the “input latency their game have” :p

  11. DesktopMan says:

    “It is obviously that even during bad low-framerate situations (<50fps), Crysis 3 has a much lower total input lag than Battlefield 4."

    The way you are testing here does not allow you to draw any conclusions across different games. Measuring from mouse click to visual confirmation of a shot is not the same as measuring from mouse click to the engine actually performing the internal operations that are required to shoot, which is the actual input lag.

    The animation system itself might have a certain amount of latency which would be included in this way of measuring, but is not a part of input lag as the action might have already been processed by the engine. The time and order between bullet leaving the gun and the visual muzzle flash can vary wildly between engines, and even guns in the same engine.

    That being said, Crysis might have lower input lag compared to BF4 of course, but the numbers here can't be used for that conclusion.

    The LED method should only be used for measuring the latency difference in the same game with the same gun, but with different monitors.

  12. Pingback: Anonymous

  13. Berserker says:

    Measuring to gun animation/muzleflash isnt the most accurate way to test input lag, some games have quite big delay for those things , like fallout 3. Measuring from moving mouse to game drawing next frame can be more accurate, but maybe harder and mouse sensor latency can influence result.

    • Chief Blur Buster says:

      Good points but it actually doesn’t affect comparative analaysis (see below). Also, it depends on the game. There are artificial gun delays programmed in some games, but not all of them.

      Mouse movement is hard to capture a millisecond-accurate moment. In 1000fps, the human hand moves imperceptibly in 1ms. However, a LED attached to the mouse in a very simple circuit, can illuminate in just a mere 1ms.

      Even if the game has built-in gunshot delay, it’s still a useful method of comparing between configurations (at least when running several trial and averaging the results), since the other variances would create a difference in full chain input lag measurements. So, even if there was a gunshot delay, it is still far by the most accurate way to compare between modes and configurations (e.g. VSYNC OFF versus VSYNC ON versus GSYNC). Several games also don’t have artifical gun delays (e.g. Quake series, Team Fortress 2, Counterstrike, etc),

      Now for measuring “aiming latency”, it’s definitely more ideal to try to measure mouse movement lag. It would be nice to come up with a reliable (e.g. 1ms-accurate) way of measuring mouse movement latency for direct analysis of that, rather than comparative analysis (the purpose of this article). Comparative analysis is still accurate even if there was a built-in gunshot latency, and the gunshot latency is still part of the perceived latency (separate of “aiming latency” — which requires measurement of mouse movement latency).

      We’d love to hear suggestions of accurate methods of measuring movement latency, that’s not too complicated, reasonably generic, inexpensive, and repeatable.

  14. Pingback: Anonymous

  15. dhaine says:

    Hi Mark,

    Great post !

    I have a question, I thought that Lightboost 2 @ 10% was the lowest input lag ever, yet here you compare to classic vsync off. Was I wrong ?
    How about you compare lightboost 2 @ 10% input lag vs gsync ?

  16. Pingback: G-Sync Input lag Review

  17. Fabulist says:

    I cannot wait for side by side comparisons with Vsync, so we can determine the delay gap between Vsync, Gsync and no sync at all.

    Additionally, would it be possible that you tell us what is the FPS limit that can be used on Counter – Strike and other games, so no negative side-effects from Gsync are produced? I see 143 to still be problematic while 120 is not. But what about 125, 130, 142 and so on?

    Thanks in advance!

    • Chief Blur Buster says:

      Since this article was written, several of us did several tests.

      fps_max 130 — works good
      fps_max 135 — works good
      fps_max 138 — works good
      fps_max 140 — slight hints of extra lag
      fps_max 142 — as bad as fps_max 143

      However, I am hoping that this is really just an NVIDIA driver issue (hopefully not an unfixable game behaviour), and that fps_max 143 should work great with newer drivers. By early March, Blur Busters Forums is adding a new “Input Lag” subforum area, so that people can test things out.

  18. Fabulist says:

    Thank you for letting me know, that is very valuable information. I wonder what Nvidia has to say about this – if this is hardware related we might need to wait for a G-sync v2.

    Does this not happen with Battlefield 4 for example if you hit more than 135 FPS? I really do not understand whether this is FPS related or CSGO related.

    I cannot wait to see further tests and your new section, keep up the good work!

      • Trip says:

        Maybe the difference in input lag is due to the tickrate of the game? Since 143 fps is higher then the 128hz tickrate it could be that the engine will try to force different frame rates(and therefore refresh rates) to correct sync issues. I noticed similar behaviour in cod4 where enabling vsync while not capping frame rate to just below refresh rate resulted in severe input lag. As the engine can only run at specified framerates namely framerates that are equal to 1000/x where x is a natural number. So 500, 250 etc… It could be the source engine is doing similar things when the tickrate is off. Since logic is only updated every 1000/128 ~= 8ms. And maybe therefore the same frame is shown twice on the monitor resulting in some instances in worse input lag. What also does not make sense is that the input lag for 300 fps and 143 fps are practicly the same. Though it is twice the framerate. This also supports this claim. Maybe it is g-sync and I am just plain wrong but it seems really suspicious that 120 fps is fine and 143 fps is not. And the tickrate is just in between them.

        • devi says:

          Source engine is able to run at any desired fps_max, with caveat being that tick-frames take more CPU time due to physics simulations, meaning that there is inherent variance in frame times and hence in input lag. This variance is of the order few milliseconds. However, you are correct that SHOTS are calculated only on tick frames, and in essence this adds a variability of 8ms or 16ms to lag measured in this way.

          But to reiterate, using fps_max 143 will result in average fps 143 on a very powerful system.

  19. admbr says:

    Registered and logged in just to say thank you!

    The tick rate in tf2 is 66.6666, so an fps_max of 138 (roughly double the tick rate) is just perfect 🙂 Great to know.

  20. Pingback: TITANFALL - Page 29

  21. Pingback: Print: Welche Artikel der PC Games Hardware 04/2014 haben euch gefallen?

    • Chief Blur Buster says:

      Yep. I gave them permission to reprint my article, with credit.

      Magazines can contact me for permission if they wish to reprint any of this site’s original articles. The wonderful folks at pcgameshardware liked this GSYNC test so much.

  22. vasiln says:

    I was looking for input/display lag tests recently and couldn’t find anything very good. Now, on unrelated browsing, I find this: the first good set of input lag data I’ve seen! Great methodology. Really interesting numbers, too– especially the fps-capped CS stuff. I would love to see more tests under a variety of circumstances. In your shoes, I might try to make something homemade for testing, for a high-contrast, immediate, full-screen response.

  23. Pingback: PC Games Hardware publishes Blur Busters GSYNC Input Lag Tests | Blur Busters

  24. blur says:

    Measuring lag among engines is definitely a hard problem (Somebody pointed the animation problem : detecting a gun animation will depend on some game design decision).

    Ideally we would need the cooperation of engines writers to include support for something like that in their engine (with a constantly changing color code) :

    Imagine if the gizmo could change the detected color on its own (and randomly) over time, then we could chart the measured latency over a large sequence of frames, do stats, detect latency hiccups and so on.

  25. erikiksaz says:

    I’m curious, I’m installing G-sync later on today (and I currently play titanfall)

    I’ll only have the option of G-sync @ 60fps, or ULMB @ 120fps (but with frame drops realistically into the low 100s or high 80/90s.

    I wonder which would feel better in that fast-paced FPS.

  26. mikesbaker says:

    I just did the g sync upgrade to my monitor. To make sure games were running g sync I hit the ULMB button. On games like CS:GO and others which run g sync it tells me that ULMB is not available in g sync mode. Battlefield 4 allows ULMB to be turned on which implies that g sync is not running. Am I missing some easy step here or have things just changed since you guys ran your test. Thanks.

      • mikesbaker says:

        Thank you for your response. I am aware of that. Perhaps I should clarify what I mean. I have g-sync enabled. I’ve even gone so far as to go into nVidia control panel and specifically set g-sync for both the regular and 64 bit BF4 exes. I am not running in windowed mode. What I was expecting was the monitor to tell me the ULMB is not available in g-sync mode. This is a really easy way to tell if g-sync is actually running. Since ULMB was able to be turned on while playing BF4 clearly g-sync is not running.

        Setting up g-sync is not complicated at all. It works on all the rest of my games. So my question still stands. Is there something simple I’m missing here? How do I get g-sync running on battlefield 4? At this point I’m out of ideas.

      • mikesbaker says:

        Hit post too soon. Sorry for the double post. I meant to end with this.

        So the question is are you able to run BF4 with g-sync? If so did you do anything other than the regular steps to enable it?

  27. klyze says:

    Im curious, im thinking of buying a gsync monitor to play king of fighters xiii due to the extreme low input lag but thing is.. that game needs to be run EXACTLY at 60 FPS.
    (because its a 2D fighting game)

    Can i cap the framerate at 60 fps with a 3rd party program (ex: afterburner) and gsync on without any problems?

  28. Fr0 says:

    I am strictly a CS:GO competitive player. Have been finding it hard to decide between the Asus VG248QE or the BenQ xl2411z. I Attain consistant High FPS (300+). I want an expert response on which one you reccommend for CS:GO. Please get back to me when you can, Thanks.

  29. Pingback: Lohnt sich G-Sync Monitor mit AMD Graka? - Seite 7

  30. Pingback: ASUS PG278Q Mini Review - Seite 2

  31. Pingback: Asus Swift PG278Q - Seite 15

  32. Pingback: Asus ROG Swift PG278Q: WQHD-Monitor mit G-Sync im Test - das perfekte Gamer-LCD? - Seite 47

  33. WickedSoN says:

    Any update on doing this input lag test with Afterburner or Precision-X framerate limiter/target? I’d be interesting to see of they work as well or better than in-game fps_max type commands. It would also be easier as a global setting for all games right?

  34. Pingback: just grabbed a new monitor on sale by tiu_sam - Page 2 - TribalWar Forums

  35. ancalimon says:

    Is it not possible to have separate ulmb and gsync settings for full screen games and windows10 desktop? I want my pg279 to activate ulmb when on Windows desktop and gsync when a game screen is open.

    Or at least a desktop icon shortcut that can switch between ulmb and gsync. How is that done?

Add Comment