G-SYNC monitors supports strobing at 85Hz and at 144Hz

UPDATE: This is old information, posted before G-SYNC was released. Blur Busters now has a G-SYNC monitor (see Our Preview of G-SYNC) with ULMB. 85Hz strobing is confirmed, however, 144Hz strobing is not available.

Good news for people who want “LightBoost” style strobing at other refresh rates, to reduce GPU requirements (85fps @ 85Hz) or to reduce input lag (144fps @ 144Hz) during low-persistence mode!

The web has revealed several clues about G-SYNC’s optional fixed-rate strobe mode:
(A) The G-SYNC upgrade datasheet has 85Hz added.
(B) AndyBNV suggested on NeoGAF the low-persistence mode is superior to LightBoost.
(C) The YouTube video of John Carmack at G-SYNC launch, was very suggestive.
(D) Many articles mentions 85Hz as a CRT frequency that stops flickering for many people.
(E) The pcper.com livestream suggests a very high fixed refresh in low-persistence mode.

Upon analysis, both 85Hz and 144Hz are available strobed modes with G-SYNC, in addition to existing common 100Hz and 120Hz strobed modes, like today’s LightBoost. More strobed modes (e.g. 60Hz) might be available. You heard it here first at Blur Busters.

About Chief Blur Buster

Head of Blur Busters.

18 comments on “G-SYNC monitors supports strobing at 85Hz and at 144Hz”

    • Chief Blur Buster says:

      It’s very difficult to do a variable-rate strobing without flicker. Motion blur elimination strobing must only occur at 1 strobe flash per refresh. And as the frame rate goes down to only 30fps, that would create very nasty flicker.

      However, it’s not impossible to do a creative variable-rate strobe algorithm that blends from steady backlight (PWM-free or ultrahigh-frequency PWM) at lower refresh rates (<60Hz) all the way to to full strobing at higher refresh rates (>100Hz), while maintaining trailing average brightness at all times over human vision timescales. This allows variable-rate strobing without visible flicker to most people.

      • vomitme says:

        The decision to strobe or keep the backlight steady over a frame depends on how much time is elapsed until the next frame arrives. Thus, there is no easy way to frequently transition between strobing steady backlighting without increased input lag.

        One hard way would be to strobe the backlight for every frame, and if the next frame doesn’t arrive before 1/(max refresh rate), put the backlight in steady mode until the next frame arrives. That would cause ugly motion blurring artifacts and possible residual flickering.

        Another issue of actively switching between strobing and steady backlighting is that the color could change significantly. see http://www.anandtech.com/show/6963/benq-xl2720t-gaming-monitor-reviewed/7

        • Chief Blur Buster says:

          G-SYNC monitors’ strobe mode would not have the same problems as LightBoost.

          1. G-SYNC’s optional low persistence strobe mode is superior to LightBoost without the same color degradation. 60Hz-vs-120Hz-vs-144Hz have different colors on old monitors, with 144Hz having worse color than 60Hz. Imagine the problem with variable refresh rates! G-SYNC fixes the color inconsistencies between all the modes through advanced color processing, and probably includes fixing the color delta between strobed and non-strobed modes.

          2. There’s a new variable-rate strobing algorithm that doesn’t flicker. It’s a strobe-blending algorithm that slowly blends between PWM-free (at lower refreshes) into full strobing (at higher refreshes). Basically the curves shows up in the flat DC line gradually, slowly until it becomes squarewave, while also maintaining at all times, a strict constant trailing-average brightness over time periods above flicker-detection threshold.

          3. I’ve indirectly experimentally determined it is not necessary to know the time until the next frame. You can use time since previous frame to calculate the strobe curve. The /motion-related/ side effects of assigning different strobe lengths (as long as they are short) to different frames has virtually no human-noticeable side effects (see 3rd paragraph here for my LB=0% vs LB=10% experiment).

          4. Strobe lengths don’t have to be assigned to the specific frames they “belong” to, for the reasons already explained here and here. The challenge is more in avoiding flicker, as you said. As it stands now, transitioning between LB=0% vs LB=10% creates a momentary flicker and keeping color consistent, but other than that, no other jarring changes. Assuming G-SYNC fixed the color differences between different refresh rates (and presumably, to strobe mode too), if there are no visible color/brightness changes. There’s only minor (less than 1 pixel) motion blur changes caused by strobe length changes, which is not noticeable like color/flicker problems, so you can randomize the strobe lengths if you wished, and it would still look reasonably acceptable. Thus, the strobe lengths don’t have to be assigned to the correct frame. And thus, you can simply use frame history instead of future frames, to determine strobing.

  1. vomitme says:

    I see.. using the time to the previous frame could work if there is little frame time jitter. If the frame times randomly spike, there would probably be flickering though I don’t really have an idea how bad it would be.

    As for fixing the color inconsistencies, there would need to be a lot of calibration since presumably the backlight’s strobe length can change every frame. G-sync won’t magically fix the problem, but assuming that g-sync supports instantly switching between different calibrations for difference backlight modes, it is still up to the manufacturers to get the calibration data.

    In any case, combining variable refresh rate and strobing backlight is much more involved than having them exclusive and I would be amazed if nvidia and/or some monitor manufacturer is able to combine the two without significant issues

    • Chief Blur Buster says:

      Strobe length doesn’t affect color on LightBoost LCD’s. It only affects brightness.
      Example: Switch between LightBoost=10% (1.4ms) and LightBoost=100% (2.4ms) using ToastyX Strobelight Control+Alt+0 and Control+Alt+1. (strobe lengths confirmed by oscillscope+photodiode). Also modern LCD’s (running PWM-free backlights) are consistently consistent color under high speed camera, so strobe lengths only dims, rather than distorts the color. You may have minor interplay with inversion and other slight temporal effects, but these were engineering problems they already had to solve when they made LCD’s compatible with 3D shutter glasses.

      In addition LightBoost is only poorer color quality than 144Hz for two reasons: (1) Calibrated for 3D glasses, and (2) The accelerated scanout, as documented in Display Corner at http://display-corner.epfl.ch/index.php/LightBoost — to artificially create the large blanking intervals needed for pixel-settling wait time before strobing, it partially buffers the input and then does an accelerated scanout to the display (probably about 1/200th or 1/240th of a second).

      Keeping colors consistent during variable refresh rates was a much bigger engineering issue (60Hz vs 120Hz vs 144Hz all have different color quality). Now that NVIDIA has successfully solved the color-fluctuation problem during variable refresh rates, which is a bigger engineering difficulty.

      If you intentionally engineer to combine G-SYNC and strobing, you can always keep the LCD scanout constant, and overdrive constant (use the same overdrive algorithm for strobed mode and non-strobed mode), so you charge the LCD pixels at roughly the same level regardless of strobing or otherwise. This successfully keeping color calibration in strobed-mode consistent versus non-strobed-mode. The problem you describe, then becomes mostly moot except for variances in overdrive artifacts. One engineering issue is the overdrive less optimized for strobing, or overdrive less optimized for non-strobing, so you come up with a compromise overdrive algorithm.

      It could occur for OLED’s — I think variable-refresh-rate OLED’s would be easier to keep consistent colors during a variable-rate strobing algorithm. Perhaps future frame might be necessary, but we could define a very short maximum time period (e.g. 1/100sec), if we decide our strobe cutoff is 100Hz and go PWM-free at 100Hz and below. Basically a tape-delay of 1/100sec (FIFO buffer of the incoming display data), and if no frames are arriving over a trailing short time period (1/100sec), then transition to PWM-free mode at that point.

      EDIT: I’ve added new diagrams here:

      I think attempting to do this is a lot easier than doing low-persistence with steady light output (flicker free, no phosphor, no PWM, no strobing). To get 1ms of persistence, it would take [email protected] for a completely flicker free display (zero flickers even under high speed camera).

      LightBoost already gets nearly that low already via strobing, and we’re stuck with strobing or some form of light brightening-dimming modulation on a per pixel basis (lasers, phosphor, strobing, plasma subfields, etc) to achieve low persistence without needing ultrahigh frame rates natively, and we’d need so much GPU power for that.

      The pursuit of low persistence, without flicker, is why some televisions have “960Hz” simulation (via a combination of both strobing and interpolation — it interpolates to 240fps, then uses strobing/backlight scanning to do the rest. However, interpolation won’t work for computers, due to input lag. So we’re stuck with pure-strobing solutions as a motion blur reduction, until GPU power permits low-persistence sample-and-hold (not practical for a long time).

  2. Neo says:

    A latency-friendly interpolation method has been developed for rendering games.


    Also, one important innovation for the ITU-Rec. BT.2020 UHDTV spec is a new side channel signaling method. This can be used to send data for very quick high-quality frame interpolation and super-resolution reconstruction. It’s a bit similiar to the above technique.

    • Chief Blur Buster says:

      A motion vector buffer — very interesting concept for lookahead-free frame rate interpolation. It may be an important piece of the puzzle to get flicker-free low-persistence displays. Presently, right now, it’s scientifically impossible to get low persistence (less than 1ms) without either using flicker (strobing, phosphor, light modulation) or the use of ultrahigh frame rates. Even an instant-responding 0ms OLED still has over 16ms of persistence (lots of motion blur) when run in non-strobed 60Hz mode.

      Getting from 30fps to 60fps isn’t scientifically interesting to me; just buy more GPU.
      But getting from 60fps to 1000fps is more interesting to Blur Busters — that’s one way to achieve of low-persistence (1ms) without needing light modulation (strobing/CRT/phosphor). The use of interpolation, to get to this level, is probably more feasible than GPU horsepower.

      • Neo says:

        And as frame rates get higher, the quality should also be potentially higher and the power lower
        (other than raw output speed) because of the smaller difference between frames needed to be computed.

        • Chief Blur Buster says:

          Hmm, I didn’t consider that average GPU power per interpolated frame may actually go down, as you add more interpolated frames between frames.

          That said, the displays will need to get there first. Not many ultrahigh refresh rate displays exist — Some CRT’s could do 240Hz in the old days, and Vpixx sells a true 500Hz-capable monochrome projector, and there’s at least one true 1000Hz-capable monochrome scientific projector. These are a long way from hitting gaming desktops, and diminishing points of returns apply here — it’s just much simpler to eliminate motion blur using the strobing/flicker effect for now.

          However, we are excited to see anything that makes a simultaneously blurfree/flickerfree display (one that passes the TestUFO Panning Map Test without requiring strobing/flicker/light modulation).

          • Neo says:

            And at some point prediction could be very straightforward as a human shouldn’t be expected to have such twitchy responses, sorta like how Oculus Rift can get good head tracking based on the limits of human movement.

            I gather the current state of the art in LCD response rate is somewhere near 300 Hz. Perhaps somewhere over 120, 240 fps the interpolation could simply be a quick pan and scan or lens perspective change for a relatively cheap enhancement. I assume those calculations are really quick and could max out the available refresh rate.

          • Chief Blur Buster says:

            True, at higher frame rates and refresh rates (e.g. 240fps @ 240Hz), even lookahead-based interpolation would begin to have a very minor input lag penalty (1/240sec lookahead for motion vector computations = only need to add about 4ms of latency).

            Sometimes input lag is not always about twitch and feeling input lag, but beating your competitor in a game. As an example, having 3ms extra input lag, can means equally matched opponents of 200ms-vs-200ms average reaction time on a specific computer setup, can now actually become 200ms-vs-203ms average reaction time, because input lag mathematically adds to your reaction time. That difference can mean staying alive or dying when you both ‘shoot’ at each other at the same time in an online FPS game. Gaming skill can outweigh this, but this can tip the scales during evenly-matched situations. (Milliseconds matter in a 100 meter race too; you want to be ahead by a millisecond or more). That’s why major gaming competitions (eSports, ESEA, ) make sure to use identical computer monitors, so that every competitor is equally matched, leaving only raw skill to differentiate between them.

            Also, LCD response time is independent of refresh rate.
            You can have response time longer than a refresh (remember the old 33ms+ LCD’s that streaked frames across multiple 16ms refreshes).
            Viewing high speed videos of LCD is quite interesting in a way to understand how response is separate from refresh rate. But yes, you do want faster response, the higher refresh rate you go, to keep the high refresh rates effective.

Add Comment