[osg-users] Enabling Vsync gives dramatic increase in latency
Jan Ciger
jan.ciger at gmail.com
Mon Apr 27 06:34:20 PDT 2015
On Mon, Apr 27, 2015 at 3:20 PM, Björn Blissing <bjorn.blissing at vti.se>
wrote:
>
> Well, this is pretty much exactly my method. But instead of an
> oscilloscope I sample the signal with a A/D capture card at 10KHz.
>
> Here is the promised data about the Oculus Rift DK1 & DK2:
> Oculus Rift DK1 + Vsync Off Min: 16.8 Max: 60.0 Avg: 37.6
> Oculus Rift DK1 + Vsync On Min: 34.0 Max: 79.1 Avg: 48.1
> Oculus Rift DK2 + Vsync Off Min: 17.0 Max: 65.0 Avg: 39.6
> Oculus Rift DK2 + Vsync On Min: 24.0 Max: 70.1 Avg: 43.6
>
> I also tested the latency on a Philips GLine screen, which supports Nvidia
> G-Sync. But I had to change GPU to do this, since G-Sync requires
> DisplayPort 1.2. So I had to upgrade to a GTX 770 card.
>
> Philips GLine + 60 Hz + Vsync Off Min: 2.8 Max: 23.0 Avg: 12.2
> Philips GLine + 60Hz + Vsync On Min: 37.0 Max: 63.1 Avg: 60.7
> Philips GLine + 60 Hz + Gsync Min: 58.9 Max: 63.1 Avg: 60.9
> Philips GLine + Native 144Hz + Vsync On Min: 22.9 Max: 26.1 Avg: 24.5
>
Interesting data, thanks for the experiment. I think the deal with the
G-sync being worse on average than pure Vsync is due to the fact that
G-sync is designed more to provide smooth, jitter-free experience by
locking the screen refresh rate to the scanout rate of the card (avoids the
"missed sync" problem) than to actually minimize latency.
In your case it seem that there are about 3 frames "in flight" on the GPU
whenever Vsync/Gsync is on, regardless of framerate. I guess that this
would need someone familiar with the actual driver code from Nvidia to
explain why they are buffering that much - my bet is performance reasons,
to keep the GPU pipeline from stalling.
J.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openscenegraph.org/pipermail/osg-users-openscenegraph.org/attachments/20150427/54905edb/attachment-0003.htm>
More information about the osg-users
mailing list