[osg-users] Enabling Vsync gives dramatic increase in latency

Björn Blissing bjorn.blissing at vti.se
Thu Apr 16 05:59:28 PDT 2015


I have made some experiments in regards to rendering latency in OpenSceneGraph. I have written a small test program which just shows one quad. During program execution the quad changes color from black to white 20 times. To my computer I have connected a light-to-voltage sensor which is attached to my monitor, the monitor is an Eizo LCD running in 60Hz.

The setup allows me to log the time it takes for me to send a frame with the new quad color to when the light sensor picks up this change.

The results surprised me a bit:

Running with VSync OFF: min = 6ms, max = 21ms, mean = 14ms
Running with VSync ON: min = 81ms, max = 96 ms, mean = 91ms

So enabling VSync on my Nvidia card gave a increased latency of 75 ms, almost 5 full screen refreshes. 

Then I remembered that Robert said something about running OSG single threaded would give the lowest latency.

So I set my application to run as single threaded:

VSync ON, OSG single threaded mode: min = 43ms, max =  64ms, mean =  57ms

Still a bit high, almost 3 full screen refreshes. 

So I poked around the NVidia settings with help of the Nvidia Inspector, which gives detailed control. The lowest latency I have been able to reach is with VSync enabled is the following setting combination:

VSync = ON
Maximum prerendered frames = 1
Frame rate limiter = 60fps
Threaded optimization = Off

With these settings I have recorded the following latencies:

Custom settings: min = 17ms, max =  42ms, mean =  30ms

The mean difference between VSync OFF and this custom setting is 16 ms, i.e. about 1 screen refresh.

I guess it is hard to push the latency any lower. Or is there any other setting which could reduce the latency even further?

Test have been run on a computer with: Windows 7 64bit + Nvidia GTX570 with driver 347.52 + OSG 3.2.1

Best regards

Read this topic online here:

More information about the osg-users mailing list