[osg-users] Enabling Vsync gives dramatic increase in latency

Björn Blissing bjorn.blissing at vti.se
Sun Apr 19 09:31:18 PDT 2015

Jan Ciger wrote:
> Depends on what you define as "production environment". Consumer GPUs have drivers optimized for games and there it is a common (and default) setting to have vsync off for the sake of the highest frame rate and lowest latencies. It is also a frequent recommendation to turn vsync off when troubleshooting framerate issues in games. It may start to change with the coming VR support where tearing is really horrible, though.

Well, if I have to be nitpicky, but the Nvidias default setting regarding VSync is to respect whatever setting the application prefers. Then you can either force it On or Off as alternative settings, or choose to run adaptive (60/30 mode) or with Gsync if your monitor supports it.

Jan Ciger wrote:
> Unfortunately the "pro" applications are a minority and Nvidia will tell you that you should use a Quadro with differently optimized drivers for "professional" applications.

I guess I need to petition my manager about getting a Quadro card installed in my machine.  ;)

Jan Ciger wrote:
> One thing you could try is to repeat the test with VSYNC on using a plain OpenGL test application, rendering the same scene. That will rule out any threading or buffering issues that OSG could
> theoretically add. 

I have actually made some initial efforts of writing just writing a pure OpenGL 4 application with the same functionality. But I used freeglut to handle my windows and for some reason I cannot force VSync On for this application. No matter what settings I use I still get Vsync off in this pure OpenGL application. I guess I will be forced to dump freeglut and use native windows instead.

Jan Ciger wrote:
> My hunch is that with the default settings the driver simply buffers a few frames to make sure it can synchronize with the scanout and not block your program - typically what you would want in a game for a smooth framerate without judder.

Well, there actually is a setting in the driver called "max pre rendered frames" which once again has "use 3D Application Setting" as its default. But you could manually set it to anywhere between 1 and 4. According to the info I find this setting should control the size of the Context Queue. And I use it in my custom settings above. BUT I still cannot get really low latencies as with Vsync Off.

Jan Ciger wrote:
> It would be really great if you could post the results. I am fairly interested to see what you get. Do you have the latency tester from Oculus? It would be interesting to compare the results from that gizmo with the results you have too - it could help to verify that the measurement technique is the same.

Sorry, I do not have the official latency tester from Oculus. I had already built my own before they released theirs. But I certainly will post my results.


Read this topic online here:

More information about the osg-users mailing list