[osg-users] Enabling Vsync gives dramatic increase in latency

Björn Blissing bjorn.blissing at vti.se
Fri Apr 17 09:10:48 PDT 2015


Jan Ciger wrote:
> 
> What happens is that sometimes your program "gets lucky" and tells the GPU to swap buffers "just in time" before the start of the next frame - then you have very little latency, because the change gets visible almost immediately (modulo the input latency of the monitor above). On the other hand, sometimes you get unlucky, you swap buffers right after the scanout of the framebuffer has started and then the GPU will hold your image until the next frame cycle - poof, one frame of latency extra ... And you can have everything in between these two extreme cases.


Yes, this is pretty similar to my theory but as you say, it happens inside the GPU and not the screen (as I erroneously guessed). 

Looking at my data from the Vsync Off case supports this. Since I "render" in much higher resolution than the screen refresh rate the process can be described as almost stochastic. In best case I will switch just in time and get 0 ms latency + the screen scanout time. In worst case I will miss the frame and get 16 ms + the screen scanout time. But most cases will be somewhere in between, i.e. average 8 ms +  the screen scanout time. This correlates nicely with my data. The minimum latency time was 6 ms, ie screen scanout time is probably ~6ms. The mean time was 14ms which is 8 + 6 ms. And the max latency was 21 ms which is slightly less than 6+16ms. 

But my main point still stands; it is possible to record latencies that are close to the scanout time of the screen with VSync Off (albeit for very simple rendering).

/Björn

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=63460#63460








More information about the osg-users mailing list