[osg-users] Enabling Vsync gives dramatic increase in latency

Jan Ciger jan.ciger at gmail.com
Fri Apr 17 01:36:47 PDT 2015


On Thu, Apr 16, 2015 at 7:43 PM, Björn Blissing <bjorn.blissing at vti.se>
wrote:

>
> That does not seem entirely correct, if you look at the values for running
> without Vsync I have managed to get down to 4 ms and a mean of 14 ms. So I
> guess that my screen has a scan out time of ~4ms and since I am rendering
> at ~3000 fps without Vsync I have somehow managed to send a frame just as
> the screen starts to scan out a new image.


That actually sounds odd, because the monitor will not refresh the image
faster than its fixed refresh rate. 4ms would require 250Hz refresh, I am
not aware of any commonly sold LCD that could go that fast. Even 120Hz ones
are quite rare. Are you sure it is not an artefact of your measurement
method? I.e. that you start your timer circuit from the PC when you send a
new image but the light sensor is still registering the light from the
previous frame, so it triggers right away, giving you a false low reading,
essentially showing only how long it took the GPU to send out the frame. If
you aren't doing so already, you may have to implement the light trigger
only when it has seen "dark" before "light" to make sure that it is not
triggering on old data. An alternative method could be a simple R-C high
pass filter circuit that will make it generate a pulse to stop your timer
only on the transition (from dark to light or vice versa) and ignore the
steady level.

J.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openscenegraph.org/pipermail/osg-users-openscenegraph.org/attachments/20150417/64c0bffb/attachment-0003.htm>


More information about the osg-users mailing list