[osg-users] Enabling Vsync gives dramatic increase in latency
Jan Ciger
jan.ciger at gmail.com
Thu Apr 16 08:03:20 PDT 2015
Hello Björn,
On Thu, Apr 16, 2015 at 2:59 PM, Björn Blissing <bjorn.blissing at vti.se>
wrote:
> With these settings I have recorded the following latencies:
>
> Custom settings: min = 17ms, max = 42ms, mean = 30ms
>
> The mean difference between VSync OFF and this custom setting is 16 ms,
> i.e. about 1 screen refresh.
>
> I guess it is hard to push the latency any lower. Or is there any other
> setting which could reduce the latency even further?
>
I think this is pretty good already.
At 60Hz one frame takes 16.6ms, so you cannot physically get under 17ms -
the monitor will not be able to display the data even if you manage to send
them faster somehow. The 30ms latency is about normal for a good monitor,
it just means you are buffering one frame somewhere - typically the GPU or
the monitor will do that for various reasons. With monitors/TVs it is
called input latency and it occurs because of the various image
decoding/processing/scaling/etc it does before it can physically push the
pixels to the LCD. There are TVs that have 100ms+ of input latency -
completely unusable for anything interactive (like game consoles). That's
why many modern TVs often have a "game mode" where the fancy image
processing is turned down or off to minimize this latency.
42ms likely means you have just missed the sync and had to wait or the OS
did something in the background delaying your process, causing you to miss
it. That would likely explain also why with VSYNC OFF you get lower latency
- in that case you don't care at all about the vsync event and just blast
the data into the GPU as fast as you can, even at the expense of tearing.
However, with all that said, the vsync logic in today's GPUs is very much
decoupled from the physical refresh of the monitor or even the GPU actually
starting to send the data on the wire. Thus the exact behaviour is going to
be extremely hw and driver dependent - the same application can have wildly
different latencies between machines or even driver revisions. It is one of
the reasons why I don't see chasing after extremely low latency in VR
applications using "crazy hacks" like the time warping or late latching as
very productive - we are always going to be limited by the speed the of the
displays, GPUs and the black box nature of their drivers and hw. I am not
sure the extra complexity these things entail is worth the effort. But that
is a story for another day ...
Jan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.openscenegraph.org/pipermail/osg-users-openscenegraph.org/attachments/20150416/cc856dd5/attachment-0003.htm>
More information about the osg-users
mailing list