[osg-users] Enabling Vsync gives dramatic increase in latency

David Heitbrink david-heitbrink at uiowa.edu
Sun Apr 19 16:28:22 PDT 2015

If I remember from my conversations with people at NVidia, the least amount of latency you can get is 2-3 frames I cannot remember the exact number.  If you select the additional per-rendered frames, this will increase this. Also remember your display will also add some additional latency as well. Some displays have motion blur reduction features and other features that can add to latency.

We did a similar test were I work, using Quadro Cards + Quadro Sync, and we got end to end latency of around 75ms. 

What we did for our test was trigger a A/D deviceto I think a go from 0 to 5v, and a we added a line in our fragment shader to over ride the color and set it to white, or black depending on a value of a uniform. We change the Uniform to 1 (to set it to white) and sent out our 5 v signal at the same time. Both were set at the start of a frame. We used a photo diode to pick up the change from black to white, and hooked that and our 5v signal to a oscilloscope. 

 I am not sure that Quadro cards have any better latency, however one nice feature is you can use an Quadro Sync card + external house sync (i.e. you provide the 60Hz signal). This can make syncing the cards with external devices easier. If you have good sync between your graphics and your data generator you can effectively reduce the latency, or at least keep it more consistent. 

As per the 75ms in latency I cannot at this point brake down were all of it is coming from, we are using projectors and we are going through some fiber converters to get from our video cards to the actual projectors.

Read this topic online here:

More information about the osg-users mailing list