[osg-users] Enabling Vsync gives dramatic increase in latency

Björn Blissing bjorn.blissing at vti.se
Mon Apr 27 06:20:48 PDT 2015

d_a_heitbrink wrote:
> What we did for our test was trigger a A/D deviceto I think a go from 0 to 5v, and a we added a line in our fragment shader to over ride the color and set it to white, or black depending on a value of a uniform. We change the Uniform to 1 (to set it to white) and sent out our 5 v signal at the same time. Both were set at the start of a frame. We used a photo diode to pick up the change from black to white, and hooked that and our 5v signal to a oscilloscope. 

Well, this is pretty much exactly my method. But instead of an oscilloscope I sample the signal with a A/D capture card at 10KHz.

Here is the promised data about the Oculus Rift DK1 & DK2:
Oculus Rift DK1 + Vsync Off Min: 16.8 Max: 60.0 Avg: 37.6
Oculus Rift DK1 + Vsync On  Min: 34.0 Max: 79.1 Avg: 48.1
Oculus Rift DK2 + Vsync Off Min: 17.0 Max: 65.0 Avg: 39.6
Oculus Rift DK2  + Vsync On Min: 24.0 Max: 70.1 Avg: 43.6

I also tested the latency on a Philips GLine screen, which supports Nvidia G-Sync. But I had to change GPU to do this, since G-Sync requires DisplayPort 1.2. So I had to upgrade to a GTX 770 card.

Philips GLine + 60 Hz + Vsync Off       Min: 2.8 Max: 23.0 Avg: 12.2
Philips GLine + 60Hz + Vsync On         Min: 37.0 Max: 63.1 Avg: 60.7
Philips GLine + 60 Hz + Gsync           Min: 58.9 Max: 63.1 Avg: 60.9
Philips GLine + Native 144Hz + Vsync On Min: 22.9 Max: 26.1 Avg: 24.5

Best regards

Read this topic online here:

More information about the osg-users mailing list