[osg-users] how to interpret values shown by stats handler

Werner Modenbach Werner.Modenbach at texion.eu
Tue Jul 17 02:42:42 PDT 2018

Hi all,

I'm trying to optimize the display speed of my application by testing
various techniques.

In order to do exact measures I implemented a mechanism where I can
trigger single frame() calls from my keyboard.

Curiously the GPU times shown by the stats handler are always drifting
somehow over significant ranges
before getting stable. Does anybody know if there is some averaging over
a couple of frames or is there any other effect
I don't see at the moment? Or is this a kind of caching effect?

It's difficult to judge the effects of techniques under those
circumstances if I don't have an idea what is causing the drifts.

Thanks for any hints.

- Werner -

More information about the osg-users mailing list