[osg-users] Synchronizing with textures uploads.

Altin Gjata altingjataj at gmail.com
Thu Mar 8 03:21:11 PST 2018


Hi Robert. Thank you very much for your quick reply.

I use Intel's Threading Building Blocks to create a flow::graph that processes video frames using the OpenCV library. On the final stage I have a frame object that contains both, the image to use as the background as well as the corresponding camera matrix.

Initially, I called ImageStream::setImage() from whatever working thread they were processed (I imagined that if I call this method as soon as possible I'll give GPU time to upload it in the background, and it seemed safe because the ffmpeg plugin also calls it from its thread), and posted a message in a queue to be read by the thread OSG runs and update the camera matrix right before calling renderingTraversals(). But depending on the timing between the worker thread that called ImageStream::setImage and the OSG main thread that updated the matrix, they could or could not update in the same frame.

Next, to not leave tings to chance, I decided to do both, call setImage & update camera matrix from within the OSG thread, just before renderingTraversals:


Code:
    advance(simulationTime);

    eventTraversal();
    updateTraversal();
    update_camera_and_image();
    renderingTraversals();




The transforming matrix takes effect that frame, but the updated texture takes effect on the next frame. (I must say that the images are full hd, and the computer doesn't have the best graphics hardware ever made. I even convert images from BGR to BGRA from my processing pipeline because that way they seem to load a bit faster) So 3d objects move one frame before their siblings from the real world.

I have split these operations and have tried placing them in all sorts of combinations between the calls above, but nothing seems to work. It seems that I need to render a frame just to have the texture uploaded for the next.

And this leads to what I currently do, a very ugly hack, which I'll describe just to show the kind of frustration this has caused me.


Code:

// begining of the rendering loop
imageStream->setImage(frame);
getSceneData()->setNodeMask(0); // avoid rendering 3D scene
// signal GraphicsContext::SwapCallback to not swap the buffers
renderingTraversals(); // render just the (old) background because this triggers the upload of the new video frame to GPU
getSceneData()->setNodeMask(saved_value);// restore 3D scene
update_camera_matrix();
... rest of default OSG loop calls ... 
eventTraversal();
updateTraversal();
// signal GraphicsContext::SwapCallback to allow sapping buffers
renderingTraversals();




So I call renderingTraversals() twice in a loop, the first time just to render the background which also triggers the upload of the new image for the next (normal) call to renderingTraversals() where I render the complete scene. And I have to disable / enable swapping buffers by using graphicsContext->setSwapCallback().

For applying came matrix changes I've tried setting a NodeCallback to its transforming node too, but that seems to disable ON_DEMAND rendering which I prefer for the moment.

You mentioned updating the texture, what method should I call and when? The examples I've seen that play video only call ImageStream::setImage()

Thank you!

Cheers,
Altin

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=73041#73041







More information about the osg-users mailing list