[osg-users] osg apps on gpu cluster

Robert Osfield robert.osfield at gmail.com
Tue Oct 2 00:53:45 PDT 2018


Hi Nick & Per,

On Tue, 2 Oct 2018 at 06:12, Per Nordqvist <nordqvist at gmail.com> wrote:
> I and Nick are working to utilize as much of the GPUs as possible, either on single machine or cluster.
> So hardware is not yet decided, but let's assume ubuntu 16+, multiple modern Nvidia gaming cards, but still single screen.

osgViewer has been written from the ground up to support multiple GPUs
on a single machine with a single application.

The basic concept is the View's master Camera controls the overall
view, and a series of slave Camera's assign to the View handle the
rendering for each graphics card/display.  The osgwindow example is
the simplistic example of this in action.  A search for addSlave in
the OSG codebase will reveal lots of other examples of it in action -
it can be used for a wide range of tasks.

The OSG out of the box will default to DrawThreadPerContext
ThreadingModel on modern machines, you might find
CullDrawThreadPerContext more appropriate, you could even try
CullThreadPerCameraDrawThreadPerContext if you have plenty of cores to
throw at it.

In 3.6.x you also have support for explicitly controlling Affinity so
you can lock various threads to particular cores.

Another variable you could play with is that you can set up the
OS/desktop so that one single graphics context can span multiple
cards.

Modern graphics cards are beast so you might well be able to handle
quite a few displays just from one card.

Robert.


More information about the osg-users mailing list