5
votes

I have ubuntu machine, and a command line application written in OS X which renders something offscreen using FBOs. This is part of the code.

        this->systemProvider->setupContext(); //be careful with this one. to add thingies to identify if a context is set up or not
    this->systemProvider->useContext();
    glewExperimental = GL_TRUE;
    glewInit();


    GLuint framebuffer, renderbuffer, depthRenderBuffer;

    GLuint imageWidth = _viewPortWidth,
            imageHeight = _viewPortHeight;

    //Set up a FBO with one renderbuffer attachment
    glGenFramebuffers(1, &framebuffer);
    glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);

    glGenRenderbuffers(1, &renderbuffer);
    glBindRenderbuffer(GL_RENDERBUFFER, renderbuffer);
    glRenderbufferStorage(GL_RENDERBUFFER, GL_RGB, imageWidth, imageHeight);
    glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, renderbuffer);


    //Now bind a depth buffer to the FBO
    glGenRenderbuffers(1, &depthRenderBuffer);
    glBindRenderbuffer(GL_RENDERBUFFER, depthRenderBuffer);
    glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT, _viewPortWidth, _viewPortHeight);
    glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthRenderBuffer);

The "system provider" is a C++ wrapper around OS X's NSOpenGLContext, which is used just to create a rendering context and making it current, without associating it with a window. All the rendering happens in the FBOs.

I am trying to use the same approach for Linux (Ubuntu) using GLX, but I am having a hard time doing it, since I see that GLX requires a pixel buffer.

I am trying to follow this tutorial:

http://renderingpipeline.com/2012/05/windowless-opengl/

At the end it uses a pixel buffer to make the context current, which I hear is deprecated and we should abandon it in favour of Frame Buffer Objects, is that right (I may be wrong about this).

Does anyone have a better approach, or idea?

1
Well, yes, renderbuffers aren't of much use nowadays.Bartek Banachewicz
Can you elaborate? What do you mean by render buffers? Do you have any better cross platform alternative?csotiriou
What could I possibly mean by "renderbuffer"? Your code is using them.Bartek Banachewicz
I was hoping you would elaborate on the alternatives, and why do you not think they are not very useful, since with a small google search people say that one of the best ways is to use fbo not only for off screen rendering but also for render-to-texture effects. I would love to hear about your alternative.csotiriou
@BartekBanachewicz I believe he is talking about pixel buffers, not render buffers. I'm trying to do the exact same thing using GLX on Ubuntu. The link the OP mentioned uses a pixel buffer to generate the context. But pixel buffers are deprecated and FBOs are preferred. What he is asking about is what needs to be done if we want to use FBOs.informer2000

1 Answers

4
votes

I don't know if it's the best solution, but it surely works for me.

Binding the functions to local variables that we can use

typedef GLXContext (*glXCreateContextAttribsARBProc)(Display*, GLXFBConfig, GLXContext, Bool, const int*);
typedef Bool (*glXMakeContextCurrentARBProc)(Display*, GLXDrawable, GLXDrawable, GLXContext);
static glXCreateContextAttribsARBProc glXCreateContextAttribsARB = NULL;
static glXMakeContextCurrentARBProc   glXMakeContextCurrentARB   = NULL;

Our objects as class properties:

Display *display;
GLXPbuffer pbuffer;
GLXContext openGLContext;

Setting up the context:

    glXCreateContextAttribsARB = (glXCreateContextAttribsARBProc) glXGetProcAddressARB( (const GLubyte *) "glXCreateContextAttribsARB" );
    glXMakeContextCurrentARB   = (glXMakeContextCurrentARBProc)   glXGetProcAddressARB( (const GLubyte *) "glXMakeContextCurrent");

    display = XOpenDisplay(NULL);
    if (display == NULL){
        std::cout  << "error getting the X display";
    }

    static int visualAttribs[] = {None};
    int numberOfFrameBufferConfigurations;
    GLXFBConfig *fbConfigs = glXChooseFBConfig(display, DefaultScreen(display), visualAttribs, &numberOfFrameBufferConfigurations);

    int context_attribs[] = {
        GLX_CONTEXT_MAJOR_VERSION_ARB ,3,
        GLX_CONTEXT_MINOR_VERSION_ARB, 2,
        GLX_CONTEXT_FLAGS_ARB, GLX_CONTEXT_DEBUG_BIT_ARB,
        GLX_CONTEXT_PROFILE_MASK_ARB, GLX_CONTEXT_CORE_PROFILE_BIT_ARB,
        None
    };

    std::cout << "initialising context...";
    this->openGLContext = glXCreateContextAttribsARB(display, fbConfigs[0], 0, True, context_attribs);

    int pBufferAttribs[] = {
        GLX_PBUFFER_WIDTH, (int)this->initialWidth,
        GLX_PBUFFER_HEIGHT, (int)this->initialHeight,
        None
    };

    this->pbuffer = glXCreatePbuffer(display, fbConfigs[0], pBufferAttribs);
    XFree(fbConfigs);
    XSync(display, False);

Using the context:

if(!glXMakeContextCurrent(display, pbuffer, pbuffer, openGLContext)){
    std::cout << "error with content creation\n";
}else{
    std::cout << "made a context the current context\n";
}

After that, one can use FBOs normally, as he would in any other occasion. Up to this day, my question is actually unanswered (if there is any better alternative), so I am just offering a solution that worked for me. Seems to me that GLX does not use the notion of pixel buffers the same way as OpenGL does, hence my confusion. The preferred way to render offscreen is FBOs, but for an OpenGL context to be created on Linux, a pixel buffer (the GLX kind) must be created. After that, using FBOs with the code I provided in the question will work as expected, the same way it does on OS X.