Results 1 to 10 of 10

Thread: Pbuffers,GLX, linux, nVidia

  1. #1
    Member Contributor
    Join Date
    Jun 2001
    Location
    San Jose, CA
    Posts
    50

    Pbuffers,GLX, linux, nVidia

    Does anyone have a functioning example of pbuffers under glX, that works on Sun or Linux/nVidia-23.13 ?

  2. #2
    Junior Member Newbie
    Join Date
    Dec 2001
    Posts
    1

    Re: Pbuffers,GLX, linux, nVidia

    Hi,

    I have the same question. Have you already heard something?

  3. #3
    Member Contributor
    Join Date
    Jun 2001
    Location
    San Jose, CA
    Posts
    50

    Re: Pbuffers,GLX, linux, nVidia

    Sadly, No. It appears to be a taboo subject.

    I had no trouble using pBuffers under windows(wgl), but glX is proving to be trickier, and the complete absence of examples is making things harder.

    I have searched everywhere for a sample piece of code, and have found none.

    I have put together the basic 20 lines of code that I think should do the job, and played around with it on linux, but it doesn't seem to want to do the final copy from pbuffer to front buffer.

    I was hoping one of the more frequent contributors would be able to post what would seem like a pretty straight forward response with example code.


  4. #4
    V-man
    Guest

    Re: Pbuffers,GLX, linux, nVidia

    I have succeeded in making p-buffer on windows, and I imagine the technic is similar on Linux.

    >>>from pbuffer to front buffer. <<

    The p-buffer and the visible buffer have different windows and different RC.

    How are you copying from one to the next?

    V-man

  5. #5
    Senior Member OpenGL Guru
    Join Date
    Mar 2001
    Posts
    2,411

    Re: Pbuffers,GLX, linux, nVidia

    Oh, no! You've mentioned the P-word! Quick, send out the black men!
    "If you can't afford to do something right,
    you'd better make sure you can afford to do it wrong!"

  6. #6
    Member Contributor
    Join Date
    Jun 2001
    Location
    San Jose, CA
    Posts
    50

    Re: Pbuffers,GLX, linux, nVidia

    After spending a day playing around on the linux/nVidia23.13 setup, I decide to take my dirty little test app over to the Sun Box. It worked almost right away. (still doesn't work for nVidia though) Here it is.

    #include <stdio.h>
    #include <stdlib.h>
    #include <X11/Xlib.h>
    #include <GL/glx.h>
    #include <GL/gl.h>

    #define WIDTH 300
    #define HEIGHT 300

    void main(int argc, char **argv);
    void fnRedraw(void);

    Display *dpy;
    Window win;
    GLXContext PBufferCtx;
    GLXContext WinCtx;
    GLXPbuffer PBuffer;

    const int sbAttrib[] = {GLX_DOUBLEBUFFER, 0, GLX_RED_SIZE, 1,GLX_GREEN_SIZE, 1, GLX_BLUE_SIZE, 1,
    GLX_DEPTH_SIZE, 12,None};
    const int pbAttrib[] = {GLX_PBUFFER_WIDTH, WIDTH,GLX_PBUFFER_HEIGHT, HEIGHT,GLX_PRESERVED_CONTENTS, True,None };


    static Bool WaitForNotify(Display *d, XEvent *e, char *arg)
    {
    return (e->type == MapNotify) && (e->xmap.window == (Window)arg);
    }

    void main(int argc, char **argv)
    {
    GLXFBConfig* fbc;
    XVisualInfo* vi = NULL;
    Colormap cmap;
    XSetWindowAttributes swa;
    XEvent event;
    Bool bRedraw = False;
    int dummy;
    int nElements;
    int nval;
    int nCounter;

    if(!(dpy = XOpenDisplay(NULL)))
    {
    fprintf(stderr,"could not open display");
    exit(-1);
    }

    fprintf(stdout,"Info:GLX Extensions:%s\n",glXQueryExtensionsString(dpy, DefaultScreen(dpy)));

    if(!glXQueryExtension(dpy, &dummy, &dummy))
    {
    fprintf(stderr,"Error:GLX extensions not supported");
    exit(-1);
    }

    fbc = glXChooseFBConfig(dpy, DefaultScreen(dpy), sbAttrib, &nElements);
    fprintf(stdout,"Info:Number of FBConfigs: %d\n",nElements);
    if(nElements == 0)
    {
    fprintf(stderr,"Error: No valid framebuffers configurations found\n");
    exit(-1);
    }
    /*
    * For simplicities sake, select the first. This however may not be the right one
    * for the purpose of an example this will suffice.
    */
    vi = glXGetVisualFromFBConfig(dpy,fbc[0]);

    if(!(WinCtx = glXCreateContext(dpy, vi,
    None, /* no sharing of display lists */
    True /* direct rendering if possible */
    )))
    {
    fprintf(stderr,"Cound not create rendering context\n");
    exit(-1);
    }

    PBuffer = glXCreatePbuffer (dpy, fbc[0], pbAttrib);
    PBufferCtx = glXCreateNewContext( dpy, fbc[0], GLX_RGBA_TYPE, 0, GL_TRUE);

    cmap = XCreateColormap(dpy,RootWindow(dpy,vi->screen),vi->visual,AllocNone);
    swa.colormap = cmap;
    swa.border_pixel = 0;
    swa.event_mask = ExposureMask | ButtonPressMask | StructureNotifyMask;
    win = XCreateWindow(dpy,RootWindow(dpy,vi->screen),0,0,WIDTH,HEIGHT,
    0,vi->depth,InputOutput,vi->visual,
    CWBorderPixel | CWColormap | CWEventMask,
    &swa);

    glXMakeContextCurrent( dpy, PBuffer,PBuffer, PBufferCtx);

    XMapWindow(dpy, win);
    XIfEvent(dpy, &event, WaitForNotify, (char*)win);

    glEnable(GL_DEPTH_TEST);
    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();
    glFrustum(-1.0,1.0,-1.0,1.0,1.0,10.0);

    while(1)
    {
    do
    {
    XNextEvent(dpy, &event);
    switch(event.type)
    {
    case ConfigureNotify:
    glViewport(0,0,event.xconfigure.width,event.xconfi gure.height);
    bRedraw = True;
    break;
    case Expose:
    bRedraw = True;
    break;
    }
    }
    while(XPending(dpy)); /* loop to compress events */

    if(bRedraw)
    {
    fnRedraw();
    bRedraw = False;
    }
    }
    }

    void fnRedraw(void)
    {
    static Bool bFirstPass=True;

    if(bFirstPass)
    {
    bFirstPass=False;
    glXMakeContextCurrent( dpy, PBuffer,PBuffer, PBufferCtx);
    glMatrixMode(GL_MODELVIEW); /* switch to model matrix stack */
    glLoadIdentity(); /* reset modelview matrix to identity */
    glTranslatef(0.0,0.0,-3.0); /* move camera back 3 units */
    glClearColor(1.0,0.0,0.0,0.0);
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

    glBegin(GL_QUADS);
    {
    glColor3f(0.0,0.7,0.1);
    glVertex3f(-1.0, 1.0, 1.0);
    glVertex3f( 1.0, 1.0, 1.0);
    glVertex3f( 1.0,-1.0, 1.0);
    glVertex3f(-1.0,-1.0, 1.0);
    }
    glEnd();
    glReadBuffer(GL_FRONT);
    }
    glXMakeContextCurrent( dpy, win,PBuffer, WinCtx);

    glCopyPixels(0,0,WIDTH,HEIGHT,GL_COLOR);
    glFlush();
    }
    /*-------------*/
    to compile on Sun : gcc -o sample.exe sample.c -lGL -lX11

    on linux : gcc -o sample.exe sample.c -lGL

    Now can anyone tell my why it doesn't work on Linux/nVidia23.13 and provide a fix ?

  7. #7
    Member Contributor
    Join Date
    Dec 2001
    Location
    Berlin, Germany
    Posts
    63

    Re: Pbuffers,GLX, linux, nVidia

    The NVidia-drivers still do not support GLX 1.3 and therefore not glXMakeContextCurrent(...). You can only use

    glXMakeCurrent(dpy, PBuffer, PBufferCtx);

    instead. Of course, it is not possible to accomplish the effect of the last glCopyPixels(...) with that. However, copying the framebuffer in a texture, changing the context and applying the texture might be better (faster) anyway.

    Florian

  8. #8
    Junior Member Newbie
    Join Date
    Dec 2001
    Location
    Copenhagen, Denmark
    Posts
    27

    Re: Pbuffers,GLX, linux, nVidia

    Now we're at it, I'm discussing the GLAF development with it's author by mail. I'm trying to get it to run on AGL (MacOS).

    Does anyone know of an example of using Pbuffers on mac?

  9. #9
    Junior Member Newbie
    Join Date
    Dec 2001
    Location
    Paris, France
    Posts
    29

    Re: Pbuffers,GLX, linux, nVidia

    I posted a link on the Linux forum, but from here I can check this the flaw I mentioned to you : visual selection is not the same on a Sun machine and on a PC. This comes from the video hardware : machines which don't have hardware text mode (Sparc, Mac, etc) must start a video driver right from kernel boot, and later X must sit on top of this. The X driver sitting on the kernel video drivers is called the 'frame buffer (FB) driver'.

    On a classical PC setup, you'll end using glXChooseConfig() instead. See my code sample on the Linux forum.

  10. #10
    Member Contributor
    Join Date
    Jun 2001
    Location
    San Jose, CA
    Posts
    50

    Re: Pbuffers,GLX, linux, nVidia

    On Sun the sample works fine.
    On nvidia, 23.13 geforce3-ti500 the only function that doesn't work correctly is the glXMakeContextCurrent(), This does make sense if nVidias GLX implementation is not quite 1.3, I suppose it would have been better if it just wasn't in the library, At least the app wouldn't link and It would be obvious.

    The glXChooseFBConfig() returns a perfectly valid set of FB's, and if the pBuffer specific code is removed and glXMakeCurrent substituted the sample works fine under nVidia.

    I will now pursue the suggestion made by Florian, since using the TEXTURE_RECTANGLE_NV approach is how I do it on NT, and I'll just wait for nVidia to release a newer GLX 1.3 version.

    When I get it working I'll be sure to post it.

Similar Threads

  1. Performance on linux(windows, pixmaps, pbuffers)
    By Zengar in forum OpenGL: Linux
    Replies: 3
    Last Post: 04-20-2006, 05:13 AM
  2. jogl float pbuffers and nvidia cards
    By Chuck0 in forum OpenGL: Advanced Coding
    Replies: 2
    Last Post: 11-17-2004, 12:51 PM
  3. New nVidia Drivers!! Pbuffers?
    By jackryan in forum OpenGL: Linux
    Replies: 9
    Last Post: 03-24-2002, 05:49 AM
  4. Pbuffers, glX, linux, nVidia
    By heath in forum OpenGL: Linux
    Replies: 6
    Last Post: 01-05-2002, 07:58 AM
  5. pbuffers with nvidia?
    By in forum OpenGL: Linux
    Replies: 1
    Last Post: 05-31-2001, 07:29 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Proudly hosted by Digital Ocean