Results 1 to 4 of 4

Thread: 16-bit Signed Integer (GL_SHORT) Texture Upload

  1. #1
    Junior Member Newbie
    Join Date
    Jun 2009
    Posts
    8

    16-bit Signed Integer (GL_SHORT) Texture Upload

    Hello,

    I am doing GPU-based volume rendering on 16-bit signed CT data. I use glTexImage3D to upload my 3D texture to the GPU like this:
    glTexImage3D(GL_TEXTURE_3D, 0, GL_LUMINANCE16, width, height, depth, 0, GL_LUMINANCE, GL_SHORT, data);

    The problem is all the negative values in the texture get clamped to zero. According to the following description in glTexImage3D documentation, this behavior is by design, but it's not the desired behavior in my application.

    ************************************************
    GL_LUMINANCE
    Each element is a single luminance value. The GL converts it to floating point, then assembles it into an RGBA element by replicating the luminance value three times for red, green, and blue and attaching 1 for alpha. Each component is then multiplied by the signed scale factor GL_c_SCALE, added to the signed bias GL_c_BIAS, and clamped to the range [0,1] (see glPixelTransfer).
    ************************************************

    GL_NV_texture_shader extension supports signed internal formats but it only supports GL_SIGNED_LUMINANCE8_NV. There is no GL_SIGNED_LUMINANCE16_NV.

    I could work around this issue by setting

    glPixelTransferf(GL_RED_SCALE, 0.5f);
    glPixelTransferf(GL_RED_BIAS, 0.5f);

    and convert the values back in my shader. This works but I would lose performance (and possibly precision?).

    Does anybody know a better solution to this problem? Thanks a lot.

    -Clarence

  2. #2
    Junior Member Newbie
    Join Date
    Jun 2009
    Posts
    8

    Re: 16-bit Signed Integer (GL_SHORT) Texture Upload

    Looks like EXT_texture_integer is what I need as it supports the GL_LUMINANCE16I_EXT internal format. I'll experiment with it tonight and see if it solves my problem.
    If you have experience working on this kind of application, feel free to post your comments here. Thanks.

  3. #3
    Senior Member Regular Contributor
    Join Date
    Aug 2003
    Posts
    261

    Re: 16-bit Signed Integer (GL_SHORT) Texture Upload

    If you want only a single channel 16-bit texture, use the GL_ALPHA16I_EXT format, along with GL_ALPHA_INTEGER_TEXTURE for the internal format parameter of glTexImage2D.

    glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA16I_EXT, width, height, 0, GL_ALPHA_INTEGER_EXT, GL_SIGNED_SHORT, pixels);

  4. #4
    Junior Member Newbie
    Join Date
    Jun 2009
    Posts
    8

    Re: 16-bit Signed Integer (GL_SHORT) Texture Upload

    Thanks for the suggestion.
    What is the advantage of using GL_ALPHA rather than GL_LUMINANCE? Does it give you faster upload speed because GL doesn't have to replicate the value for red, green and blue components?

Similar Threads

  1. Performance of integer texture upload
    By pango in forum OpenGL: Advanced Coding
    Replies: 12
    Last Post: 12-23-2008, 03:18 PM
  2. texture upload.
    By noncopyable in forum OpenGL: Advanced Coding
    Replies: 12
    Last Post: 11-27-2007, 06:27 AM
  3. slow texture upload
    By michael.bauer in forum OpenGL: Advanced Coding
    Replies: 5
    Last Post: 02-19-2005, 11:30 AM
  4. Texture fast upload
    By kohn in forum OpenGL: Advanced Coding
    Replies: 4
    Last Post: 11-24-2003, 09:21 AM
  5. texture upload question
    By rgreene in forum OpenGL: Advanced Coding
    Replies: 9
    Last Post: 10-18-2002, 09:33 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Proudly hosted by Digital Ocean