Results 1 to 3 of 3

Thread: GL_INTENSITY16

  1. #1
    Junior Member Newbie
    Join Date
    Feb 2004
    Posts
    11

    GL_INTENSITY16

    Hello,

    There seems to be some problem using glTexImage2D() using GL_INTENSITY16 .. basically i have 2D texture which has only a single value, which I'd like to texture onto a polygon. The value is 16-bit unsigned int and so I am using GLuint and GL_UNSIGNED_SHORT for texturing.

    When I store the texture as RGBA wherein I copy the single value into R, G & B and then make A = 1, the texturing seems to work fine. Please let me know if I am doing anything wrong and is the use of GL_INTENSITY16 appropriate.
    Thanks,
    aj

  2. #2
    Super Moderator OpenGL Guru imported_dorbie's Avatar
    Join Date
    Jul 2000
    Location
    Bay Area, CA, USA
    Posts
    3,966

    Re: GL_INTENSITY16

    Yes use this:

    internalformat : GL_LUMINANCE16

    format : GL_LUMINANCE

    type : GL_UNSIGNED_SHORT

    You could also try varying the internal format to INTENSITY* depending on what you want the fragment alpha to do. Remember that your graphics card may not support 16 bit textures, so you'll get a white texture. So, if it doesn't work try GL_LUMINANCE12 and GL_LUMINANCE8 for the internalformat param.

    Use the proxy texture mechanism to determine what is supported at your required resolution at runtime.

  3. #3
    Senior Member OpenGL Pro
    Join Date
    Feb 2002
    Location
    Bonn, Germany
    Posts
    1,633

    Re: GL_INTENSITY16

    dorbie,
    sized internal formats aren't enforced. A proper implementation would not fail if it doesn't support the exact requested resolution, but it would convert to one of the supported internal formats.

    [This message has been edited by zeckensack (edited 02-19-2004).]

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Proudly hosted by Digital Ocean