Page 1 of 2 12 LastLast
Results 1 to 10 of 12

Thread: 16-bit textures...

  1. #1
    Junior Member Newbie
    Join Date
    Nov 2000
    Posts
    9

    16-bit textures...

    This is my first post, but here goes:

    I'm having a problem with 16-bit textures, arranged in the 5551 RGBA bit format. I have used a 24-bit version of the texture, a simple RAW file, and all worked well. However, when I coverted the file to a 16-bit texture, I could not seem to get OpenGL to render it. I know the 16-bit texture is fine, the filesize checks. Here's what I'm doing:

    void* LoadRAW(char* pTexture, int nNumber, int nDepth)
    {
    FILE* pFile = fopen(pTexture, "br");
    char* pData = new char[128*128*2];
    fread(pData, sizeof(char), 128*128*2, pFile);

    fclose(pFile);

    glGenTextures(1, &texture[nNumber]);
    glBindTexture(GL_TEXTURE_2D, texture[nNumber]);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB5_A1, 128, 128, 0, GL_RGB5_A1, GL_UNSIGNED_BYTE, pData);

    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);

    return pData;
    }

    Forget the return value, isn't used, this code is not clean, but you have the idea. I believe the problem deals with the params used in glTexImage2D, but am not sure. My screen res is set to 16-bit, so that is not the problem. If anyone can help...please.

  2. #2
    Senior Member Regular Contributor
    Join Date
    Jul 2000
    Location
    Nice, France
    Posts
    201

    Re: 16-bit textures...

    Originally posted by jtwoods:
    FILE* pFile = fopen(pTexture, "br");
    I have never used fopen with "br" as second parameter but rather "rb", never seen in my help file that it could be possible to reverse the order either. So I was wondering ...
    I rebuilt my texture loader function with "br" and ... it doesn't work any more .
    So was it just an error in your post or is your code actually like this ?

    Hope this helps.

    Moz

  3. #3
    Junior Member Newbie
    Join Date
    Nov 2000
    Posts
    9

    Re: 16-bit textures...

    No, as I was posting the topic, I was playing around with the code...I thought I changed it back to the original before I pasted, but I guess I didn't catch that part. In the official code, it is "rb". "br" causes an exception error. In response to your question, the fopen does use "rb", and the file is loaded properly.

  4. #4
    Senior Member Regular Contributor
    Join Date
    Jul 2000
    Location
    Augsburg, Germany
    Posts
    334

    Re: 16-bit textures...

    The problem seems to be the

    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB5_A1, 128, 128, 0, GL_RGB5_A1, GL_UNSIGNED_BYTE, pData);

    The second GL_RGB5_A1 is not allowed. The first one is correct, but for the second one you need probably GL_RGB. The first format parameter tells how the data is stored in memory. I'm not really sure what the second means, but as I understood the RedBook, it means the type of display you have. So GL_RGB could be fine.

    ->
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB5_A1, 128, 128, 0, GL_RGB, GL_UNSIGNED_BYTE, pData);


    Kilam.

  5. #5
    Junior Member Newbie
    Join Date
    Nov 2000
    Posts
    9

    Re: 16-bit textures...

    I just tried your suggestion, didn't work...a driver error. I'm wondering if its some sort of problem specific to a Radeon/OpenGL/16-bit texture system...if anyone has any idea how to make OpenGL recognize 16-bit textures...

  6. #6
    Junior Member Newbie
    Join Date
    Nov 2000
    Posts
    9

    Re: 16-bit textures...

    I'm not sure about this, but I"m gonna take a stab at my problem: Could the problem have something to do with pixel packing alignment? glPixelStorei() seems to change something about pixel storage alignment, but I'm not sure. Could it be that I have to use this or a similar function to make OpenGL realize my color components of my textures are not individual bytes, but bits within a short (16-bit) value. Please reply, thanks!

  7. #7
    Senior Member Regular Contributor
    Join Date
    Apr 2000
    Location
    Redlands, CA, USA
    Posts
    233

    Re: 16-bit textures...

    > Could it be that I have to use this or a similar function to make OpenGL realize my color components of my textures are not individual bytes, but bits within a short (16-bit) value.

    Only in OpenGL 1.2 or with GL_EXT_packed_pixels extension.

  8. #8
    Member Newbie
    Join Date
    Oct 2000
    Posts
    32

    Re: 16-bit textures...

    The first GL_RGB5_A1 is a internal image format, and the second is the pixel format of the submitted image.

    If you use GL_RGB for the second you will get a driver error becuase it tried to read 3*w*h bytes, which is more than the data you have submitted.

    It may have something to do with the supported input texture formats, so check that out.

    Also you can try using a RGBA texture as an input and keep the internal format as GL_RGB5_A1. The driver will keep a 5551 version of the texture object itself so you can delete the original RGBA data after the bind. There may be a problem then of the dithering quality the driver uses to convert the format.

  9. #9
    Senior Member OpenGL Guru zed's Avatar
    Join Date
    Nov 2010
    Posts
    2,466

    Re: 16-bit textures...

    kalim was right this aint alowed
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB5_A1, 128, 128, 0, GL_RGB5_A1, GL_UNSIGNED_BYTE, pData);

    use this instead
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB5_A1, 128, 128, 0, GL_RGBA, GL_UNSIGNED_BYTE, pData);

    only 4 things can go in the image format GL_RGB,GL_RGBA,GL_LUMINANCE,GL_LUMINANCE_ALPHA

    BTW i wrote a program yesterday to enumerate through all the internal texture formats and see what the driver actually gives u.
    ill post it up here today http://members.nbci.com/myBollux
    ive just gotta make a webpage

  10. #10
    Senior Member OpenGL Guru zed's Avatar
    Join Date
    Nov 2010
    Posts
    2,466

    Re: 16-bit textures...

    reading your original question again (helps)
    u might have to alter the data when u load it in , ie convert the 16bit texture data to 24/32 bit before passing it to glTexImage(..)

Page 1 of 2 12 LastLast

Similar Threads

  1. Applying textures to a sphere. I know nothing about textures.
    By ElFlautas in forum OpenGL: Basic Coding
    Replies: 1
    Last Post: 11-28-2018, 03:47 AM
  2. model textures being redirected to font textures
    By Tcll5850 in forum OpenGL: Advanced Coding
    Replies: 7
    Last Post: 09-07-2014, 10:43 PM
  3. FBO Attached Textures Reused as Shader Source Textures
    By Rennie Johnson in forum OpenGL: Basic Coding
    Replies: 2
    Last Post: 06-22-2012, 09:43 PM
  4. Lots of small textures or fewer large textures?
    By jlamanna in forum OpenGL: Advanced Coding
    Replies: 30
    Last Post: 01-12-2004, 03:53 AM
  5. are GL_COLOR_INDEX textures converted to RGBA-Textures?
    By TB-Rex in forum OpenGL: Advanced Coding
    Replies: 4
    Last Post: 12-18-2000, 06:34 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Proudly hosted by Digital Ocean