RGB Internal Texture formats in SC standard

I will be implementing the OpenGL SC standard for an Intel graphics card, but the 1.0/1.0.1 SC standards state that the internal format for RGB must be using 3 bytes (assuming it means RGB8, with 8 bytes each texel).

The Intel cards only store textures in 1, 2, or 4 byte increments per texel. The RGB they support is RGB5 (R5G6B5). When you use RGB8, it is stored as RGBA8.

It is fairly common for cards to only support the RGB565 format internally so I was wondering if the standard was intentionally trying to go against the grain. Currently, I will have to list this as an area where I am not in compliance with the speciification, but I think this will be an area many will not be able to comply with due to hardware restrictions.

Also, the 1.0.1 standard still references table 3.1 for internal formats, but should really reference 3.2 since that table has been renumbered.

Thanks,
Nick

*Note that this is only in the planning phase. I am going to try to make the code compliant, but have included no claims of compliance. If that was done, I would have to of course go through the conformance testing specified on the Khronos website.

Your right, I believe your pretty much forced to do format conversions to comply with the specification. What you really want to avoid is doing multiple conversion, and sometimes that requires application developers to be aware of this issue. Some hardware support certain texture compression formats and expose the additional formats with extension. Regardless, the optimal texture format tends to be platform specific.

It seems to me you should be able to use the RGBA8 hardware format to implement the OpenGL SC “RGB” internal format. Can you not just discard the alpha after lookup?