I find the Secret of openGL Texture.Am i right?

I find many image can not be texture correctly.So i try and try,and find a tip:

as following:
The width or the height of the image must be hte power of 2,such as:1,2,4,8,16,32,64,128,256,512,1024.If >1024,OpenGL1.1 will wrong.

Wish my find will help you,friends.

Chinese believe in the friendship.If all we share the tip we find instead of hiding them in the heart of heart,the road to OpenGL expert will be broad and straight.

Yes, You are right, they must be power of 2.

I’m sorry to say but… That’s written everywhere!


That’s written everywhere!

[Tune pattern as the pet detector: ACE Ventura ]R-e-a-l-l-y~~~~~~?!

Well,but…En…that is very…zigzig~~It embarrass me so much where is the gap in the ground?show me it and i will get into…

OK,Maybe somebody will say:The 1300000000 people will be shamed by your stupid action.;(

Just made a quick search in the OpenGL specification document. There is no place where can find where it says a texture’s dimension has to be a power of two. I think this is a driver-limitation, and not something OpenGL requires. I have heard NVidia is planning support for non-power of two sizes, but Í think that should be clasified as a rumor, i.e. might very well be wrong.

So, theoretically, textures CAN have any size.

Better look again Bob , it is written in the spec. I just checked. It clearly states that each texture dimension will be of the form 2^n+2*b, where n is the extent of the source image in a given dimension, and b is the number of texture border pixels. As most people don’t use texture borders, that leaves just the image requirement 2^n. This means a 2D texture will be 2^n X 2^m. However I also vaguely remember hearing about arbitrary texture sizes being supported by an extension.

[This message has been edited by DFrey (edited 01-14-2001).]

>1024 won’t necesserily end up in an error. The maximum texture size in either direction is driver dependent and you can get it with glGetInteger I think.

DFrey: Ok, whatever. As I said, a quick search. But I read the section about glTexImage2D, where I expected to find it, but couldn’t see anything about it.

By the way, in the MSVC6 documentation it says it has to be a power of two. But this documentation is not the very best you can find, and I don’t completely trust it as a source of facts. The OpenGL-part that is. I was reading the official 1.2.1 documentation, the .PDF you can download from this site.

And in case you are right (which I believe you are), arbitary texture size as an extension sounds reasonable.

Yes, that’s the very pdf file I was reading too. Page 118 (pg 130 of pdf file). I pretty much ignore the MSDN OpenGL documentation. It has caused me one too many headaches.

[This message has been edited by DFrey (edited 01-15-2001).]

Code snip, might help …

/// Insure that texture can be accomodated by the
// hardware accelerator
GLsizei width = m_ImageData->width();
GLsizei height = m_ImageData->height();

glTexImage2D(
GL_PROXY_TEXTURE_2D,
0,
m_ImageData->dataFormat(),
width,
height,
0,
m_ImageData->dataFormat(),
m_ImageData->dataType(),
NULL
);

glGetTexLevelParameteriv(
GL_PROXY_TEXTURE_2D,
0,
GL_TEXTURE_WIDTH,
&width
);

if (width==0)
{
MString msg(
"Texture size is larger than that supported by ",
“hardware accelerator or texture is sized improperly|”
);
msg << "Texture name was " << name();
throw MOglException(msg,M_TRACEPOINT);
}

DFrey: Ok, seen it too now

I don’t care what you argue,but the only thing is :If you don’t use the size in power of 2,you can not get a right result.

maybe you like talk about the things above,but i think it is a wast of time.Please get more light on the useful tech.ok?

It seems that I started a little discussion here…

Well, I think I’m a little newbie to opengl and I already read that many times… that’s why I posted that message…

Anyway, It was NOT my intention to make fun of you, Suvcon.
Sorry!

It is in fact very, very common to be fighting with texture mapping to load your 100x50 texture just to find hours, days, etc, later that they must be power of two in size…

That’s some kind of a programmer sindrome: start coding without first reading the documentation (I suffer from this cronically).

Bob:
By the way, in the MSVC6 documentation it says it has to be a power of two.

Uh, really???

from VC6 hepl:
width

  • The width of the texture image. Must be 2n + 2(border) for some integer n.
    height
  • The height of the texture image. Must be 2m + 2(border) for some integer m.

You can check it in online MSND lib.:
glTexImage2D

==========================================
Never mind. I understand what you mean now.

I am a freshhand in opengl too,maybe we can teach each other you are welcome!
my mailbox: suvcon_cn@sina.com

This is what my documentation says about glTexImage2D. I looked at the helpfile shipped with MSVC6.

[i]
width
The width of the texture image. Must be 2^n + 2(border) for some integer n.

height
The height of the texture image. Must be 2^m + 2(border) for some integer m.
[/i]

But there you can see why I don’t trust these documents that very much. Different things about the same thing, on different places, from the same company.

Bob:

2^n or 2n…this is a question…but,just try it by coding.You can not use a 10050 texture directly,but 64*128 do.
so i think it should be 2^n.

practice give us the truth.

>>practice give us the truth.

Oh, that is so correct

Anyways, wether it’s supposed to be 2^n or 2*n is no question. 2^n is definitely the correct one. I have known textures has to be a power of two for a long time (no offense, if you misunderstand it ).

But now when you know, I hope you never forget it, which I have done some times, because is can cause some pain in the a$$ and waste of time

Ok from this power of 2 business im taking it that textures must be

2,4,8,16,32,64,128,256… and so on

and so on in both directions but not necessarily the same in both x and y.

Ok ive found at least with Nvidia hardware that the texture can be sizes of

2^n + 2^m

giving me texture sizes of

say 6x12 (2+4),(8+4)
or 80x96 (64+16),(64+32)

Im not sure what the deal here is but i was always under the impression that textures did have to be 2^n. Where’as a texture of size 7x13 will fail but as you can see you cant add 2 * 2^n to get those figures no matter how you try.

[b]

Ok ive found at least with Nvidia hardware that the texture can be sizes of

2^n + 2^m

giving me texture sizes of

say 6x12 (2+4),(8+4)
or 80x96 (64+16),(64+32)

[/b]
Are you certain of that? I’ve never noticed that. That would also seem to require an extension since the OpenGL 1.2 spec clearly indicates the textures dimensions for a 2D texture will be 2^n+2b X 2^m+2b. 2*b is just the size added due to border texels. That leaves requiring the source image to be 2^n X 2^m.