Results 1 to 4 of 4

Thread: Problem with displaying textures

  1. #1

    Question Problem with displaying textures

    Hi all,
    This is my first post here

    Even if I did this multiple time with my new engine I have some problem with displaying textures...
    My entire code is here: in the case the error don't come from where I expect...

    Well, here is what I did in my fragment shader (the interesting part):

    Code :
    layout(location = 8)uniform int textureAmount;
    uniform sampler2D tex[8];
    void main() {
        int i = 0;
        outColor = lighting();                                                                                                                                                                      
        while (i < textureAmount) {
            outColor *= texture(tex[i], fUVCoord);

    For my texture loading I use DevIL but because I had nothing displayed I use an hard coded texture instead to be sure it's not a loading problem... Here is my code:

    Code :
    void GRand::Texture::load() noexcept {
        if (!_textureId) {
             glDeleteTextures(1, &_textureId);
             _textureId = 0;
        glGenTextures(1, &_textureId);
        glBindTexture(GL_TEXTURE_2D, _textureId);
        if (_filename.size() == 0) { // hardCoded texture part....
            std::cout << "\e[33;1mno texture file name, using the default one\e[0m" << std::endl;
            float pixels[] = {
                .7f, .7f, .7f,   .7f, .0f, .0f,
                .7f, .0f, .0f, .7f, .7f, .7f
            glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 2, 2, 0, GL_RGB, GL_FLOAT, pixels);
        std::cout << "loading file: " << _filename << std::endl;
        if (!ilLoadImage(_filename.c_str())) {
            std::cout << "\e[31;1mfailed to load: " << _filename << "\e[0m" << std::endl;
        ilConvertImage(IL_RGBA, IL_UNSIGNED_BYTE);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, ilGetInteger(IL_IMAGE_WIDTH), ilGetInteger(IL_IMAGE_HEIGHT), 0, GL_RGBA, GL_UNSIGNED_BYTE, ilGetData());
        GLenum errCode;
        if ((errCode = glGetError()) != GL_NO_ERROR) {
            std::cout << _glErrorToString[errCode - GL_INVALID_ENUM] << std::endl;
        _loaded = true;

    my bind function is pretty simple:
    Code :
    inline void bind(GLenum textureUnit_) const noexcept {
       glBindTexture(GL_TEXTURE_2D, _textureId);

    and To finish because here is how I use my textures:

    Code :
    void GRand::Material::use() const noexcept {
        unsigned int i = 0;
        for (decltype(_textures)::value_type t : _textures) { // _textures is a std::vector<const Texture*>
            t->bind(GL_TEXTURE0 + i);
            glUniform1i(glGetUniformLocation(_shaderProgram, _StexStringArray_[i]), 0);
        _uTextureAmount.upload(); // _uTextureAmount is a representation of the uniform textureAmount on my fragment... upload it to syncronize it

    I want to say that this is not a mapping problem because when I do this:
    outColor = vec4(fUVCoord, 0,1);
    In my fragment I can see my texture map like that:

    If you want to download the code to test things you have to know it only work of linux right now and compile only with clang. the dependency are only libjpeg libpng libtiff liblGLU libXrandr libXi libGL libpthread liblX11 and libXxf86vm I've included the rest in the repo. (note: you will need sse sse2 and sse3...)

  2. #2
    Senior Member Frequent Contributor
    Join Date
    Mar 2009
    Karachi, Pakistan
    There is an infinite loop in your fragment shader bolded below. Increment i in the while loop or do something like this
    Code :
    while(i++ < textureAmount) { ... }

    Other than that I would suggest using one texture to start with. See if you get something with this simple fragment shader
    Code :
    void main() {
        outColor = texture(tex[0], fUVCoord);    

    Quote Originally Posted by NeWinn View Post
    Hi all,
    Code :
    layout(location = 8)uniform int textureAmount;
    uniform sampler2D tex[8];
    void main() {
        int i = 0;
        outColor = lighting();                                                                                                                                                                      
        [B]while (i < textureAmount) {
            outColor *= texture(tex[i], fUVCoord);

  3. #3
    Thanks for replying mobeen.

    I've already tried "outColor = texture(tex[0], fUVCoord);" in my fragment shader and it make my suzane all black.
    To be honest I've tried a lot of thing in my shader to understand where this bug come from and it's when I introduce this (really dumb) infinite loop

  4. #4
    After 3 week with this bug I've found why.

    It wasn't in the code I've posted before. Actually it was because of my architecture itself.
    I do every graphics things in a special thread and my texture code was not executed in this thread so it tried to load and generate before my windows was even open...

    Sorry for this dumb question =/

Similar Threads

  1. Problem with displaying objects' textures at high angles
    By ORLEON in forum OpenGL: Basic Coding
    Replies: 6
    Last Post: 09-02-2016, 11:24 AM
  2. Problem - DXT1 textures not displaying
    By MaToVy in forum OpenGL: Basic Coding
    Replies: 0
    Last Post: 12-08-2012, 10:13 AM
  3. Advise on displaying textures
    By racumin in forum OpenGL: Basic Coding
    Replies: 5
    Last Post: 11-04-2009, 03:25 AM
  4. Displaying large textures
    By Ian2004 in forum OpenGL: Advanced Coding
    Replies: 5
    Last Post: 05-28-2004, 03:08 AM
  5. Displaying large textures
    By apanos in forum OpenGL: Advanced Coding
    Replies: 1
    Last Post: 10-10-2002, 08:31 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
Proudly hosted by Digital Ocean