Results 1 to 4 of 4

Thread: glGetError() issue

  1. #1
    Junior Member Newbie
    Join Date
    Jun 2016
    Posts
    9

    glGetError() issue

    I have written a function that evaluates the result of glGetError() and prints the corresponding message:
    Code :
    void error(GLenum e)
    {
    	switch (e)
    	{
    	case GL_INVALID_ENUM:
    		cout << "Error: GL_INVALID_ENUM" << endl;
    		system("pause");
    		break;
    	case GL_INVALID_VALUE:
    		cout << "Error: GL_INVALID_VALUE" << endl;
    		system("pause");
    		break;
    	case GL_INVALID_OPERATION:
    		cout << "Error: GL_INVALID_OPERATION" << endl;
    		system("pause");
    		break;
    	case GL_INVALID_FRAMEBUFFER_OPERATION:
    		cout << "Error: GL_INVALID_FRAMEBUFFER_OPERATION" << endl;
    		system("pause");
    		break;
    	case GL_OUT_OF_MEMORY:
    		cout << "Error: GL_OUT_OF_MEMORY" << endl;
    		system("pause");
    		break;
    	case GL_NO_ERROR:
    		cout << "No error reported" << endl;
    		break;
            default:
                    cout << "Unknown error" << endl;
                    break;
    	}
    }

    However, EVERY time I call it:
    Code :
    error(glGetError());
    It outputs "Error: GL_INVALID_ENUM", even if the program works correctly.

    Here is a very simple program that draws a blue rectangle and it works like a charm, however
    adding a call to error(glGetError()) prints "Error: GL_INVALID_ENUM".

    What is going on here?

  2. #2
    Senior Member Regular Contributor
    Join Date
    May 2016
    Posts
    477
    i assume you get that error while settin up the shader program:
    https://www.khronos.org/opengles/sdk...derInfoLog.xml
    try to pass a regular return value:
    Code :
    int loglength;
    glGetShaderInfoLog(vs, 512, &loglength, vsLog);

    GL_INVALID_ENUM essentially means you have passed a wrong value into a gl function

    i'm using a similar function:
    Code :
    void CheckForGLError()
    {
    	GLenum error;
    	while ((error = glGetError()) != GL_NO_ERROR)
    	{
    		std::cout << "ERROR: \t";
    		if (error == GL_INVALID_ENUM)
    			std::cout << "GL_INVALID_ENUM";
    		if (error == GL_INVALID_VALUE)
    			std::cout << "GL_INVALID_VALUE";
    		if (error == GL_INVALID_OPERATION)
    			std::cout << "GL_INVALID_OPERATION";
    		if (error == GL_INVALID_FRAMEBUFFER_OPERATION)
    			std::cout << "GL_INVALID_FRAMEBUFFER_OPERATION";
    		if (error == GL_OUT_OF_MEMORY)
    			std::cout << "GL_OUT_OF_MEMORY";
    		if (error == GL_STACK_UNDERFLOW)
    			std::cout << "GL_STACK_UNDERFLOW";
    		if (error == GL_STACK_OVERFLOW)
    			std::cout << "GL_STACK_OVERFLOW";
    		std::cout << (char)7 << std::endl;		/*play sound*/
    		std::cin.get();
    	}
    }

    you can do the same info log thing for your program (to check for any link errors):
    https://www.khronos.org/opengles/sdk...ramInfoLog.xml
    Code :
    std::string ProgramInfoLog(unsigned int program)
    {
    	if (glIsProgram(program))
    	{
    		int logsize;
    		char infolog[1024] = { 0 };
    		glGetProgramInfoLog(program, 1024, &logsize, infolog);
     
    		return std::string(infolog);
    	}
     
    	return "invalid program";
    }

  3. #3
    Senior Member OpenGL Guru
    Join Date
    Jun 2013
    Posts
    3,101
    Quote Originally Posted by syntax_error View Post
    Here is a very simple program that draws a blue rectangle and it works like a charm, however adding a call to error(glGetError()) prints "Error: GL_INVALID_ENUM".

    What is going on here?
    glewInit() will generate this error if used with a core profile context. It calls glGetString(GL_EXTENSIONS) to get the extension list, but this will generate GL_INVALID_ENUM with the core profile, where glGetStringi() must be used instead. This is why you need to set glewExperimental when using a core profile context: detecting extensions fails and it assumes that no extensions are supported.

    In order to check whether a particular section of code generated an error, it is necessary to clear any outstanding errors beforehand. Otherwise, you can't distinguish between errors generated by that code and errors generated prior to it.

    This requires calling glGetError() in a loop until it returns GL_NO_ERROR. OpenGL implementations can maintain multiple error flags, and each call to glGetError() will report (and clear) at most one error flag, only returning GL_NO_ERROR when all error flags are clear.

  4. #4
    Junior Member Newbie
    Join Date
    Jun 2016
    Posts
    9
    Quote Originally Posted by GClements View Post
    This requires calling glGetError() in a loop until it returns GL_NO_ERROR.
    Thanks, it works!

    Code :
    while(true)
    {
        if (glGetError() == GL_NO_ERROR)
            break;
    }

Similar Threads

  1. glGetError() blocking
    By spurserh in forum OpenGL: Windows
    Replies: 1
    Last Post: 03-14-2009, 04:13 PM
  2. glGetError joy
    By Jeff Russell in forum OpenGL: Advanced Coding
    Replies: 9
    Last Post: 03-03-2008, 02:26 AM
  3. glGetError
    By ToddAtWSU in forum OpenGL: Basic Coding
    Replies: 1
    Last Post: 01-23-2007, 06:33 PM
  4. glgeterror
    By cem polat in forum OpenGL: Advanced Coding
    Replies: 1
    Last Post: 10-05-2004, 12:30 PM
  5. glGetError
    By mdog1234 in forum OpenGL: Basic Coding
    Replies: 1
    Last Post: 01-13-2003, 02:06 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Proudly hosted by Digital Ocean