Page 1 of 2 12 LastLast
Results 1 to 10 of 15

Thread: glTranslate - wrong behaviour?

  1. #1
    Junior Member Newbie
    Join Date
    Sep 2008
    Posts
    12

    glTranslate - wrong behaviour?

    I stumbled upon some problem regarding the handling of calls to
    glTranslate(...) on NVIDIA graphics cards.
    It seems to be a bug/feature directly connected to NVIDIA adapters,
    since I could not reproduce the error on ATI or INTEL hardware in any way.
    I am not exactly sure if this is the right place for such a matter, so it would be
    nice if somebody could confirm that or point me to somewhere else.
    (It takes some effort to explain the problem thoroughly, so I would not like to
    spread this issue here, if its clearly the wrong place.
    I hit NVIDIA forums but those dont seem to be the best place either. )

    thanks for your concern

  2. #2
    Super Moderator OpenGL Lord
    Join Date
    Dec 2003
    Location
    Grenoble - France
    Posts
    5,574

    Re: glTranslate - wrong behaviour?

    First time I hear about an implementation problem on glTranslate.
    All previous problems were due to error(s) between the keyboard and the chair

  3. #3
    Senior Member OpenGL Pro Zengar's Avatar
    Join Date
    Sep 2001
    Location
    Germany
    Posts
    1,931

    Re: glTranslate - wrong behaviour?

    I very much doubt that glTranslate would cause any problems, that part of the driver probably didn't change for the last 5 years. You can submit bugs to nvidia through ndivia developers website. Anyway, if you describe your problem here we will probably be able to find a solution.

  4. #4
    Junior Member Newbie
    Join Date
    Sep 2008
    Posts
    12

    Re: glTranslate - wrong behaviour?

    Well it is not exactly a problem with glTranslate. It seems to be some problem with how the card reacts to it.
    So lets give it a try.

    I am not a native, so if something sounds fishy you might consider looking for a communication problem.

    I ran into this "effect" when I was drawing some "dataTable" in 2dimensional arranged fields at the x-z plane, using JOGL.
    To sum it up first:
    calling glTranslate(1.0, 0.0, 0.0) a 10000 times, followed
    by glTranslate(-10000.0, 0.0, 0.0) does not result in placing "focus" at the center. (at least not on any of my NVIDIA cards)

    First thoughts were about "unclean matrices" or "floatingpoint errors". Something like doing a rotation and not beeing able to do 100% accurate translations afterwards due to non discrete values.
    Therefore I ran a few tests using clean matrices and discrete values that can easily be covered by IEEE754.
    The error remained.
    I was developing on a Geforce 7600 GS with a very old Driver from 2006 at first. Unfortunately updating the drivers did not change the behaviour.
    After that I tried switching hardware, which partially resulted in "solving" the problem.
    I tried a few ATI cards (about 4 or 5, don't remember the exact hardware but can get the info if neccessary) and some INTEL cards (i855 i945), additionaly i tried some more NVIDIA cards (6200, Geforce Go, ...), with the effect happening on all the NVIDIA cards.

    So what did I do?
    To get things clean I used "lesson 4" from NeHe and put some very basic code in it.
    1) glTranslate(0.0, 0.0, -4.0) to get a better view at the center.
    2) draw a simple object (glutsphere ...) at the center
    3) repeat a translation along the e.g. x-axis with a stepping size of 1.0 for about 10000 or more repetitions
    4) translate "back" in one step
    5) draw the same object again
    To be sure I implemented this in Java using JOGL and in C++
    with the same results.
    On all NVIDIA cards I tested so far, the second object ist not placed at the center. It has a drift along the axis used for translation which increases as the distance for the back and forth translation increases.
    Additionaly this effect is directly influenced by the viewport.
    Resizing or stretching the window (which results in "reshape" followed by a reset of the viewport) will make the second object "jump" along the axis.
    On all cards from ATI and INTEL this does not happen.
    Both objects are placed exactly at the center, not matter how far the back-and-forth translation is.
    I should mention that this effect only occurs if the translations are done with different amounts of "steps".
    glTranslate(10000, ....) followed by glTranslate(-10000, ...)
    results in correct placement.
    (The same happens if the back translation is done with the many steps as the forth translation.)
    I made a little testprogram which shows the drifting object and the modelview matrix, which seems to be correct.
    One can clearly see the effect effects of the translation and the matrix is exactly the same right befor drawing any of the two objects.

    Doesn't look like I can upload any code here. So maybe I can get a hold of some webspace, so you might re-check.
    That is, if there isn't a simple one-row explanation to this matter, and how its gonna be solved.


  5. #5
    V-man
    Guest

    Re: glTranslate - wrong behaviour?

    calling glTranslate(1.0, 0.0, 0.0) a 10000 times, followed
    by glTranslate(-10000.0, 0.0, 0.0) does not result in placing "focus" at the center. (at least not on any of my NVIDIA cards)
    I would say that's normal.
    When you call glRotate, glTranslate and these other matrix operations, the driver uses the FPU and SSE to do the matrix computation. The SSE unit probably introduces more precision issues than the FPU unit.

    If you want to avoid that, use glPushMatrix() and glPopMatrix() or do your own matrix math and upload with glLoadMatrixf()

  6. #6
    Super Moderator OpenGL Lord
    Join Date
    Dec 2003
    Location
    Grenoble - France
    Posts
    5,574

    Re: glTranslate - wrong behaviour?

    I think I recall there is an option in Nvidia Control Panel to enable/disable the use of CPU optimized instructions. Might want to check.

  7. #7
    Junior Member Newbie
    Join Date
    Sep 2008
    Posts
    12

    Re: glTranslate - wrong behaviour?

    Well its not about "avoiding" the problem. (Allready found a few ways, by changing the ways I translate [skipping immediate mode e.g.]) Basically this behaviour breaks the rules of Group-Theory since the inverse element becomes broken.
    So if the translations used in opengl are provided by some extern mathematical system I cannot rely on "follow the laws of matrices and it works".

    So I am very interested in how to "disable" this effect.
    But it seems I need to catch up a bit on that topic.
    Could you explain a bit more what the NVIDIA cards do and <the others> don't?

  8. #8
    Senior Member OpenGL Pro Zengar's Avatar
    Join Date
    Sep 2001
    Location
    Germany
    Posts
    1,931

    Re: glTranslate - wrong behaviour?

    Numerical errors will always be present if you use "too many" matrices. Computers are not ideal computing devices, this is what you learn in numerics. No idea why it only affects Nvidia cards, this will be driver-dependent. To avoid this problem you can implement your own matrix tracking that deals with precision errors.

  9. #9
    Junior Member Newbie
    Join Date
    Sep 2008
    Posts
    12

    Re: glTranslate - wrong behaviour?

    I allready adressed those errors. I also cannot see any reason why only NVIDIA cards would be effected from numerical errors (i.e. being not able to store indiscrete values in floatingpoint datatypes)
    And then, adding up 1.0 in a loop should not produce numerical errors, since its a discrete value that can easily represented by IEEE754.
    And as I mentioned before, I am not using "to much matrices".
    Theres no rotation or any other transformation than the ones I mentioned in my description.

    I check the matrix 3 times.
    1) right before drawing the first sphere
    2) right after the translation loop
    3) right before drawing the second sphere
    In state 1) and 3) the modelview matrix is identical, in state 2) there is one different value, that is the loop-translation of e.g. 10000 units. (exactly as it should be)
    Those matrices look exactly the same on any graphics card, and they do not show any differences, but the rendered image does.
    Additionaly changing the viewport does change the drift of the second object (on NVIDIA cards), but does not affect the matrices.
    So it seems to be some error out of the focus of opengl,
    because the matrix says that the sphere was drawn at the center,
    but the image prooves it wrong.
    This should imply that there is a distortion just on the graphics card, like NVIDIA cards are unable to add up those numbers, whereas every other vendors card can do it.

    Maybe you could try this yourself, so we can be sure to know what we talk about.
    I uploaded my C++ testprogram (yay, finally the free webspace from my isp is usefull)
    http://www.muckelzwerg.de/test/
    (use '+' and '-' to increase and decrease the translation distance. starts at 0)


  10. #10
    Senior Member Regular Contributor CatDog's Avatar
    Join Date
    Mar 2006
    Location
    Germany
    Posts
    226

    Re: glTranslate - wrong behaviour?

    Tested on GF7950GX2: the sphere makes some strange movements and leaves the view at around 23000.

    CatDog

Page 1 of 2 12 LastLast

Similar Threads

  1. Replies: 1
    Last Post: 01-19-2019, 01:20 PM
  2. fbo gltranslate
    By mikethebike in forum OpenGL: Advanced Coding
    Replies: 1
    Last Post: 02-01-2012, 03:53 AM
  3. glTranslate
    By Eymerich in forum OpenGL ES
    Replies: 3
    Last Post: 12-18-2007, 03:52 AM
  4. glTranslate
    By sadhu in forum OpenGL: Advanced Coding
    Replies: 1
    Last Post: 03-06-2002, 01:46 PM
  5. glTranslate
    By sadhu in forum OpenGL: Basic Coding
    Replies: 1
    Last Post: 03-06-2002, 05:27 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Proudly hosted by Digital Ocean