Results 1 to 3 of 3

Thread: Depth Buffer and hardware problems

  1. #1
    Junior Member Newbie
    Join Date
    Mar 2000
    Location
    bellevue, wa, usa
    Posts
    2

    Depth Buffer and hardware problems

    i don't know what i'm doing wrong, but it sure must be big. when i render polys to the screen, the depth buffer seems to screw up. sections of polys appear behind others when they should be in front and visa versa. i've tried running in software emulation mode and it was painfully slow yet no jaggies or misplaced poly sections (odd that it works in software yet not in hardware). i've also ran it on varied computers with different hardware cards (with the latest drivers) and the depth buffer seems horrable. when i disable the depth buffer, rendering is fine (granted everything is not depth buffered, but no more jaggies or sections of poly's appearing infront of others. it's all or nothing). so my question is what am i doing?? how can i fix this? i'm out of control!

    oh yeah, one more thing: it works on one computer in hardware acceleration mode, but they did some funky stuff to their computer and i do not think that is nessesary to get depth buffering working. especially considering i've ran other gl apps and they work just fine.

    thanks for your help and time
    tim

  2. #2
    Senior Member Regular Contributor
    Join Date
    Feb 2000
    Location
    milano, italy
    Posts
    282

    Re: Depth Buffer and hardware problems

    seems like a precision/distribution problem.

    the microsoft GDI generic implementation uses a depth buffer with 32 bits of precision.
    the nvidia tnt card use 24 bits instead. this can cause some z-fight.
    i ran test applications on the matrox g200 card, wich can either make use of 24 and 32 bits depth buffer, and with the lower setting many z-fights occoured.

    a basic consideration with depth buffering is to keep the near clipping plane as far as possible, and the relative depth of the far clipping plane as little as possible.

    this is useful to optimize the distribution of depth values into the buffer.

    try to adjust the near and far clipping planes z when you call gluPerspective()

    Dolo/\/\ightY
    Dolo/\/\ightY

  3. #3
    Junior Member Newbie
    Join Date
    Mar 2000
    Location
    bellevue, wa, usa
    Posts
    2

    Re: Depth Buffer and hardware problems

    most excellent! that was the problem exactly. the near clip plane fixed it all. just needed to move it back a bit. you rule! thanks for you help, i was quite worried about how i was gonna fix that.

Similar Threads

  1. Problems trying to visualize the depth buffer.
    By GeatMaster in forum OpenGL: Basic Coding
    Replies: 6
    Last Post: 10-02-2017, 06:39 AM
  2. Problems with Depth Buffer
    By Clayton in forum OpenGL: Basic Coding
    Replies: 2
    Last Post: 09-16-2003, 11:20 PM
  3. Depth buffer artefacts on Geforce 2 hardware
    By cix>foo in forum OpenGL: Advanced Coding
    Replies: 3
    Last Post: 03-26-2001, 04:15 AM
  4. Problems with depth buffer.
    By timmie in forum OpenGL: Basic Coding
    Replies: 6
    Last Post: 10-22-2000, 03:40 PM
  5. Depth buffer problems (z fighting)
    By elroy in forum OpenGL: Advanced Coding
    Replies: 3
    Last Post: 06-20-2000, 12:35 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Proudly hosted by Digital Ocean