Results 1 to 5 of 5

Thread: vetical retrace, linux, OpenGL, SGI230, nvidia

  1. #1
    Junior Member Newbie
    Join Date
    Oct 2000
    Location
    Vic, Australia
    Posts
    3

    vetical retrace, linux, OpenGL, SGI230, nvidia

    Hi everyone,
    I am a 2nd year Ph.D. student studying VISUAL PERCEPTION at Deakin
    University, Melbourne, Australia. I am investigating via a series of
    computer based and applied experiments how people learn to judge
    time-to-contact with approaching objects, for example, how do people
    learn to time a successful catch of an approaching tennis ball.

    Over the last year i have taught myself (there is no one at Deakin
    University familiar with OpenGL) how to program in C and create the
    stimuli for my experiments using the OpenGL API. I have written a simple
    program that displays a sphere which translates along the z-axis towards
    the observer (camera) for x amount of frames. Each frame is synchronized
    to the vertical refresh using #export GL_SYNC_TO_VBLANK=1.

    My problem is this. The experiments that I am running are extremely time
    critical. I need to time exactly how long the sphere is translating for,
    so that I can calculate its velocity and calculate exactly when the
    sphere would "collide" with the observer.

    I have been told by users of other graphics APIs and OS's that the best
    way to get accurate timing is to use vertical retrace. That is, time the
    animation of objects by counting the number of vertical retraces, such
    that in my sphere example, if vertical retrace is at 60 Hz and i wanted
    the ball to translate for ONE SECOND i would write code for the
    translation to occur for 60 frames/retraces.

    Would anyone have any code or information that they could share with me
    about how to determine what the actual operating refresh rate is when my
    program is running, specifying a particular frame
    rate?(glxSwapIntervalSGI() does not appear to be supported), and/or how
    to time the translation of the sphere so that it is synchronized with
    VBLANK and will swap buffers for X amount of retraces?

    I am using a SGI230 workstation (intel processor), RedhatLinux6.2, and
    nvidia's Geforce 256 DDR. None of the OpenGL extensions [e.g.,
    glXWaitVideoSyncSGI(); GLX_SGI_swap_control] that may have been able to
    address this issue appear to be supported.

    any replies would be greatly appreciated as i'm really stuck on this
    one,
    Simon


    PS: sorry about my naive description of vertical retrace issues, i'm
    really a novice at this stuff.

  2. #2
    Senior Member Regular Contributor
    Join Date
    Sep 2000
    Location
    Vancouver BC Canada
    Posts
    417

    Re: vetical retrace, linux, OpenGL, SGI230, nvidia

    Sorry I don't know anything about vertical retrace, but I do know one way to time things nicely is to use, well, a timer. And the SDL library has a very nice timer. Check out www.libsdl.org .

    Meantime I'll try and figure out vsync in X.

  3. #3
    Junior Member Newbie
    Join Date
    Feb 2001
    Location
    Chapel Hill, NC, USA
    Posts
    4

    Re: vetical retrace, linux, OpenGL, SGI230, nvidia

    If using GLUT, you can just use glutGet(GLUT_ELAPSED_TIME) to find the current time in milliseconds since the program started.

    I found that the proprietary nVidia GLX for Linux has the function glXGetVideoSyncSGI() which returns the video frame count, incremented at each vblank. It also has glXWaitVideoSyncSGI(), but hogs the cpu until it reaches the vblank. Not surprising, since these are not advertised in the GLX extensions string and not really ready yet.

    Joe Krahn

  4. #4
    Guest

    Re: vetical retrace, linux, OpenGL, SGI230, nvidia

    First of all i d'ont think that using Vretrace is the best way for implementing timed code. Because au supposed the retrace depends of your VideoMode, your screen and you video board.
    Their is a lot of way to implements real timing function.


    Get the actual time with the "ansi adequate function'.
    Use an equivalent function as:
    - SDL_GetTicks();
    - glutGet(what joekrahn said).

    Or used a callbacks timer function
    - glutTimerFunc(nbms before the call, function to call, data to pass to the function).

  5. #5
    Senior Member Regular Contributor
    Join Date
    Feb 2001
    Location
    Montréal, QC, Canada
    Posts
    304

    Re: vetical retrace, linux, OpenGL, SGI230, nvidia

    THe vertical sync is never exactly 60hz.
    If you need high accuracy, this won't run prolerply.

    You should use an high accuracy interpolated timer, such as the one featured in linux kernel : it does interpolate time between mainboard timer's interruption quite well.

Similar Threads

  1. Starting with OpenGL, Nvidia and Linux
    By padawan in forum OpenGL: Basic Coding
    Replies: 9
    Last Post: 12-04-2010, 07:12 AM
  2. OpenGL 3 for linux made available by nvidia
    By asdruv in forum OpenGL: Linux
    Replies: 3
    Last Post: 11-03-2008, 10:25 AM
  3. OpenGL+Linux+NVidia
    By bismuti@boeing in forum OpenGL: Basic Coding
    Replies: 4
    Last Post: 01-29-2005, 02:35 PM
  4. nvidia implementation of OpenGL for Linux?
    By bismuti@boeing in forum OpenGL: Linux
    Replies: 2
    Last Post: 01-27-2005, 02:36 PM
  5. When will Nvidia ship there OpenGl for Linux
    By kk in forum OpenGL: Linux
    Replies: 0
    Last Post: 04-06-2000, 08:40 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Proudly hosted by Digital Ocean