FPS???

Hello!

I’ve trubles with figuring out FPS. I do it like this:
i have gloabal variables:
LARGE_INTEGER freq;

float framestart,frameend,frametime,fps;

in init i do:
QueryPerformanceFrequency(&freq)
and get time with function:
float precisetime ()
{
QueryPerformanceCounter(&time);
return (float)time.QuadPart/(float)freq.QuadPart;
}

In glutIdleFunc i do:
framestart=precisetime();
draw_scene;
frameend=precisetime();
frametime=framestart-frameend;
fps=1/frametime;

Now, why is always frametime=0? (ok, i render only 1 cube, but hell, even THAT takes some time). What am I doing wrong?

10x for help!

try fps=1.0f/frametime;

Thanks…but I tried that already - no dice. Like I said, the problem is because frametime is always 0…so how should I relay on it to calulate moves for cubes etc? (with moveby=movepersecond*frametime)

[This message has been edited by mtrooper (edited 01-20-2001).]

Why? Because those 64 bit values need the precision of a double, not a float. When you cast it to a float, you are throwing away up to 29 bits of precision!

[This message has been edited by DFrey (edited 01-20-2001).]

Thank you thank you thank you thank you thank you…I knew I should first get a hang of C++…damn. Really, thanks!

I know why I’m always running on millisecond base… However I can’t believe that only the loss of precision will drop it to zero. When the time gets really small, it will be round off to zero while the cast I think. Shouldn’t that lead to an exception (or does the fpu just ignore that?)?
And if not, the floats suffice for really a large number of fps. There’s some strange thing in his precisetime function I think. Gonna look in my code.

Originally posted by Michael Steinberg:
Shouldn’t that lead to an exception (or does the fpu just ignore that?)?

Dividing by zero is ignored because in app initialzation (in glut32.dll module I think) the divide-by-zero exception is turned off.
It is done because many of OpenGL floating operations (like x/w) would cause an exception.

I did mine a little differently. For starters, I used GetTickCount() rather than the high precision one you guys are using. I see no need for high precision timing just for a simple FPS counter. GetTickCount() returns a DWORD value in milliseconds. My code is this:

currenttime=GetTickCount();
frames++;
if ((currenttime-1000)>=starttime) //after 1 second
{
fps=frames; //set fps=# of frames after 1 second
starttime=currenttime; //reset the timer
frames=0; //set frame counter back to zero
}
…draw the scene…

[This message has been edited by element (edited 02-11-2001).]

On some computers, the counter you query with GetTickCount has a very poor accuracy. The result was that my app that used it raised a “division by 0” exception when I did 1.0f/frametime. At best, if you expect that and test the frametime before the division, you won’t get any framerate at all.

QueryPerformanCeounter returns zero if no performace counter is available. Maybe you don’t have a performance counter…? ( )

Why are you returning the results of the performance counter as a floating point value at all (float OR double)? The performace counter returns a 64bit integer, not a floating point value. But I’ve had my own set of problems with timing so perhaps this is due to the fact that I’m missing why you’re using floats. There’s got to be something I’m not getting because ona really fast system I actually get slightly slower response from my objects than on a slower system. When I multiply the velocities and deltas of my objects by the time elapsed, it doesn’t seem to be scaling properly.

I’m not 100% sure how the performance counter works, but I think that the performace counter is a 64-bit counter that counts the number of clock cycles since last reset. The performace frequency is the actual clockspeed of your processor.

The reason why you would return a float, is that when dividing the counter with the frequency, you get the number of seconds since restart. And I’m pretty sure you want to represent seconds in floats and not integers. Integers would end up in great time resolution… not .

Originally posted by Bob:
[b]I’m not 100% sure how the performance counter works, but I think that the performace counter is a 64-bit counter that counts the number of clock cycles since last reset. The performace frequency is the actual clockspeed of your processor.

The reason why you would return a float, is that when dividing the counter with the frequency, you get the number of seconds since restart. And I’m pretty sure you want to represent seconds in floats and not integers. Integers would end up in great time resolution… not .[/b]

No, that’s the RDTSC instruction. The performance counter returns the tick number of a standard clock running around 1.05 MHz if I remember it right. With RDTSC and the performance counter you can calculate the MHz of the CPU btw …

Originally posted by mtrooper:
[b]Hello!

I’ve trubles with figuring out FPS. I do it like this:
i have gloabal variables:
LARGE_INTEGER freq;

float framestart,frameend,frametime,fps;

in init i do:
QueryPerformanceFrequency(&freq)
and get time with function:
float precisetime ()
{
QueryPerformanceCounter(&time);
return (float)time.QuadPart/(float)freq.QuadPart;
}

In glutIdleFunc i do:
framestart=precisetime();
draw_scene;
frameend=precisetime();
frametime=framestart-frameend;
fps=1/frametime;

Now, why is always frametime=0? (ok, i render only 1 cube, but hell, even THAT takes some time). What am I doing wrong?

10x for help![/b]

I’d let the framestart variable be static and store the frameend in it instead. A little more accurate … remember that if you only draw a cube you’ll just pass it to the driver and then return and the graphic card will work in parallel … so you’ll get back almost immediately.
I’d also use __int64 instead of floats all the way, except in the final conversion to fps.

Ok, maybe I was wrong then. But why do QueryPerformaceFrequency returns 734MHz for me? I have two 733 MHz PIII processors.

I have tried to query the counter, and divide the result with the frequency, and it actually counts seconds.

Hmm … that’s strange. Just tested and it returns 1193180 for me, 1.19MHz …

And yeah, of course you get seconds, you’d get that regardless of the timer resolution … cycles / (cycles / second) = seconds …