X Optimizations

Hi folkls,

I am a beginner in the 3D world.
I currenlty display 3D pictures on a PC + GeForce MX running Linux. Unfortunately, the application is loaded on a UNIX server. I use X Window to display it on PC. Both PC and server are linked by a TCP/IP network.

The rendering performances are very slow. I would like to know how can I increase them by either used an other protocol (supported by X) or depending of the complexity optimized the X/GLX commands (moving to large packets…)

Your experience or idea can help me a lot. Please tell what can I do?

In advance thanks,

Alex

Hello,

well, there are several issues. Does the server (on the linux end) use h/w accelerated opengl? Does the client (the windows end) know about OpenGL, and is THAT accelerated?

The worse case scenario is that your libux box isn’t opengl accelerated and your windows box doesn’t know about opengl, so GLX will s/w rasterise the display, pack the bitmap and send it to the windows box.

The ideal solution is to have your windows client understand GLX so the linux server just (just?) needs to send down opengl state changes. If your windows end is h/w accelerated, then it can take thse oepngl commands and rasterise using h/w on the display.

You might also need ot look at client side state, but that will require reprogramming to get the client/server model working. (check out glPushClientAttrib()).

How does one go about getting a GLX compliant windows-side x-session, then? NFI.

cheers
John