glBindAttribLocation with Cg

How can I use glBindAttribLocation with Cg? The function expects uint-type value, not CGprogram

You generally specify the attrib-slot inside the shader

(this is Cg-style GLSL)


attribute vec4 inPos : ATTR0;
attribute vec3 inN : ATTR1;

Wow. I didn’t expect that simply putting these ATTRx would work. No I have something like this:


struct VS_INPUT
{
	float4 position: POSITION: ATTR0;
	float3 normal: NORMAL: ATTR2;
	float2 texCoord0: TEXCOORD0: ATTR8;
};

And this works nicely for OGL and D3D :slight_smile:
Thanks

Oh no… I thought this would work for D3D yet VS30 doesn’t recognize semantic ATTR*… :frowning: Does anyone know a way make this work without using preprocessor? I need one Cg shader that would work for VS30, PS30 and ARB/GLSL profiles

Cg could be problematic in GLSL profile, afaik. (from what I’ve read on this board, and the Cg->GLSL converted code I’ve seen cgc.exe produces)
But for DX, ARB and NVx0 profiles instead of ATTRx you’d better use the :POSITION, :NORMAL, :TEXCOORDx semantics. The downside is that you’ll have to use the glVertexPointer/glNormalPointer/etc instead of glVertexAttribPointer .

Cg can be a mess with ATi cards via OpenGL; but at least it’ll work nicely for those cards via DX.

Yes… I thought that Cg would be a good idea for shaders but now I see it’s hard to handle OGL+Cg+ATI. In my future engine I’ll surely use explicitly GLSL and HLSL instead of Cg. By now I shall use glVertexPointer/glNormalPointer/etc. I’ve made some small test-application and so far it works with both NVidia and ATI (at least on two GPUs I’ve tested). Hope that when I switch from ARB to GLSL none of the new problems arrive…
Again, thanks for your contribution.