Hello,
I’m really confuse. Ok, a test case with two kernels
col := a floating-point buffer, alloc size = 4 * sizeof(cl_float) * cnt;
buf := uchar buffer, alloc size = 3 * sizeof(cl_uchar) * cnt;
// first, run this kernel
__kernel void initBuf ( __global float3 col )
{
const size_t index = get_global_id (0);
col[index] = 0.126f * 100.0f;
}
// second run this kernel with col-Buffer from initBuf
__kernel void convertToUChar ( __constant float3 col, __global uchar buf )
{
const size_t index = get_global_id (0);
float3 c = col[index];
buf[index * 3 + 0] = convert_uchar_rte (c.x);
buf[index * 3 + 1] = convert_uchar_rte (c.y);
buf[index * 3 + 2] = convert_uchar_rte (c.z);
}
The result is a uchar-Buffer with 0 values.
buf[0] == 0, buf[1] == 0, …
I use an HP nw 8240 with AMD APP SDK 2.4
Another example:
convert_uchar_rte(1.99f * 100.0f); --> 100
convert_uchar_rte(199.0f) --> 199
Where is my mistake?