## How to do "General Computation" on GPU

Hello everyone,

Now I'm doing an experiment on moving computations from CPU to GPU. For example, matrix multiplication: A x B can be easily done by C code running on CPU, but how can I dispatch such computation work to GPU and get the result from GPU ?

The environment:
HTC dev 1 phone, with Qualcomm MSM2701A chipset (GPU: Adreno 130)
OpenGL ES 1.4

I've tried writing a simple program to let GPU do the work, as below

====================== Part of code =============================
EGLDisplay m_eglDisplay = eglGetDisplay(EGL_DEFAULT_DISPLAY);
if (m_eglDisplay == EGL_NO_DISPLAY || eglGetError() != EGL_SUCCESS) {
return -1;
}

EGLint major, minor;
if (eglInitialize(m_eglDisplay, &major, &minor) == EGL_FALSE || eglGetError() != EGL_SUCCESS) {
return -1;
}

GLfixed mantissa[16];
GLint exponent[16];
GLfloat matrix[16] = { 3, 3, 8, 8,
5, 7, 5, 7,
1, 2, 2, 2,
4, 4, 4, 2};
GLbitfield status;

glMatrixMode(GL_MODELVIEW);
status = glQueryMatrixxOES(mantissa, exponent);
================================================== ========

Codes above load a 4x4 matrix into current matrix, then glQueryMatrixxOES( ) should put the value into mantissa and exponent. After the program complete, I get these value:

status = 54

mantissa = {
-1342110592.000000, 0.000000, -1342109440.000000, -1342171008.000000,
0.000000, -1342110592.000000, -1342110592.000000, -1094222464.000000,
-1094222464.000000, -1094222208.000000, 0.000000, 0.000000,
0.000000, 0.000000, 0.000000, 0.000000
}

I want to know how to operate GPU as a general computing unit like CPU can do. Should I keep trying OpenGL/ES ? Any suggestion would be useful !

Thanks.