So here is my situation:
I am trying to draw an image, pixel by pixel, on a 320x480 pixel iPhone screen.
I’m doing this using GLDrawArrays( ). I create a vertex array including x,y pairs for all of the pixels on the screen (i.e., there are 320480 x,y pairs in the array). Then I create an array that defines a color for each pixel (i.e., there are 320480 colors in the array).
This works well.
Now, let’s say that I now need to draw an image with 1024x480 pixels on the iPhone screen. I can no longer do a pixel to pixel mapping between the image and the screen. Does OpenGL ES have some way to squeeze this data onto the iPhone screen?