Texture or Point Rendering to create effect in drawing app.

I am curious about how to approach this app i’m building (ios). Its a drawing app with some dynamic animating effects that run on your drawings. Initially, the drawn paths were bezier curves but this led to issues with certain operations (like erasing becomes harder). So I want to switch to a "bitmap"y approach. When the user draws, basically the points underneath their finger are turned on. So basically, a grid of color values. I also want to run effects algorithms every frame (like a dynamic blur by calculating convolutions over this grid and so forth).

So dropping down to the openGL layer, what would be the best way to accomplish this screen sized bitmap, wherein I can efficient do calculations on its values and render to screen. My initial thinking was a 2d texture that is based on a data-array in memory and not a bmp or png. But I’ve also heard of point rendering. So I’m just asking how to proceed and do this efficiently. I’m aware that in iOS, one can use higher level frameworks to deal with bitmap contexts, but I also wish to improve my openGL and potentially involve Z-dimension features in the future.

I would think what you need is a FrameBufferObject (FBO) which you render into (it allows you to render into a texture). You can do your effects like blur directly on the hardware by using shaders. If you want to retrieve the image you can glReadPixels.

any reading resources on this? (its difficult to go from instructions to implementation just reading the docs with opengl for me). also why couldn’t i just use a normal texture? And tho I will try to do as much processing in the shaders, a good deal of the functionality is turning on and off various “pixels” with their corresponding colors in response to user touch interactions.