Results 1 to 5 of 5

Thread: Rendering to memory

  1. #1
    Junior Member Newbie
    Join Date
    Jan 2005
    Location
    UK (southwest)
    Posts
    6

    Rendering to memory

    Hi.
    I am making a DLL extension for a piece of 2D game coding software (called Fenix). My extension is a 3D eninge using opengl. At the moment I have it working by setting up opengl to take over the window. This of course disables Fenix's normal software drawing functions, which I don't want to do.
    However, Fenix has a buffer I could draw to (by the name of "background"). It consists of an array of 2 byte (16 bit) pixels.
    So, does anyone know how to setup opengl to render to this?
    I have heard of the possiblity of attaching a windows HBITMAP to a Device Context, but am not sure if a HBITMAP could be set up with the correct 16 bit format (without modifying the "background"), and whether that is what I need anyway...
    Can anyone help?
    Thanks
    Dan

  2. #2
    Senior Member OpenGL Pro
    Join Date
    May 2000
    Location
    Naarn, Austria
    Posts
    1,102

    Re: Rendering to memory

    Directly rendering to system memory is almost always a bad idea, because you will get software rendering this way, so it will be extremely slow as soon as you start doing anything more advanced than drawing a few untextured triangles.

    You can create a pbuffer context, or use the new (not yet officially available) EXT_framebuffer_object extension to render to an offscreen buffer, and then read back the result with glReadPixels. But this will be slow, too... Not as slow as software rendering, but its very likely still too slow for realtime rendering.

    Modern graphic cards are not designed for reading data back from them, it stalls their pipeline, thus destroying parallelism, and the bus transfer rate is much lower when reading back. I'm afraid there is no fast solution for rendering to a memory buffer in realtime, so the best solution for your specific problem is what you have already, just hijack the window

  3. #3
    Junior Member Newbie
    Join Date
    Jan 2005
    Location
    UK (southwest)
    Posts
    6

    Re: Rendering to memory

    Thanks.....
    I think i'm going to have to examine Fenix's rendering code some more...i'm not really sure how SDL (Fenix uses) sends data to the screen. Or how the screen is in general interfaced to through windows. I wonder if it is possible to have these two things using the same Device Context, and maybe one rendering on top of the other? With transparent colour allowing the other to be seen?
    ...or something like that
    Dan

  4. #4
    Senior Member OpenGL Pro
    Join Date
    May 2000
    Location
    Naarn, Austria
    Posts
    1,102

    Re: Rendering to memory

    With SDL it should be relatively easy to do this the other way round, that is, render the 2D data to an offscreen buffer and use this in the 3D renderer.

    SDL uses a structure called SDL_Surface for on- and offscreen rendering. So you could just change the setup-code of Fenix so that it constructs an offscreen surface instead of a window.

    Then you make your own window for OpenGL rendering, and each time Fenix renders a frame you upload the content of the offscreen surface to a texture. Then you can use this texture anywhere you want in your 3D drawing routine...

  5. #5
    Junior Member Newbie
    Join Date
    Jan 2005
    Location
    UK (southwest)
    Posts
    6

    Re: Rendering to memory

    That does sound a better + nice idea using it as a texture. Thanks
    I can't directly change the Fenix setup code; I can only call my DLL functions after it has started.

    DLLs access is given to the SDL_Surface that is created (using SDL_Setvideomode) with the pointer "screen".
    I have tried setting "screen" to a new one using SDL_CreateRGBSurface: neither the software functions or my opengl display (I have also tried setting it to NULL: it crashes....)

    In all cases my opengl works when I don't show any software sprites in Fenix.

    So...does this mean something of the old "screen" or the new one is still rendering to the window?
    Do you have any advice on creating an offscreen surface?
    Dan

Similar Threads

  1. Rendering to memory bitmaps
    By towsim in forum OpenGL: Advanced Coding
    Replies: 5
    Last Post: 03-11-2014, 02:07 AM
  2. Rendering depth to pc memory
    By libicocco in forum OpenGL: Basic Coding
    Replies: 11
    Last Post: 04-27-2011, 11:49 AM
  3. Memory issue in offscreen rendering
    By ramya987 in forum OpenGL ES
    Replies: 3
    Last Post: 03-18-2011, 07:46 AM
  4. rendering context use more and more memory
    By aQ in forum OpenGL: Basic Coding
    Replies: 4
    Last Post: 09-19-2005, 03:36 PM
  5. Rendering directly to texture memory
    By stecher in forum OpenGL: Advanced Coding
    Replies: 11
    Last Post: 11-13-2000, 01:35 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Proudly hosted by Digital Ocean