Results 1 to 2 of 2

Thread: Using integer types to specify geometry

  1. #1
    Junior Member
    Join Date
    Mar 2011

    Using integer types to specify geometry

    I have a lot of dense geometry to render.

    Since the geometry is already chunked in a tile-like organization, I don't need the full scale of a 32-bit float for my vertices. In some cases I could get by with a 16-bit short, or even an 8-bit unsigned byte (!). I would of course need to adjust the transformation matrices I'm using.

    My questions are:
    1) Is this supported at all?
    2) Is it a "good idea" in general? Perhaps such usages are atypical, and are not handled in an optimized code path in the driver/GPU?

    The idea is of course to save GPU memory. In addition, since the server could output a much more efficient binary encoding (I'm using binary XHR to load data), it would save network bandwidth as well. Thirdly, since I need to retain this data at the JavaScript side as well (to do window queries and iterate geometry), I would have JavaScript VM memory.

    Comments welcome.

  2. #2
    Senior Member
    Join Date
    May 2010

    Re: Using integer types to specify geometry

    Yes, you can do it - and on most GPU's it'll help.

    The trade-off is in the savings in memory and bus bandwidth from sending less data - versus the extra time it takes the GPU to convert the data to floating point for use in the vertex shader.

    I have found this to be a net win in desktop OpenGL - but on things like phones...I honestly don't know.

    I think you should do it.

Similar Threads

  1. Loading Geometry!!
    By anshul.newlife in forum OpenGL ES
    Replies: 1
    Last Post: 04-11-2011, 03:42 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
Proudly hosted by Digital Ocean