Results 1 to 2 of 2

Thread: using unsigned short in glsl compute shader

  1. #1
    Member Contributor
    Join Date
    Jul 2018
    Posts
    63

    using unsigned short in glsl compute shader

    Hi

    Is there a way to use unsigned shorts in a glsl compute shader? I noticed that the supported data types do not mention unsigned short - ony 32 bit types like int, float unsigned int.

    I have a buffer of unsigned shorts that I am reading into the compute shader. I also write out to this buffer. If unsigned short is not supported, then I will have to
    a) convert the unsigned short into floats before sending them to the compute shader
    OR
    b)Strip out the extra bits and store 2 bytes worth of data into the buffer inside the compute shader (NOT SURE IF THIS IS THE BEST APPROACH).

    Any suggestions from previous experiences?

    thanks

  2. #2
    Junior Member Newbie
    Join Date
    Jan 2013
    Posts
    11
    The GL_NV_gpu_shader5 extension added support for a full set of 8-, 16-, 32-, and 64-bit scalar and vector data types for all shader types.

Similar Threads

  1. Replies: 3
    Last Post: 05-03-2018, 03:52 PM
  2. unsigned short/int performance
    By imported_pjcozzi in forum OpenGL: Advanced Coding
    Replies: 7
    Last Post: 08-29-2009, 12:15 AM
  3. 3d texture problem with (unsigned) short data type
    By jozko_sk in forum OpenGL: Advanced Coding
    Replies: 8
    Last Post: 10-16-2006, 12:37 AM
  4. VBO's and unsigned short with large meshes
    By Mars_999 in forum OpenGL: Basic Coding
    Replies: 4
    Last Post: 08-18-2006, 12:19 AM
  5. texturing using unsigned short array
    By svorre in forum OpenGL: Basic Coding
    Replies: 0
    Last Post: 12-09-2003, 10:42 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Proudly hosted by Digital Ocean