Results 1 to 6 of 6

Thread: 10 bit rendering

  1. #1
    Junior Member Newbie
    Join Date
    Jun 2016
    Posts
    3

    10 bit rendering

    What is the best way to render 10 bit video using openGL?

    I want to read raw 10bit video (P010 fourcc for example) from file, and then to produce true 10 bit output using OpenGL to the 10 bit display device.

    What graphics card can I use for this task?

    I tried to use sample code from this document "www.nvidia.ru/docs/IO/40049/TB-04701-001_v02_new.pdf" with my GTX 980 Ti card, but the rendering fails with "No 10bpc WGL_ARB_pixel_formats found!" message (10 bit mode was enabled in NVIDIA Control Center).

    I tried the samples from here "www.nvidia.com/object/quadro-product-literature.html" (30 bit color sample code) but the same problem occurred (8 bits works fine).

    As I understood, the way, described in these documents, is good for Quadro cards, but not fits for others. Is it true? Is there any way to produce 10 bit output, using GTX graphics cards?

    Is there any way to render true 10 bit image without using WGL_ARB_pixel_format extension?

  2. #2
    Senior Member OpenGL Lord
    Join Date
    Mar 2015
    Posts
    6,671
    No 10bpc WGL_ARB_pixel_formats found!
    What emitted that error? That sounds like something that a tool like FreeGLUT or GLFW or something would emit. So, what was it?

  3. #3
    Junior Member Newbie
    Join Date
    Jun 2016
    Posts
    3
    Sorry for the late reply.

    The problem is that wglChoosePixelFormat method returns zero as number of compatible formats. The code is
    Code :
     // Find the 10bpc ARB pixelformat
        wglGetExtensionsString = (PFNWGLGETEXTENSIONSSTRINGARBPROC) wglGetProcAddress("wglGetExtensionsStringARB");
        if (wglGetExtensionsString == NULL)
        {
            printf("ERROR: Unable to get wglGetExtensionsStringARB function pointer!\n");
            goto Cleanup;
        }
     
        const char *szWglExtensions = wglGetExtensionsString(dummyDC);
        if (strstr(szWglExtensions, " WGL_ARB_pixel_format ") == NULL) 
        {
            printf("ERROR: WGL_ARB_pixel_format not supported!\n");
            goto Cleanup;
        }
     
        wglGetPixelFormatAttribiv = (PFNWGLGETPIXELFORMATATTRIBIVARBPROC) wglGetProcAddress("wglGetPixelFormatAttribivARB");
        wglGetPixelFormatAttribfv = (PFNWGLGETPIXELFORMATATTRIBFVARBPROC) wglGetProcAddress("wglGetPixelFormatAttribfvARB");
        wglChoosePixelFormat = (PFNWGLCHOOSEPIXELFORMATARBPROC) wglGetProcAddress("wglChoosePixelFormatARB");
     
    	if ((wglGetPixelFormatAttribfv==NULL)||(wglGetPixelFormatAttribiv==NULL)||(wglChoosePixelFormat==NULL))
        {
            goto Cleanup;
        }
     
        int attribsDesired[] = {
            WGL_DRAW_TO_WINDOW_ARB, 1,
            WGL_ACCELERATION_ARB, WGL_FULL_ACCELERATION_ARB,
            WGL_RED_BITS_ARB, 10,
            WGL_GREEN_BITS_ARB, 10,
            WGL_BLUE_BITS_ARB, 10,
            WGL_ALPHA_BITS_ARB, 2,
            WGL_DOUBLE_BUFFER_ARB, 1,
            0,0
        };
     
        UINT nMatchingFormats;
        if (!wglChoosePixelFormat(dummyDC, attribsDesired, NULL, 1, &idx30bit, &nMatchingFormats))
        {
            printf("ERROR: wglChoosePixelFormat failed!\n");
            goto Cleanup;
        }
     
        if (nMatchingFormats == 0)
        {
            printf("ERROR: No 10bpc WGL_ARB_pixel_formats found!\n");
            goto Cleanup;
        }

    It's the method, recommended by NVIDIA and AMD (AMD code is a bit different, but it is the same in general). The documents, I referred in my post are 7 years old.

    Is there any modern way to produce 10 bit output, using openGL?

    Is there any way to display 10 bit YUV image using openGL?

  4. #4
    Senior Member OpenGL Guru
    Join Date
    Jun 2013
    Posts
    3,100
    Quote Originally Posted by timman View Post
    Is there any modern way to produce 10 bit output, using openGL?
    You need hardware which supports 10-bit output.

    Quote Originally Posted by timman View Post
    Is there any way to display 10 bit YUV image using openGL?
    The bit depth of the source data has no bearing upon the bit depth of the output.

    If the hardware only supports 8-bit output, you can either convert 10-bit YUV to 8-bit RGB, or convert 10-bit YUV to 10-bit RGB (rendering to a suitable texture attached to a framebuffer object) then convert 10-bit RGB to 8-bit RGB as a post-process (optionally using dithering or error-diffusion).

  5. #5
    Junior Member Newbie
    Join Date
    Jun 2016
    Posts
    3
    Thanks for the fast reply!

    I have necessary hardware of course. I can view 8 bit - 10 bit difference on my 10 bit display, using Mad Video Renderer for example. And now I'm trying to write my own code, that will render 10 bit picture using openGL. I googled few days and the only solution that I found was WGL_ARB_pixel_format extension (NVIDIA and AMD 2008-09 year recommendation documents). I compile the samples and tried to run them and faced to problems, I described above (wglChoosePixelFormat method returns zero as number of compatible formats).

    I'm new to openGL, but I suppose that the other way (without using WGL_ARB_pixel_format extension) should be.

    My task is to open 10 bit raw video (or image -- it doesn't matter) and display it on 10 bit display as true 10 bit (with a minimal differences from the original, possibly).

    So, do you know any way to solve my problem? Or may be you can give me any advise about manuals, I must to read?

    I've wrote a code for 8-bit case (it was a simple task), so I'm interested in 10 bit initialization traits.

  6. #6
    Senior Member OpenGL Guru
    Join Date
    Oct 2004
    Posts
    4,649
    Quote Originally Posted by timman View Post
    I'm new to openGL ... My task is to open 10 bit raw video (or image ...) and display it on 10 bit display as true 10 bit ...

    So, do you know any way to solve my problem? Or may be you can give me any advise about manuals, I must to read?
    Apparently some nVidia Quadro GPUs support this over DisplayPort or Dual-link DVI. For details and code snippets to feed GL context setup, see:

    * 30-Bit Color Technology for NVIDIA Quadro

    Supposedly some AMD consumer and professional line GPUs do as well:

    * AMD’s 10-bit Video Output Technology

    A little more on nVidia's 30-bit display support from the nVidia Linux driver README file:

    Quote Originally Posted by /usr/share/doc/NVIDIA_GLX-1.0/README.txt
    __________________________________________________ ____________________________

    Chapter 31. Configuring Depth 30 Displays
    __________________________________________________ ____________________________

    This driver release supports X screens with screen depths of 30 bits per pixel
    (10 bits per color component). This provides about 1 billion possible colors,
    allowing for higher color precision and smoother gradients.

    When displaying a depth 30 image, the color data may be dithered to lower bit
    depths, depending on the capabilities of the display device and how it is
    connected to the GPU. Some devices connected via analog VGA or DisplayPort can
    display the full 10 bit range of colors. Devices connected via DVI or HDMI, as
    well as laptop internal panels connected via LVDS, will be dithered to 8 or 6
    bits per pixel.

    To work reliably, depth 30 requires X.Org 7.3 or higher and pixman 0.11.6 or
    higher.

    In addition to the above software requirements, many X applications and
    toolkits do not understand depth 30 visuals as of this writing. Some programs
    may work correctly, some may work but display incorrect colors, and some may
    simply fail to run. In particular, many OpenGL applications request 8 bits of
    alpha when searching for FBConfigs. Since depth 30 visuals have only 2 bits of
    alpha, no suitable FBConfigs will be found and such applications will fail to
    start.

Similar Threads

  1. Replies: 5
    Last Post: 05-25-2018, 06:55 PM
  2. Rendering function not rendering terrain
    By Arko4576 in forum OpenGL: Basic Coding
    Replies: 0
    Last Post: 06-14-2014, 04:57 PM
  3. Convert Java SW rendering to OpenGL HW rendering?
    By in forum OpenGL: Basic Coding
    Replies: 1
    Last Post: 08-12-2004, 10:24 AM
  4. Replies: 5
    Last Post: 06-17-2004, 11:51 AM
  5. using basic opengl rendering results for software rendering
    By divide in forum OpenGL: Advanced Coding
    Replies: 4
    Last Post: 01-04-2004, 06:23 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Proudly hosted by Digital Ocean