Results 1 to 5 of 5

Thread: Setting depth buffer (z buffer) size (16bit / 24bit)

  1. #1
    Join Date
    Mar 2010
    Posts
    73

    Setting depth buffer (z buffer) size (16bit / 24bit)

    Hello all,

    I have a GeForce 8500 GT and installed linux OS and driver version 173.14.12. I would like to configure the size of the depth buffer (or z buffer if you like) from 24bit to 16 bit, but I am not getting any option like that to configure the settings for that, neither in the driver xorg options nor in the nvidia-settings graphic tool.

    Is this compatible by the Linux driver? The windows driver allows me to do that.

  2. #2
    Join Date
    May 2008
    Posts
    1,149

    Setting depth buffer (z buffer) size (16bit / 24bit)

    Yes, this would be the one solution for that,the GLUT can solve this issue and allows only you to indicate that you desire a depth buffer.

    It does not provide you any management over the size manipulation. To control over the size, you will require to use a platform-specific interface like AGL, GLX, or WGL. Of course, you will also requires graphics hardware that can be compatible to re-size that you want to set.

  3. #3
    Join Date
    May 2008
    Posts
    1,196

    Setting depth buffer (z buffer) size (16bit / 24bit)

    I am trying to perform some sanity debugging during the writing shades. It would be much better if I could locate some values of the depth buffer so I can inside the in my shader. The value that you would put there would have been the depth buffer size. In OpenGL, It is done with 'glClearDepth' as far as I can tell, not much sure about D3D. It assumes as the right location would be in the Viewport class. Maybe the interface would look like this :

    Code:
    @param value the value to initialize all pixels in the depth buffer with
        void Viewport::setDepthClear(double value);

  4. #4
    Join Date
    May 2008
    Posts
    1,467

    Setting depth buffer (z buffer) size (16bit / 24bit)

    One just clicked the OpenGL ES template and everything always enabled the depth buffer by configuring a 0 to 1 in the EAGLView.m file.

    Code:
    #define USE_DEPTH_BUFFER 0
    #define USE_DEPTH_BUFFER 1
    When OS 3.0 was first introduced then the OpenGL ES template was configured and implemented to support OpenGL ES 2.0 and the simple method to enable the depth buffer was destroyed.

  5. #5
    Join Date
    May 2008
    Posts
    1,149

    Setting depth buffer (z buffer) size (16bit / 24bit)

    Turning on the depth buffer with the help of QGLFormat was not a issue, and I had studied some of the documentation regarding this issue on different sites, The thing that is not documented anywhere which I can get to see, is how to configure the number of depth bits to use. The GTK code would be as follows:


    int attrlist[] =
    {
    GDK_GL_RGBA,
    GDK_GL_DOUBLEBUFFER,
    GDK_GL_DEPTH_SIZE, 16,
    GDK_GL_NONE
    };

    if ((glarea = gtk_gl_area_new(attrlist)) == NULL)
    {
    g_print("Error creating GtkGLArea!\n");
    return NULL;

Similar Threads

  1. Replies: 11
    Last Post: 06-02-2012, 01:32 PM
  2. What should be “On-Chip Frame Buffer Size” in gigabyte GA-Z68X-UD3H-B3 motherboard
    By Anna Hard Reset in forum Motherboard Processor & RAM
    Replies: 4
    Last Post: 21-12-2011, 07:47 PM
  3. Depth Pass (Z-Buffer) from Blender, is that possible?
    By Callium in forum Windows Software
    Replies: 6
    Last Post: 14-05-2010, 02:56 AM
  4. Buffer of a disk
    By Xan in forum Hardware Peripherals
    Replies: 3
    Last Post: 16-03-2009, 09:41 AM
  5. I'm having trouble with the buffer
    By monsitj in forum Technology & Internet
    Replies: 4
    Last Post: 22-09-2008, 07:50 PM

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Page generated in 1,711,629,232.31418 seconds with 17 queries