Strange Issue With AMD R7 and R9 Drivers - Maybe Deph Buffer Problem

Hi!
I have my Own 3D Real-Time Engine and it uses OpenGL to Render 3D Scenes.
And a strange bug are happening with how OpenGL are Deph Buffering the Buffers after 4.3 implementation(It seems the deph buffer not existing anymore…)

The strange thing is, for example:

I test it with two graphics cards:

AMD R7 260X card
AMD R9 280X card

On both using the original driver that come with the card the objects are rendering nicely(OpenGL 4.3 - Image 01).

But in all drivers updates from this version , without any change on the engine the objects strangely start to be rendered without Z-Buffer testing i think…(Image 02).

I keep contact with AMD , but it seems they not have an idea about this problem and maybe this are happening by a bug in OpenGL and not in the AMD driver itself.

Anyone there can help me about this problem?
Today are completing more than 6 months i´m trying solve this issue and i simply dont want install the original drivers again when the drivers are updated(I want my system upgraded without problem :wink:

Any help will be much appreciated.

Kind Regards.

If you suspect depth issues, try posting some code related to how you are allocating a depth buffer and setting up depth state. Or better yet, post a short standalone test program which illustrates the problem.

Hi Dark Photon,

It will be too much code to put , so i will insert only the main code about OpenGL:

Rendering code:


glDrawElements(GL_TRIANGLES,QtdIndices, GL_UNSIGNED_INT, 0);

Bind buffers code:


    glBindBuffer(GL_ARRAY_BUFFER, Id[0]);
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, Id[1]);

    quintptr offset = 0;

    int vertexLocation = pShader->attributeLocation("a_position");
    pShader->enableAttributeArray(vertexLocation);
    glVertexAttribPointer(vertexLocation       ,
                                3                    ,
                                GL_FLOAT             ,
                                GL_FALSE             ,
                                sizeof(VertexData)   ,
                                (const void *)offset);


    offset += sizeof(CIMTVetor);

    int texcoordLocation = pShader->attributeLocation("a_texcoord");
    pShader->enableAttributeArray(texcoordLocation);
    glVertexAttribPointer(texcoordLocation,
                               2               ,
                               GL_FLOAT        ,
                               GL_FALSE        ,
                               sizeof(VertexData),
                               (const void *)offset);

    offset += sizeof(CIMTVetor2);

    int normalcoordLocation = pShader->attributeLocation("a_normal");
    pShader->enableAttributeArray(normalcoordLocation);
    glVertexAttribPointer(normalcoordLocation,
                                3                  ,
                                GL_FLOAT           ,
                                GL_FALSE           ,
                                sizeof(VertexData) ,
                                (const void *)offset);

    offset += sizeof(CIMTVetor);

    int smoothcoordLocation = pShader->attributeLocation("a_smooth");
    pShader->enableAttributeArray(smoothcoordLocation);
    glVertexAttribPointer(smoothcoordLocation,
                                3                  ,
                                GL_FLOAT           ,
                                GL_FALSE           ,
                                sizeof(VertexData) ,
                                (const void *)offset);


    offset += sizeof(CIMTVetor);

    int tancoordLocation = pShader->attributeLocation("a_tangente");
    pShader->enableAttributeArray(tancoordLocation);
    glVertexAttribPointer(tancoordLocation,
                                3                  ,
                                GL_FLOAT           ,
                                GL_FALSE           ,
                                sizeof(VertexData) ,
                                (const void *)offset);


    offset += sizeof(CIMTVetor);

    int bitancoordLocation = pShader->attributeLocation("a_bitangente");
    pShader->enableAttributeArray(bitancoordLocation);
    glVertexAttribPointer(bitancoordLocation,
                                3                  ,
                                GL_FLOAT           ,
                                GL_FALSE           ,
                                sizeof(VertexData) ,
                                (const void *)offset);

Buffers initialization code:


    glGenBuffers(3, vboIds);
    glGenQueries(1, qryIds);

    glBindBuffer(GL_ARRAY_BUFFER, vboIds[0]);
    glBufferData(GL_ARRAY_BUFFER,
                      QtdVertices * sizeof(VertexData),
                      VertexBuffer,
                      GL_STATIC_DRAW);

    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vboIds[1]);
    glBufferData(GL_ELEMENT_ARRAY_BUFFER,
                      QtdIndices * sizeof(GLuint),
                      VertexIndices,
                      GL_STATIC_DRAW);

OpenGL states used during rendering:


    glClear     ( GL_COLOR_BUFFER_BIT  |
                       GL_DEPTH_BUFFER_BIT  |
                       GL_ACCUM_BUFFER_BIT  );
    glEnable     (GL_DEPTH_TEST    );
    glEnable     (GL_CULL_FACE     );
    glCullFace   (GL_BACK);
    glPolygonMode(GL_FRONT,GL_FILL);
    glEnable     (GL_TEXTURE_2D    );

I not believe is a problem in the code since this is a multiplatform engine and it works in iOS,Android , Windows, Mac and Linux machines nicelly.

I cannot provide a executable becouse i will need generate a installation package , since i use Qt and a lot of pre-requisites, but it are avaliable in GitHub as Insane Engine 3D and you can request access to it(You will have less work getting the engine from there and run it than the work i will have to generate the package).

The most strange thing about this issue , is becouse only happen on AMD R7 and R9 graphics cards and with updates that come with OpenGL 4.4 and 4.5 on it.
If i install the original driver (with OpenGL 4.3) it works nicelly.

For example i install my R9 280x in my machine last week and the system works nicelly , yesterday a update on the graphic card was made and the system start to show the deph issue…

In the same day i test the same code on my student machines (I am a 3D OpenGL programming professor at University) and in all other machives the code works lovelly(NVIDIA and IBM graphic cards and some machines Mac and Linux S.O´s), so the problem are only heppening in AMD graphic cards with updates(If i use the original driver - that come with the card - the system works too).

This issue are affecting all my main 3D objects where my objects are benig rendered well but without “deph”…(In the Image 2 we se a box being rendered in front of the mountain, but this box are located behind the mountain and cannot be rendered…).

Any help with this will be much appreciated, since i dont want change my graphics card since i like the AMD cards, and i want my system being updated…(Acctually i need black driver updates to avoid this problem).

AMD do not give any feedback about this issue too… :wink:

Kind Regards.

Well, you didn’t really provide much detail about what you’re doing with depth. You’re clearing the depth buffer (if present) and enabling depth test, but you didn’t show what your depth writemask is, what your depth clear value is, what your depth comparison function is, how you are allocating depth in the render target you’re using, whether you’re actually verifying that the render target itself “does” have depth, whether you’re checking for GL errors, etc.

So the best recommendation I can make is write up a short test program that renders with a depth buffer and see if it works properly with the new drivers. If not, you’ve got the perfect short repro case to post here as a sanity check and to mail to ATI to get them to fix their drivers (if in-fact it’s a driver bug). If not, then you’ve got some work to do figuring out what the key difference is between your short test program and your full-up application that’s apparently “breaking” depth test behavior.

Hi Dark Photon,

Thanks so much for your response.

About:
“Well, you didn’t really provide much detail about what you’re doing with depth. You’re clearing the depth buffer (if present) and enabling depth test, but you didn’t show what your depth writemask is, what your depth clear value is, what your depth comparison function is, how you are allocating depth in the render target you’re using, whether you’re actually verifying that the render target itself “does” have depth, whether you’re checking for GL errors, etc.”

I dont provide any detail becouse i dont use a deph writemask , i not specify a deph comparison function and i not allocate any deph buffer to be used…i use the default values. I think is a problem with deph becouse the results i´m getting on the screen seems like a deph problem.

i will post an executable compiled and with all Dlls to make possible you run it in your desktop.

Again , i dont believe it is a error in the code, since it works fine in many other machines, so you have any contact to send this problem to ATI?

Sadly i believe i need change to a different graphics card vendor since in all other works and tell to the Users of the engine we not recomend AMD R7 and R9 cards becouse this problem.

Again, Thanks so much for your help.

That’s OK. Don’t worry about it. With source, it could help you get an answer. An executable alone is a black box.

For your own testing, you could read back the depth buffer and ensure that it is being set as you expect it to.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.