Hi there,
I’ve been working on an OpenGL ES 2.0 project for some time now, and recently ported it to Windows, which has raised a some problems along the way which might be useful for somebody else to read. Further I have had some questions come up too.
At first I was running it on an NVidia GPU. Since my graphical application performs shadow mapping, I needed to create a depth texture that could be rendered to. On ES 2.0 platforms this requires the existence of the GL_OES_depth_texture extension. On PC it seems that it requires GL_ARB_depth_texture instead. With this simple change, my application now ran fine under Windows.
Some time later I had to run it on an ATI Radeon GPU, and two main problems arose. Firstly, the depth-texture render-buffer wasn’t being completed because it didn’t like that I was ignoring a draw buffer. I was under the impression that the GL_ARB_ES2_compatibility extension would make it so this wasn’t necessary (this extension was available on both my NVidia and ATI GPUs). Updating the device drivers on the ATI machine seemed to rectify this, I assume it was a driver bug.
I still had another problem though: my shadow mapping shaders weren’t compiling. Firstly the line
#extension GL_EXT_shadow_samplers : require
wasn’t working, simply because the extension didn’t exist. This is a good point, the extension isn’t in the list of extensions for either the NVidia or ATI GPU’s… but then how is it that the NVidia GPU was fine with this?
With this requirement no longer being fulfilled of course, the line
mediump float shadow = shadow2DProjEXT(s_shadowTex, v_shadowCoord);
failed because it couldn’t resolve the function shadow2DProjEXT.
Of course, the obvious solution was to no-longer require the extension and use the z-component of the vec4 retrurned by shadow2DProj(-). But I’m still at a loss as to why the NVidia GPU was fine with my use of ES 2.0 extensions and functions. Maybe it is because the NVidia GPU is implementing GL_ARB_ES2_compatibility more effectively? More importantly, I want to know, is it the NVidia GPU that is behaving improper, or the ATI GPU?.
Thanks for any light shed!