PDA

View Full Version : ES2 extensions under Nvidia and ATI GPUs



mrdodo
07-02-2015, 03:34 AM
Hi there,

I've been working on an OpenGL ES 2.0 project for some time now, and recently ported it to Windows, which has raised a some problems along the way which might be useful for somebody else to read. Further I have had some questions come up too.

At first I was running it on an NVidia GPU. Since my graphical application performs shadow mapping, I needed to create a depth texture that could be rendered to. On ES 2.0 platforms this requires the existence of the GL_OES_depth_texture extension. On PC it seems that it requires GL_ARB_depth_texture instead. With this simple change, my application now ran fine under Windows.

Some time later I had to run it on an ATI Radeon GPU, and two main problems arose. Firstly, the depth-texture render-buffer wasn't being completed because it didn't like that I was ignoring a draw buffer. I was under the impression that the GL_ARB_ES2_compatibility extension would make it so this wasn't necessary (this extension was available on both my NVidia and ATI GPUs). Updating the device drivers on the ATI machine seemed to rectify this, I assume it was a driver bug.

I still had another problem though: my shadow mapping shaders weren't compiling. Firstly the line

#extension GL_EXT_shadow_samplers : require
wasn't working, simply because the extension didn't exist. This is a good point, the extension isn't in the list of extensions for either the NVidia or ATI GPU's... but then how is it that the NVidia GPU was fine with this?

With this requirement no longer being fulfilled of course, the line

mediump float shadow = shadow2DProjEXT(s_shadowTex, v_shadowCoord);
failed because it couldn't resolve the function shadow2DProjEXT.

Of course, the obvious solution was to no-longer require the extension and use the z-component of the vec4 retrurned by shadow2DProj(-). But I'm still at a loss as to why the NVidia GPU was fine with my use of ES 2.0 extensions and functions. Maybe it is because the NVidia GPU is implementing GL_ARB_ES2_compatibility more effectively? More importantly, I want to know, is it the NVidia GPU that is behaving improper, or the ATI GPU?.

Thanks for any light shed!

Alfonse Reinheart
07-02-2015, 07:07 AM
how is it that the NVidia GPU was fine with this?

... this is kind of a grey area. Well, that's assuming that you had a `#version 100` at the top of your shader, which would indicate that you want OpenGL ES 2.0's shading language.

I say it's a grey area because the specification is not clear on what extensions are allowed. It says, " Shaders that specify #version 100 will be treated as targeting version 1.00 of the OpenGL ES Shading Language, which is a strict subset of version 1.50."

This could be taken to mean "the compiler will convert GLSLES 1.00 shaders into their GLSL 1.50 equivalents". In that case, NVIDIA's behavior is wrong.

However, if the goal of ES2_compatibility is to allow you to take a GLSLES 1.00 shader and use it in desktop GL without modifications, then certainly any extensions that the shader references would be OpenGL ES extensions, not desktop GL extensions (which would not be legal GLSLES 1.00 for any implementation).

Moreover, the extension specification says nothing about how to go about compiling GLSLES 1.00. It simply say that it uses that shading language. Well, that shading language also has its own set of extensions. So one could logically conclude that #extension declarations refer to ES extensions.

But the specification is not clear on this point. And it certainly provides no way to ask for ES extensions that are available to the ES GLSL compiler. So which extensions are available would be a pure guess.

So there are justifications for either behavior, with no clear direction as to which should be required.

mrdodo
07-02-2015, 09:03 AM
Thanks for your response. It was interesting to hear that it likely comes down to a difference in interpretation.

For the benefit of anybody in a similar situation, for now I have have opted to declare
#extension GL_EXT_shadow_samplers : enable instead of
#extension GL_EXT_shadow_samplers : require. This causes a warning if it isn't available. Later on I can then use
#ifdef GL_EXT_shadow_samplers which function to call.