View Full Version : Porting OpenGL to GLESv2

12-20-2017, 12:34 PM
I'm trying to port an OpenGL program to GLESv2. The program uses the following code a texture to the default framebuffer (it also fails if I render it to an fbo which also works on OpenGL).

glBindFramebuffer(GL_FRAMEBUFFER, 0);
glVertexAttribPointer(bgra_texcoords, 2, GL_FLOAT, GL_FALSE, 0, display_texcoords);

glBindBuffer(GL_ARRAY_BUFFER, vertex_buffer);
glBindTexture(GL_TEXTURE_2D, inst->texture);
glUniform1i(bgra_texture, 0);
glViewport(inst->x, root_surface->h - (inst->y + inst->h), inst->w, inst->h);

glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
glBindBuffer(GL_ARRAY_BUFFER, 0);

This works fine with OpenGL but it fails on glDrawArrays() under GLESv2 with GL_INVALID_VALUE. I read this question: https://stackoverflow.com/questions/24702349/gldrawelements-throw-gl-invalid-value-error which is very similar to my problem but I can't figure out how to apply the solution to my code since I'm not using VertexArray and I'm very new to GL.

inst->texture is a texture uploaded with glTexImage2D(). I created the vertex_buffer right after initializing EGL and compiling the shaders:

glGenBuffers(1, &vertex_buffer);
glBindBuffer(GL_ARRAY_BUFFER, vertex_buffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices),
vertices, GL_STATIC_DRAW);
glBindBuffer(GL_ARRAY_BUFFER, vertex_buffer);
glVertexAttribPointer(bgra_pos, 2, GL_FLOAT, GL_FALSE, 0, 0);

Edit: You can look at the whole source file here: https://github.com/fernando-rodriguez/mediabox/blob/bc4135d9568b2c5b4e8f39ac63ded2cb66023bcd/src/lib/ui/video-opengl.c. The file is a video "driver" for a compositor, all it does is creates 2D surfaces and render them to the screen. If there is anything wrong with the question or am missing something please post a comment so I can fix it. Thanks.

Dark Photon
12-20-2017, 06:07 PM
I didn't look pick through all your code. But here are a few thoughts.

I suspect your avbox_glTexImage2D() wrapper should actually call glTexImage2D on the GLES side.
Are you actually calling glCheckFramebufferStatus()? Your #if check before it looks backwards (says "#ifdef NDEBUG"; this is more commonly written #ifndef NDEBUG so that it only compiles in in debug builds).
If you haven't already, read up on the color renderable internal formats in the GLES 2.0 spec (https://www.khronos.org/registry/OpenGL/specs/es/2.0/es_full_spec_2.0.pdf) (and here (https://stackoverflow.com/questions/18688057/which-opengl-es-2-0-texture-formats-are-color-depth-or-stencil-renderable)). I think you'll be surprised at how constrained it is by default without extensions. Also check the internal format you're assigning to inst->texture. It's not one of those, right? Then check your extensions to see what the full set your implementation can support is.