PDA

View Full Version : Bugs with apple's 2.0 opengl software renderer?



tommyg
09-20-2016, 05:12 AM
Hi!
I have a few issues that i've experienced using OpenGL - are these bugs? i'm using code::blocks, and GLUT with xcode 2.5 on osx 10.4 (actually also on a hackintosh - if that makes any difference..) - which is using apple's 2.0 software renderer. I don't have any newer hardware to test on at the minute..

glGet-ting GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS gives 16, but when I glGet GL_ACTIVE_TEXTURE after glActiveTexture-ing above the 8th one, it will still be set to the 8th one.

glShaderSource with length set to the length of the chars passed will give GL_INVALID_VALUE, but setting it to null works fine...

are these known issues? am i misunderstanding something? is there somewhere that catalogues issues per os, if these are such?
I'm also having issues with pixels read back from texture units being different to what was put in, but i haven't narrowed down the exact issue yet..
cheers!

GClements
09-20-2016, 10:21 AM
glGet-ting GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS gives 16, but when I glGet GL_ACTIVE_TEXTURE after glActiveTexture-ing above the 8th one, it will still be set to the 8th one.

That would appear to be a bug.



glShaderSource with length set to the length of the chars passed will give GL_INVALID_VALUE, but setting it to null works fine...

The documented errors for glShaderSource() are:


GL_INVALID_VALUE is generated if shader is not a value generated by OpenGL.
GL_INVALID_OPERATION is generated if shader is not a shader object.
GL_INVALID_VALUE is generated if count is less than 0.

Have you confirmed that the error was actually generated by glShaderSource(), and not left over from a previous command? I.e. glGetError() was called immediately prior to the glShaderSource() call and returned GL_NO_ERROR?

tommyg
09-21-2016, 04:25 AM
Hello! Thanks for the reply!
Yes, errors were cleared first. I've since looked again, and it was my error in incorrectly including the null terminator when counting the length of the chars that caused this GL_INVALID_VALUE... nothing seemed to care on windows!

Don't think there's any place that the error could be mine with the active texture thing though...
Still haven't figured out the reading back pixels differently thing either..

arekkusu
09-21-2016, 01:37 PM
Don't think there's any place that the error could be mine with the active texture thing though...

Show your code. For example:

GLint maxCombinedUnits, activeUnit;
glGetIntegerv(GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS, &maxCombinedUnits);
printf("%s %s: %d\n", glGetString(GL_RENDERER), glGetString(GL_VERSION), maxCombinedUnits);
glGetIntegerv(GL_ACTIVE_TEXTURE, &activeUnit);
printf("before: %04X\n", activeUnit);
glActiveTexture(GL_TEXTURE0 + 10);
glGetIntegerv(GL_ACTIVE_TEXTURE, &activeUnit);
printf("after: %04X\n", activeUnit);
assert(GL_NONE == glGetError());
works as expected on OSX 10.9:

Apple Software Renderer 2.1 APPLE-9.6.5: 16
before: 84C0
after: 84CA

If you're confusing (http://www.nvidia.com/object/General_FAQ.html#t6) glActiveTexture with glClientActiveTexture, then of course it won't work.

tommyg
09-21-2016, 03:44 PM
Cheers! No, I don't think that I'm getting that wrong. How about this?:



#include <OpenGL/gl.h>
#include <OpenGL/glext.h>
#include <GLUT/glut.h>
//-framework GLUT


#include <cstdio>


int main(int argc, char **argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_DEPTH);
glutCreateWindow("textureunit test");


GLint numoftextureunitsint;
glGetIntegerv(GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS, &numoftextureunitsint);

printf("Vendor: - %s, Renderer - %s, Version - %s, reckons there are %d texture units", glGetString(GL_VENDOR), glGetString(GL_RENDERER), glGetString(GL_VERSION), numoftextureunitsint);

for(unsigned int i = 0;i<(unsigned int)numoftextureunitsint;i++)
{
glActiveTexture(GL_TEXTURE0+i);
GLint realactivetexture;
glGetIntegerv(GL_ACTIVE_TEXTURE,&realactivetexture);
if(realactivetexture-GL_TEXTURE0 != (int)i)
{
printf(", but I think it is lying, as I've found there to be %d, and trying to set it higher gives GL error %d.",i,glGetError());
break;
}
GLenum err = glGetError();
if(err)
{
printf("!!! SURPRISE GL ERROR!!!!! %d !!!!!",err);
}
}
return 0;
}


gives

Vendor: - Apple Computer, Inc., Renderer - Apple Software Renderer, Version - 2.0 APPLE, reckons there are 16 texture units, but I think it is lying, as I've found there to be 8, and trying to set it higher gives GL error 1280.

for me..

arekkusu
09-21-2016, 06:06 PM
Yeah, that looks broken.

Update to whichever newest OSX version your Hackintosh can run...?

tommyg
09-23-2016, 06:55 AM
Cheers! I'm happy knowing that the bug isn't in my code, and that there is a way around it for any implementation with the same issue.

I'm still looking into the other issue, using glTexImage2D and glGetTexImage - i'm getting different stuff back to what I put in. will post again when I've had more of a look.