So I’m asking people over at java gaming to try out my game and one of them has encountered a problem I can’t solve.
Here are his specs:
OPEN_GL
—FB stencil Bits: 8
—FB depth Bits: 0
—FB Red Bits: 0
—FB Green Bits: 0
—FB Blue Bits: 0
—FB Alpha Bits: 0
—Version: 3.3.14008 Core Profile/Debug Context 21.19.137.1
—SL Version: 4.50
—glRenderer: ATI Technologies Inc., Radeon ™ RX 480 Graphics
And at this code snippet he gets an opengl error:
System.out.println("---FB stencil Bits: " + glGetFramebufferAttachmentParameteri(GL_FRAMEBUFFER, GL_STENCIL, GL_FRAMEBUFFER_ATTACHMENT_STENCIL_SIZE));
System.out.println("---FB depth Bits: " + glGetFramebufferAttachmentParameteri(GL_FRAMEBUFFER, GL_STENCIL, GL_FRAMEBUFFER_ATTACHMENT_DEPTH_SIZE));
System.out.println("---FB Red Bits: " + glGetFramebufferAttachmentParameteri(GL_FRAMEBUFFER, GL_FRONT, GL_FRAMEBUFFER_ATTACHMENT_RED_SIZE));
System.out.println("---FB Green Bits: " + glGetFramebufferAttachmentParameteri(GL_FRAMEBUFFER, GL_FRONT, GL_FRAMEBUFFER_ATTACHMENT_GREEN_SIZE));
System.out.println("---FB Blue Bits: " + glGetFramebufferAttachmentParameteri(GL_FRAMEBUFFER, GL_FRONT, GL_FRAMEBUFFER_ATTACHMENT_BLUE_SIZE));
System.out.println("---FB Alpha Bits: " + glGetFramebufferAttachmentParameteri(GL_FRAMEBUFFER, GL_FRONT, GL_FRAMEBUFFER_ATTACHMENT_ALPHA_SIZE));
System.out.println("---Version: " + glGetString(GL_VERSION));
System.out.println("---SL Version: " + glGetString(GL_SHADING_LANGUAGE_VERSION));
System.out.println("---glRenderer: " + glGetString(GL_VENDOR) + ", " + glGetString(GL_RENDERER));
System.out.println();
if (glGetInteger(GL_MAJOR_VERSION) < 3 || (glGetInteger(GL_MAJOR_VERSION) == 3 && glGetInteger(GL_MINOR_VERSION) < 3))
throw new IllegalStateException("No opengl 3.3 support!");
checkErrors();
And at checkErrors():
static void checkErrors(){
switch(glGetError()){
case GL_NO_ERROR: break;
case GL_INVALID_ENUM : throw new RuntimeException("GLerr: invalid enum");
case GL_INVALID_VALUE : throw new RuntimeException("GLerr: invalid value");
case GL_INVALID_OPERATION : throw new RuntimeException("GLerr: invalid operation");
case GL_STACK_OVERFLOW : throw new RuntimeException("GLerr: stack overflow");
case GL_STACK_UNDERFLOW : throw new RuntimeException("GLerr: stack underflow");
case GL_OUT_OF_MEMORY : throw new RuntimeException("GLerr: out of memory");
case GL_INVALID_FRAMEBUFFER_OPERATION : throw new RuntimeException("GLerr: invalid FB operation");
}
}
He gets the GL_INVALID_ENUM.
Now, my question has two parts. Why Am I not getting the same error with my specs:
—FB stencil Bits: 8
—FB depth Bits: 24
—FB Red Bits: 8
—FB Green Bits: 8
—FB Blue Bits: 8
—FB Alpha Bits: 8
—Version: 4.5.0 NVIDIA 369.09
—SL Version: 4.50 NVIDIA
—glRenderer: NVIDIA Corporation, GeForce GTX 660M/PCIe/SSE2
And second, why is he getting the error?
I have a hunch that something in the code is decrepit, but I’ve googled everything and can’t find it… And, if so, if there anything I can specify so that my computer also picks up decrepit stuff, as of CORE 3.3 (perhaps there is lots more of this that I don’t know about in other parts of the source code).