OpenGL invalid enum

So I’m asking people over at java gaming to try out my game and one of them has encountered a problem I can’t solve.

Here are his specs:
OPEN_GL
—FB stencil Bits: 8
—FB depth Bits: 0
—FB Red Bits: 0
—FB Green Bits: 0
—FB Blue Bits: 0
—FB Alpha Bits: 0
—Version: 3.3.14008 Core Profile/Debug Context 21.19.137.1
—SL Version: 4.50
—glRenderer: ATI Technologies Inc., Radeon ™ RX 480 Graphics

And at this code snippet he gets an opengl error:

		 System.out.println("---FB stencil Bits: " + glGetFramebufferAttachmentParameteri(GL_FRAMEBUFFER, GL_STENCIL, GL_FRAMEBUFFER_ATTACHMENT_STENCIL_SIZE));
		 System.out.println("---FB depth Bits: " + glGetFramebufferAttachmentParameteri(GL_FRAMEBUFFER, GL_STENCIL, GL_FRAMEBUFFER_ATTACHMENT_DEPTH_SIZE));
		 System.out.println("---FB Red Bits: " + glGetFramebufferAttachmentParameteri(GL_FRAMEBUFFER, GL_FRONT, GL_FRAMEBUFFER_ATTACHMENT_RED_SIZE));
		 System.out.println("---FB Green Bits: " + glGetFramebufferAttachmentParameteri(GL_FRAMEBUFFER, GL_FRONT, GL_FRAMEBUFFER_ATTACHMENT_GREEN_SIZE));
		 System.out.println("---FB Blue Bits: " + glGetFramebufferAttachmentParameteri(GL_FRAMEBUFFER, GL_FRONT, GL_FRAMEBUFFER_ATTACHMENT_BLUE_SIZE));
		 System.out.println("---FB Alpha Bits: " + glGetFramebufferAttachmentParameteri(GL_FRAMEBUFFER, GL_FRONT, GL_FRAMEBUFFER_ATTACHMENT_ALPHA_SIZE));
		 System.out.println("---Version: " + glGetString(GL_VERSION));
		 System.out.println("---SL Version: " + glGetString(GL_SHADING_LANGUAGE_VERSION));
		 System.out.println("---glRenderer: " + glGetString(GL_VENDOR) + ", " + glGetString(GL_RENDERER)); 
		 System.out.println();
		 if (glGetInteger(GL_MAJOR_VERSION) < 3 || (glGetInteger(GL_MAJOR_VERSION) == 3 && glGetInteger(GL_MINOR_VERSION) < 3))
			 throw new IllegalStateException("No opengl 3.3 support!");
		 checkErrors();

And at checkErrors():

	static void checkErrors(){
		
		
		switch(glGetError()){
		case GL_NO_ERROR: break;
		case GL_INVALID_ENUM : throw new RuntimeException("GLerr: invalid enum");
		case GL_INVALID_VALUE : throw new RuntimeException("GLerr: invalid value");
		case GL_INVALID_OPERATION : throw new RuntimeException("GLerr: invalid operation");
		case GL_STACK_OVERFLOW : throw new RuntimeException("GLerr: stack overflow");
		case GL_STACK_UNDERFLOW : throw new RuntimeException("GLerr: stack underflow");
		case GL_OUT_OF_MEMORY : throw new RuntimeException("GLerr: out of memory");
		case GL_INVALID_FRAMEBUFFER_OPERATION : throw new RuntimeException("GLerr: invalid FB operation");
		
		}
		
	}

He gets the GL_INVALID_ENUM.

Now, my question has two parts. Why Am I not getting the same error with my specs:

—FB stencil Bits: 8
—FB depth Bits: 24
—FB Red Bits: 8
—FB Green Bits: 8
—FB Blue Bits: 8
—FB Alpha Bits: 8
—Version: 4.5.0 NVIDIA 369.09
—SL Version: 4.50 NVIDIA
—glRenderer: NVIDIA Corporation, GeForce GTX 660M/PCIe/SSE2

And second, why is he getting the error?

I have a hunch that something in the code is decrepit, but I’ve googled everything and can’t find it… And, if so, if there anything I can specify so that my computer also picks up decrepit stuff, as of CORE 3.3 (perhaps there is lots more of this that I don’t know about in other parts of the source code).

Knowing that a specific error was generated is meaningless unless you know which command generated it.

If you call glGetError() twice, and it returns GL_NO_ERROR for the first call and some other value for the second call, you know that some command between the two calls generated the error. Calling glGetError() at intermediate points will allow you to narrow down the source.

Alternatively, if you are using OpenGL 4.3 or later or have the KHR_debug extension, you can use the glDebugMessage* functions to obtain errors without needing to call glGetError().

[QUOTE=GClements;1284371]Knowing that a specific error was generated is meaningless unless you know which command generated it.

If you call glGetError() twice, and it returns GL_NO_ERROR for the first call and some other value for the second call, you know that some command between the two calls generated the error. Calling glGetError() at intermediate points will allow you to narrow down the source.

Alternatively, if you are using OpenGL 4.3 or later or have the KHR_debug extension, you can use the glDebugMessage* functions to obtain errors without needing to call glGetError().[/QUOTE]

good point. A checkErrors() is called before the code snippet, I should have mentioned that. So the error occurs within the snippet. But since the error didn’t occur on my pc, I can’t perform an errorcheck and single out the command. I’m thinking that it’s glGetInterger(MAJOR/MINOR), since the other queries are printed with expected result. I’m really looking for a confirmation of this. My googling have led me to other people having problems with this function as well, but nothing resembling my particular case. And as far as I know, it’s in the core profile.

Here, the attachment parameter should be GL_DEPTH.

For these queries, the attachment parameter should probably be GL_FRONT_LEFT. GL_FRONT isn’t valid here. According to the glGetFramebufferAttachmentParameter reference page:

If the specified framebuffer is a default framebuffer, target, attachment must be one of GL_FRONT_LEFT, GL_FRONT_RIGHT, GL_BACK_LEFT, GL_BACK_RIGHT, GL_DEPTH or GL_STENCIL, identifying the corresponding buffer.

[QUOTE=GClements;1284376]Here, the attachment parameter should be GL_DEPTH.

For these queries, the attachment parameter should probably be GL_FRONT_LEFT. GL_FRONT isn’t valid here. According to the glGetFramebufferAttachmentParameter reference page:[/QUOTE]

Thank you for that nice peace of the puzzle. It’s probably correct, though I can’t try it out. After some googling I found that different gpus may or may not support code like the above. Nvidia is know to be very lenient. And there is no “strict”-mode to enable unfortunately. All you can do is to write perfect code from the beginning or test your code on different hardware. :frowning: