Problem with stencil buffer in Win32 Console

Hi everyone,

I am a new member of the forum seeking a help from openGL seniors out there :smiley:

Currently, I am doing an assignment about shadowing drawing using stencil buffer. The algorithm is alright, but it seems that when I wrote it in the Win32 Console framework , the stencil buffer was not working/disabled somehow, even when I used glEnable(GL_STENCIL_TEST), leaving no shadow rendered. When I use the framework used in all NeHe popular tutorials - he used WIN32 API to create real windows application to draw, there is a function having an argument to control the stencil buffer (0 - disable, 1- enable) , then my code can run and the shadow is displayed.

I want to ask what is the real cause of this issue and can I enable stencil buffer manually in console environment ?

Sorry for my bad English , thank you all.

You must activate the stencil capabilities of your window.

On Win32 you must set the pixel format to support it with something like this (sorry for french):


PIXELFORMATDESCRIPTOR pfd =
{ 	 
sizeof(PIXELFORMATDESCRIPTOR), 	//taille du descripteur de format
1, 	//version
PFD_SUPPORT_OPENGL |
PFD_DRAW_TO_WINDOW |
PFD_DOUBLEBUFFER, 	//Propriété
PFD_TYPE_RGBA, 	//Mode de couleurs
16, 	//Bits de couleur
0, 0, 0, 0, 0, 0, 	//Paramètres des couleurs
0,0, 	//Paramètres alpha
0,0, 0, 0, 0, 	//Paramètres du buffer d'accumulation
24, 	//Bits de profondeur      <=== Depth Buffer should be 24
8, 	//Bits du buffer stencil  <=== STENCIL SHOULD BE 8
0, 	//Nombre de buffers auxiliaires
0, 	//ignoré (obsolète)
0, 	//réservé/code>
0, 	//ignoré (obsolète)
0, 	//Couleur de transparence
0 	//Ignoré (obsolète)
}; 	 

Hope that helps.

Make sure you pick a pixel format with stencil buffer.
See what glGetIntegerv(GL_STENCIL_BITS, &value); returns.

Thanks for the help guys , really appreciate it.
@_arts: Your code setup is for Windows application, which is pretty similar to NeHE I think. Do you know how to create the same effect with Console environment, when you do not use such WIN API function.

@kyle_: When I use your function to print out the value , I see a bunch of zero. My initial call is glClearStencil(1.0f). Is it normal ?

If get integer with GL_STENCIL_BITS returns zero, it means you are on pixel format without stencil.

Also, how do you even create your context if you dont use winapi???

For that you need a handle to a window (well, DC) on which SetPixelFormat was called with pixel format id of pixel format that does have stencil - what arts said.

Whether you use Win32 console or Win32 with WinMain function, you must specify the pixel format, which should be done the same way for both.

It sounds to me as though you’re using a framework like GLUT or SDL to create your OpenGL context. Can you confirm if this is so before we continue?