Problem with displaymode switching

Hi there.
I’ve got serious problems with my code that switches the screen resolution etc… If I’m starting the program the first time nothing unusual happens. But always while exitting the second session, the program hangs at the ChangeDisplaySettings( NULL, 0 ) command.
I tried to set the resolution afterwards manually and the windows dialog also hangs.
Has anybody an idea?
I could post some code as well, if that would help.

(Sorry, I’m usually no friend of posting the same question in both boards, but I need help badly since I would loose a whole week end and school is keeping me busy.)

Well, the NeHe guy has a word on something that seems strange to me.

Why should one have to switch back to the normal graphics mode before destroying the window.

In my app, I do it in the other way round, and my mouse gets corrupted and I can’t get any other resolution until a restart. I’ve got the newest Detonator on a TNT.

Matt, have you got any idea why? I will fix that anyway now, it’s only a question.

I don’t have any idea. Remember, though, that changing color depths is generally not reliable, and that in general, to be totally safe, you should always tear down OpenGL completely before changing the resolution or color depth.

  • Matt

Doh, that’s really Crazy!!!
What I am doing is this:

On startup:
Build a list of supported graphics modes with EnumDisplaySettings. I save them to a tree, and afterwards, I only use the saved numbers to switch to them.

Detach OpenGL from window
Destroy Window
Change graphics mode
Create window
atach opengl

I’m doing that in any case, just to be sure it will work if at all possible.
After I cut-and-pasted the NeHe code, it’s hanging every time I run that.

Maybe I’m getting it all wrong with windows crap. How to destroy a window? I simply use DestroyWindow.

It’s really crap, since I can only try something once, and then I’ll have to restart windows. Oh man…

Here’s some code, maybe it can help:

/*

makes up a list of all valid display-modes

*/
void win_makeup_modes( void )
{
display_mode_t *new_mode;
DEVMODE test_mode;
int mode_num = 0;

con_printf( "
fetching all valid display modes…" );

while( EnumDisplaySettings( NULL, mode_num, &test_mode ) )
{
if( ChangeDisplaySettings( &test_mode, CDS_TEST ) == DISP_CHANGE_SUCCESSFUL ) {
new_mode = (display_mode_t*) malloc( sizeof( display_mode_t ) );

  	new_mode->height = test_mode.dmPelsHeight;
  	new_mode->width  = test_mode.dmPelsWidth;
  	new_mode->bpp    = test_mode.dmBitsPerPel;
  	new_mode->mode_num = mode_num;
  	new_mode->next   = root_display;
  	root_display     = new_mode;
  }

  mode_num++;

}

con_printf( " found %u modes", mode_num );
return;
}

void win_create_glwindow( void )
{
WNDCLASSEX window_class;
int width, height, bpp, fullscreen;

get_data_from_var( current_disp_settings, &width, &height, &bpp, &fullscreen );

// Register my Window classes
app_instance = GetModuleHandle( NULL );
memset( &window_class, 0, sizeof(WNDCLASSEX) );
window_class.style = CS_OWNDC | CS_HREDRAW | CS_VREDRAW; //Fensterstile
window_class.lpfnWndProc = win_winproc; //Fensterprozedur
window_class.cbClsExtra = 0;
window_class.cbWndExtra = 0;
window_class.hInstance = app_instance;
window_class.hIcon = LoadIcon( app_instance, NULL); //Icons aus der Resourcedatei laden
window_class.hCursor = LoadCursor(NULL, IDC_ARROW); //Mauszeiger laden
window_class.hbrBackground = NULL; //Hintergrundfarbe festlegen
window_class.lpszClassName = “win_class”; //Klassenname festlegen
window_class.cbSize = sizeof( WNDCLASSEX );

if ( !RegisterClassEx(&window_class) ) { //Haupt-Fensterklasse registrieren
MessageBox(NULL, “Error: Unable to register window-class”, “Fehler”, MB_OK);
return;
}

if( fullscreen ) {
window_handle = CreateWindow(
“win_class”, //Klassenname
“ConsoleNT - fullscreen”, //Titel
WS_POPUPWINDOW|WS_CLIPCHILDREN|WS_CLIPSIBLINGS,
0, 0, //Position der linken oberen Ecke
width, height, //Breite, Höhe
NULL, //Kein Parent
NULL, //Kein Child
app_instance, //Instanz
NULL
);
} else {
window_handle = CreateWindow(
“win_class”, //Klassenname
“ConsoleNT - windowed”, //Titel
WS_OVERLAPPEDWINDOW|WS_CLIPCHILDREN|WS_CLIPSIBLINGS,
0, 0, //Position der linken oberen Ecke
width+8, height+26, //Breite, Höhe
NULL, //Kein Parent
NULL, //Kein Child
app_instance, //Instanz
NULL
);
}

if( !window_handle ) MessageBox( NULL, “Fehler!”, “Fehler”, MB_OK );

PIXELFORMATDESCRIPTOR desired_format = {
sizeof(PIXELFORMATDESCRIPTOR), // size of this pfd
1, // version number
PFD_DRAW_TO_WINDOW |
PFD_SUPPORT_OPENGL | // support OpenGL
PFD_DOUBLEBUFFER, // double buffered
PFD_TYPE_RGBA, // RGBA type
bpp, // 16-bit color depth
0, 0, 0, 0, 0, 0, // color bits ignored
0, // no alpha buffer
0, // shift bit ignored
0, // no accumulation buffer
0, 0, 0, 0, // accum bits ignored
16, // 16-bit z-buffer
0, // no stencil buffer
0, // no auxiliary buffer
PFD_MAIN_PLANE, // main layer
0, 0, 0 // 0 , reserved, layer masks ignored
};

int context_num;
window_context = GetDC( window_handle );

context_num = ChoosePixelFormat( window_context, &desired_format );
if( !context_num ) {
con_printf( "
Unable to find matching pixel-format" );
return;
}

SetPixelFormat( window_context, context_num, &desired_format);
opengl_context = wglCreateContext( window_context );

if( !opengl_context ) {
con_printf( "
wglCreateContext() failed" );
}
if( !wglMakeCurrent( window_context, opengl_context ) ) {
con_printf( "
wglMakeCurrent() failed" );
}

ShowWindow( window_handle, SW_SHOW );
SetForegroundWindow( window_handle );
SetFocus( window_handle );
}

void win_destroy_glwindow( void )
{
DestroyWindow( window_handle );
UnregisterClass( “win_class”, app_instance );

wglMakeCurrent( NULL, NULL );
wglDeleteContext( opengl_context );
ReleaseDC( window_handle, window_context );

DestroyWindow( window_handle );
UnregisterClass(“win_class”,app_instance);
}

And this is what I’m doing while resolution change:

  win_destroy_glwindow();
  
  EnumDisplaySettings( NULL, new_disp->mode_num, &dev_mode );
  dev_mode.dmFields = DM_BITSPERPEL | DM_PELSWIDTH | DM_PELSHEIGHT;

  result = ChangeDisplaySettings( &dev_mode, CDS_FULLSCREEN );
  if( result != DISP_CHANGE_SUCCESSFUL ) return 0;
  win_create_glwindow();

I appreciate any help!!!

[This message has been edited by Michael Steinberg (edited 12-09-2000).]

void win_destroy_glwindow( void ){ DestroyWindow( window_handle ); UnregisterClass( “win_class”, app_instance ); wglMakeCurrent( NULL, NULL ); wglDeleteContext( opengl_context ); ReleaseDC( window_handle, window_context ); DestroyWindow( window_handle ); UnregisterClass(“win_class”,app_instance);}

You are destroying window and unregistering window class twice in the same function. When you use CS_OWNDC I think that UnregisterClass() automatically destroys the class DC handle for you so you don’t have to call ReleaseDC().

Thanks! I’m gonna try that out!

Though, when I do a ChangeDisplaySettings call that does nothing, ie. the new mode is the old, it will exit without error, so I think it has something to do with ow I change the graphics mode.

That little bug with two times destroying the window in the win_destroy_glwindow is just a typing mistake.

Try changing CDS_FULLSCREEN flag to zero. Also, make sure that when you call destroy window that you DON’T process and dispatch any new messages to window procedure since your window handle will be dead at that time. Because I don’t see your main loop I can’t be sure what you’re doing.

This is my main loop:

while(true)
{
if(PeekMessage(&msg, hWnd, NULL, NULL, PM_NOREMOVE))
{
// Message available
GetMessage(&msg, hWnd, 0, 0);

		if(msg.message == WM_QUIT)				
		{
			m_Destroy();
			break;			
		}
		else			
		{
			TranslateMessage(&msg);
			DispatchMessage(&msg);			
		}
	}
	else
	{
		// Draw scene
		if(!m_RenderScene())
		{
			ShowCursor(true);
			wul_ShowError("Rendering failed...", __FILE__, __LINE__);				 
			PostMessage(hWnd, WM_CLOSE, NULL, NULL);	
		}
	}

In m_Destroy() I do this:

void COpenGL::m_Destroy()
{
HGLRC hGlrc = wglGetCurrentContext();
HWND hWnd = FindWindow(m_strWndClassname.c_str(), m_strWndname.c_str());

if(NULL != hGlrc)
{
	wglMakeCurrent(NULL, NULL) ; 						
	wglDeleteContext(hGlrc); 			
}
			
// Destroy the window and unregister the window class that this window belongs to
DestroyWindow(hWnd);		
UnregisterClass(m_strWndClassname.c_str(), GetModuleHandle(NULL));					
ShowCursor(true);				 	

}

These are the steps that happen in order:

1)I respond to WM_CLOSE message that is sent by the system when the window is closed by clicking on the X in upper right corner of the window bar.
2)I then call PostQuitMessage(0) which will post WM_QUIT message in the message queue.
3)I retrieve the WM_QUIT message and call my m_Destroy function. Then I break out of the main loop and exit the app.

Hope this helps I agree completely that it’s sometimes too difficult to get things right because there are too many different ways to do the same thing and no two people will have written the code the same way that’s why I’m posting mine here for you to see.

I also remembered something else. Notice in my PeekMessage(&msg, hWnd, NULL, NULL, PM_NOREMOVE) function I specify the hWnd explicitly. I could have made it NULL in which case I would be retrieving messages for all windows made from the window class. Instead I decided to listen only to the messages that were posted by my OpenGL window. I mention this because this might cause some problems too in weird situations

[This message has been edited by JD (edited 12-10-2000).]

Hi,

don’t know if this will help, but I saw that, in the code you posted, you never do the following:

From MSDN:[b]
Before calling EnumDisplaySettings, set the dmSize member to sizeof(DEVMODE), and set the dmDriverExtra member to indicate the size, in bytes, of the additional space available to receive private driver-data.

In my experience, not doing such initializations lead to wierd bugs …

Nicolas: Thanks!!!
That seems to be the thing, I think, because the is the only thing I see that is actually buggy. But if so, why do I actually get the correct screen settings. Well, win is a bit weird sometimes, so I’ll try that!
See ya!

Well, it has not been the problem. There was a bug somewhere, but I didn’t find it.
So, what everyone would do, I decided to simply recode that stuff. It works really nice now. Thanks for all of your support.

I’ve got another question now, sorry.
I’ve got some virtual “windows”, the position and the size given.
So I do a:

glViewport( x,y, width, height );

after that:

glOrtho( 0, width, 0, height );

Shouldn’t I get pixel precise lines like that:

glTranslatef( 0.5, 0.5, 0 );

glBegin( GL_LINES );
glVertex( 0, width-1 );
glVertex( height-1, width-1);
glEnd();

I end up not getting cool loking shaded buttons, because the lines aren’t as long as they should be, what do I make wrong?
Thanks!

Your lines are actually starting at the corners of pixels in that case. You’ll be at the whim of floating-point inaccuracies.

You should add 0.5 to the coordinates as appropriate.

To draw a 1-pixel point at the pixel (x,y), you draw a point at (x+0.5,y+0.5).

To draw a 1-pixel-wide line between (x1,y) and (x2,y) that is inclusive of both x1 and x2, you draw a line from (x1,y+0.5) to (x2+1,y+0.5).

To draw a rectangle from (x1,y1) to (x2,y2) that is inclusive of both corner points, you draw a rectangle from (x1,y1) to (x2+1,y2+1).

To start a bitmap or DrawPixels rectangle at the pixel (x,y), set your raster pos to (x+0.5,y+0.5).

Think about the way primitives get sampled (center of pixel rectangle) and this will all make sense.

This is why glVertex2i and glRasterPos2i are evil.

  • Matt

Uhmm, that sounded hard…
Oh, I used that glTranslate(0.5,0.5,0) Matt!
Shouldn’t do that the trick to move the vertices to the center?

Oh well, it wasn’t hard, sorry…
Uhmm, if I follow you correctly, I need to include the complete starting and ending pixel of the line in order to draw them correctly. I always thought, that if a line starts in the center of a pixel (which I was actually doing), that pixel will also be drawn, same applies to the ending. Why that difference between lines and vertices?

Pixel-exact line drawing (AA lines or non-AA lines, either way), by the way, is considered a rather high-end workstation feature. Be warned.

Imagine the line as a quad. You’re starting the quad at the center of the pixel. So the sample is on the edge of the primitive, and whether you’ll draw it or not will be essentially random based on precision issues.

This is not a perfect analogy because lines aren’t quads (though AA lines are very similar to quads), but it should work.

I strongly urge anyone trying to get pixel-exact lines across all HW (especially consumer HW) to actually use a quad in the app instead. And make sure the centers of the pixels you want fall entirely inside the quad, ideally by a half-pixel.

  • Matt

Also, OpenGL-compliant line hardware can include the start or end, but not both. Consider a line strip. If it included both, you’d get a double hit. 1-pixel-wide lines are not allowed to have double hits.

For more information, you will have to read the OGL spec. I am not an expert on the aliased line rasterization rules by any means. I do know the rules for points and triangles and for AA lines, though.

  • Matt

Thanks Matt.
I think I’m gonna use textured quads, where the texels will exactly fall onto the pixels, if you know what I mean. It’s also easier to change the look of my in-game windows…