SetPixelFormat always returns false?

Hello everyone! I am teaching myself OpenGL, and I am trying to just get a very basic, beginner OpenGL application up and running. However, I am running into a problem, and that is that the function SetPixelFormat always returns false. Here is my code (The specific code in question is in the function GameInitialize):



//-----------------------------------------------------------------
// Include Files
//-----------------------------------------------------------------
#include "GameClient.h"

//-----------------------------------------------------------------
// Game Engine Functions
//-----------------------------------------------------------------
BOOL GameInitialize(HINSTANCE hInstance)
{
  // Create the game engine
  // NOTE: _pGame is a global variable from GameClient.h
  _pGame = new GameEngine(hInstance, TEXT("Game Skeleton"),
                          TEXT("Game Skeleton"));
  if (_pGame == NULL)
  {
	return FALSE;
  }
  
  // Set the frame rate
  _pGame->SetFrameRate(15);

	//Perform OpenGL Initilization
	
	HDC   hDC;
	HWND  hWindow = _pGame->GetWindow();
	hDC = GetDC(hWindow);


		//Set up the Pixel Format
		PIXELFORMATDESCRIPTOR pfd;
		memset(&pfd, 0, sizeof(PIXELFORMATDESCRIPTOR));
		pfd.nSize = sizeof(PIXELFORMATDESCRIPTOR);
		pfd.dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER;
		pfd.nVersion = 1; //Version
		pfd.iPixelType = PFD_TYPE_RGBA; //Color Type
		pfd.cColorBits = 32; //Desired Color Depth
		pfd.cDepthBits = 24; //Depth Buffer
		pfd.iLayerType = PFD_MAIN_PLANE; //Main layer

		//Choose the best matching pixel format, return index
		int pixelFormat = -1234;
		pixelFormat = ChoosePixelFormat(hDC, &pfd);

		//Test the pixel format
		if (pixelFormat == -1234)
		{
			MessageBox(NULL, "OpenGL could not select a pixel format.", "An error occurred", MB_ICONERROR | MB_OK);
		}

		//Set pixel format to device context and test
		if(!SetPixelFormat(hDC, pixelFormat, &pfd))
		{
			MessageBox(NULL, "A pixel format could not be set.", "An error occurred", MB_ICONERROR | MB_OK);
		}

		//Set up a Rendering Context
		//Set the version we want. In this case, it is version 3.0
		
		int attribs[] = {
			WGL_CONTEXT_MAJOR_VERSION_ARB, 3,
			WGL_CONTEXT_MINOR_VERSION_ARB, 0,
			0}; //Zero signifies the end of the array
		
	
		//Create a temporary context so that we can get a pointer to the function we want.
		HGLRC tempContext = wglCreateContext(hDC);
		//Make the temporary context current
		wglMakeCurrent(hDC, tempContext);

		//Grabe the function pointer we want
		
		PFNWGLCREATECONTEXTATTRIBSARBPROC wglCreateContextAttribsARB;
		wglCreateContextAttribsARB = (PFNWGLCREATECONTEXTATTRIBSARBPROC)wglGetProcAddress("wglCreateContextAttribsARB");
		

		//Test versions. If this is null, then 3.0 is not supported.	
		if(!wglCreateContextAttribsARB)
		{
			MessageBox(NULL, "OpenGL 3.0 is not supported", "An error occurred", MB_ICONERROR | MB_OK);
			
			//Because OpenGL 3.0 is not supported, wglCreateContextAttribsARB cannot be used to
			//create our context. We instead will have to use wglCreateContext. This means that
			//our temporary context is now our main context.

			//Deselect the temporary context
			wglMakeCurrent(hDC, NULL);

			//Create the new context
			renderContext = wglCreateContext(hDC);

		}
		else
		{
			//Create a new OpenGL 3.0 context
			renderContext = wglCreateContextAttribsARB(hDC, 0, attribs);

			//Deselect the temp context
			wglMakeCurrent(hDC, NULL);

			//Delete the temporary context.
			wglDeleteContext(tempContext);
		}

		//Check the Rendering Context
		if(!renderContext)
		{
			MessageBox(NULL, "Failed to create a Rendering Context", "An error occurred", MB_ICONERROR | MB_OK);
		}

		//Make the new context active and test.
		if(!(wglMakeCurrent(hDC, renderContext)))
		{
			MessageBox(NULL, "Rendering Context could not be made active.", "An error occurred", MB_ICONERROR | MB_OK);
		}

		//OpenGL Drawing initialization
		//Enable Depth testing
		glEnable(GL_DEPTH_TEST);

		//Set background
		glClearColor(0.75f, 0.5f, 0.5f, 0.5f);

	glMatrixMode(GL_PROJECTION);						// Select The Projection Matrix
	glLoadIdentity();	

	glMatrixMode(GL_MODELVIEW);						// Select The Modelview Matrix
	glLoadIdentity();	

	ReleaseDC(hWindow, hDC);

  return TRUE;
}

void GameStart(HWND hWindow)
{
  // Seed the random number generator
  srand(GetTickCount());

}

void GameDeactivate(HWND hWindow)
{
  HDC   hDC;
  RECT  rect;

  // Draw deactivation text on the game screen
  GetClientRect(hWindow, &rect);
  hDC = GetDC(hWindow);
  DrawText(hDC, TEXT("Deactivated!"), -1, &rect,
    DT_SINGLELINE | DT_CENTER | DT_VCENTER);
  ReleaseDC(hWindow, hDC);
}

void GamePaint(HDC hDC)
{

}

void GameCycle()
{
  HDC   hDC;
  HWND  hWindow = _pGame->GetWindow();

  hDC = GetDC(hWindow);

  //OpenGL drawing
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
	glLoadIdentity();

	glBegin(GL_TRIANGLES);
		glColor3f(1.0f, 0.0f, 0.0f);
		glVertex3f(-0.5f, -0.5f, -2.0f);
		glColor3f(1.0f, 1.0f, 0.0f);
		glVertex3f(0.5f, -0.5f, -2.0f);
		glColor3f(0.0f, 0.0f, 1.0f);
		glVertex3f(0.0f, 0.5f, -2.0f);
	glEnd();


	//SwapBuffers(hDC);
	ReleaseDC(hWindow, hDC);

}

void GameEnd()
{
  // Cleanup the game engine
  delete _pGame;
}

void GameActivate(HWND hWindow)
{
  HDC   hDC;
  RECT  rect;

  // Draw activation text on the game screen
  GetClientRect(hWindow, &rect);
  hDC = GetDC(hWindow);
  DrawText(hDC, TEXT("Activated!"), -1, &rect,
    DT_SINGLELINE | DT_CENTER | DT_VCENTER);


  ReleaseDC(hWindow, hDC);
}


I know that is the problem because I am getting message boxes popping up stating that The pixel format could not be set, then that OpenGL 3.0 is not supported, then that A rendering context could not be created, and finally that the rendering context could not be made active. I feel like these all stem from the fact that I can’t set a pixel format.

I know that the right way to do things is to change the values used in the Pixel Format Descriptor since each card supports different things. However, this code is failing on a machine with an Nvidia GTX 470, but I get no error messages when I run it on a laptop with integrated graphics. That leads me to believe that my problem doesn’t stem from the settings I have selected in my PFD - and besides, I thought that ChoosePixelFormat was supposed to do a little bit of the leg work and select and appropriate format anyway.

(By the way, although this code generates none of the error message boxes I have programmed in on the laptop, the OpenGL aspect of the project does not work on either machine. I simply get a blank white screen.)

Does anyone have any suggestions what could be going wrong? I’m trying to learn to do this the right way, so absolutely any information would be appreciated. I’m following with the book “Beginning OpenGL Game Programming, Second Edition” by Luke Benstead, and this is pretty much exactly how he shows to set up OpenGL.

if (pixelFormat == -1234)

What is this and where does it come from? If ChoosePixelFormat fails to find one, it returns zero, not -1234.

Ok. I didn’t know if the function would return 0 if it couldn’t choose a good format, so I initially set pixelFormat to some number I knew it wouldn’t use. Then after I call ChoosePixelFormat if the variable is still that number, I knew the function didn’t change it at all. Now that i know that, I can change it and make this much better. Thanks!

EDIT: I made the changes and now test pixelFormat to see if it is zero. The program’s behavior hasn’t changed, though. My message boxes are popping up stating that the pixel format could not be set, that OpenGL 3.0 is not supported, and that a rendering context could neither be created nor made active.

Update: I’ve tried to play with the settings in my Pixel Format Descriptor, but I still only get false back from SetPixelFormat. Here is the new pfd I’m using:


//Set up the Pixel Format
		PIXELFORMATDESCRIPTOR pfd;
		//memset(&pfd, 0, sizeof(PIXELFORMATDESCRIPTOR));
		ZeroMemory(&pfd, sizeof(pfd));
		pfd.nSize = sizeof(PIXELFORMATDESCRIPTOR);
		pfd.dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER;
		pfd.nVersion = 1; //Version
		pfd.iPixelType = PFD_TYPE_RGBA; //Color Type
		pfd.cColorBits = 24; //Desired Color Depth
		pfd.cDepthBits = 16; //Depth Buffer
		pfd.iLayerType = PFD_MAIN_PLANE; //Main layer

Still no dice. The program is able to choose a format, but it cannot actually set it. What could be causing this?

Since ur pixeltype is RGBA, shouldn’t cColorBits be 32? and the depthbits 24?


pfd.cColorBits = 32; //Desired Color Depth
pfd.cDepthBits = 24; //Depth Buffer

That’s what I originally had it listed as, but it still cannot set the pixel format even with those numbers.

What you specify in your PFD shouldn’t matter (so far as this problem is concerned) as ChoosePixelFormat should always give a best match rather than needing to be absolutely constrained to an exact match.

What are you getting back for the value of pixelFormat from ChoosePixelFormat? And have you tried putting it into DescribePixelFormat to see what values and flags it has? This would be a useful exercise as it would give you info on the format that SetPixelFormat is failing on.

It’s also possible to loop through the pixelformats available on your machine and confirm that your driver actually does expose valid formats. Note here that pixel format numbers are 1-based, not 0-based, so code might look something like this:

for (int pfnum = 1; ; pfnum++)
{
   // return 0 = failed
   if (!DescribePixelFormat (hdc, pfnum, sizeof (PIXELFORMATDESCRIPTOR), &pfd)) break;

   // store or output info relating to this pixelformat
}

Thank you for your reply! I didn’t know about DescribePixelFormat, but I’ll use it to get more information on my problem.

I’m interested in seeing what the results are because if my understanding is correct, ChoosePixelFormat does the leg work of finding a pixel format that will be good on my machine. It would seem, then, that if it is successful, feeding that result into SetPixelFormat would be sure to work. I guess what i’m saying is that it surprises me that the failure is happening in SetPixelFormat and not ChoosePixelFormat. But that could be because I don’t completely know what I’m doing yet.

Ok. I found out that the index of the pixel format that ChoosePixelFormat returns is 7. I also did a little bit of digging, and found out that the cColorBits of pixelFormat 7 is 32, and the cDepthBits is 24, exactly as I specified in my code.

Doesn’t this mean that my drivers are, indeed, exposing formats to be used?

Check the flags; these will be a bitwise mask of the values given here: PIXELFORMATDESCRIPTOR (wingdi.h) - Win32 apps | Microsoft Learn

So you need code like:

if (pfd.dwFlags & PFD_FLAG_NAME_BLAH) // do something

Check 'em all and let us know what you get for 7.

Also remember that you can only set the pixel format once for a window, so if you are calling it multiple times on the same window, it will fail. ( read MSDN SetPixelFormat page )

Hi everyone! Sorry for the delayed reply, but I was away on vacation.

Anyway, I found the solution to my problem when I got back home. I updated the drivers on my graphics card, and all of a sudden my pixel format problems vanished! So if anyone is running into this problem in the future, try a driver refresh / update.

Thanks for the help everyone!