Can't get my "red screen" showing

My goal was to make a simple scene that displays nothing but a red background. Instead, for some reason, it’s showing me a black scene. What am I doing wrong here? I’ll post the entire code since I don’t know what sections to look for.

#define WIN32_LEAN_AND_MEAN
 
#include <windows.h>
#include <windowsx.h>
#include "winmain.h"
#include <gl/gl.h>
#include <gl/glu.h>
#include <gl/glaux.h>
 
// Macros
#define KEYDOWN(vk_code) ((GetAsyncKeyState(vk_code) & 0x8000) ? 1 : 0)
#define KEYUP(vk_code) ((GetAsyncKeyState(vk_code) & 0x8000) ? 0 : 1)
#define MsgError(msg) { MessageBox(NULL, msg, "Error", MB_OK | MB_ICONERROR); }
#define ALIGN32 __declspec(align(32))

// Globals
const char *g_szClassName = "WNDCLASS1";
const char *g_szWinTitle = "OpenGL Application";
const int WIDTH = 800;
const int HEIGHT = 600;
HWND g_hWnd = NULL;
HINSTANCE g_hInstance;
HGLRC hRC = NULL;
bool active = TRUE, fullscreen = TRUE;

// Prototypes 
void Game_Main();
void Game_Destroy();
void GetInput(); 

// Resize & Initialize the OpenGL Scene
GLvoid ReSizeGLScene(GLsizei w, GLsizei h) {
	if (h == 0) 
		h = 1;
	glViewport(0, 0, w, h);
	glMatrixMode(GL_PROJECTION);
	glLoadIdentity();
	gluPerspective(45.0f, (GLfloat)w/(GLfloat)h, 0.1f, 100.0f);
	glMatrixMode(GL_MODELVIEW);
	glLoadIdentity();
}

// Init OpenGL Scene Here
int InitGL(GLvoid) {
	glShadeModel(GL_SMOOTH);							// enable smooth shading
	glClearColor(1.0f, 0.0f, 0.0f, 0.5f);				// black background
	glClearDepth(1.0f);									// depth buffer setup
	glEnable(GL_DEPTH_TEST);							// enable depth testing
	glDepthFunc(GL_LEQUAL);								// what kind of testing to do
	glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);	// really nice perspective calculations
	return TRUE;
}

int DrawGLScene(GLvoid) {
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
	glLoadIdentity();
	return TRUE;
}

int WINAPI WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPSTR lpCmdLine, int nCmdShow) {
	HWND hWnd = NULL;
	MSG msg;

	// Build the parent window class
	WNDCLASSEX wc; 
	wc.cbClsExtra = 0;
	wc.cbWndExtra = 0;
	wc.cbSize = sizeof(WNDCLASSEX);
	wc.hbrBackground = (HBRUSH) (GetStockObject(BLACK_BRUSH));
	wc.hCursor = LoadCursor(NULL, IDC_ARROW);
	wc.hIcon = LoadIcon(NULL, IDI_APPLICATION);
	wc.hIconSm = LoadIcon(NULL, IDI_APPLICATION);
	wc.hInstance = hInstance; 
	wc.lpfnWndProc = WndProc;
	wc.lpszClassName = g_szClassName;
	wc.lpszMenuName = NULL;
	wc.style = CS_VREDRAW | CS_HREDRAW | CS_DBLCLKS | CS_OWNDC;

	// Register the parent window class
	if (!RegisterClassEx(&wc)) {
		MsgError("Registering the parent window class failed!");
		return 0;
	}

	// Create the parent window 
	hWnd = CreateWindowEx(NULL, g_szClassName, g_szWinTitle, WS_OVERLAPPEDWINDOW, 
		CW_USEDEFAULT, CW_USEDEFAULT, WIDTH, HEIGHT, NULL, NULL, hInstance, NULL);

	if (hWnd == NULL) {
		MsgError("Creating the parent window failed.");
		return 0;
	}

	// Save globals
	g_hWnd = hWnd;
	g_hInstance = hInstance;

	// Center the parent window
	int xPos = (GetSystemMetrics(SM_CXSCREEN) - WIDTH) / 2;
	int yPos = (GetSystemMetrics(SM_CYSCREEN) - HEIGHT) / 2;
	SetWindowPos(g_hWnd, NULL, xPos, yPos, WIDTH, HEIGHT, SWP_NOZORDER | SWP_NOSIZE);

	// Show & Update Window
	ShowWindow(hWnd, nCmdShow);
	UpdateWindow(hWnd);

	InitGL();
	ReSizeGLScene(WIDTH, HEIGHT);
	
	// Message Loop
	while (TRUE) {
		if (PeekMessage(&msg, NULL, 0, 0, PM_REMOVE)) {
			if (msg.message == WM_QUIT)
				break;

			TranslateMessage(&msg);
			DispatchMessage(&msg);
		}

		// Game logic
		Game_Main();
	} 

	Game_Destroy();

	return (int) msg.lParam;
}
  
// Main game loop
void Game_Main() {
	GetInput();
	DrawGLScene();
} 

// Release objects and resources here
void Game_Destroy() {
}

// Get user input
void GetInput() {
	if (KEYDOWN(VK_ESCAPE))
		SendMessage(g_hWnd, WM_CLOSE, 0, 0);
} 

You are not initializing the OpenGL context. See this Nehe tutorial the body of CreateGLWindow() function. If you initialize the context in double buffered mode you will also have to call SwapBuffers after each frame is rendered.

You are setting your clear color as red
glClearColor(1.0f, 0.0f, 0.0f, 0.5f);

It goes red, green, blue, and alhpa

To it to black try setting rgb all to 1.0f so

glClearColor(1.0f, 1.0f, 1.0f, 0.5f);
Cheers!

Sorry, I mis-read you post never mind my last post

Note: That NeHe tutorial has an error. Instead of the following line:
memset(&dmScreenSettings,0,sizeof(dmScreenSettings));
you should call the function EnumDisplaySettings() to get the current settings of the system. If you set all the values to 0, you set all the default values and so set the refresh rate of your monitor to 60Hz. But if you use from the function EnumDisplaySettings(), you fill the DEVMODE struct with the current settings. We spent some days to discover this error :frowning: ( I hope that someone report this error to NeHe :wink: )
-Ehsan-

But if you use from the function EnumDisplaySettings(), you fill the DEVMODE struct with the current settings.

There is one problem with using EnumDisplaySettings(). If Windows have incorrect informations about monitor parameters this function may report frequencies which, while supported by card, are not supported by monitor. So the 60Hz is safe bet rather than error.

Minimum refresh rate reported by windows is 60Hz. So with the function EnumDisplaySettings(), the refresh rate of our monitor is at least 60Hz.
So with
1)memset() monitor refresh rate = 60hz;
2)EnumeDisplaySettings() monitor refresh rate >= 60hz

EnumDisplaySettings() gets the current graphics settings. If the user set the monitor refresh rate to 75Hz, maximum refresh rate in full screen mode is 75Hz. If the current refresh rate is 60Hz, maximum refresh rate in game is 60Hz.But memset() sets the monitor refresh rate to 60Hz.
60Hz is not a suitable refresh rate for a commercial game.
-Ehsan-

“My goal was to make a simple scene that displays nothing but a red background.”

That’s a lot of code for just clearing the framebuffer.

“60Hz is not a suitable refresh rate for a commercial game.”

Ehsan, 60Hz is the ideal refresh rate for commercial games – You should probably get your facts straight. In fact, I can’t really make sense of your post at all, other than it has something to do with monitor frequency.

I know from personal experience that 60Hz is definitly not the ideal refresh rate. 85Hz is the recommended rate for viewing on a CRT computer monitor. 60Hz has been known to cause severe eye strain and headaches.
Personally, I hate games that force 60Hz.

Realize that on an LCD monitor, you can’t tell 60Hz from 85Hz.


EnumDisplaySettings() gets the current graphics settings. If the user set the monitor refresh rate to 75Hz, maximum refresh rate in full screen mode is 75Hz.

As long as the fullscreen mode has the same (or likely lower) resolution as mode you called that function in which may be not always true.

“85Hz is the recommended rate for viewing on a CRT computer monitor.”

Recommended by whom? I’m aware of no such recommendation. It is true that some people prefer higher refresh rates on CRTs (including myself), but I’ve never noticed the difference in a game, and I consider myself very sensitive to flicker–I can actually see the flicker at 60Hz on white screen.

Aside from the the eyestrain you speak of (primarily due to prolonged reading), there’s no reason at all to actually render in excess of 60Hz – the eye simply won’t notice the difference, and that’s time better spent on other things, such as physics and AI.

“Realize that on an LCD monitor, you can’t tell 60Hz from 85Hz.”

That’s nonsense.

I think you’re confusing framerate and refreshrate. A higher refreshrate does not cost performance.

About the eye not noticing the difference above 60 FPS: I know people that can tell the difference between 90 and 100 FPS (I myself can’t, but they demonstrated it to me, as I didn’t believe them either).

Recommended by whom?
For example:
http://en.wikipedia.org/wiki/Refresh_rate

“I think you’re confusing framerate and refresh rate.”

No, I was making the distinction. There’s no need to render in excess of 60Hz – ergo there’s no need to refresh in excess of 60Hz – in a game. True, there maybe some discomfort for some people with big CRTs, but I’d have to see the math to believe that 90-100Hz makes a difference – that probably has more to do with perceived smoothness of mouse motion or something to do with darting the view around quickly (quakecon alumni testimonials, and the like).

“A higher refreshrate does not cost performance.”

I’m not so sure about that. I think it does actually cost (GPU) performance, but I’ll have to defer the details to a future that has them in front of me.

Originally posted by <TinWhisker>:
[b] “A higher refreshrate does not cost performance.”

I’m not so sure about that. I think it does actually cost (GPU) performance, but I’ll have to defer the details to a future that has them in front of me. [/b]
No it does not. How could it ? It does only regards the monitor but the fact you write your own code that has to wait before sending the image. Nevertheless I don’t see how it could cost GPU performance since in most of the time it has simply nothing to do with it. In some other situations, you use some extensions to do that synchronization, but the GPU, in my point of view, is idling, not over consuming. What to do when you have to wait ? waiting… or doing other works.

How could it ?
By consuming part of memory bandwith that could be otherwise used for another work. In resolution 1600x1200, 32 bit at 100Hz the card output circuits need to read aproximately 732MB per second. For current cards this it is not too important however it was in the past.

“Realize that on an LCD monitor, you can’t tell 60Hz from 85Hz.”

That’s nonsense.
Well for my part I have never seen any LCD screen able to do more than 60 Hz.

“Well for my part I have never seen any LCD screen able to do more than 60 Hz.”

I have. In fact most of the new (good) LCDs range anywhere from 60Hz to around 75Hz, or so. But what does refresh rate have to do with flicker on LCDs, anyway? Both active and passive LCD technologies maintain their “pixel state” between the refresh. Response time is the sticking point with LCDs, and 8-16ms is considered sufficient to avoid most blurring and ghosting artifacts, although not entirely – it turns out that it’s more than just response time that contributes to this. Flicker, on the other hand, shouldn’t be an issue at all with LCDs.

“Nevertheless I don’t see how it could cost GPU performance since in most of the time it has simply nothing to do with it.”

With a CRT, the electron beam scans out the display, line by line, exciting the phosphorus painted screen as it goes. After the beam passes over the pixel, the pixel begins its return to its normal state (black), hence the flicker with CRTs – a problem made worse by larger screen sizes, since there’s more screen for the beam to cover in the same amount of time. No state is permanently retained in the phosphor, so the controlling mechanism must pull video memory to refresh the display with enough frequency to fool the eye into thinking the image is persistent. I’m not clear on the exact details of this operation, but suffice it to say that it’s not free.

This is also why flicker is not really noticeable unless there’s a high degree of contrast, such as when reading black text on a white background. The constant oscillation between black and white is very noticeable. But in the typically dark and colorful worlds of games, this really isn’t the case, hence it’s far less objectionable.

TVs, while CRTs themselves, don’t suffer as badly as computer monitors, even at 60Hz – for reasons that are not completely known to me (something to do with the phosphor process, if I had to guess). Ever got eyestrain from playing a console game on your TV?

The CRTs are on their way out, permanently. And that being the case, I can see no reason why refresh rates would be an issue in the future – not where image quality is concerned, anyway.

Can you try to call “initGL()” earlier? Maybe before “showWindow()”?

Just a guess because I had the same problem when the call to init() is not early enough in GLUT.