Matrox G400 initialisation

I’m having trouble getting a matrox G400 to initalise properly to do openGL rendering. The same code works fine on Voodoo 1 (!) (when set to 16 bit), TNT, and GeForce cards, but the matrox card falls over on SetPixelFormat() using the latest drivers (on NT4SP6a). Here’s a snippet of init code:

// Initialise directdraw:
if (DirectDrawCreate(NULL, &lpdd, NULL) != DD_OK)
{
	printf("Failed to create directdraw

");
return -1;
}

// Step 1: set cooperative level so that we get exclusive access to the whole
// screen.
if (lpdd->SetCooperativeLevel(hWnd, DDSCL_FULLSCREEN | DDSCL_EXCLUSIVE | DDSCL_ALLOWREBOOT) != DD_OK)
{
	printf("Failed to set cooperative level

");
Destroy();
return -1;
}

// Step 2: set the display mode to 640x480x32
if (lpdd->SetDisplayMode(
		640,
		480,
		32
	) != DD_OK)
{
	printf("Failed to set screen mode

");
Destroy();
return -1;
}

PIXELFORMATDESCRIPTOR pfd = { 
	sizeof(PIXELFORMATDESCRIPTOR),   // size of this pfd 
	1,                     // version number 
	PFD_DRAW_TO_WINDOW |   // support window 
	PFD_SUPPORT_OPENGL |   // support OpenGL 
	PFD_DEPTH_DONTCARE |
	PFD_GENERIC_ACCELERATED |
	PFD_DOUBLEBUFFER,      // double buffered 
	PFD_TYPE_RGBA,         // RGBA type 
	24,                    // 16-bit color depth 
	0, 0, 0, 0, 0, 0,      // color bits ignored 
	0,                     // no alpha buffer 
	0,                     // shift bit ignored 
	0,                     // no accumulation buffer 
	0, 0, 0, 0,            // accum bits ignored 
	0,                     // NO z-buffer 
	1,                     // 1-bit stencil buffer 
	0,                     // no auxiliary buffer 
	PFD_MAIN_PLANE,        // main layer
	0,                     // reserved 
	0, 0, 0                // layer masks ignored
};
int  iPixelFormat;  

hdc = GetWindowDC(hWnd);  

// get the best available match of pixel format for the device context  
iPixelFormat = ChoosePixelFormat(hdc, &pfd);
if (iPixelFormat == 0) {
	printf("Failed to choose pixel format.

");
Destroy();
return -1;
}

printf("Pixel format is %d

", iPixelFormat);

// make that the pixel format of the device context 
if (SetPixelFormat(hdc, iPixelFormat, &pfd) == FALSE) {
	printf("Failed to set pixel format

");
Destroy();
return -1;
}

printf("Created display

");

// Hide the mouse
ShowCursor(FALSE);

return 1;

Try using 16 or 32 for color depth instead of 24. I remember having trouble with some cards / drivers when using 24 bit.

OK, tried 32, and it didn’t like it. Tried 16 and finally it gets past the initialisation, but displays totally unrecognisable textures. (My test program displays a string 0123456789ABCDEFH…Z which follows the mouse; on the G400, I get a bizarre embossed grey “Z” shape which follows the mouse in place of the 0, and nothing else).

No OpenGL errors are being reported so glTexImage2D would appear to be doing its business correctly.

It’s a little difficult to post the demo program because most of it’s in Java and quite large…

Try to do the inititalzation without directdraw. Do it with only opengl calls instead.

Ah, the directdraw stuff’s kinda important, order to force the display nicely into the screen mode I want it to be in (which is fullscreen 640x480x32 - this is for a game, y’see). Is there a better way to do this? I specifically don’t want to render to a window or have anything do to with window management in any way (I use directinput for keyboard and mouse too).

Hi there!

Hum, well I hadn’t had the pleasure yet to run into an G400, but I think it could be the stencil flag, causing your trouble.

Try again in 640x480x32 without the stencil flag.

Regards,

LG

Will have a go.

A rule of thumb: don’t mix two API’s

Using one API to do one thing, then all of a sudden change to another API to take over (setting videomode with DirectX and render with OpenGL) is baaad programming.

Bob

Use this to change to fullscreen rendering in OpenGL:
ChangeDisplaySettings(&devmode, CDS_FULLSCREEN);

Tried it. Basically no change, except that when the process abruptly crashes now, I don’t get the display settings restored (directdraw was kind enough to do it properly).

The call to SetPixelFormat causes the process to exit without error codes or access violations - bang! Gone. I’ve got a valid pixelformat number so the ICD is presumably matching this OK.

(So far all the tweaks I’ve done I’ve tested on nvidia cards and they still work)

interesting side effect: when set to 24 bit SetPixelFormat doesn’t crash, and the application runs. But then I get the corrupted texture problem.

Do the matrox drivers take a copy of the texture pointed to by glTexImage2D or do they just retain the pointer? I wonder what would happen if I free’d the texture memory after a call to glTexImage2D. Is this specified anywhere in the OpenGL specs?

coco (a MIA contributor ) told me that opengl keeps a copy of the texture pixels in system ram, and uses the card’s ram as a cache.

i made tests on g200, tnt and nv10, and found the same predicted behaviour.
also, since g200 is from matrox i think this is the same for g400.

i even tried to go above the ram of the g200 card i have at work (8megs), and the texture appeared correctly everytime… just a bit slow sometime, due to bus/agp transfer.

in short, use a system ram buffer just for the glTexImage() call, then free it.

Dolo//\ightY

Thanks dmy, that’s what I thought was the case.

Back to the original problem though: how about a different tactic - eliminate a dodgy setup. Has anybody got a teeny openGL demo source that they know works on the Matrox G400 to hand that I could try? (That sets the res to 640x480x16 fullscreen)

Hi there!

Hm, hum. I think the G400 driver is playing tricks on us .

Well, as I can see you switched into 32 bits per pixel mode and specified a RGBA pixel format, so the bitdepth should be as well 32 bits per pixel with 24 bits for rgb and 8 bits for a alpha or stencil buffer.

Did you ever tried this?

PIXELFORMATDESCRIPTOR pfd = {
sizeof(PIXELFORMATDESCRIPTOR), // size of this pfd
1, // version number
PFD_DRAW_TO_WINDOW | // support window
PFD_SUPPORT_OPENGL | // support OpenGL
PFD_DEPTH_DONTCARE |
PFD_GENERIC_ACCELERATED |
PFD_DOUBLEBUFFER, // double buffered
PFD_TYPE_RGBA, // RGBA type
32, // 32-bit color depth
0, 0, 0, 0, 0, 0, // color bits ignored
8, // no alpha buffer
0, // shift bit ignored
0, // no accumulation buffer
0, 0, 0, 0, // accum bits ignored
0, // NO z-buffer
0, // 1-bit stencil buffer
0, // no auxiliary buffer
PFD_MAIN_PLANE, // main layer
0, // reserved
0, 0, 0 // layer masks ignored
};

???

I think the problem is in this command.
As I said before I dont have G400 (but the name Matrox still rings my bell, since I got hair tearing nights due to there so called “GDI acceleration”, but that’s another story).

Hm,hm that works perfectly fine on my TNT…

Well, you probably have to play around with that.

Two other things. First you said that the texture is coorupted. What do you mean by this? Has the texture a funny “tint” like purple or green? In this case we have a RGB to BGR problem (Windows works with a BRG pixel, even I didn’t figured why there use BGR. Maybe it only means (B)ill (G)ates ®ules .

Anyway, the second thing I want to mention is the following. If you switch the display mode via ChangeDisplaySettings(…) The change will affect GDI in general not only your application (sorry bob, but that’s what I would call bad programming). As a side affect the “normal” GDI settings can’t be restored if your app crashed causing your customers to freak out (bin there, done that, got the scares).

Looks like a good point to stop the endless post, sorry guys and girls.

With regards,

LG

And may the vector be with you!

Hello again…

lgrosshennig, you said using ChangeDisplaySettings will affect GDI in general, and that this is bad programming. The function works this way, and there is nothing you can do about it, right? What I meant with bad programming, was about flipping between two completely different APIs.

Bob

Careful you don’t get into a flame war here Bob; you should know them’s fightin’ words.

Forget about bad programming; that’s neither here nor there as it’s only your humble opinion, nor do I think there is such a thing as an unrelated api - merely an imcompatible one - and these one’s aren’t supposed to be. The DirectX api works perfectly well with OpenGL under Windows and is in fact better than the ChangeDisplaySettings call (which is also an unrelated API to OpenGL) as it cleans up after itself properly and doesn’t interfere with everything else.

The only people that could be accused of bad programming are Matrox, because their drivers don’t conform to the API contract. I know this because simple things crash it or produce unexpected results on the G400, and work reliably on all the other consumer cards I’ve so far tested.

Back to the matter in hand: I’ll be testing that RGBA mode and see if that does the trick.

Now chill

Hi there!

First things first! Bob I apologize if I offended you, I didnt mean to.

But since I was squabbled to death when I was used the devmode trick (from customers and my ex boss) until my brain popped out. It is only that I can’t recomend this method to others.
It maybe ok, if you write a game or demo, but at least I wasn’t.

PEACE

LG

May the vector be with you!

Hi again …

I have a g400 too, and i know it’s OpenGL driver sucks … however, i have a working OpenGL app which initializes with ChangeDisplaySettings(&devmode, CDS_FULLSCREEN);
I could send it to you if you want to.

Felt like saying a few thing in this thread.
First, I’m not offended, so you don’t have to apologize .
Second, ChangeDisplaySettings may very well belong to a non-OpenGL API (if it does, it would mean I just said I am a bad programmer ), but it’s a standard function in Win32, DirectX is not.

If DirectX works together with OpenGL, there’s no reason not to use it. I just think you shouldn’t mix APIs like that.

What would the world look like if everyone had the same thougts?

Bob

How’d you do the wacky animated smiley???

Cas