GeForce and Vertex Arrays

I’ve had a number of people tell me that my demo doesnt work on Geforce cards, so I had no choice but to go and buy a Geforce and debug it :/. The following code was cauing the problem:
glutInitDisplayMode(GLUT_DOUBLE | GLUT_DEPTH | GLUT_RGB);
.
.
glEnableClientState(GL_NORMAL_ARRAY);
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_COLOR_ARRAY);
.
.
glVertexPointer(3,GL_FLOAT,0,mesh_vertices);
glNormalPointer(GL_FLOAT,0,mesh_normals);
glColorPointer(3,GL_UNSIGNED_BYTE,0,mesh_colours);
.
.
glDrawElements(GL_QUAD_STRIP,i4,GL_UNSIGNED_INT,mesh_indices[i]+i5);

Notice anything intrinsically wrong with this code?

the program was hanging as soon as anything was drawn with drawelements.

To cut a long story short the problem was more specifically to do with the colour array and the fact that each vertex required 3 bytes for colour and not 4. As soon as I defined the color pointer to be a float, it worked but of course it slowed my app down. In the end I got round it by sticking with unsigned bytes for colour and using RGBA mode, this maintained 4 byte alignment. This doesnt just apply to the colour array the same happens if you use GL_INT for the normal arrays. 2*3=6 6/4 = 1.5 = crash.

Now as far as I know everyone with a mainstream card apart from the GF has been able to run my app so this is specific to GeForce cards and probably to do with the way they have implemented HW T&L. Just thought I should let you guys know.

As an aside , does anyone know if/how you can remove the DOS box in GLUT apps, also can GLUT apps be run fullscreen as opposed to maximsed. Thanks

On the subject of hiding the DOS box, I tried the following:

HWND console = FindWindow(“ConsoleWindowClass”, argv[0]);
if (console == NULL)
cout << “Couldn’t get console.” << endl;

if (!ShowWindow(console, SW_HIDE))
cout << “Couldn’t hide window.” << endl;

Which worked fine when I stepped through the code (using VC++ 5) with F5, but as soon as I ran it at “full speed”, the FindWindow() function failed. However, I think this is the way to go.
As for the full screen thing, I remember reading some code to do this, but for the life of me, I can’t remember where!
IIRC, you create a window style based on the required resolution, then call some DirectX function. You might want to try Gamasutra or similar site.
It’s not much help, but I know it’s possible.

He Who Still Plays Mode 13h Games.

-John

You can also avoid showing a DOS box in the first place. In the (MSVC6) Project/Settings/Link/General/ProjectOptions box, specify the options:

/subsystem:windows /entry:mainCRTstartup

(GLUT projects use /subsystem:console by default so you’ll probably have to change this.)

This approach has the advantage that it doesn’t affect your source code, so you’re still portable.

Thanks,
I tried that but when I compile I get
unresolved external symbol _mainCRTstartup

so I created the function _mainCRTstartup and got it to call main but it crashes.

I suspect the reason it doesn’t work at full speed is that you look for the window before it actually exists. Try making that code run a second or so AFTER the window is created (do something else before you hide it)… Unless there’s (suprise) a bug in something other than your code. Perhaps something written by, say, a company who’s name starts with M

Oh, and GLUT CAN run full screen… I HAD some source for a proggy that did that 'till my main box’s HD crashed, maybe tommorow I can D/L it again and find the stuff that does it.

[This message has been edited by Kaeto (edited 05-19-2000).]

Found the zip on a CD I’d burned. The function is

(drum roll)

glutFullScreen(); (big surprise )

Tada! (at least, that’s what his code does… I don’t use glut, but he just sets up a window, and then calls that function. If you can’t get it to work, I can try to find the URL I got this from…

yeah that works fine, using that in conjunction with glutReshapeWindow and glutPositionWindow I can toggle between windowed and full screen, thanks.