glNormalPointer isn't doing normal things!

To be specific, glnormalpointer doesnt appear to be working, as objects with normals associated with them become dark as if they were not being lit by the environment.

here’s some code and screenshots of the issue

init function

GLint normalData[]=
{
0,0,0, 
0,0,0, 
0,0,0,
0,0,0,
0,0,0,
0,0,0,
};

GLuint texcoords[] ={
1,1,
1,0,
0,0,
0,0,
0,1,
1,1,
};

GLuint BufferName[3];

GLfloat PositionData[]=
{
	-1.0f,-1.0f,0.0f,
	 1.0f,-1.0f,0.0f,
	 1.0f, 1.0f,0.0f,
	 1.0f, 1.0f,0.0f,
	-1.0f, 1.0f,0.0f,
	-1.0f,-1.0f,0.0f,
};

	glViewport(0,0,SCREEN_WIDTH,SCREEN_HEIGHT);						// Reset The Current Viewport
	glMatrixMode(GL_PROJECTION);						// Select The Projection Matrix
	glLoadIdentity();									// Reset The Projection Matrix
	// Calculate The Aspect Ratio Of The Window
	gluPerspective(45.0f,(GLfloat)640/(GLfloat)480,0.1f,1000.0f);
	glMatrixMode(GL_MODELVIEW);							// Select The Modelview Matrix
	glLoadIdentity();

	glEnable(GL_TEXTURE_2D);						// Enable Texture Mapping
	glShadeModel(GL_SMOOTH);						// Enable Smooth Shading
	glClearColor(0.278f, 0.666f, 1.0f, 0.5f);					// Background
	glClearDepth(1.0f);							// Depth Buffer Setup
	glEnable(GL_DEPTH_TEST);						// Enables Depth Testing
	glDepthFunc(GL_LEQUAL);							// The Type Of Depth Testing To Do
	GLfloat LightAmbient[]= { 0.5f, 0.5f, 0.5f, 1.0f }; 				// Ambient Light Values ( NEW )
	GLfloat LightDiffuse[]= { 1.0f, 1.0f, 1.0f, 1.0f };				 // Diffuse Light Values ( NEW )
	GLfloat LightPosition[]= { 0.0f, 0.0f, 2.0f, 1.0f };				 // Light Position ( NEW )


	glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);			// Really Nice Perspective Calculations
        glLightfv(GL_LIGHT1, GL_AMBIENT, LightAmbient);
        glLightfv(GL_LIGHT1, GL_DIFFUSE, LightDiffuse);				// Setup The Diffuse Light
        glLightfv(GL_LIGHT1, GL_POSITION,LightPosition);			// Position The Light
        glEnable(GL_LIGHT1);							// Enable Light One
        glEnable(GL_LIGHTING);
        glEnable(GL_POLYGON_SMOOTH);

        glBindBuffer(GL_ARRAY_BUFFER, BufferName[3]);
	glBufferData(GL_ARRAY_BUFFER, sizeof(GLint)*3*6, normalData,GL_STREAM_DRAW);
	glNormalPointer( GL_INT, 0, 0);*/
	
        glBindBuffer(GL_ARRAY_BUFFER, BufferName[1]);
        glBufferData(GL_ARRAY_BUFFER, sizeof(GLfloat)*18, PositionData, GL_STREAM_DRAW);
        glVertexPointer(3, GL_FLOAT, 0,0);

	glBindBuffer(GL_ARRAY_BUFFER, BufferName[2]);
        glBufferData(GL_ARRAY_BUFFER, sizeof(GLint)*12, texcoords, GL_STREAM_DRAW);
	glTexCoordPointer(2, GL_INT, 0, 0 );

and then we have the draw function

	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);	// Clear The Screen And The Depth Buffer
        glLoadIdentity();
        //bunch of stuff for moving the camera around

	glEnableClientState(GL_VERTEX_ARRAY);
	glEnableClientState(GL_NORMAL_ARRAY);
	glEnableClientState(GL_TEXTURE_COORD_ARRAY);

        glDrawArrays(GL_TRIANGLES, 0, 6);

        glDisableClientState(GL_VERTEX_ARRAY);
	glDisableClientState(GL_NORMAL_ARRAY);
	glDisableClientState(GL_TEXTURE_COORD_ARRAY);

        SDL_GL_SwapBuffers();

this screenshot is of the object using the code above.
http://i.imgur.com/zAomz.png
note: i originally thought the error was that the normals were pointing in a direction opposite of the camera, but i reversed them and still had the same output

screenshot of the object when not specifying a normals array, it defaults to normal z=1, I believe
http://i.imgur.com/sHTbs.png

screenshot of the object with lighting turned off
http://i.imgur.com/sHTbs.png

so any idea what’s wrong here? is it something wrong with my code or something else?

Your normals are all zeros. Normals need to be unit vectors.

Also, you never generate buffer objects with glGenBuffers. You should do that.

oops, well during debug I have used different values from the normals, in each direction at least once actually. How do I make my buffers without using glgenbuffers?

I mean that you didn’t use glGenBuffers. Or if you did, you didn’t post that code. Did you call it after GL was initialized?

ah yes, it was in a part of code I had clipped out

glGenBuffers(3, BufferName); is right before the second half of the init code befor the buffer binding

so technically it’s after opengl was initialized, is that a problem?

I notice that you use GLint to store your normals. Is it intended ?

ahhhhhhh, changing the structure to float fixed it. Thank you!

You should also change your glNormalPointer call as well.