Strange lighting

Ok, I’m fairly new to OGL, but I’ve done alright so far. I’m developing some software that represents some tables of data in 3d graph format. The code are fairly basic, and nothing is terribly complex about the graphs.
In developing it, I’ve gotten nearly everything to work, however, I seem to have some lighting issues that manifest themselves on two machines here.
They both have different ATI cards, tho I did not enable accelleration in the pixel format descriptor(enabling it doesn’t fix the problem either). I’m not sure how that really works, as the OpenGL dll still seems to be an ATI supplied file.
Otherwise, I have it working on 2 machines with Nvidia cards and a machine with no hardware accelleration at all.
Basically, the problem is that on the ATI machines, the scene appears to lack the positional light I have set up and works on all the other machines I’ve tested. If you move the camera position to exactly where the light is positioned, the scene looks correct. Removing or altering the specular lighting components of the material and the light don’t seem to have any effect.
It appears almost as if the normals of every vertex are all equal as rotating the object changes the intensity of all points equally(even tho the points all have different normals).
I’ve traced thru the code and I can see that I am setting the normals how I want, but they just don’t seem to take effect unless I am in line with the light.
Has anyone experienced this or does anyone have any guesses on what I can check.

Here is my lighting initialization

      GLfloat lightPos[]={4.0f, -2.0f, 4.0f, 0.0f};    	         // set light position
      glLightfv(GL_LIGHT0, GL_POSITION, lightPos);

      GLfloat diffuseLight[]={0.8f, 0.8f, 0.8f, 1.0f};    	     // set diffuse light parameters
      glLightfv(GL_LIGHT0, GL_DIFFUSE, diffuseLight);

      GLfloat ambientLight[]={0.2f, 0.2f, 0.2f, 1.0f};    	     // set ambient light parameters
      glLightfv(GL_LIGHT0, GL_AMBIENT, ambientLight);

      GLfloat specularLight[]={0.6f, 0.6f, 0.6f, 1.0f};          // set specular light parameters
      glLightfv(GL_LIGHT0, GL_SPECULAR, specularLight);
      glLightModelfv(GL_LIGHT_MODEL_AMBIENT, ambientLight);	     // set light model

      GLfloat specularReflection[]={0.8f, 0.8f, 0.8f, 1.0f};     // set specularity
      glMaterialfv(GL_FRONT_AND_BACK, GL_SPECULAR, specularReflection);
      glMateriali(GL_FRONT_AND_BACK, GL_SHININESS, 128);
      glMaterialfv(GL_FRONT_AND_BACK, GL_DIFFUSE, diffuseLight);
      glMaterialfv(GL_FRONT_AND_BACK, GL_AMBIENT, ambientLight);

      glEnable(GL_LIGHT0);                         	             // activate light0
      glEnable(GL_LIGHTING);                       	              // enable lighting
      glEnable(GL_NORMALIZE);

Here is the basic code that is looped through to build my display list

  if(_GenerateNormal(u, v, w, normal))
    glNormal3fv(normal);
  refuColor = _GetCellColor(*(zIter + 1));
  fvuColor[0] = GetRValue(refuColor)/(float)0xFF;
  fvuColor[1] = GetGValue(refuColor)/(float)0xFF;
  fvuColor[2] = GetBValue(refuColor)/(float)0xFF;
  glColor3fv(fvuColor);
  glVertex3fv(u);

_GenerateNormal, btw returns true if lighting is enabled and populates my normal array w/ an un-unit-vectorized normal, and I let OGL handle the unit-vector conversion.
Wow, that got long.
Anyway, any help at all would be appreciated. Thanks!

It looks OK when you position the eye at the light? What matrix is on the modelview when you position the light?

This is critical to the light position but it should affect all implementations.

Loading identity onto the moelview before you position the light will specify the light location relative to the eye. Loading the viewing matrix prior to positioning the light will position the light relative to the world origin.

Here is my camera positioning/aiming code

  glPushMatrix();
    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();
    gluPerspective(35.0f, m_dAspectRatio, m_dClipNear, m_dClipFar);
    ReAimCamera();
    glMatrixMode(GL_MODELVIEW);
  glPopMatrix();

and shortly after that, I setup the lights like I showed earlier. So I haven’t loaded the identity and I should be in the Modelview
It seems to be an ATI specific problem, but I can’t find anything referencing that.
Can you see anything wrong w/ my specular lighting or material?
I’ll try to get some screenshots later so I can better illustrate what I’m talking about.
So, to clear just one thing up, the pixel format descriptor… Does setting that with PFD_GENERIC_ACCELERATED enable hardware accelleration? And using PFD_GENERIC_FORMAT is software rendering?
What i can’t figure out is that as far as I can tell, I’m not going being OGL 1.1 which should be uniformly supported everywhere, right? So why does this problem only show up on ATI, so far?
Thanks!

Ok. Here are the screenshots
Sorry it’s so small, but Yahoo is being dumb.
I merged 3 shots together and if you go from top to bottom, that’s basically waht you see as you move past the light source on just 2 of the 7 machines I’ve tested it on. Those two have ATI cards. 2 other have Nvidia cards, 2 of the remaining have no hardware accelleration and the last machine I don’t really know, but it’s a laptop, so if it was accellerated, then it’s likely not very advanced.
Anyway, on those machines the graph looks like the middle picture all the time(tho, reflections change as the object is rotated.
Thanks.

PFD_GENERIC_FORMAT is software rendering…
P.S. you can upload pictures here: imageshack

I recommend you www.photobucket.com best for your screenshot needs! :slight_smile:

Ok, so I’ve tracked the problem down to GL_NORMALIZE.

When I calculate my on unit length normals(I was letting OGL do it for me because I have alot of scaling going on) things look correct. It seems there is some incompatibility with ATI cards and GL_NORMALIZE, or at least in the way I am enabling it. Anyone have any ideas? Thanks!

Ok, so I just wrote a Normalize function that converts my normals to unit length so I sidestepped that problem, but if anyone still knows why GL_NORMALIZE doesn’t work on ATI’s please let me know.
…So now the new problem w/ ATI’s… glColorMaterial seems to act strangely. I have some non-light-affected text I want to display, so I set this

  glEnable(GL_COLOR_MATERIAL);
...
  GLfloat fvWhite = {1.0f, 1.0f, 1.0f};
...
  glColorMaterial(GL_FRONT, GL_EMISSION);
  glColor3fv(fvWhite);

to make it always be as bright white as possible.
However it seems to just make the text’s color change as its position changes. Then later on when my actual data is drawn I set

  glColorMaterial(GL_FRONT, GL_AMBIENT_AND_DIFFUSE);

and then procede to set the color based the Z value(I have an algorithm which works for this). The problem is, what I draw at this point is ALL white with no shadows. It’s almost as if these two color materials are opposite each other. I know they are not, tho, because just removing the first glColorMaterial call with GL_EMISSION, the table data fixes itself, but the text is still broken. I’m also pushing and popping the GL_LIGHTING_BIT attribute between these. Anyone have any ideas? Thanks!