Passing cubemap as a parameter for my fragment shader

I am extremely confused about how to pass a cube map texture into my fragment Program.

Here is the code I used to create the cube map.

glEnableClientState(GL_TEXTURE_CUBE_MAP);

		glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
		glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
		glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_R, GL_CLAMP_TO_EDGE);
		glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
		glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

		glTexImage2D(GL_TEXTURE_CUBE_MAP_POSITIVE_X, 0, GL_RGBA, envmap->right->w, 
			envmap->right->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, envmap->right->pix);

		glTexImage2D(GL_TEXTURE_CUBE_MAP_NEGATIVE_X, 0, GL_RGBA, envmap->left->w, 
			envmap->left->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, envmap->left->pix);

		glTexImage2D(GL_TEXTURE_CUBE_MAP_POSITIVE_Y, 0, GL_RGBA, envmap->top->w, 
			envmap->top->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, envmap->top->pix);

		glTexImage2D(GL_TEXTURE_CUBE_MAP_NEGATIVE_Y, 0, GL_RGBA, envmap->front->w, 
			envmap->front->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, envmap->bottom->pix);

		glTexImage2D(GL_TEXTURE_CUBE_MAP_POSITIVE_Z, 0, GL_RGBA, envmap->front->w, 
			envmap->front->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, envmap->back->pix);

		glTexImage2D(GL_TEXTURE_CUBE_MAP_NEGATIVE_Z, 0, GL_RGBA, envmap->front->w, 
			envmap->front->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, envmap->front->pix);
		
		
		glBindTexture(GL_TEXTURE_CUBE_MAP, 1);
		glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
		glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);

		glEnable(GL_TEXTURE_CUBE_MAP);

Then, where I do my per-frame initialization for my shader, I have the following code

fragmentCubeMap = cgGetNamedParameter(fragmentProgram, "cubemap");
cgGLSetTextureParameter(fragmentCubeMap, 1);
  cgGLEnableTextureParameter(fragmentCubeMap);

where I need to replace ‘1’ in the call to cgGLSetTextureParameter() with whatever value will give me the cube map I’m building in the previous block of code.
Then the function header for my shader looks like this

PixelDataOutput FragmentMain(VertexDataOutput pdi, 
	uniform float blueHue, uniform float3 lightDirv, uniform float3 ppcC, uniform float blendfactor,
	uniform float canvasFlag, uniform samplerCUBE cubemap) {

Where cubemap is, of course, my samplerCUBE object.

Right now, when I run what I have, the samplerCUBE is actually giving me some default information. Each face of the cube is just a different color, but it has nothing to do with the maps that I actually specified in the glTexImage2D() calls in first block of code. I suspect that the magic happens somewhere near glBindTexture();. I think I’m making it pretty obvious from this post that I’m in way over my head, but I need to do this somehow, so any help would be greatly appreciated.

  1. This is an invalid call

glEnableClientState(GL_TEXTURE_CUBE_MAP)

I suggest that you use glGetError() to detect such problems.

  1. I suggest that you use glGenTextures to get a texture ID. Then call glBindTexture and then setup your texture.
  2. There is no need to call glTexEnvi. It does nothing when you are using shaders.
  3. There is no need to call glEnable(GL_TEXTURE_CUBE_MAP). It does nothing when you are using shaders.