mapping C++ types to OpenGL types

Hi,
I’m writing a code that is meant to be very portable and safe.
The problem that I encountered is that I see no safe way of matching C++ types with OpenGL types. For example, in C++ I store triangle indices as unsigned long integers. When I want to pass an array of them to OpenGL drawing function I need to supply OpenGL type identifier (e.g. GL_UNSIGNED_INT). How can I be sure that the size of OpenGL type GL_UNSIGNED_INT is equal to C++ unsigned long int?
I cannot simply do sizeof(GL_UNSIGNED_INT) because it’s a number, not a typedef.

GLuint ?

I completely forgot GL typedefs.
So is this test enough?:
sizeof(GLuint) == sizeof(unsigned long int)

GLuint is not unsigned long int.
As its name says, GLuint is unsigned int.

But you can do cast as long as it is possible. But in this case you should use unsigned int type which can store quite huge values in the 3D graphics field about indices.

In addition, long is not portable because it differs between 32 and 64 bit architectures, and even differs between different 64 bit architectures:

see “Table 1. 32-bit and 64-bit data model” at http://www.ibm.com/developerworks/library/l-port64.html