What is the correct way to get the OpenGL version on an OpenGL core context?

I am using:
Code :
int major = 0;
int minor = 0;
glGetIntegerv(GL_MAJOR_VERSION, &major);
glGetIntegerv(GL_MINOR_VERSION, &minor);

but both lines return an invalid enum error.

glGetIntegerv says that both tokens are allowed. What gives?

This is on Mac OS X 10.9, using a Nvidia 650M and a core context.