View Full Version : GL2 core VS extensions
Going straight to the point, I finally have some time (and need) to play a little with GLSL.
Now, I see GL2 core is a bit nicer with GLhandle being a GLuint and shorter function names...
I am used to do everything thuru extensions and I seldom use core functions because of the namings.
I would like to fetch the GL2 core pointers when asked for ARB_shader_objects and such.
This means for example, I'll fetch CreateShader instead of CreateShaderObjectARB.
Is there any long-lasting problems you may warn me about?
10-29-2005, 06:16 AM
What do you mean when you say "fetch GL2 core pointers when asked for ARB_shader_objects and such"
who is doing the asking? Are you writing a wrapper for OpenGL?
I also find it strange thatyou don't like to use core features because of the naming? How so? most of the time they just drop the extension part of the entry point.
Also note that shader programs and shader objects are destroyed differently in GL2 core.
See here for a complete listing of changes:
You can call your function pointer whatever you'd like it to be called, eg called CS the function pointer to glCreateShaderObjectARB and so. But, as for uniforms, you'll meet many functions with many prefixes, so you'll have to take that in account.
Finally, I don't really understand what you'd really like to know: theoretically, GL 2 core does not use much ARB/EXT extensions (this is the case under Linux, but doesn't seem to be the case under W32) so that you can freely and directly call to the functions (and I think without the ARB/EXt suffixes, don't remember :) without having to get their addresses.
Is that what you meant ?
Up to now, being on win32 I always used extensions only, keeping core GL at standard 1.1 to remember that's the state of the actual library implementation on windows. You know, everybody is using this, that and the other extension.
When I fetched for example GL1.4 functionalities I really didn't fetch core names: instead I fetched extension names and pointers. This worked pretty smoothly up to now.
The problem is that I don't like too much the function names provided by ARB_shader_objects and similar and I would like to use GL2 names. GL2 however, manages shaders slightly differently: to a certain degree, (GL20) != (GL15 + shader_extensions).
This disoriented me quite a bit because there would have been no way to get those functionalities without some care.
I actually understand my habit is bogus. I shall not think about the single extensions anymore and go straight for core names.
Forgive me, the problem was just that I had no name to map for example the two CreateShader(void) and CreateProgram(void)[B] to [B]CreateShaderObjectsARB(GLenum) in any immediate way like before (MultiDrawElementsEXT --> MultiDrawElements comes to mind). I see now I shall not rely on this anymore.
The thing I was thinking (before) was to somehow use 'core like names'. For example I may have called CreateProgram as CreateProgramARB. The problem here is that I had no proof about how the driver would have liked this, in the sense that I would have requested for "CreateProgram" handle in a branch to really fetch ARB_fragment_shader. I began thinking about the possibility for a driver to support ARB_shader_object but no GL2 so the query would have failed.
I want to repeat it: I am now pretty sure this is really bogus and the solution is to simply live happily with core function names - this issue has born from the (broken) way I manage OpenGL versioning. It's time to make it work for real (I am not going to spend time on compatibility issues anyway so I loose nothing).
Thank you anyway for your feedback.
search for NVIDIA_opengl_2.0_support.pdf
it outlines the changes that glsl underwent when it became part of te gl core
Powered by vBulletin® Version 4.2.2 Copyright © 2015 vBulletin Solutions, Inc. All rights reserved.