There are a few things that are bad and I dont know why. I dont have the code in front of me but here is the gist of it.
glGetString(EXTENSIONS) only returns 3 or 4 extensions. But the same call in another program returns dozens of extensions. I am using the same glExt.h file in both programs. Another than that, I do everything the way you have it.
Originally posted by lucidmm: glGetString(EXTENSIONS) only returns 3 or 4 extensions. But the same call in another program returns dozens of extensions. I am using the same glExt.h file in both programs. Another than that, I do everything the way you have it.
whatever the version of glext.h isnt important cause it has nothing to do with the number of extensions u getback from a call to getstring( extensions ).
possible reason is one version is hardware accelerated + the other version is in software use glGetString( GL_VENDER ) to check.
make sure u have a valid opengl rendering context first
Right, make sure, you are rendering not via the MS OpenGL implementation.
GL_VENDOR string is: Microsoft Corporation
GL_RENDERER string is: GDI Generic
You can’t get destination alpha in 16 bits, so if you request it, it’ll probably boot you into software mode. People reported exactly the same problem to me when I recently released a per-pixel lighting demo on my site (www.delphi3d.net).
I don’t get it. This is the 21st century, for cryin’ out loud! Why are so many people still running their displays in dark-age color depths?
You don’t have to go fullscreen, you can just change the bit depth using ChangeDisplaySettings, then create your window.
I know it’s rude, but sod 'em.
I’m the other way round - if I download a demo, I can’t help but sigh when my monitor clicks as they change my resolution - especially if I’ve got lots of apps running at the same time, as the swap time becomes very annoying (and if I’m lucky, they won’t have messed up all my icon positions)