Simple Shader Setup. Does it exist?

Hi! I’m new here. Nice to meet you. I really hate to join a forum by asking a question, but I’m at my wit’s end.

Situation: As part of my thesis work, I need to implement a program that can initialize and switch between several different shader sets applied to the same set of geometry.

Experience Level: I know the basics of, and some advanced, C++. I know enough GDI to make my own menu system. I’ve used OpenGL several times before, but always in the context of GLUT. I have a good understanding of the graphics pipeline.

Problems: I need to work with shaders, and for the life of me, I can’t seem to find the tutorials or example code that will show me a simple way of initializing the shaders without an additional extension library. From what I’ve gathered, GLSL is now part of OpenGL’s core, but nearly 100% of tutorials out there use either GLEE or GLEW. GLEW isn’t really an option for me, because I need to run this project on a number of machines, some of which do not have access to GLEW, nor can it be installed (aka, school computers). I thought GLEE would be the magic bullet, since it can be compiled into the source with the included .c and .h files. However, all my efforts to do so result in hundreds of errors when attempting to compile GLee.c (something to do with the incorrect order of #include <windows.h>, but shouldn’t that be taken care of? Either way, I wasn’t able to solve this problem on my own. I can duplicate the errors if you want).

I don’t plan on using any extensions, and so long as I have OpenGL 2.0, I should be able to just use GLSL, right? How? Optimistically, I would like to see the source for the bare minimum to get a shader online. Don’t even need the shaders themselves.

Details: OpenGL 2.1 installed. Visual Studio 2005 typical development environment.

  1. Don’t even need the shaders themselves.
    How you want to use shaders if you dont need them???

  2. I would like to see the source for the bare minimum …
    The source of a vertex/fragment shader for doing the minimum possible???

all you need is to obtain the function pointers first so you can create, bind and link shader programs in GLSL. when you load a shader from a file you can bind it so the old fixed fucntion pipeline is replaced by your vertex/fragment shaders. it is not that difficult if you understand the concept.

simple sample:
http://www.codesampler.com/oglsrc/oglsrc_10.htm#ogl_glslang_simple_vs2ps

I found the Lighthouse tutorials very helpful…

http://www.lighthouse3d.com/opengl/glsl/index.php?intro

Also The Orange Book, and it’s associated web site…

http://www.3dshaders.com/

If someone is getting lots of errors when using external libraries like GLEW, then it’s eithre because he hasn’t set some preprocessor macro or is including library headers incorrectly (mismatch in call convention).

I’m using _cdecl call convention and I have GLEW_STATIC added to my preprocessor definitions. This is defined in project properties under VC++ (“C++/preprocessor” and “C++/advanced”).

http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=247660#Post247660

If you don’t want to use GLEW you have to obtain the the function pointer using the wglgetprocaddress()*.
But glew save you from this boring task.

If you want to use glew you don’t need to install it in the target machine, just copy glew32.dll* in the program folder, if you can copy your program you can copy the dll. :slight_smile:

*I suppose you work on windows machine.

@_NK47

  1. Because I already know how to make shaders for the most part, just not how to integrate them without extensions.
  2. I would like to see the source of everything needed to import GLSL shaders without extensions. I can handle making the shaders myself.

I know this isn’t supposed to be that hard, but with absolutely no examples that don’t use extensions, I have nothing to work from.

@scratt
All of these examples use extensions. :frowning:

@Ilian Dinev
I’m trying to avoid extensions completely. That includes making them myself. GLSL is supposed to be part of OpenGL 2.0 core. I shouldn’t need extensions.

@Rosaria Leonardi
Do you know of an example I could look at explaining what you are talking about with wglgetprocaddress?

@Everyone
I’ve been looking at examples all over the internet and books for the past three days or so. I’ve found none that don’t use extensions.

Actually the tutorial from Lighthouse 3D shows how to use GLSL via core and ARB_shader_objects.

The tutorial covers both the ARB extensions and OpenGL 2.0 versions.

Color coding has been used to help the reader to distinguish between them. The ARB stuff is presented in grey, and the OpenGL 2.0 in orange.

Maybe the “not clear part” is that windows driver expose only openGl 1.1 function and all the other function must be taken from driver using wglgetProcAddress(). (look for MSDN page for further istruction on wglgetProcAddress)

For example to use glCreateShader you have to:
define a typedef for the function pointer then you can define a function pointer and get the entrypoint with the wglgetProcAddress

typedef GLuint (APIENTRY * PFNGLCREATESHADERPROC)(GLenum);

PFNGLCREATESHADERPROC glCreateShader;

glCreateShader = (PFNGLCREATESHADERPROC)wglgetProcAddress("glCreateShader");

//now you can call the glCreateShader :)

you have to do that for every function you want to use… did I already tell you that this is very boring? :stuck_out_tongue:

Use glew and include the library in your project… it’s a lot better.

you can’t use shaders without extensions. all samples given here use shaders in the normal way (with extensions obviously). it’s either shaders with extensions or fixed function without.

shaders: custom written programs that run on gpu.

No, you don’t need any extensions, just check for GL_VERSION_2_0. All you’ll have to do is query for the new function pointers, like Rosario wrote.

If you want to keep your code cross-platform (wgl* functions are Windows-specific), use this little snippet of code:

#ifdef __linux__
#include <GL/gl.h>
#include <GL/glu.h>
#include <GL/glx.h>
#include <GL/glext.h>
#define glGetProcAddress(n) glXGetProcAddressARB((GLubyte *) n)
#endif

#ifdef _WIN32
#include <windows.h>
#include <GL/gl.h>
#include <GL/glu.h>
#include <GL/glext.h>
#define glGetProcAddress(n) wglGetProcAddress(n)
#endif

#ifdef __APPLE__
#include <OpenGL/gl.h>
#include <OpenGL/glu.h>
#endif

And use glGetProcAddress instead.

HTH.

Ah ha! That’s the thing I didn’t understand. With that in mind, I went with GLee.

I have one more question. I had to change a couple things to make GLee work. There’s code in GLee.h that looks like this:

#ifdef WIN32
	#define WIN32_LEAN_AND_MEAN
	#include <windows.h>
	#include <GL/gl.h>
#elif defined(__APPLE__) || defined(__APPLE_CC__)
    #define GL_GLEXT_LEGACY
	#include <OpenGL/gl.h>
#else // GLX
	#define __glext_h_  /* prevent glext.h from being included  */
	#define __glxext_h_ /* prevent glxext.h from being included */
	#define GLX_GLXEXT_PROTOTYPES
	#include <GL/gl.h>
	#include <GL/glx.h>
#endif

However, this code, though it looks like it should make itself work, WIN32 is apparently not defined on my Windows install. However, if I change it to “_WIN32”, it works. Is WIN32 the incorrect check for if compiling on Windows? Would the fact that I’ve got XP installed on an Intel Mac change your answer?

The only other thing necessary was to make sure the main file with my code had the following headers:

#include <windows.h>
#include <stdio.h>
#include "GLee.h"
#include <gl\glut.h>

Thanks for bearing with me!

I’m always confused by those defines, too. I also remember seeing WIN32 used in some code I’ve read, therefore I use this:

#if (defined(__WIN32__) || defined(_WIN32))
    #define WIN32
#endif

And then you can go on using WIN32 only.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.