I am trying to compile a very simple vertex shader and I got this error, however I don’t see anything wrong in my vertex shader. This is the error that I get:
It really amazes me just how few C++ programmers know something as simple as how to read a text file with iostreams. It’s really not that difficult, and the fact that you were reading it line-by-line should have been a tip-off that you were doing it wrong.
Really, have you ever seen a programming language where line-by-line was the way you had to read a text file?
In any case, your code is wrong because std::getline doesn’t actually put the endline character in the string. So you’re sticking every statement on a single line, which never works.
Thanks for the reply, but now I have a different problem. Sometimes the shader will compile, but other times it won’t compile and I’ll get seemingly random shader compilation errors.
Here my new code that loads and compiles the vertex shader:
That’s my fault for linking you to buggy code; I should have looked at the first example more closely. I have since fixed it over on Stack Overflow.
Though the memory leak you wrote in when you copy-and-pasted the buggy code is your fault. I was kinda hoping that you’d use one of the other alternatives, rather than the one that directly allocates memory for no adequately explained reason.
There’s still something wrong with my code, but I don’t see anything wrong. I am compiling a fragment shader underneath the vertex shader. Sometimes the vertex shader doesn’t compile, sometimes the fragment shader does compile, and sometimes neither compile – and then sometimes they both compile. This is my code:
std::ifstream file ("VertexShader.glsl", std::ios::in|std::ios::binary|std::ios::ate);
if (file.is_open())
{
long size;
file.seekg(0, std::ios::end);
size = file.tellg();
char *contents = new char [size];
const char *contentsConst = new char [size];
file.seekg (0, std::ios::beg);
file.read (contents, size);
file.close();
contentsConst = contents;
GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER);
glShaderSource(vertexShader, 1, &contentsConst, NULL);
glCompileShader(vertexShader);
GLint status;
glGetShaderiv(vertexShader, GL_COMPILE_STATUS, &status);
if (status == GL_TRUE) {
std::cout << "The vertex shader has been compiled" << std::endl;
}else{
std::cout << "Something went wrong" << std::endl;
}
char buffer[512];
glGetShaderInfoLog(vertexShader, 512, NULL, buffer);
delete [] contents;
}
std::ifstream fileF ("FragmentShader.glsl", std::ios::in|std::ios::binary|std::ios::ate);
if (fileF.is_open())
{
long size;
fileF.seekg(0, std::ios::end);
size = fileF.tellg();
char *contents = new char [size];
const char *contentsConst = new char [size];
fileF.seekg (0, std::ios::beg);
fileF.read (contents, size);
fileF.close();
contentsConst = contents;
GLuint fragmentShader = glCreateShader(GL_VERTEX_SHADER);
glShaderSource(fragmentShader, 1, &contentsConst, NULL);
glCompileShader(fragmentShader);
GLint status;
glGetShaderiv(fragmentShader, GL_COMPILE_STATUS, &status);
if (status == GL_TRUE) {
std::cout << "The fragment shader has been compiled" << std::endl;
}else{
std::cout << "Something went wrong" << std::endl;
}
char buffer[512];
glGetShaderInfoLog(fragmentShader, 512, NULL, buffer);
delete [] contents;
}
I don’t see any problems with it I apologize if my code is weird looking as my “main” language is Objective-C
Have you attempted to, you know, debug the code at all? Did you look at the string you got to see if it was actually your file? And if Objective-C is your “main language”, why are you using iostreams for file loading at all?
While probably less efficient than the linked version, using a stringbuf is simpler and more robust (e.g. it works with streams which don’t support seeking, such as pipes and sockets):
You’re passing NULL for the last parameter (length) of glShaderSource(), which means that the strings must be null terminated, but I don’t see a null terminator being added anywhere.
As you know the length of the string, I’d suggest copying it to a GLint and passing a pointer to that as the last argument.
I’ve checked to see if my C string is my file… and it is. What I mean as Objective-C being my language is that I’ve been programming with Objective-C for years, while I’ve only been using C++ recently. I’m using iostream to read the file because Objective-C doesn’t play nice with glfw (my window management system.) I’m not currently concerned with performance, so I’ve left the leak alone for now.
Passing in the size doesn’t seem to fix the problem. I did this to get the size of the C String for the final parameter:
GLint sizeOfContents = sizeof(contentsConst);
I passed this in as a pointer for the final parameter. (&sizeOfContents)
So I’m still not sure why this doesn’t work. What’s odd to me is that sometimes the shaders will compile and other times the shaders won’t compile.
I’ll try GCements method of loading the shaders and see if that makes a difference.
[QUOTE=Blakeasd;1253122]Passing in the size doesn’t seem to fix the problem. I did this to get the size of the C String for the final parameter:
GLint sizeOfContents = sizeof(contentsConst);
[/QUOTE]
That’s wrong. sizeof(contentsConst) will be the size of a pointer (typically 4 or 8), not the size of the array which it points to. The size of the string can be obtained from the variable “size” which you used to allocate the array.
Alternatively, you could null-terminate the string by allocating size+1 bytes and setting contents[size]=’\0’.
If you do neither, glShaderSource() won’t know where the string ends, and it will try to parse whatever follows it in memory.
It worked! Thanks so much for the help GClements and Alfonse! I obtained the length of the C string (not the pointer this time) like this:
GLint arraySize = (GLint)size;
and passed it in as a reference like so: &arraySize