It could be the pixel alignment. I think by default the driver expects texel data to be aligned on word (4 byte) boundaries. Your data is RGB, so you’d need to specify byte alignment with something like:
How are you assigning texture coordinates? Are you clamping them to the texture edge? The man page for glTexImage1D indicates that the OpenGL specification guarantees no textures with dimension less than 64. You might try replicating each color 16 times.
GLenum target,
GLint level,
GLint internalFormat,
GLsizei width,
GLint border,
GLenum format,
GLenum type,
const GLvoid * data
);
width
Specifies the width of the texture image including the border if any. If the GL version does not support non-power-of-two sizes, this value must be 2^n+2 (border) for some integer n. All implementations support texture images that are at least 64 texels wide. The height of the 1D texture image is 1.
Thanks I just read that in the man page, but this is talking about texels not pixels.
I don’t think this means that my 1D texture (line) has to be 64 pixels wide, but I’m going to try that anyhow as nothing else seems to be working.
I might end up having to use a 2D image texture after all, which I was hoping not to as it complicated everything
Wild guess, but i always suspect this thing when i see black output from texture (only partially black output suggests that its not the thing but anyways …).
Could you try doing for the texture:
It’s possible there’s some bug in the FLTK GL windowing code.
Maybe the video card on this laptop is the problem.
So I’m when I get the chance I’ll compile on my other computer with the better video. (this will be awhile since I’m away from home and only can use the laptop for now)
I’ve started to create a GTK+ version of the program to see if this changes anything.
When I get time I’ll try a simple test with GLX only.