Problem with 1D_textures on new Nvidia graphic processors

Hi,

We are developing an application under VB6 (graphical library vbogl.tlb - opengl 1.2) for some years.
Among other things, we are using 1D_texture.
Everything is running well on all types of ATI graphical processors and on NVidia up to GeForce4, but on Nvidia GeForce5 and above (i.e. since they are opengl 2.0 compatible…), our 1D_texture is not represented.

The command lines for initializing the texture are put hereunder. Maybe the problem is related to wrong initialisation of parameters or to incompatibility of 1D_texture of version 1.2 with the opengl 2.0 ??

Call glBindTexture(GL_TEXTURE_1D, 1)
Call glTexEnvf(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_REPLACE)
Call glTexParameterf(GL_TEXTURE_1D,GL_TEXTURE_MIN_FILTER, GL_NEAREST)
Call glTexParameterf(GL_TEXTURE_1D,GL_TEXTURE_MAG_FILTER, GL_NEAREST)
Call glTexParameterf(GL_TEXTURE_1D,GL_TEXTURE_WRAP_S, GL_CLAMP)

Call glTexImage1D(GL_TEXTURE_1D, 0, GL_RGB, texres_1D, 0, GL_RGB, GL_UNSIGNED_BYTE, tex1D(1, 1))

Does anyone already experienced the same problem ?

Thank you.

our 1D_texture is not represented
What does that mean?

Do you get any GL Errors?
I wouldn’t use GL_CLAMP, you normally want GL_CLAMP_TO_EDGE (not that it would matter with nearest filtering).
Is texres_1D a power-of-two?
Do you have your pixelstore unpack alignment set correctly? Should be 1 for RGB.
You should use GL_RGB8 as internalFormat.

The polygon remains empty, like if no texture is applied, and without any GL_Errors

I checked all other parameters you mentioned. They seem to be what they should be… (texres_1D is 256, and pixelstore unpack alignment is indeed set to 1)
GL_RGB8 doesn’t change anything

It should be noted that vbogl.tlb is a bit outdated, and has a few issues, including some missing/incorrect constants.

I have an updated and cleaner version that fixes these issues and includes all constants up to OpenGL 2.0 (including GL_CLAMP_TO_EDGE, which the old TLB does not declare).
Note that it is not fully OpenGL 2.0 compliant because it misses the extension functions (VB6 does not support function pointers). You can still use those extensions, but you’ll have to route those calls through a C++ DLL. I actually had one in development but haven’t worked on it for years.

Anyway, here’s the updated Type Library:

http://home.planet.nl/~buijs512/_temp/opengltlb.zip

OpenGL 1.1 functionality sits under the “GL” module and all later extensions under the “GLEXT” module (see Object Browser).

Thanks a lot to remdul for the Library, However, its use doesn’t change anything.

Actually our application is “pure” OpenGL 1.2 and we were then expecting it to run under OpenGL 2.0. My main problem is thus to identify what changed in Nvidia processors that makes it impossible to load the texture correctly…
I already asked Nvidia, but currently without answer. So I was expecting that maybe someone could have experienced a similar problem…

To get to your problem:

If I remember correctly, I once did use 1D textures with this type library, on an nVidia card and it did work. However, I don’t remember what card or driver it was.

[edit]
I’m actually not so sure it worked. I may have worked around it with an 2D texture.
At any rate, a visual basic TLB is passive. It just passes the calls to opengl32.dll/driver. So your problem should thus also occur in a C++ program. You may want to verify this.
[/edit]

  1. make sure that if you still declare any GL constants manually, read them back and check to see if they are correct. I believe that if they were declared in hexadecimal form, it would actually have a different value.

  2. How do you create the 1D texture? Are you sure the texture name and data are valid?

Keep in mind that VB arrays are one-based, and usually add one extra element. I.e. Dim data(4) as Byte will range from 0 to 4, and thus has actually five elements. This is important if you use 2D arrays for images.

  1. I sometimes pass image data like this:

glTexImage2D GL_TEXTURE_2D, 0, 3, .width, .height, 0, GL_RGB, GL_UNSIGNED_BYTE, ByVal VarPtr(pixels(0))

Note the ByVal VarPtr-part. I don’t exactly remember the reasoning, but it fixed some issue.

My problem is solved.

I simply replaced glTexEnvf by glTexEnvi, and everything is working on all cards.

Great. :slight_smile:

I had’t noticed that error in your code either. Makes sense it did not work. :slight_smile:

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.