PDA

View Full Version : Problem with textures with different graphic cards



Limdor
04-13-2011, 03:02 AM
Hi,

I want to compute the colors for every face of a mesh. I save these colors into a texture and then in the glsl I get the colour.
When I try this, I get different behaviour in different graphic cards. In some ones the result it's ok and on others it's wrong.
In both the texture coordinates are the same. Any idea?

OpenGL code creating the texture:

glDeleteTextures(1, &mPolygonalSaliencyTexture);
glGenTextures(1, &mPolygonalSaliencyTexture);
glBindTexture(GL_TEXTURE_2D, mPolygonalSaliencyTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, mPolygonalTexturesSize, mPolygonalTexturesSize, 0, GL_RGB, GL_FLOAT, &floatColors[0]);

GLSL code to get the values:

#extension GL_EXT_gpu_shader4 : enable

uniform sampler2D saliencyTexture;

...

vec2 textCoord = vec2( ((gl_PrimitiveID + offset) % polygonalTexturesSize), ((gl_PrimitiveID + offset) / polygonalTexturesSize) );
textCoord /= float(polygonalTexturesSize);

returnColor = vec4(texture(saliencyTexture, textCoord).xyz, 1.0f);

YarUnderoaker
04-13-2011, 06:33 AM
Looks right.

A couple of comments
GL_CLAMP is deprecated,
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB - deprecated token for texture internal format

Alfonse Reinheart
04-13-2011, 02:56 PM
deprecated token for texture internal format

That's not true. It's still legal to use unsized internal formats. It's not a good idea of course, but it isn't removed.

Limdor
04-13-2011, 11:21 PM
I've change the deprecated GL_CLAMP for GL_CLAMP_TO_EDGE but it works equally. Any other idea?

YarUnderoaker
04-14-2011, 12:00 AM
Can you show a screenshot with wrong result?

Limdor
04-14-2011, 12:56 AM
Yes of course. I've attached two images, one with the correct result and another one wrong.

BionicBytes
04-14-2011, 01:37 AM
Can you tell us what those graphics cards are and their OpenGL versions as reported by GL ?
eg nNidia Geforce 8600m, OpenGL 3.3 compatability profile.

p.s. My money's on one being an Intel card.

Limdor
04-14-2011, 01:56 AM
I've attached the files obtained with the glewinfo (GLEW 1.5.8) with the information of the graphic cards and the profiles.

BionicBytes
04-14-2011, 02:12 AM
can you just paste the info directly rather than expecting everyone to look up these links. In this way all the information relating to this post is here and not held some where else on the internet which may get taken down and deleted in time.

We're not after that much information - only the Gfx card and Opengl version!

Also, more importantly the links you refer too come back as http 403 Forbidden.

Limdor
04-14-2011, 02:24 AM
Sorry, I don't know why it doesn't work, I've used the File Manager application of this web to upload files.

Computer 1:

Intel Core i7 965
6,00 GB RAM
Windows 7 64 bits Service Pack 1

GLEW version 1.5.8
Reporting capabilities of pixelformat 1
Running on a GeForce GTX 465/PCI/SSE2 from NVIDIA Corporation
OpenGL version 4.1.0 is supported

The result is wrong

Computer 2:

Intel Core i5 M430
4,00 GB RAM
Windows 7 64 bits Service Pack 1

GLEW version 1.5.8
Reporting capabilities of pixelformat 1
Running on a ATI Mobility Radeon HD 5400 Series from ATI Technologies Inc.
OpenGL version 4.1.10362 Compatibility Profile Context is supported

The result is correct

Computer 3:

Intel Core 2 Duo T9300
4,00 GB RAM
Windows 7 64 bits Service Pack 1

GLEW version 1.5.8
Reporting capabilities of pixelformat 1
Running on a GeForce 8600M GT/PCI/SSE2 from NVIDIA Corporation
OpenGL version 3.3.0 is supported

The result is correct

BionicBytes
04-14-2011, 02:31 AM
So, two nVidia cards are showing different results.
Do they both have the latest drivers installed?
It's a shame the output you posted does not give the OpenGL version string as used by the program - but rather gives what is supported as opposed to what is actually being used.

Limdor
04-14-2011, 02:35 AM
How can I output the OpenGL string used?

About the drivers of two nvidias the i7 use the driver 266.58 and the Core 2 Duo use the driver 260.99.
I will try to install the driver 260.99 to the i7 to see if it works.

Alfonse Reinheart
04-14-2011, 02:42 AM
How can I output the OpenGL string used?

It's a string. A char*. You print it. Probably with "printf" or if you're really into the C++, with "cout".

Limdor
04-14-2011, 02:44 AM
Hahhahaha, sorry, I wanted to say how I get the string xD

BionicBytes
04-14-2011, 02:44 AM
Getting the driver levels may help (or make matters worse!) but it may resolve the nature of the problem (driver bug?).

I meant that you should write down and post here the OpenGL version as output by glGetString(GL_VERSION)

When my application runs it always outputs to a text file the following from the created context:
GL_VENDOR, GL_RENDERER, GL_VERSION, GL_SHADING_LANGUAGE_VERSION, and GL_EXTENSIONS

V-man
04-14-2011, 05:49 AM
It is probably a driver bug. It is worth updating the video drivers and trying again if your drivers are old.

Limdor
04-15-2011, 04:18 AM
Hi guys, I've tried with other machines and it seems that the driver it's not the problem. I've attached the information of GL_VENDOR, GL_RENDERER, GL_VERSION and GL_SHADING_LANGUAGE_VERSION.

Any ideas? Please, help me!

Computer 1:

Intel Core i7 965
6,00 GB RAM
Windows 7 64 bits Service Pack 1
Drivers version: 266.58

GLEW version 1.5.8
Reporting capabilities of pixelformat 1
Running on a GeForce GTX 465/PCI/SSE2 from NVIDIA Corporation
OpenGL version 4.1.0 is supported

GL_VENDOR: NVIDIA Corporation
GL_RENDERER: GeForce GTX 465/PCI/SSE2
GL_VERSION: 2.1.2
GL_SHADING_LANGUAGE_VERSION: 1.20 NVIDIA via Cg compiler

The result is wrong


Computer 2:

Intel Core i5 M430
4,00 GB RAM
Windows 7 64 bits Service Pack 1
Driver version: 10.12

GLEW version 1.5.8
Reporting capabilities of pixelformat 1
Running on a ATI Mobility Radeon HD 5400 Series from ATI Technologies Inc.
OpenGL version 4.1.10362 Compatibility Profile Context is supported

GL_VENDOR: ATI Technologies Inc.
GL_RENDERER: ATI Mobility Radeon HD 5400 Series
GL_VERSION: 4.1.10362 Compatibility Profile Context
GL_SHADING_LANGUAGE_VERSION: 4.10

The result is correct


Computer 3:

Intel Core 2 Duo T9300
4,00 GB RAM
Windows 7 64 bits Service Pack 1
Drivers version: 260.99

GLEW version 1.5.8
Reporting capabilities of pixelformat 1
Running on a GeForce 8600M GT/PCI/SSE2 from NVIDIA Corporation
OpenGL version 3.3.0 is supported

GL_VENDOR: NVIDIA Corporation
GL_RENDERER: GeForce 8600M GT/PCI/SSE2
GL_VERSION: 2.1.2
GL_SHADING_LANGUAGE_VERSION: 1.20 NVIDIA via Cg compiler

The result is correct


Computer 4:

Intel Core 2 Quad Q660
4,00 GB RAM
Windows 7 64 bits Service Pack 1
Drivers version: 260.99

GLEW version 1.5.8
Reporting capabilities of pixelformat 1
Running on a GeForce GTX 285/PCI/SSE2 from NVIDIA Corporation
OpenGL version 3.3.0 is supported

GL_VENDOR: NVIDIA Corporation
GL_RENDERER: GeForce GTX 285/PCI/SSE2
GL_VERSION: 2.1.2
GL_SHADING_LANGUAGE_VERSION: 1.20 NVIDIA via Cg compiler

The result is wrong


Computer 5:

Intel Core 2 Duo P8400
4,00 GB RAM
Windows 7 64 bits Service Pack 1
Drivers version: 266.58

GLEW version 1.5.8
Reporting capabilities of pixelformat 1
Running on a GeForce 9600M GT/PCI/SSE2 from NVIDIA Corporation
OpenGL version 3.3.0 is supported

GL_VENDOR: NVIDIA Corporation
GL_RENDERER: GeForce 9600M GT/PCI/SSE2
GL_VERSION: 2.1.2
GL_SHADING_LANGUAGE_VERSION: 1.20 NVIDIA via Cg compiler

The result is wrong

Limdor
06-16-2011, 07:57 AM
Finally, I think that I've solved the problem. I get the value in the corner of the pixel, not in the middle. Then the graphic card had to choose the nearest pixel and all of them were in the same distance.