You don’t “apply” fragment shaders to things. You render with a fragment shader. So the rendering operations that generate depth values are when you need to be using your fragment shader.
My fragment shader is here to dumper depth values to color channels it doesn’t generate depth.
I didn’t say that your fragment shader would generate the depth values. I said that your “rendering operations” would.
gl_FragCoord.z is generated by the rasterizer when you render an object.
If all you have is a depth buffer with depth values and you want to convert that into a color buffer (with linearized color values), the fragment shader you posted won’t do that. Since it doesn’t fetch the depth values from the depth texture that was previously used to render with.
But something is strange… as you can see my Image2D is made of GL_UNSIGNED_BYTE.
When I use gl_readpixels the rgb values are float, is it due to the conversation or is it because I’m not using my Fbo at all.
Maybe my question is not clear, if you want me to highlight a point of my problem it will be a pleasure
[QUOTE=tkostas;1279546]But something is strange… as you can see my Image2D is made of GL_UNSIGNED_BYTE.
When I use gl_readpixels the rgb values are float, is it due to the conversation or is it because I’m not using my Fbo at all.[/QUOTE]
The internal (GPU-side) and external (CPU-side) formats don’t have to be the same; the implementation will convert them as required.
glReadPixels() will return the data in whatever format you request. If the type parameter is GL_UNSIGNED_BYTE, you’ll get unsigned bytes, if the type parameter is GL_FLOAT, you’ll get floats. The texture’s internal format doesn’t matter (beyond the fact that you’ll lose information if you request the data in a format that has either less precision or a smaller range than the internal format).
[QUOTE=GClements;1279548]The internal (GPU-side) and external (CPU-side) formats don’t have to be the same; the implementation will convert them as required.
glReadPixels() will return the data in whatever format you request. If the type parameter is GL_UNSIGNED_BYTE, you’ll get unsigned bytes, if the type parameter is GL_FLOAT, you’ll get floats. The texture’s internal format doesn’t matter (beyond the fact that you’ll lose information if you request the data in a format that has either less precision or a smaller range than the internal format).[/QUOTE]
You were absolutely right, I finally achieve to transfer my texture from opengl to cuda and the values seems to be the right ones.