PDA

View Full Version : Texture borders on GL_RGBA32F textures (ATI bug most likely)



Pentagram
12-28-2005, 11:57 PM
Hello,

I got a few textures created with the following setup:


glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA32F_ARB, width, height, 1, GL_RGBA, GL_UNSIGNED_BYTE, NULL );(width and height are 64 for my current tests)

This crashes when it tries to dereference a null pointer in the dirvers somewhere. Removing the texture border (i.e. from 1 to 0) solves the crash. (But obviously no borders :D )
Im'm rendering to these using fbo's but it is not rendering to them that causes them to crash but binding them on a texture unit and reading from them.

Has anyone had similar problems? and nows how to kindof "work around" it?

Charles

Mars_999
12-29-2005, 02:06 AM
Ah you have a fp32 texture type but you call GL_UNSIGNED_BYTE for a type? that should be GL_FLOAT

sqrt[-1]
12-29-2005, 03:34 AM
Originally posted by Mars_9999:
Ah you have a fp32 texture type but you call GL_UNSIGNED_BYTE for a type? that should be GL_FLOATUh, you are aware that that parameter is only for the supplied pixel data right? So you can use what ever format you like (and considering he is using the empty pixel data array, it does not matter at all as there is no data to convert)

Mars_999
12-29-2005, 02:00 PM
Originally posted by sqrt[-1]:

Originally posted by Mars_9999:
Ah you have a fp32 texture type but you call GL_UNSIGNED_BYTE for a type? that should be GL_FLOATUh, you are aware that that parameter is only for the supplied pixel data right? So you can use what ever format you like (and considering he is using the empty pixel data array, it does not matter at all as there is no data to convert)Ok, note to self. I still pair the types up just in case down the road I change to upload the data instead...