PDA

View Full Version : blitting cube textures?



Zarniwoop
07-18-2010, 12:09 AM
If I have a non-compressed cubemap texture, and I want to make it a compressed cubemap texture, I suppose I have to blit it to the compressed internal format, right?

How to do that is very unclear to me based on a reading of the extension specs. EXT_framebuffer_blit contains not one reference to "cube", and EXT_framebuffer_object contains not one reference to "blit".

So how to blit a cube texture to another cube texture? Is it legal to use glFramebufferTexture2DEXT(GL_READ_FRAMEBUFFER_EXT, ...,GL_TEXTURE_CUBE_MAP_POSITIVE_X,...) to select the source to blit from, similar to how I normally use it to select the draw face with GL_FRAMEBUFFER_EXT? Or can I blit the whole cube at once by selecting GL_TEXTURE_2D instead of a cube face? Or how?

thanks in advance.

kyle_
07-18-2010, 01:21 AM
If I have a non-compressed cubemap texture, and I want to make it a compressed cubemap texture, I suppose I have to blit it to the compressed internal format, right?

Nope. Compressed formats are not renderable, thus Blit wont do much here. If you want to compress the texture, you need to glTexImage2D with compressed internal format, than you can glGetCompressedTexImage to get compressed content back.




How to do that is very unclear to me based on a reading of the extension specs. EXT_framebuffer_blit contains not one reference to "cube", and EXT_framebuffer_object contains not one reference to "blit".
You want to have an fbo with attachments that are cube faces.
Blit doesnt care about actual texture much, it just needs to have correctly set up fbo.

Zarniwoop
07-18-2010, 01:41 AM
Thanks Kyle!


Nope. Compressed formats are not renderable, thus Blit wont do much here. If you want to compress the texture, you need to glTexImage2D with compressed internal format...
Ouch! The problem is that my data never existed outside the gfx card, because I made it up with rendering to a texture. To use glTexImage2D I'd have to read it back across the PCIE, then re-submit it again across the PCIE as the compressed internal format. That's gonna be really slow.

Or do I miss some better approach? Can I easily :) compress a texture that's only living on the card to another texture that's only living on the card?

kyle_
07-18-2010, 03:08 AM
Read back to PBO, than glTexImage2D from that PBO.
That way you are on the card all the time.

Zarniwoop
07-18-2010, 07:02 PM
Thank you again Kyle. I wasn't able to get glTexImage2D to work because it needs a host memory pointer, but I looked a little further and found this: "glCopyTexImage2D"

Which reads from the GL_READ_FRAMEBUFFER and copies to the texture you have found. I got that to work - or at least, it isn't throwing any glErrors and everything looks OK, so I assume it's using compression now - I'll verify it ASAP. Oddly, glCopyTex*SUB*Image on the full buffer didn't work, but the non-sub-image version did, even though it should have been copying the same data, so no worries about alignments.

A few other questions:

(1) If glCopyTexImage2D can do it, why can't glBlitFrameBuffer do it? In both cases my data isn't leaving the card.

(2) If I have a texture that just needs one component, so I'm storing it in a luminance texture, is it better to use the luminance texture, or a compressed RGB texture?

thanks so much for your help. You gave me enough info to get it working :cool:.

*Edit* I spoke too soon - there are no more GL errors, but the textures are quite corrupted after doing the glCopyTexImage2D. Still looking...

*Edit2* Got it! I was binding the glCopyTexImage2D texture to a framebuffer, which I apparently shouldn't have done. I removed that, and now it seems to work for real.