PDA

View Full Version : I want better cubemap egdes



mogumbo
04-04-2005, 02:43 PM
It is a known issue that each side of a cubemap is treated as a separate texture when filtering and mipmapping, causing a hard edge to appear in reflections and other effects. Also, there are a lot of nice lighting tricks that could be achieved with shaders using mipmap-biased cubemap lookups, but these hard edges become even more apparent when artificially raising the mipmap level.

Has anyone out there given any thought to (or done any work on)
a) graphics hardware that interpolates between sides on cubemap lookups, or
b) a cubemap mipmap extension that averages edge pixels together at each mipmap level in order to hide the edges (this would cause duplicate pixels at the edges, but that would be a lot more difficult to notice than a hard edge), or
c) any other fix that I haven't thought of?

idr
04-04-2005, 03:24 PM
a) graphics hardware that interpolates between sides on cubemap lookups, orI believe the newer 3dlabs boards do this.


b) a cubemap mipmap extension that averages edge pixels together at each mipmap level in order to hide the edges (this would cause duplicate pixels at the edges, but that would be a lot more difficult to notice than a hard edge),This is basically a texture border. Bet you never thought that would be useful, huh? I'm not sure if anyone hardware accelerated texture borders on cubemap faces, but I know that Nvidia cards accelerate texture borders on "regular" 2D textures, so it might be worth a shot.

Stephen_H
04-04-2005, 05:02 PM
How are cubemap seams handled when doing omni directional shadowbuffer shadows?

Jeff Russell
04-04-2005, 06:04 PM
John Carmack mentioned this problem a few months ago, and also mentioned that future hardware should fix it. (http://www.armadilloaerospace.com/n.x/johnc/recent%20updates/archive?news_id=290)

If you need to do it right now, and border texels don't cut it, I would think you might be able to do your own mipmap generation at load time where you manually sample the other faces and feed those results into the border texels in the various mipmap LOD's. Wouldn't really work for dynamic cube maps though.

P.S. - Ames, IA? Are you somebody I know?

jwatte
04-04-2005, 07:12 PM
I believe the latest hardware supports cube map edge texels, and next generation stuff will likely be able to filter between the actual maps. Thus, if you upload border texels for your cube map faces (on all MIP levels) most of the problem is likely to go away.

spasi
04-05-2005, 12:24 AM
A useful tool for (static) cube maps:

ATI CubeMapGen (http://www.ati.com/developer/cubemapgen/index.html)

mogumbo
04-06-2005, 08:26 AM
Thanks for the tips everyone. And yes, Jeff, I'm Terry. We met at the coffee shop a couple weeks ago.

simongreen
04-06-2005, 10:10 AM
NVIDIA supports borders on cubemaps. This solves the problem you're talking about, but does use quite a lot of additional memory because of the way cubemaps are arranged in memory.

The code to fill in the borders from the adjacent faces is a bit tricky, but Cass has already written it for you:
http://cvs1.nvidia.com/DEMOS/OpenGL/src/shared/cubemap_borders.cpp

mmp
04-18-2005, 04:26 PM
Originally posted by simongreen:
NVIDIA supports borders on cubemaps. This solves the problem you're talking about, but does use quite a lot of additional memory because of the way cubemaps are arranged in memory.

The code to fill in the borders from the adjacent faces is a bit tricky, but Cass has already written it for you:
http://cvs1.nvidia.com/DEMOS/OpenGL/src/shared/cubemap_borders.cpp There's a subtle bug in that code--note that there are two places that set the texels on the bottom of face 5, but none that set the top. (i.e. bimg[5][B_BOT_ROW(i)+c] is assigned to twice.)

One of those should presumably be B_TOP_ROW instead. (Not sure which, trying to figure cube map addressing out always makes my head hurt.)

-matt

arekkusu
04-19-2005, 07:06 AM
Originally posted by idr:
I know that Nvidia cards accelerate texture borders on "regular" 2D texturesOn what specific hardware? Just curious, I thought this caused a software fallback.


Originally posted by simongreen:
NVIDIA supports borders on cubemaps.]Are they accelerated? On what specific hardware?

idr
04-20-2005, 08:01 AM
Cass or Mark would know for sure, but, AFAIK, texture borders are hardware accelerated at least as far back as the original Geforce. They may even be accelerated on the TNT family, but I'm not sure.