Faking an atmosphere on a 3D planet model in perspective projection

I’ve been working on this for weeks and have tried a lot of things, so hopefully someone here has a clever solution. I have a planet model (a sphere) and an atmosphere model (a hemisphere). I draw the planet, then the atmosphere at 1.1 times the scale of the planet and rotated to face the camera. The atmosphere model’s UVs are planar mapped and it uses a texture with alpha that looks similar to a radial gradient: http://eightvirtues.com/misc/atmosphere.png. I’ve also tried using a quad drawn at the center of the planet instead of a hemisphere model.

The problem is that as the camera gets closer to the planet the perspective projection causes the edges of the atmosphere model to become occluded, effectively cutting off the edges of the atmosphere texture. I tried to work around this by scaling the texture space for the atmosphere texture progressively smaller based on the camera’s distance from the planet. This works except the rate at which the texture is scaled doesn’t match the rate at which the edges of the atmosphere model are occluded. I’ve tried to scale linearly, used square and cube root functions on the distance and compressed the result between minimum and maximum scales, but no amount of tweaking gives wholly satisfactory results. When using a quad instead of a hemisphere there is no occlusion, so I think the problem there is that the quad is being distorted differently than the planet model by the perspective projection since it’s flat rather than spherical.

Is there a proper way to do this, or is this the most ass-backward way possible to create this type of effect? I don’t know how to use shaders at all, but if it turns out there’s no other way to do it I’d very much be open to assistance in creating such a shader.

I draw the planet, then the atmosphere at 1.1 times the scale of the planet and rotated to face the camera.

Why? Why not just draw the planet and atmosphere at the same size? There’s also no need to draw a hemisphere that faces the camera; use the exact same sphere as the planet, using the exact same transformation.

This is just standard multipass. Indeed, there’s no reason to even use multipass; just use multitexture with the sphere, using a different set of texture coordinates for your atmosphere texture.

Hi Alfonse. The problem with doing it that way is that a planet’s atmosphere from space has a soft edge that is above the surface of the planet. Here’s a photo of the real thing, which is what I based my atmosphere texture on:

I do use your technique for rendering clouds and cloud shadows and it works extremely well, but the atmosphere extends beyond them.

I was just thinking about the physics behind why an atmosphere looks that way from space (basically how much air you’re looking through based on the vector of your eye and each point on the atmosphere) and thought of a possible solution emulating that. What if I rotated the matrix for the atmosphere such that it was facing the camera, then drew a series of quads (say, 100) starting behind the planet and ending in front of it. Each quad would have a texture of an opaque circle with transparent background and a low-alpha material setting. The scale of each quad would follow the diameter of the slice of the planet it was intersecting. Each quad would render only reading Z depth (the planet’s) but not writing it (won’t occlude each other) and use the additive alpha blending mode (Gl.BlendFunc(Gl.SRC_ALPHA, Gl.ONE)). With enough quads that should very closely resemble reality, and the perspective projection distortions at close range may become irrelevant due to the quad density approximating the volumetric nature of a real atmosphere. I just need to figure out the function to scale the quad to fit the planet’s diameter at a given position on its Z axis.

Do you think that would work, and any idea what the calculation would be to scale the quad? The scale at the back of the planet would be zero, at the middle 1, and at the front zero again, but instead of interpolating linearly it would follow the curve of the sphere. Maybe that should be in another thread. :slight_smile:

Assuming that you want to do this without using shaders, my first attempt would be to try to re-purpose the OpenGL lighting calculations. Place a light source at the viewpoint, so the diffuse component is based upon the angle between the viewpoint and the normal. Consider a diffuse colour with alpha much larger than one, e.g. (0,0,0,10); this will result in the resulting alpha being 1.0 (due to clamping) over most of the sphere, except for the very edges (points which are tangential to the view), where it will fall off to zero. Then render the surface and a solid blue sphere with source factors of GL_DST_ALPHA and GL_ONE_MINUS_DST_ALPHA respectively.

If that isn’t good enough (e.g. because you don’t have enough control over the shape of the fall-off), consider doing something similar to what you’re doing at present, but with a separate texture for the blue tint at the edges. The texture would be a linear gradient, with most of the texture entirely transparent. You can change the fall-off either by modifying the texture coordinates dynamically, or using a texture matrix.

But ultimately, you’re likely to get much better results using shaders.

Hi GClements. That is an interesting technique. I’ll keep that in my tool box as it could be used for any number of useful things.

Last night I implemented my idea of using multiple quads with an opaque circle texture scaled to the width of the sphere at its point of intersection and I’ll be damned if it doesn’t look lovely. I’m currently using 200 layers/quads covering the front 60% of the planet (the rear 40% won’t be seen anyway since the layers all face the camera and are occluded by the planet). What’s really crazy is that it looks realistic when close to the planet, even from its surface with the camera “inside” the atmosphere. I was so excited after two weeks of pain that I leapt from my chair and clapped.

Of course, once you’re “finished” you quickly realize there’s “one more thing” to be done. In this case the lighting. Since the atmosphere is composed of hundreds of quads facing the camera the atmosphere isn’t lit correctly, as it should be lit as though it were a sphere just like the planet. The first thing I tried was to set the vertex normals of each quad using the center of the planet and each vertex’s position as the vector. The results appeared as though I’d succeeded, but four vertices do not make for accurate spherical shading. I’m about to try using a quad strip (10x10 quads, perhaps) instead of a single quad per atmosphere layer, and set the vertex normals as before. I’ll let everyone know how it goes and once it’s working I’ll post the source code here. I’d really hate for someone else to have to suffer through getting such a common effect working properly.

Alright. It appears the best way to draw an atmosphere layer is to use a triangle fan to create an untextured circle and use the material settings exclusively for color and alpha (as opposed to a textured quad or triangle strip with stitching and a material). I’ve modified my code to draw a 200-layer “sandwich” of triangle fans through the front 60% of the planet model. I’m setting the normal of each vertex of each triangle strip relative to the planet center. Also Z is up/down/elevation and -Y goes into the screen, which is a bit unusual but I’m more comfortable with. I also moved the code to a display list for efficiency, so the atmosphere dimensions are 1x1x1 until scaled by the procedure drawing the planet.

The only remaining issue is that despite setting the normals of the triangle fans as if they were spherical, the shading of the triangle fans appears to operate on 1/4 of them with a hard cutoff. Here’s a screenshot:

At this point I think I’m overlooking something relatively simple. Does anything about my logic or implementation jump out as being boneheaded here? Interestingly this is the same problem when I was setting the vertex normals of quads (rather than triangle fans) for the atmosphere layers, so I don’t think it’s anything new that’s causing the problem. Here’s the code for the procedure drawing the atmosphere layers:

Public Sub AtmosphereCreateLists()

  ' Create atmosphere display lists.

  ' General declarations.
  Dim ScalePosition As Single       ' Y coordinate of current atmosphere layer.
  Dim ScaleRadius As Single         ' Half scale of current atmosphere layer.
  Dim Radius As Single              ' Radius of atmosphere precalculated for efficiency.
  Dim Alpha As Single               ' Alpha value of each atmosphere layer.
  Dim Layers As Single              ' Number of atmosphere layers to render.
  Dim LayerStep As Single           ' Step value of atmosphere layer For...Next loop.
  Dim LayerCoverage As Single       ' Percentage of planet atmosphere layers will cover (0 - 1).
  Dim Outer As New Single[2]        ' Coordinates of outer vertex when drawing triangle fan.
  Dim Triangles As Single           ' Number of triangles triangle fan is composed of.
  Dim Angle As Single               ' Current angle of triangle fan vertex.
  Dim AngleStep As Single           ' Number of degrees Angle is incremented by for triangle fan.
  Dim Counter As Short

  ' Assign initial values to variables.
  Radius = 0.5
  Layers = 200
  LayerCoverage = 0.6
  LayerStep = LayerCoverage / Layers
  Alpha = 1 / Layers
  Triangles = 64
  AngleStep = 360 / Triangles

  ' Create display list.
  AtmosphereList[0] = Gl.GenLists(1)
  Gl.NewList(AtmosphereList[0], Gl.COMPILE)

  ' Set up render parameters.
  DepthReadOnly
  Gl.BlendFunc(Gl.SRC_ALPHA, Gl.ONE)
  Gl.BindTexture(Gl.TEXTURE_2D, 0)
  'MaterialFull([0, 0, 0, 0], [255 / 255, 185 / 255, 165 / 255, Alpha], [0, 0, 0, 0])
  MaterialFull([0, 0, 0, 0], [1, 1, 1, Alpha], [0, 0, 0, 0])

  ' Cycle through each atmosphere layer.
  For ScalePosition = 1 - LayerCoverage To 1 Step LayerStep
    ' Calculate current atmosphere layer's scale radius.
    ScaleRadius = Sqr((2 * ScalePosition * Radius) - (ScalePosition ^ 2))
    ' Check if scale is positive.
    If ScaleRadius > 0 Then
      ' Reset Angle.
      Angle = 0
      ' Draw center vertex.
      Gl.Begin(Gl.TRIANGLE_FAN)
      Gl.Normal3fv(NormalLineSegment([0, Radius, 0], [0, ScalePosition - Radius, 0]))
      Gl.Vertex3f(0, ScalePosition - Radius, 0)
      ' Draw outer vertices.
      For Counter = 0 To Triangles
        ' Calculate vertex position (X, Z).
        Outer = TranslateInDirection(0, 0, ScaleRadius, Angle)
        ' Draw vertex.
        Gl.Normal3fv(NormalLineSegment([0, Radius, 0], [Outer[0], ScalePosition - Radius, - Outer[1]]))
        Gl.Vertex3f(Outer[0], ScalePosition - Radius, Outer[1])
        ' Increment angle.
        Angle -= AngleStep
      Next
      Gl.End()
    Endif
  Next

  ' Restore previous settings.
  DepthReadWrite
  Gl.BlendFunc(Gl.SRC_ALPHA, Gl.ONE_MINUS_SRC_ALPHA)

  ' Finish display list.
  Gl.EndList()

End

Here’s the code for the procedure calculating the vertex normal from two points:

Public Function NormalLineSegment(p1 As Single[], p2 As Single[]) As Single[]

  ' Calculate and return normal of specified points.

  ' General declarations.
  Dim N As New Single[3]
  Dim Magnitude As Single

  ' Calculate normal.
  N[0] = p2[0] - p1[0]
  N[1] = p2[1] - p1[1]
  N[2] = - (p2[2] - p1[2])

  ' Normalize normal.
  Magnitude = Sqr(N[0] ^ 2 + N[1] ^ 2 + N[2] ^ 2)
  If Magnitude <> 0 Then
    N[0] /= Magnitude
    N[1] /= Magnitude
    N[2] /= Magnitude
  Endif

  Return N

End

And lastly here’s the code for calculating the X/Y coordinates of a point around an origin at an arbitrary angle (plainly working but I like contributing when I can, so enjoy if you need it):

Public Function TranslateInDirection(StartX As Single, StartY As Single, Distance As Single, Degrees As Single) As Single[]

  ' Translate specified coordinates by specified distance at specified angle in degrees.

  ' General declarations.
  Dim NewXY As New Single[2]

  NewXY[0] = StartX + Distance * Cos(Rad(Degrees))
  NewXY[1] = StartY + Distance * Sin(Rad(Degrees))

  ' Return new coordinates.
  Return NewXY

End