Flat Planets

Flat Planets

I often envy those who can accurately estimate the time their tasks will take, particularly the seemingly small ones. Experience has taught me that it’s these ‘simple’ tasks that are often the most deceptive. What appears to be a 15-minute fix can unexpectedly turn into a bizarre edge case, transforming a supposed one-line code change into a day-long hunt for answers on the Wayback Machine. And yet, even with all that experience, the task that led me down the deepest rabbit hole, surprisingly, was texturing a sphere.


Sky can serve as an excellent narrative device. Movie directors and game developers understand this well, often using it for exposition. What better way to convey the sense of being on a far-off planet than by presenting an alien sky, complete with unfamiliar-looking moons and planets hanging over the horizon?

I’m taking a similar approach in my project. The skybox I’m working on will feature an animated moon and a gas giant, adding some extra visual flair.

Both planets will spin, and in addition, the gas giant will feature moving atmospheric currents. Normally, these motions are too subtle to be seen, but I’ll accelerate them for greater visual impact. As for the artistic direction, I’m going for a semi-realistic style, inspired by the sci-fi art of the ’70s and ’80s.

Planet’s Surface

While the gas giant’s surface texture will be generated in realtime using a pixel shader and a render-to-texture approach, moon texture will be created using Substance Designer. This allows for experimenting with various looks by simply changing the color palette and material seed.

Animated gas giant, while visually captivating, presents its own set of challenges. Rendering texture for each frame will impact performance, with the rendering cost quadrupling if the texture resolution is doubled.

I want one solution that works consistently, regardless of whether the input is a static texture created in Substance Designer or dynamically rendered in Unreal as described in Flowfields.

This leaves 2 options for storing the planet surface – a square or rectangular texture. Although a cubemap was considered initially, its potential performance impact – consuming six times the resources of its square texture counterpart – makes it completely impractical. Furthermore, Substance Designer doesn’t support cubemap generation.

UV Sphere Problems

With texture choices narrowed down to square or rectangle, the next step is mapping them onto a sphere. The UV sphere is an obvious choice here, as it is easy to align its texture coordinates with the a 2×1 rectangular texture.

Given that only one side of the sphere is visible at a time, there is no need to use a full, unique rectangular texture, a tiling square one can be used instead.

Jagged Edges

Using UV sphere might be the easiest option, but unfortunately in this case is not the best one. Firstly the outline is visibly angular – it doesn’t have to be a perfect circle, but at least should be smooth.

When triangle count is not an issue this problem can be solved by using a sphere with more subdivisions. Alternatively the edge can be masked using pixel shader.

Texture Usage

Most of the sphere’s surface is viewed at an angle, leading to sampling from lower mipmaps. This is particularly problematic for dynamic textures, where each pixel is rendered per frame. Most of the generated image gets wasted on areas where it is barely visible.

While increasing tiling can mitigate this issue, it introduces another problem.


To maintain a continuous pattern across the whole sphere, the tiling can only be increased in whole numbers. However, even doubling the initial tiling makes the repetition noticeable.

Fractional values, on the other hand, create a seam running from the north to the south pole.

Polar Distortion

Another problem is texture pinching at the poles. Not only the pinching is visually jarring, mapping onto triangles generates additional distortion, similiar to PSX’s ‘floaty’ texturing.

A common workaround in game development has been to conceal these pinched areas with a textured patch.


Finally, when considering the UV sphere for a planetary model, the inability to depict atmospheric halos becomes a limitation. Since mesh only allows surface drawing, an additional model is required for the halo, complicating the setup.

The UV sphere may seem as a feasible solution, but the laundry list of necessary modifications and creative hacks turns it into a Frankenstein’s monster. It is still used for modeling a planet, but for my application, the skybox, I was considering another option.

Flat Disk

Opting for an actual spherical mesh to represent the planet is primarily for the ease of texture mapping. In the case of a skybox, however, there’s no need for a complex 3D mesh – it will be observed from a distance and from one position. A basic disk will work just as good. The main issue, texture mapping, can be effectively managed through a pixel shader. This technique also has the added benefit of allowing the atmospheric halo to be drawn on the same mesh.

The disk mesh I’ll use is a straightforward filled circular polygon. The key feature to note is that its texture coordinates have their origin exactly at the center of the disk. If we assume that the planet has a radius of 1, then the UV coordinates have to extend beyond that to make some space for rendering the atmosphere.

Pixel Shader

There are several features of the 3D UV Sphere mesh I want to recreate in the shader. Obviously, the shader version of a planet has to be visually indistinguishable from the mesh one. It also has to be free of all the problems of the mesh listed before. The shader creation process will be broken down to several steps.

  • Surface Equation: Without the triangle mesh, the surface has to be defined in analytical terms.
  • Texture Mapping: UV coordinates have to be defined as a function of position on a sphere. This includes scaling and TBN calculation
  • Orientation: The shader must allow for the tilting of the planet’s axis.
  • Rotation: The planet should be able to spin around its axis.

Surface Equation

One important note on coordinates: I’ll be using a left-handed, Y-up system, as it is most consistent with DirectX. I will also use the DirectX normal map format. While Unreal Engine also uses a left-handed system, Z is the up-direction. It is a good practice to double- or triple-check orientations and the format of normal maps used.

The surface equation maps a 2D position on a disc plane to a corresponding 3D position on a spherical surface. In simpler terms – it is a function that draws a ball.

The first step is figuring out if the pixel lies on the surface of a sphere. To create a bounding circle mask, verify if the length of the UV vector is less than the sphere’s radius:

float CircleMask( float2 uv, float radius)
    return length(uv) < radius? 1.0: 0.0;

It’s important to note that this approach will function correctly only when the UV coordinates are aligned with the center of the mesh.

Next step is to create surface vector – or the local position of the surface of the sphere. Luckily both x and y components of the UV coordinates are the same as the corresponding surface vector components. The only thing left to do is to reconstruct z component.

float3 ReconstructSurface(float2 uv)
    float zSquared = 1.0 - dot(uv, uv);
    float z = sqrt(zSquared);
    return float3(uv, z);

Texture Mapping

Mapping the square texture to spherical surface can be broken down to 3 operations:

  1. Wrapping around cylinder
  2. Repeating the same for y axis
  3. Deforming the result into a circle

Process of wrapping might seem counterintuitive. The x coordinate of texture will be proportional to the angle the texture would wrap around the cylinder. To find this angle it is enough to calculate the arcsine(x). Then the angle has to be remapped from [-π, π ] to [0, 1].

To form a circle it is necessary to define the circular curve as a distance from y axis:

The ‘local width’ of the sphere, or the generatrix, is then used to pinch the coordinates at the poles. This is achieved by dividing the x-component of the surface position.

float2 GenerateSphericalUV(float3 position)
    float width = sqrt(1.0 - position.y * position.y);
    float generatrixX = position.x / width * sign(position.z);
    float2 generatrix = float2(generatrixX, position.y);
    float2 uv = asin(generatrix) / 3.14159 + float2(0.5, 0.5);  
    return float2(uv);


To make the planet look more natural, we need to adjust the orientation of its axis, specifically the pitch and roll. Planets in the sci-fi art are usually tilted, as it creates more dynamic composition and allows to show ice capes on the poles.

The yaw, which corresponds to the planet’s spinning motion, will be dealt with in separate step to avoid the seam issue mentioned when discussing UV spheres.

In the conventional approach, a mesh sphere is typically reoriented or transformed using a matrix. In our case, where each pixel defines the surface position, we can use a simplified 3⨯3 matrix limited to rotation – eliminating the need for a full set of transformations.

Building such a matrix is pretty straightforward. Identify the ‘right,’ ‘up,’ and ‘forward’ vectors of the rotated sphere and organize them as columns in the matrix.

Material editor doesn’t support matrices as a data format, but Unreal offers a walk-around:

When the rotations are limited to just pitch and roll the matrix looks like this:

In HLSL 3⨯3 matrix can be represented using float3x3:

float3x3 CreateRotationMatrix(float pitch, float roll) {
    float cosPitch = cos(pitch);
    float sinPitch = sin(pitch);
    float cosRoll = cos(roll);
    float sinRoll = sin(roll);

    return (float3x3)(
        cosRoll, -sinRoll * cosPitch, sinRoll * sinPitch,
        sinRoll, cosRoll * cosPitch, -cosRoll * sinPitch,
        0.0, sinPitch, cosPitch


Before adding the spinning motion, there is one problem that needs to be adressed – uv seams. Right now the texture is mapped in such a way, that the seams are aligned with the edges of the texture, and thus invisible. But if we change the scale, the discontinuity becomes obvious. It is not a problem that can be easily solved, but it can be partially masked by moving to the backside. This requires changes in mapping function:

float2 GenerateSphericalUV(float3 position, float scale)
    float width = sqrt(1.0 - position.y * position.y);
    float generatrixX = position.x / width * sign(position.z);
    float2 generatrix = float2(generatrixX, position.y);
    float2 uv = asin(generatrix) / 3.14159 + float2(0.5, 0.5);  
    return float2(uv);


With all the prerequisites in place, adding the spinning is straightforward, even if counterintuitive. To keep the seam in well masked position, we will not rotater sphere’s surface, but pan the surface texture along x axis.

float2 GenerateSphericalUV(float3 position, float scale, float spin)
    float width = sqrt(1.0 - position.y * position.y);
    float generatrixX = position.x / width * sign(position.z);
    float2 generatrix = float2(generatrixX, position.y);
    float2 uv = asin(generatrix) / 3.14159 + float2(0.5, 0.5);  
    return float2(uv);
float2 sphericalVU = GenerateSphericalUV(position, scale, time*speed)


For the ball to appear truly round, shading is crucial. Typically, it would be handled by the engine, but in the case of a planet, it is not that easy. A planet consists of two distinctive layers – the surface and the atmosphere. Typical materials describe only one medium. Therefore, we have to define how both layers are lit and blend them together manually.


Regardless of the chosen shading model, surface normal must be provided. Fortunately it is already there – surface position, before rotation, in case where radius is 1 is equal to surface normal. It is enough to perform basic shading.


I want to replicate the aesthetics of sci-fi book covers, with a stylized version inspired by NASA photos. Realism isn’t my primary concern. I’ll be using the simplest Lambertian model, where the light is equal to the dot product of the surface normal and light vector.

float LambertianLight(float3 normal, float3 lightDirection) {
    float NdotL = max(dot(normal, lightDirection), 0.0);
    return NdotL;

Bump Mapping

A simple normal might suffice for cases where the planet appears smooth, such as gas giants. However, planets with rocky surfaces are often covered with geological formations like mountains, ridges, or craters. These are best simulated using normal maps.

To use normal maps, we require a complete set of vectors: Tangent, Bitangent, and Normal. This set comprises three unit vectors perpendicular to each other and aligned to UV mapping. These vectors can be thought of as ‘UV Local Space’.

In the case of a sphere, calculations need to be done per pixel. It’s often convenient to store these calculations as columns of the TBN matrix. This matrix can then be used to recalculate the normal from the normal map into World Space Coordinates.

Shaded surface and the corresponding normalmap:

UV Discontinuities

When discussing the generation of UV coordinates in the pixel shader, one critical topic to address is discontinuities. Take, for instance, the case of the back seam – the horizontal component of the UV should wrap perfectly as it is equal to 0.0 and 1.0 on the borders. However, instead of a seamless transition, there is a line of blocky artifacts along the seam.

This line corresponds to the DDX of the UV. DDX and DDY, in this case, measure the rate of change of UV in the respective screen space axes. They are then fed into the texture sampler and used to determine which mipmap to use. Low values of UV derivative correspond to a high-resolution mipmap, and vice versa.

There’s a ‘jump’ in values between the left and right sides of the seam, causing the DDX to be high. Consequently, the texture is sampled from the lowest mipmap, resulting in the gray color.

In the case of a planet, it won’t be a significant issue, as it will be covered with a polar patch anyway. However, in other visually jarring scenarios, the DDX and DDY of the seam have to be manually corrected and fed into the sampler.

The presented ‘fix’ is merely for illustrative purposes. Typically, formulas for proper values of DDX and DDY have to be derived manually to match the mapping accurately.

Polar Patches

Last problem that needs adressing is the distortion on the poles. As mentioned before – the tried and tested solution is to just cover them. Seems like a hack, but in the case of a planet or a moon makes a lot of sense. Polar areas are often covered with ice, and visually distinct from the rest of the planet, so having a patch of different texture not only hides the prolem but also add visual interest.

Easiest way to add polar patch wolud be to map it in the plane perpendicular to the axis of rotation and rotate resulting coordinates:

float2 PolarPatchMapping(float3 position, float scale, float spin)
    float width = sqrt(1.0 - position.y * position.y);
    float generatrixX = position.x / width * sign(position.z);
    float2 generatrix = float2(generatrixX, position.y);
    float2 uv = asin(generatrix) / 3.14159 + float2(0.5, 0.5);  
    return float2(uv);


Planet’s surface doesn’t occupy the entire area of the disk, leaving enough space to draw the atmospheric halo. Drawn with the same shader, it can be seamlessly merged with the atmosphere drawn over the planet.

There are countless whitepapers discussing atmosphere rendering. Most are physically based and rely on raymarching to deliver the most realistic image possible. I will use none of them. There is a nice trick from Quake I want to try.

In both Quake 1 and 2, lighting was achieved using mostly lightmaps, a technique popular at the time and still used until recently for static light scenarios. In this approach, computationally expensive lighting calculations are precomputed during development and stored as a separate texture covering all surfaces in the level.

However, since lightmaps only work for static lighting, idSoftware had to devise a solution for dynamic objects like monsters. They sampled the lightmap below the dynamic object and used it to modulate the color of the object. While this technique wasn’t physically accurate, it was good enough to make the monster blend with its surroundings.

In a similar manner, I’ll use the planet’s surface to approximate the lighting of the atmospheric halo. It is enough to extend the surface normal before bump mapping to calculate the halo’s brightness.

To further enhance the effect, the brightness of each channel can be remapped differently. This is a simple yet effective method for simulating the absorption of different light wavelengths.

The visibility of the atmosphere is more pronounced at oblique angles. This effect can be replicated by remapping the Z component of the surface normal. Additionally, to recreate the halo’s fading effect, the distance from the sphere’s surface must be calculated.

Finally, both the surface and atmosphere need to be alpha-blended together.


Although it took more steps than anticipated, the result works great. The planet is a perfect sphere, and the mapping allows for all the texture and shader manipulations I wanted. I can create a single Substance Designer graph generating surface texture, and the result doesn’t require any further post-processing. The same goes for animated texture – I can use a smaller size and thus save performance, then adjust the tiling as needed.