# Realtime Image Based Lighting using Spherical Harmonics When I generated the images for the last blog post, I had the strong feeling that the rendering quality didn’t do the whole thing justice. The IGES importer, repair functionality and tessellation work so nice now, it’s a shame to have a fancy rim model and only use one standard directional light. This brought me to one thing I had one my “I want to implement this”-list for a long time: image based lighting.

The source for this technique is the SIGGRAPH 2001 paper “An Efficient Representation for Irradiance Environment Maps“ by Ravi Ramamoorthi and Pat Hanrahan. To sum it up very briefly, the main idea (there are multiple applications to Spherical Harmonics mentioned in there) is to use an environment map and – similar to how reflections are done by looking up the texel that corresponds to the direction of the reflection vector – use the surface normal as a lookup to figure out what the lighting from that direction is. However, instead of just taking one texel into account, one wants to simulate ambient light coming from all directions and contributing in varying degree to the diffuse lighting term of any particular position on a surface.

The key observation in the paper is that with regard to diffuse lighting, individual pixels (high frequency of change) do not matter as much as the general tone (low frequency of change). As an example, a white pixel in an environment map will not matter much if all other pixels are a dark red since the pixel is only a fraction of the light that is received by the surface. These low frequencies can very efficiently be encoded in a couple of coefficients (3×9 to be exact) and only produce a negligible amount of error. This is done by using Spherical Harmonics (see “Spherical Harmonics Lighting: The Gritty Details” by Robin Green for an excellent in-depth discussen). Similar to how Fourier Analysis is used to transform a signal to the frequency domain (e.g. a recorded WAV file is transformed into the information which frequencies are used to produce the sound) or how JPEG compression works, Spherical Harmonics are just another form of expressing a two dimension function which in our case is “how much light comes from that direction”.

The nice thing is that you don’t even need to concern with Spherical Harmonics too much. The paper contains a couple of simple equations (equation set 3) which are easy to apply … the thing that was a bit tricky to grasp was what to do with them exactly or what they mean. I spend a couple of hours reading through various material on Spherical Harmonics, implementing generic Spherical Harmonics, just to find out that once I got the concept, I wouldn’t have needed most of it. Which is why this post isn’t the in-depth explanation that you might find on other sites, it’s just a high-level, no-math explanation so you can figure out which pieces you need to achieve the goal of image based lighting.

The aim is to convert an environment/reflection map into an irradiance map (how much light comes from any particular direction). By the way, the cube maps used for the images in the post can be downloaded from Humus’ website. While modern rendering engines use real-time algorithms to create and update irradiance maps, I was aiming for something simpler, an offline pre-processing step. The iPad app should only be required to do the actual texture lookup during rendering.

The desired result actually looks like if it was simply a blurred version of the environment map but using a Gaussian blur will not produce the correct results. What one does instead is:

1. Down-sample the input texture to a smaller size for better performance. In my case, I use 64×64 input textures to create 64×64 irradiance maps. Since this map is only used for diffuse lighting, only the low-frequencies are needed and those don’t need much resolution to be reproduced accurately.
2. Sample the environment map: For each pixel in the input cube map, calculate the directional vector from the origin to the center of the pixel and use the equations in the paper (equation set 3) to calculate the 9 Spherical Harmonics coefficients for that direction. Then multiply them by the RGB-channels separately to produce 27 values for that one pixel (one set of 9 for red, one set of 9 for green and one set of 9 for blue). Add those values up for all pixels on all sides and then divide them by 4 times Pi divided by the number of pixels. This gives you the 9 coefficients for each of the 3 channels which represent the average over all pixels. Believe it or not, but those coefficients are all you need to create the irradiance map.
3. When sampling is done, create an empty cube map. For each pixel on each side, again create the directional vector and calculate the Spherical Harmonics coefficients. Multiply each coefficient for the current pixel with the respective averaged coefficient that has been calculated in step 2 (so pixel direction coefficient L0,0 with averaged red coefficient L0,0 plus pixel direction coefficient L1,-1 with averaged red cooefficient L1,-1 and so on for all red coefficients to get the result of the red channel) . Make sure to clamp the values to the valid range (e.g. 0-255 for a pixel’s channel value).

That’s actually it! Replace the ambient and diffuse terms in your lighting equation with the looked up value from the irradiance map times the diffuse color of the object and add the normal specular term (doing specular image based lighting is a bit involved, take a look at “Real Shading in Unreal Engine 4″ by Brian Karis). The stuff described above is only the simplest approach to get some form of image based lighting, there are various extensions and it is of course a far stretch from proper physical based rendering. But it produces nice results, especially for the small amount of work…

Keep in mind this is a very dumbed down approach and explanation. For example, the corner pixels in a cube map should contribute less then the center pixels of a cube map face as they fill a smaller opening angle when seen from the origin. But hopefully this gets you motivated to try it yourself.

I encourage you to dig through the referenced literature! The point though is that if your render system is capable of doing cube map reflections, it is very easy to add image based lighting in this limited form. The whole calculation in the pre-processing step is roughly 200 lines of code and rendering a couple of lines added to your shader!

## Source Code

The following is taken from an initial implementation that used lambda-expressions to get/set the texture values. I’ve changed the code a bit since that version but it’s great for illustrating how simple the whole thing can be done.

### SphericalHarmonics.h

``````#ifndef SphericalHarmonics_h
#define SphericalHarmonics_h
#include <functional>
#include <vector>
#include "Vector.h"

class SphericalHarmonics
{
public:
SphericalHarmonics();
virtual ~SphericalHarmonics();

enum class CubeMapSides
{
Top,
Bottom,
Back,
Forward,
Left,
Right,
};
struct Sample
{
CubeMapSides cubeMapSide;
size_t x;
size_t y;
double coefficients;
};

static std::shared_ptr<SphericalHarmonics> createForCubeMap(size_t const resolutionX, size_t const resolutionY);
void sampleCubeMap(std::function<Vector3(CubeMapSides const, size_t const, size_t const)> func);
void evaluateCubeMap(std::function<void(CubeMapSides const, size_t const, size_t const, Vector3 const)> func) const;
private:
Vector3 _coefficients;
std::vector<Sample> _samples;
};
#endif /* SphericalHarmonics_h */  ``````

### SphericalHarmonics.cpp

``````
#include <cmath>
#include "SphericalHarmonics.h"

SphericalHarmonics::SphericalHarmonics()
{
}

SphericalHarmonics::~SphericalHarmonics()
{
}

std::shared_ptr<SphericalHarmonics> SphericalHarmonics::createForCubeMap(size_t const resolutionX, size_t const resolutionY)
{
auto result = std::make_shared<SphericalHarmonics>();
for ( size_t side = 0; side < 6; ++side )
{
for ( size_t y = 0; y < resolutionY; ++y )
{
for ( size_t x = 0; x < resolutionX; ++x )
{
// Calculate directional vector
// TODO: Correct direction to sample center of texel.
Vector3 direction((float)x / (float)(resolutionX - 1) * 2.0f - 1.0f, (float)y / (float)(resolutionY - 1) * 2.0f - 1.0f, -1.0f);
direction.normalize();
// Rotate to match cubemap side
switch (static_cast<CubeMapSides>(side))
{
case CubeMapSides::Back:
{
direction *= -1.0f;
direction *= -1.0f;
}break;
case CubeMapSides::Left:
{
float const temp = direction;
direction = direction;
direction = -temp;
} break;
case CubeMapSides::Right:
{
float const temp = direction;
direction = -direction;
direction = temp;
} break;
case CubeMapSides::Top:
{
float const temp = direction;
direction = -direction;
direction = temp;
} break;
case CubeMapSides::Bottom:
{
float const temp = direction;
direction = direction;
direction = -temp;
} break;
default:
break;
}
Sample sample;
sample.cubeMapSide = static_cast<CubeMapSides>(side);
sample.x = x;
sample.y = y;
sample.coefficients = 0.282095;
sample.coefficients = 0.488603 * direction;
sample.coefficients = 0.488603 * direction;
sample.coefficients = 0.488603 * direction;
sample.coefficients = 1.092548 * direction * direction;
sample.coefficients = 1.092548 * direction * direction;
sample.coefficients = 0.315392 * (3.0 * direction * direction - 1.0);
sample.coefficients = 1.092548 * direction * direction;
sample.coefficients = 0.546274 * (direction * direction - direction * direction);
result->_samples.push_back(sample);
}
}
}
return result;
}

void SphericalHarmonics::sampleCubeMap(std::function<Vector3(CubeMapSides const, size_t const, size_t const)> func)
{
for ( size_t i = 0; i < 9; ++i )
{
_coefficients[i] = Vector3(0.0, 0.0, 0.0);
}
for ( auto const & sample : _samples )
{
auto const value = func(sample.cubeMapSide, sample.x, sample.y);
for ( size_t i = 0; i < 9; ++i )
{
_coefficients[i] += value * sample.coefficients[i];
}
}
double const factor = 4.0 * M_PI / static_cast<double>(_samples.size());
for ( size_t i = 0; i < 9; ++i )
{
_coefficients[i] *= factor;
}
}

void SphericalHarmonics::evaluateCubeMap(std::function<void(CubeMapSides const, size_t const, size_t const, Vector3 const)> func) const
{
for ( auto const & sample : _samples )
{
Vector3 value(0.0, 0.0, 0.0);
for ( size_t i = 0; i < 9; ++i )
{
value += sample.coefficients[i] * _coefficients[i];
}
func(sample.cubeMapSide, sample.x, sample.y, value);
}
}
``````