To build this monolithic project with 13 different scenes, several systems made up of reusable and composable components were developed within React Three Fiber:
- Deferred Rendering & Outlines
- Composable Materials
- Composable Particle System
- Scene Transition System
The article begins with an overview of the concept art and early collaboration behind the project, then moves into dedicated sections that explain each system in detail. These sections describe the decisions behind Deferred Rendering and Outlines, the structure of the Composable Materials system, the logic behind the Composable Particle System, and the approach used for transitions between scenes.
Brief Intro & Concept Art

Kehan came to me directly, knowing me through a friend of a friend. He had a vision for the project and had already engaged Lin to illustrate several scenes. I told him the team I wanted, and we expanded into a full group of freelancers. Fabian joined as a shader developer, Nando as a creative, Henry as a 3D artist, Daisy as a producer, and HappyShip joined once Henry went on vacation.


Linâs illustrations had such a distinctive and inspiring art style that translating them into 3D became an incredibly fun and exciting process. The team spent countless days and nights discussing how to bring the project to life, with a constant stream of new ideas and shared referencesâmy bookmarks folder for the project now holds more than 50 links. It was a pleasure and a privilege to work with such a passionate and talented team.




1. Deferred Rendering & Outlines

A key feature of the art style is the use of colored outlines. After extensive research, we found three main ways to achieve this:
- Edge detection based on depth and normals
- Inverse hull
- Edge detection based on material IDs
We decided to use the first method for two reasons. With inverse hull, moving the camera closer or farther from the object would cause the outline width to appear thicker or thinner. Material ID would also not work well with the particle-based clouds.
Normals

To use deferred rendering in Three.js, we need to set the count in WebGLRenderTarget, where each count represents a G-Buffer. For each G-Buffer, we can define the texture type and format to reduce memory usage.
In our case, we used a G-Buffer for storing normals. We applied a memory optimization technique called octahedron normal vector encoding, which allows normals to be encoded into fewer bits at the cost of additional encoding and decoding time.
Outline Colors

We also wanted different colored outlines for different objects, so we used an additional G-Buffer. Because we were only using a small number of colors, one optimization could have been to use a color lookup texture, reducing the G-Buffer to just a few bits. However, we kept things simple and easier to adjust by using the full RGB range.
Outlines

Once the G-Buffers are prepared, a convolution filter is applied to the depth and normal data to detect edges. We then apply the color from the outline color G-Buffer to those edges. Resources such as Moebius Style Post Processing by Maxime Heckel and Outline Styled Material by Visual Tech Art were immensely helpful.
Gotchas
One issue with using count in Three.js WebGLRenderTarget is that all core materials, such as MeshBasicMaterial, will no longer render by default. A value must be assigned to the G-Buffer location for it to appear again. To avoid polluting the buffer with unwanted data, we can simply set it to itself.
layout(location = 1) out vec4 gNormal;
void main() {
gNormal = gNormal;
}
2. Composable Materials
Since this project includes many scenes with numerous objects using different materials, I wanted to create a system that encapsulates a piece of shader functionalityâalong with any data and logic it requiresâinto a component. These components could then be combined to form a material. React and JSX make this kind of composability straightforward, resulting in a fast and intuitive developer experience.
Note: this project was developed in early 2024, before TSL was introduced. Things could be done differently today.
GBufferMaterial
The core of the system is the GBufferMaterial component. It is essentially a ShaderMaterial with helpful uniforms and pre-calculated values, along with insertion points that modules can use to add additional shader code on top.
uniform float uTime;
/// insert
void main() {
vec2 st = vUv;
/// insert
}
MaterialModule
A large array of reusable modules, along with several custom one-off modules, were created for this project. The most basic of these is the MaterialModuleColor.
export const MaterialModuleColor = forwardRef(({ color, blend = '' }, ref) => {
// COLOR
const _color = useColor(color);
const { material } = useMaterialModule({
name: 'MaterialModuleColor',
uniforms: {
uColor: { value: _color },
},
fragmentShader: {
setup: /*glsl*/ `
uniform vec3 uColor;
`,
main: /*glsl*/ `
pc_fragColor.rgb ${blend}= uColor;
`,
},
});
useEffect(() => {
material.uniforms.uColor.value = _color;
}, [_color]);
useImperativeHandle(ref, () => _color, [_color]);
return <>>;
});
It simply adds a uColor uniform and writes it to the output color.
Use Case
For example, this is the code for the monolith:
All of these are generic modules that were reused across many different meshes throughout the site.
- MaterialModuleNormal: encodes and writes the world normal to the normal G-Buffer
- MaterialModuleOutline: writes the outline color to the outlineColor G-Buffer
- MaterialModuleUVMap: sets the current
stvalue based on the provided texture (affecting later modules that usest) - MaterialModuleGradient: draws a gradient color
- MaterialModuleAnimatedGradient: draws an animated gradient
- MaterialModuleBrightness: brightens the output color
- MaterialModuleUVOriginal: resets
stto the original UVs - MaterialModuleMap: draws a texture
- MaterialModuleFlowMap: adds the flow map texture to the uniforms
- MaterialModuleFlowMapColor: adds a color based on where the flow map is activated
Modules that affected the vertex shaders were also created, such as:
- MaterialModuleWind: moves the vertex for a wind effect, used for trees, shrubs, etc.
- MaterialModuleDistort: distorts the vertex, used for the planets
With this system, complex shader functionalityâsuch as windâis encapsulated into a reusable and manageable component. It can then be combined with other vertex and fragment shader modules to create a wide variety of materials with ease.
3. Composable Particle System
Similarly, the idea of making things composable and reusable is extended to the ParticleSystem.
ParticleSystem
This is the core ParticleSystem component. Since it was written in WebGL, it includes logic to calculate position, velocity, rotation, and life data using the ping-pong rendering method. Additional features include prewarming, the ability to start and stop the particle system naturally (allowing remaining particles to finish their lifetime), and a burst mode that ultimately wasnât used.
Just like the GBufferMaterial, the position, rotation, and life shaders contain insertion points for modules to use. For example:
void main() {
vec4 currPosition = texture2D(texturePosition, uv);
vec4 nextPosition = currPosition;
if (needsReset) {
/// insert
}
/// insert
nextPosition += currVelocity * uDelta;
/// insert
gl_FragColor = vec4(nextPosition);
}
It supported two modes: points or instanced mesh.
ParticleSystemModule
The system is inspired by Unity, with modules that define the emission shape as well as modules that affect position, velocity, rotation, and scale.
Emission modules
For example, the EmissionPlane module allows us to set particle starting positions based on the size and position of a plane.
The EmissionSphere module allows us to set the particle starting positions on the surface of a sphere.
The most powerful module is the EmissionShape module. This allows us to pass in a geometry, and it calculates the starting positions using MeshSurfaceSampler.
Position, Velocity, Rotation, and Scale modules
Other commonly used modules include:
- VelocityAddDirection
- VelocityAddOverTime
- VelocityAddNoise
- PositionAddMouse: adds to the position based on the mouse position, and can push or pull particles away from or toward the mouse
- PositionSetSpline: sets a spline path for the particles to follow and ignores velocity
Asteroids Use Case
For example, this is the asteroid belt:
The particles are emitted from a small sphere, then follow a spline path with a random rotation.
It also works with the GBufferMaterial, allowing us to shade it using the same modules. This is how the mouseover flow map is applied to this particle systemâthe same material module used for the monolith is also used here.
Leafs Use Case
4. Scene Transition System
Because of the large number of scenes and the variety of transitions we wanted to create, we built another system specifically for scene transitions. Every transition in the project uses this system, including:
- solar system > planet: wipe up
- planet > bone: zoom blur
- history > tablet: mask
- tablet > fall: mask
- fall > overview: zoom blur
- desert > swamp: radial
- winter > forest: sphere
- world > ending: mask
First, we draw scene A with deferred rendering, including depth and normals. Then we do the same for scene B.
Next, we use a fullscreen triangle with a material responsible for mixing the two scenes. We created four materials to support all of our transitions.
- MaterialTransitionMix
- MaterialTransitionZoom
- MaterialTransitionRadialPosition
- MaterialTransitionRaymarched
The simplest of these is MaterialTransitionMix, but it is also quite powerful. It takes the scene A texture, scene B texture, and an additional grayscale mix texture, then blends them based on a progress value from 0 to 1.
For the solar system to planet transition, the mix texture is generated at runtime using a rectangle that moves upward.
For the history to tablet transition, the mix texture is also generated at runtime by rendering the same tablet scene in a special mask mode that outputs the tablet in a black-to-white range.
The tablet to fall transition, as well as the world to ending transition, were handled the same way, using mix textures generated at runtime.
Deferred Rendering, made composable
Using the same insertion technique as the composable material and particle systems, the deferred rendering workflow was made composable as well.
By the end of the project, we had created the following modules for our Deferred Rendering system:
- DeferredOutline
- DeferredLighting
- DeferredChromaticAberration
- DeferredAtmosphere â most visible in the desert intro
- DeferredColorCorrect
- DeferredMenuFilter
Use Case
For example, the solar system scene included the following modules:
Final thoughts
These systems help make development faster by being encapsulated, composable, and reusable.
This means features can be added and tested in isolation. No more giant material files with too many uniforms and hundreds of lines of GLSL. Fixing a specific feature no longer requires copying code across multiple materials. Any JS logic needed for a shader is tightly coupled with the snippet of vertex or fragment code that uses it.
And of course, because all of this is built with React, we get hot reloading. Being able to modify a specific shader for a specific scene and see the results instantly makes the workflow more fun, enjoyable, and productive.
