Skip to main content

Command Palette

Search for a command to run...

Particle systems in games with threejs and tricks to make them look good

Published
8 min read
Particle systems in games with threejs and tricks to make them look good

Particles Are Just Textured Quads

Particle systems drive fire, smoke, sparks, rain, dust, magic. They look complex. They are not. A particle system is a flat image drawn over and over with small variations. That is it.

What A Particle Actually Is

A particle is a quad. Quad means a rectangle made of two triangles.

The quad has a texture on it. Usually small. Usually with transparent edges.

The quad always faces the camera. So when you move the camera around, the quad rotates to face you. This is called a billboard. Like a highway billboard that turns to stay readable from the road.

That is the whole thing. A particle is one textured billboard.

Now imagine 10 000 of them. Different positions. Different sizes. Different colors. Some fading out. Some rotating. That is a particle system.

What A Particle System Is

A particle system does three jobs. Emit. Simulate. Render.

Three jobs. That is the whole system.

Emit

You pick a shape for the emitter. Point. Box. Sphere. Cone. Mesh surface. New particles spawn from that shape.

You pick a rate. 100 per second. Bursts of 50 at a time. Whatever fits the effect.

Initial state is usually random within bounds. Velocity random in a cone. Size random in a range. Color maybe random. Life random.

Random with bounds is what makes particles look organic instead of robotic. Pure random is noise. Pure fixed values are robotic. The middle is life.

Simulate

Every frame you loop over every living particle. You update.

  • Position. Add velocity times deltaTime. DeltaTime means the time since the last frame, so your particle moves the same distance regardless of frame rate.

  • Velocity. Apply gravity, drag, wind, any force you want.

  • Age. Increase by deltaTime.

  • If age exceeds lifetime, kill the particle.

  • Attributes over lifetime. Size, color, opacity often follow a curve across the particle's life.

That last part is where the magic lives. A fire particle is yellow at birth. Red in the middle. Black smoke at death. Driven by a curve with three key points.

Render

Each living particle becomes a quad. Each quad gets the texture. Each quad gets tinted by the particle's color. Each quad scales to the particle's size.

Draw them all. Move on to the next frame.

Blend Modes. The Most Important Choice

A particle is transparent where its texture is transparent. How the transparent parts combine with what is behind them matters a lot. This is the blend mode.

Two blend modes matter for games. Additive and alpha.

Additive Blending

Additive means the particle's color is added to whatever is behind it. The math is final = src + dst. Source is the particle color. Destination is what is already there.

Black plus anything is that anything. So black reads as fully transparent with no alpha channel needed.

Bright colors stacked on bright colors push toward white. Fire looks like fire because the overlaps get brighter, not darker.

Use additive for fire. Sparks. Magic. Lightning. Lasers. Anything that emits light.

Alpha Blending

Alpha means the particle has an alpha channel. Alpha is a fourth value per pixel that says how opaque that pixel is. 0 is fully transparent. 1 is fully opaque.

The math is final = src.rgb * src.a + dst.rgb * (1 - src.a). Mix between particle color and scene color, weighted by alpha.

Alpha blended particles can be any color including black. Overlaps look like overlaps. No brightness push.

Use alpha for smoke. Fog. Dust. Clouds. Rain. Anything solid-ish that does not emit light.

Why This Matters

Pick the wrong blend mode and your fire looks like grey paper. Your smoke looks like fire. Particle feel lives in the blend mode.

The Sorting Problem

Alpha blended particles have a painful trick.

If you draw a near particle first, then a far particle behind it, the far one either fails the depth test and disappears, or draws on top incorrectly because it writes wrong depth. Either way looks broken.

Solution. Sort particles by distance to the camera every frame. Draw furthest first. Nearest last. This is called back to front sorting.

Additive particles do not need sorting. The math is commutative. a + b + c = c + b + a. Order does not change the pile of brightness.

In a game with tens of thousands of alpha particles the sort has to happen every frame. On the CPU that chokes above a few thousand particles. On the GPU you use a bitonic sort, a parallel sort algorithm that fits GPU threads well, and the cost stays flat.

The Intersection Problem

Here is where particle systems win or lose.

A particle quad moves through the world. Sometimes the quad passes through solid geometry. A smoke quad crosses the floor. A fire quad crosses a wall.

The depth test of the GPU cuts the quad at the intersection. You see a hard straight line where the quad meets the surface. Looks like a flat sticker pressed into the wall. Reveals that the quad is flat. Kills the effect.

This is the single biggest tell that particles are fake.

Soft Particles. The Fix.

Soft particles solve the intersection problem. You fade the particle's alpha toward zero as it gets close to the geometry behind it. No more hard cut. Smooth fade instead.

The illusion. A particle cannot pass through a wall anymore because it fades out before it touches the wall.

How It Works

In the particle's fragment shader you need two values.

  1. Particle depth. The distance from the camera to this particle pixel.

  2. Scene depth. The distance from the camera to whatever solid geometry is behind this same pixel, read from the depth buffer.

Subtract the two. The result is how far the particle sits in front of the wall.

If the distance is big, keep full alpha. If the distance is small, fade the alpha toward zero. A smoothstep function handles the curve. Smoothstep is a function that smoothly ramps from 0 to 1 between two edges with an S shape, no sharp transition.

delta = sceneDepth - particleDepth
softAlpha = smoothstep(0, falloffRange, delta)
finalAlpha = particle.alpha * softAlpha

falloffRange is a tunable number. Small values mean the fade happens right at the wall. Big values mean particles fade out even when kind of far from the wall. You tune per effect.

The Depth Buffer Catch

Reading from the depth buffer does not give you real world distance. It gives you a non-linear value between 0 and 1 optimized for precision near the camera. You have to linearize it. Convert back to actual view space distance using the camera's near and far planes.

The formula is algebra. Not scary. Three.js has a helper, in raw WebGPU you do it yourself.

Once you have linear depth, the math above just works.

Particles In Three.js

Three.js gives you two paths.

THREE.Points

A built in. Each particle is a point primitive, a GPU feature where one vertex becomes a square sprite in screen space automatically.

Fast. Simple. No quad geometry to manage. You set positions in a buffer and Three.js draws them.

The downsides. Size is in pixel units, not world units, which is awkward for 3D effects that need to feel scaled. You get one size per particle. And you cannot use arbitrary geometry, only square sprites.

Instanced Quad Meshes

You build a quad geometry once. You instance it, drawing the same quad thousands of times with per instance data for position, size, color, rotation. Each instance is a real quad in world space.

More work to set up. More control. Works for arbitrary shapes, ribbons, trails, mesh particles. Better for real VFX systems.

WebGPU Changes Everything

Traditional Three.js particle code was CPU driven. JavaScript looped over every particle every frame, updated its position, uploaded fresh data to the GPU. Fine below 1000 particles. Painful above 2000. Dead above 5000.

WebGPU compute shaders change the story. Compute shaders are GPU programs that run in parallel over buffers of data. Move the particle simulation into a compute shader and JavaScript never touches per particle state again. The GPU advances 100 000 particles each frame in one dispatch with no perf drop.

The bottleneck shifts from simulation to sorting. And sorting moves to a bitonic compute pass too.

This is the path a real WebGPU VFX library takes. Compute shader simulation. Instanced quad rendering. GPU sort for alpha blending. Soft particles via depth buffer fade.