This week we looked at full screen shader effects and render targets. We explored rendering a scene to a frame buffer in memory and feeding that frame buffer into another draw call to perform effects on every rendered pixel.

These concepts show up in a lot of engines and frameworks. Unity calls render targets RenderTextures and Three.js calls them WebGLRenderTarget (they have a nice tutorial over here).

The approach of chaining renders like this is common enough that you will also find that in a lot of engines -- in Three.js its the EffectComposer, in Unity it depends on which rendering engine you use, but the Post Processing Stack is the closest. Post-processing does a lot of work to polish your games and make them look "finished" and they pair very well with the particle systems and procedural generation we've already looked at.

The effect we started in class was bloom which is a very common post-processing effect in games (Three.js, Unity). Other post-processing effects are also available, such as those listed as examples on the Three.js EffectComposer documentation and this old blogpost of mine.

We also looked a bit at easing functions on easings.net. When trying to "sculpt" values, in addition to playing around on desmos it is common to go "shopping" for a graph that is shaped the way you want. Again, this is common enough that you will find them bundled into libraries like this one.

One easing function we looked at is built into glsl -- smoothstep. The syntax is:

smoothstep(edge0, edge1, x);

Where

The result is always a value between 0 and 1! if x is less than edge0, the result is 0, if x is greater than edge1 the result is 1, if x is between edge0 and edge1 the result is a smooth blend between 0 and 1, following this shape: