Code

Code from this week is up at this google drive folder!

Coordinate Space Transformations

We had to juggle between world space where our deform controls were and local space where the mesh vertices are defined. That's one of the things the Transform Component is designed to do -- particularly the TransformPoint and InverseTransformPoint methods. Coordinate system mismatches are tricky because they do not result in error messages -- just weird / broken behavior, so always keep in mind what coordinate space your vertices are in and make sure they match up!

Animation Curves

This week introduced one of Unity's most powerful features -- Animation Curves. The name is misleading -- you can use them for a lot more than animations. Think of them as an interface to draw an arbitrary curve in the editor you can evaluate at runtime with the Evaluate method. They can replace the pile of Sin/Cos function calls we've been using all semester. Make use of them!

Visual Effects Graph

Our focus was the Visual Effects Graph -- Unity's approach to GPU powered particle systems. Remember: It has to be enabled from the package manager by navigating through Window > Package Manager > Unity Registry and searching for Visual Effect Graph and clicking Install.

It is a lot like Shader Graph but different in a few key ways.

Properties and Attributes

VFX Graphs are controlled by properties and attributes, which are accessible by default in the top left corner and can be revealed by pressing the <*> button. Properties are like Shader Graph properties -- they act like Uniforms to the underlying shaders. Think of them as the external interface to the visual effect. Unity will expose an inspector widget so you can provide values to your properties. Property values are the same for all particles.

Attributes are more like the attributes we worked with in the first half of the semester -- the arrays of numbers that our shaders looped over to produce our visuals.They provide the per-particle information which is another way of saying that attribute values are unique to each particle. Unity provides you with a lot of built in attributes that are common to most particle systems -- each particle needs a position, size, age, lifetime, etc. You can make your own if you need to track extra information per-particle or just use the built in ones.

Vertical Processing Flow

VFX Graphs are different from Shader Graphs in that they have a vertical aspect to them. The nodes that connect vertically make up the Processing Flow. This controls the lifecycle of the particles and you can think of them as the setup, update, and draw "functions" of the particle system. Unity calls these blocks "contexts" and they generally stick to the following squence:

  1. Start with a Spawn Context represents the event of a particle spawning
  2. Connect to an Initialize Context which runs once and sets up the particles initial attribute values
  3. Connect to an Update Context which runs every frame and can update attribute values. Note that even if you do not modify any attribute values you should still have an update context for the particle system to work correctly.
  4. Connect to an Output Context that controls how the particle is rendered each frame. There are a few different output contexts and they offer different ways of rendering your particles

Contexts can have blocks placed inside them that run in order, top to bottom. Click on a context and -- while your cursor is hovering over it -- press space to pull up a popup menu of available blocks. For example the screenshot above is an Initialize Context with a block to set the position to a location on a Cube mesh and a block to set the lifetime to a random value.

You can have as many of these vertical blocks as your effect needs, limited only by your GPU. You can also connect the contexts in interesting ways -- experiment! The node editor will not let you make an invalid connection.

Horizontal Value Flow

In addition to specifying the lifecycle of your particles vertically, VFX Graph lets you specify how data flows into your blocks' parameters horizontally. Values can begin as properties, as above, or attributes, as below, and the values can be transformed by other nodes until they connect to a socket on a block, feeding that value into that stage of the particle's lifecycle.

This way you can connect your particle systems with values coming in from the rest of your game via properties or with values computed earlier in the pipeline via attributes. This part of VFX Graph is most like Shader Graph.

Connecting to Shader Graph

Speaking of which -- any Shader Graph that has Support VFX Graph checked can be used in a VFX Graph output node! You can even control the Shader Graph's properties from VFX Graph, which gives you full control over how the particles move and change overtime and how they are rendered!

The Shader Graph we made in class exposes a "Fade" property that is being driven by the Age over Lifetime of the VFX Graph.

Events

Finally -- VFX Graphs can be given named event nodes you can trigger from your C# code with SendEvent. Without events the particle system will start automatically and loop forever. The simplest arrangement is to control when the spawn starts and stops but other more advanced integrations are possible, too.

The "Built-In" Particle System

In version 4.0 Unity introduced a particle system that at the time they called "Shuriken" but now just call the "Built In Particle System". We did not look at it this week but it is good to know about. Unity has a page comparing the Built In Particle System and the Visual Effects Graph and their summary of when you would pick one over the other is:

Resources

An amazing thing about VFX Graph is that it's been around for long enough that there's an enormous amount of resources for it online. Armed with a conceptual understanding of the system you should be able to turn these videos into working effects! The first video was the main basis for the fire we did in class, and the others are also informative.