Hi Everyone, I’m Richard, A Junior games VFX Artist. Almost a year ago now I made the decision to switch from environment art to real-time visual effects. This blog is a chronicle of my journey. It began back in September to log my attempts at making better effects. This year I’m aiming to get more courageous and more extravagant. So, from now until I inevitably forget this blog exists and stop updating it, let’s get weird.
For this season I’m going to be living in the Unity Engine. To kick things off I wanted to take a chance to write up my transition from UE4 to Unity and hopefully guide others on how to find the check boxes that will break a project. Last week I demonstrated some of the techniques that helped me get acquainted with the more technical side of the engine. Things like using code to trigger animations and particle systems and setting up animators. This week we’re going to look at the artistic tools that build those effects.
What I’m looking at this week is:
· Quick reminder of the demo effect
· Unity’s Shader Graph
· The Particle System
· Post Processing Stack
A quick reminder of the demo effect
This is the effect I’ll be using to demonstrate Unity's features:
It’s the Pound land equivalent of the legendary hideout opening effect in sea of thieves.
It’s made up of 10 emitters, 8 materials, 4 shaders, 2 scripts and 1 animated Mesh.
Unity’s Shader Graph
First thing that needs to be highlighted with Unity’s shader system is the naming conventions of materials. In Unity, what would be a material in UE4, is called a shader and what would be considered a material instance is just a material. Unity’s shader graph system is barebones when compared to UE4, however a lot of the maths-based problems that get solved in the material graph can be translated to Unitys shader graph reasonably well.
For my effect I needed 4 separate unlit shaders:
· A basic Particle Shader
· A shader that could fade in and out and dissolve in and out.
· A shader that fades in from the sides
· A shader with UV distortion
I opted to make my own basic shader rather than use the inbuilt shaders so I can change the colour based on a variable rather than using the particle system. I’m still working in the pipeline I wrote about from the infinity war blog last year and I find it easier to edit the colours in the material rather than in particle editors. The setup for this shader is light.
The main differences between this and a UE 4 setup is that we’re using a Vertex colour node instead of a Particle colour node. The struct pins also aren’t already split on the Vertex colour node, so I need to put that in manually. Other than that, it’s mostly the same. That particle colour gets multiplied by a texture and then a separate colour property. Then the alpha gets multiplied by a texture alpha. The Master node is much lighter than UE4 master node, only including 3 of the main properties, colour, Alpha and Position (world position offset). It also includes a unique property which is alpha clip threshold. Alpha clip dictates when a pixel in the Alpha gets made invisible. Right now, it’s set to 0 so anything that’s black gets culled. If I raise that number to 0.5 all grey pixels get culled too.
A shader that dissolves in and fades out was a difficult challenge to solve. This is due to Unity not having an equivalent of the dynamic parameter node.
The solution I came up with is using the colours in the Vertex colour node to drive my custom parameters. I used the green channel to control the intensity of the colour which is what makes the skull flare up in the earlier gif. The Red channel was being used to control the strength of the noise texture I was using. When R is 0 then the noise texture is not being impacted by the smooth step and the black areas of the texture are untouched. A combination for the R and A channel rising from 0 to 1 is what gives the skull the impression of dissolving in. At the end of the particles lifetime the R stays at 1 allowing the A channel to go to 0 and cleanly fade out the skull.
The Fade in from the sides shader follows a similar setup to the basic particle shader however it uses the alpha dissolve technique of one minusing then subtracting by a gradient to well… Fade in from the sides.
This is one that I can probably make much more efficient if I took another look at it. Mostly by changing the gradient maths into a texture. For now, it works just fine.
Finally, the UV distortion shader. It looks scary but a lot of the elements are the same as the standard UV distortion shader.
The core of the shader is the noise texture copied 3 separate times at 3 different sizes and speed. Rather than the panner node Unity uses a tiling and offset node which handles the size, and directional movement of the texture.
This then gets multiplied by a property that I labelled distortion strength and added to the UV coordinates. The rest of the shader is just polish. Stuff like adding a gradient to slight fade out the tops of flames, using a power node to give the texture a hot colour and warm colour. Just small changes compounding together to make a better looking whole.
To then turn these 4 shaders from piles of spaghetti to actual functioning materials that I can use on my effects I need to right click the shader, hover over create, select material and it builds a material with that shader already selected as the parent. Exactly the same as making material instances.
The Particle System
Shuriken is the basic particle system that I used for the effect. It’s not the best, there are way better ones available on the Unity marketplace. Hell, there’s betters’ ones already in Unity, but I think there’s something special about it. To me It’s like the old Intuos graphics tablet buried in my closet. It’s got less features than almost everything else, but it’s the first particle system I’ve used. I’ve got a sense of freedom and nostalgia with it I don’t have with UE4. Taking Niagara and the shader graph out of the question and looking at only how UE4s Cascade compares with Unity’s Shuriken system, it’s a bit like comparing an Italian sports car to an American Muscle car. Cascade is all so precise, it has the speed and the power but it’s easy to get tied down editing decimals that in the grand scheme of things don’t make much difference.
With Shuriken, it’s different, there’s not as many numbers to edit. It’s all about the curves and not like cascades curves where it’s still heavily numbers based. The points aren’t inputted they get pulled around and bent. The system’s slower to get where you want the effect to go but it’s quicker to look at it in the scene. It’s not a file that gets made in the content browser, but a system built directly into the world. Everything is right there from the start, there’s no ferreting around drop downs for separate modules. It’s a case of selecting what you want and driving away.
For the effect I made use of 8 emitters all stacked in a hierarchy. Realistically only one parent is needed and the whole effect plays out in the previewer. I wanted to have a little more structure though, just to remind myself what I’m looking at.
6 of these emitters are emitting 1 mesh particle. They’re setups are similar, the changes that happen are when the material being used requires a separate touch or need different timings. For example, the flames don’t rely on any particle-based logic. They only need a size over lifetime to make them rise and a colour over lifetime to fade them away. The Skull on the other hand, well, only needed a colour over lifetime but it needed a pretty gradient to make everything work correctly.
The core area of the particle system is where the initials are dictated. The Initial size, initial lifetime, colour, delay, so on so forth. It’s also where someone can configure the simulation space, as in does the direction change when the emitter is rotated, even the maximum number of particles that can be emitted is changed in this area.
It’s the menu where an effect is given life. The drop downs on the panel control the animation of the particles, how they move whether they grow. Do they go straight forward, or do they follow a noisy path? Do they collide, do they birth sub emitters?
It doesn’t have any of the fancier features that cascade does, I can’t scale a colour, for example, I can only use a gradient over a particle’s lifetime, there’s no dynamic parameters or vector fields. If I want extra features, I must program them myself either in the shader, like I did with the skull or in the form of new modules. Which is where Unity’s programmer first workflow works in its favour. I’ve spent a lot of time in cascade, but I can’t tell how to add custom modules to it. In Shuriken’s system, it's easy.
It has the add component button that everything has. There’s no special way to add new stuff in. Cascades good. It’s precise. If I followed a tutorial for it, I will always get the exact same result because of how precise it is. With Shuriken, I will always get something that’s a little bit different because there is no one way of doing something.
This area is meant as just an introduction to the particle editor because while the particles I’m using look pretty, there isn’t that much going on with them so instead of talking about the colour over life module 8 times I thought I’d be better off writing a little comparison. In the next few blogs, I’ll be diving deeper and deeper and into the system as I start building more complex effects. Until then for more information, there are some amazing video introductions out there by Brackeys and Sirhaian’Arts.
Setting up the Post processing Stack
The post processing stack is a key tool in a VFX artists arsenal. It is vital for helping an artist make their effects look like they’re from another engine. In my scene I’m only using ambient occlusion and bloom and it’s made my scene go from looking like this:
It’s a brilliant feature and easy to implement. It’s also the reverse of my Shuriken/cascade argument. I prefer Shuriken because all the modules are already laid out for me. The UE4 post processing already has all the modules laid out, only there’s so many modules it’s confusing to use. The Unity post processing stack is much cleaner, the modules are laid out in a roll out menu of modules and because it’s mounted on a camera component, things can be changed on the fly through C# code.
Sykoo did a great video that gives an explanation on how to set up the post processing stack and what each of the modules do so I highly recommend checking his video out.
Unity doesn’t have a correct way of doing things. There’s a lot of technical based hoops that an artist is going to have to jump through to properly use it, but there’s also a lot of freedom in the toolset. There’s a lot less decimal wrangling. A lot of my time is spent moving points in a graph rather than inputting numbers. That is an incredibly exciting prospect for someone who likes to drag things across the screen and see pretty pictures update in real time. It’s very easy to trot in and sketch out some cool new ideas, even if those ideas are then going to be polished in UE4. It’s the perfect engine for rapid prototyping.
Hopefully whether you are brand new RT VFX artist, or you’ve been doing it for several years there was something here that helped you out. If you do find any creative uses for the things, I’ve talked about here please email me a gif at – RichardVFXfanmail@gmail.com or tweet me @stokes_richard. I’d love to see what you can create.
Please check back soon. Next time I’ll be pushing that shader graph further and trying to make something similar to Into The Spiderverse.