Showing particles in a screen-space overlay canvas

At King we make use of Unity in a lot of prototyping projects. Over the last few weeks I have been working in a team building a prototype to take through the greenlight process. This is the process a game goes through at King when it is trying to get the backing to move into production and eventually get released. Therefore most of my time at the moment has been spent working in Unity focusing on fast iteration over scalability and optimisation. Here is a short piece sharing a problem I came across and the solution I came up with whilst using Unity to hack together a prototype!

Recently we had to come up with a way of showing particles on top of a canvas set to render as an overlay in a Unity UI system. Anyone with any experience of creating a UI in Unity might immediately be thinking: why not just set the render mode to Screen Space – Camera and move on with your life?

Well, there are several reasons you may want to use an overlay canvas instead of screen space. Maybe you will be editing your camera’s settings in game and don’t want this to affect your Canvas. Or maybe you started your prototype three months ago and don’t want to go through the hassle of reorganising your entire UI for the sake of a one-second particle effect.

Who knows?

The point of this post is not to argue for everyone to use the overlay render mode (if anything it hints at the opposite!) instead it is to share a little trick I used to get particles displaying in front of my UI in an overlay canvas.

For those who are unfamiliar with the different render modes of canvases in Unity you can read up on them here: https://docs.unity3d.com/Manual/UICanvas.html. The gist of it is that overlay canvases are extremely easy to work with and allow you to scale your canvas with the screen size and not be affected by camera settings, but they do not consider depth and don’t allow you to render your UI behind objects in your scene, or behind particles.

This left me in a bit of a predicament when an artist asked for a particle effect in a rather complex area of our UI, all of which was being rendered in a canvas set to render as an overlay. I initially tried just banging a particle system into our current canvas and changed the rendering mode to Screen Space – Camera. Unfortunately, due to the short-sightedness of prototype development, that didn’t go down very well. With a lot of our scaling and anchor points becoming messed up it was apparent that it would be a lot of work to reorganize this canvas to work with a new render mode.

Instead I decided to make use of render textures. Render textures in Unity are textures that a camera can render to directly. So, if you wanted a real-time video of an area of your game appearing in your UI, you can set up a camera to render to a new render texture. Then display that texture within your UI on a RawImage component.

For my use case I needed to have a camera setup looking at a particle system, and rendering to a render texture, which I could then display in my overlay canvas. Problem nearly solved!

You can see the setup of my particle camera below, along with the RawImage component to which I am rendering. Where my particle camera is rendering to my target texture “Player Render Texture.”

ParticlesInAnOverlayCanvas2

The only other thing left to contend with is we wanted this effect to happen in different places and at different times depending on where a player tapped. I created a ParticlePlayer MonoBehaviour, that was a simple class that looked like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
public class ParticlesPlayer : MonoBehaviour
{
  public ParticleSystem particles;

  public void ShowParticles(int count = 10)
  {
    particles.Emit(count);
  }

  public void StartContinuousEmission()
  {
    particles.loop = true;
    particles.Play();
  }

  public void StopEmission()
  {
    particles.loop = false;
    particles.Stop();
  }
}

This would allow me to toggle my particles on and off if I needed them or emit a certain amount for a given effect. This component was placed in the parent of my particle system at which my camera was pointing. To keep it simple I saved this as one prefab – this can be seen here:

ParticlesInAnOverlayCanvas3

Next I needed some way of moving my particle system to a desired position. One approach would be to have our render texture stretched to appear in front of our entire UI canvas, and translate the position we want the particles to appear to the position the particle system needs to appear in front of our camera. Another approach would be to have the position of the particle system in front of the camera remain constant, then move our RawImage component around on the canvas. Because I only needed one simple particle effect I decided to take this approach. I created a ParticleDisplayer component which would move around my render texture as needed:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
public class ParticlesDisplayer : MonoBehaviour
{
  public RectTransform imageTransform;

  public void ResetPosition()
  {
    imageTransform.anchoredPosition = Vector2.zero;
  }

  public void MoveToPosition(Vector3 pos)
  {
    imageTransform.position = pos;
  }
}

So now I have what I need. One component to control playing our particles and another to ensure they appear in the correct place. But I’m still not 100% happy with this solution.

Currently if I want to display a particle effect in my overlay canvas I will need a reference to my ParticlePlayer and ParticleDisplayer objects. Personally, I’m not a fan of this, I don’t think it’s ideal to be holding references to objects you will only need occasionally and I don’t think it’s ideal that you require a direct reference to use objects that will need to be used by several different classes. There are several opportunities for error here, null references because maybe you forgot to assign the reference in the editor. Or maybe you have some instantiated objects that want to display particle effects and they lose reference to the player and displayer? Obviously, you can fix both of these problems relatively easily but I decided to use a more foolproof approach.

I created a static class OverlayParticles. In this class I would hold references to my particle player and displayer:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
public static class OverlayParticles
{
  private static ParticlesPlayer player;
  private static ParticlesDisplayer displayer;

  public static void IntializeCheck()
  {
    if(player == null)
    {
      player = GameObject.FindObjectOfType<ParticlesPlayer>();
    }

    if(displayer == null)
    {
      displayer = GameObject.FindObjectOfType<ParticlesDisplayer>();
    }
  }

  public static void ShowParticles(int count)
  {
    IntializeCheck();

    displayer.ResetPosition();
    player.ShowParticles(count);
  }

I can then just make calls via this static class from anywhere in my code to show my particles:

1
OverlayParticles.ShowParticles(100, somePosition);

This to me is much nicer than holding a reference to an object; it cuts down on the chance of errors through null references, cuts down on the amount of repeated code (in our case two calls, and two references have turned into one line!), and increases readability.

In conclusion, I think I came up with a pretty decent workaround for displaying particles on top of an Overlay Canvas. There are a couple of areas where I would like to expand this in the future: possibly toggling on and off the camera and render texture when they are not being used to be more performant. I’d also like to consider allowing multiple particle emitters and having multiple effects happen at the same time.

Though I think there is an argument that if your UI is that dependent on particles, you should consider using an alternative render mode rather than overlay for your Canvas.

All in all, I was pleased with this as a quick solution to get particles appearing in a place they’re not meant to appear. Please let me know what you think!

You can find an example project showing how all this works here:

https://github.com/ljackso/ParticlesInAnOverlayCanvas.

Luke Jackson

About Luke Jackson

I’m a Game Developer from London. I first joined King as an intern back in the summer of 2013, I then came back permanently in 2015 after finishing my degree in computer science. I have worked on several games during my time at King, some released, some not! Most recently I have been spending my time working in teams prototyping new game ideas. This has led me to spend a lot of time working in Unity and thinking about how best to prototype and validate new game ideas rapidly. I’m passionate about programming, a big fan of agile development, and a game lover.

Leave a Reply

Your email address will not be published. Required fields are marked *