Generating a cartoon style waterfall stream.

By: Aaron Ligthart

Table of contents

  1. Introduction
  2. Simulating Liquids
    1. Ray-marching and liquid blobs.
    2. Time to focus, a new goal.
  3. Mesh generation
    1. Generating a mesh along a line
    2. Creating a 2D slice editor
    3. Adjusting the UV’s and normals
  4. Shaders.
    1. Toon shading
    2. Creating small waves
    3. Adding foam edges
    4. The illusion of falling water
    5. Edge foam distortion
  5. End result
  6. Sources

1. Introduction

In a time before physically based rendering was a thing in Unity, I always felt limited as a game artist with the standard shaders I had to work with. Now a game programmer I like to tackle technical and visual tasks. But I never had the time to get more deeply into shaders.

At the beginning of the game-lab project I was faced with some cool challenges through the boot camps. I liked messing around with mesh generation, the technical challenges with dungeon generation and I really liked writing my very first shader without the use of shader graph.

With so many interests I found it difficult to decide what to research.
At first, I wanted this study to be about special effects, but that would be too bland or general. So I decided to learn how I could generate realistic moving water, that would be able to turn into ice and gasses, combining both shaders and mesh generation. But I soon found out how challenging this could be.

In this study I will take you through my adventure of learning about water generation, redefining scopes, creating meshes and writing HLSL shaders.


2. Simulating Liquids.

Illustration 1. Castle crashers 3d render

My goal for this study was to learn how I could simulate water going from a liquid to a gas or solid.
The inspiration for this came through a 3d render made by  (Chahin, 2019). The way the gasses and water splatters were made
Looked like a form of meta balls.
So the first step was to figure out how to create these shapes.


2.1 Ray-marching and liquid blobs.

One way These blobs could be made was through mesh generation. But an easier way to get the same result would be through ray marching. As explained by J. Wong, ray marching is an algorithm that sends out view rays and returns a distance based on mathematical functions. (Wong, 2016)

Using these functions, shapes can be created and later be coloured through a shader. Following through with a video by Lague (2019) I was able to recreate a meta ball effect which I could use to simulate coloured water droplets moving around each other, as seen at video 1.

Video 1. First meta ball test

This looked cool! But to create realistically moving water I needed more than just these 3 shapes. So I decided to stress test this system to see how much it can render. As seen in video 2 I added 20 balls and added physics to see how well they would react with each other. The simulation started off fluidly, but soon my frames would drop from 60 to 10 fps. Multiple balls were close to each other, and they were all effecting each other. In the end the ray-marcher took more than half a minute to render a new frame and eventually ended up crashing Unity.

Video 2: second meta ball test with physics.

2.2 Time to focus, a new Goal.

With the meta ball experiment falling through I had to look for another way to simulate the water. After some research I came across some articles about simulating water in the Unreal Engine and the mathematics behind NVIDIA’s render tests. But I soon realized the amount of maths and calculations to make my original goal work would not fit within my scope and the timeframe. So I had to take a step back, set a new goal and focus on creating something else.
I still wanted to do something with generating meshes and making fluids move. So I decided on a new goal:

How to generate a cartoon style waterfall stream.

I needed to have a clear goal so I could focus and make up for some lost time, so I created 2 art boards to help me in the right direction, a collection with good examples of what I’d like to recreate, and a collection with bad examples which I want to avoid.

The waterfall will exist out of 4 parts:

  1. The water stream
  2. The foam on top.
  3. The foam at the bottom
  4. The water surface at the bottom.

3. Mesh Generation

With the main goal of creating a water stream I had to decide if I wanted to go and model a waterfall myself, or if I wanted to generate the stream, so I had more control on the shape and the amount of vertices.
In the boot camp the concept of loft shapes was introduced. To generate my water stream I had to find a way to define a path so I could extrude a 2d slice to create a 3d model

3.1 Generating a Mesh along a line

I created a path for my stream by using Cubical Bézier curves.
A cubical bezier curve is a line drawn through 4 points that interpolate with each other as shown in illustration 4 (CartouCHe, 2012)

illustraion 4: Cubic bezier curve going through 4 points (CartouCHe, 2012)

Through an introduction of mesh generation by F. Holmer (2019) I was able to figure out how to write a mesh generator. I created a scriptable objected called mesh2D in which I could enter mesh data. This mesh data then gets used by the mesh generator to draw triangles between all the different points.

By dividing the curve into segments I was able to create low poly as well as high poly versions of my mesh. Which will come in handy later if I have to optimize my vertex shader by adding LOD’s.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEditor;
using System.Linq;

[RequireComponent(typeof(MeshFilter))]
public class GeomitryGenerator : MonoBehaviour
{
    [SerializeField] ShapeGenerator shape2D;
    [Range(0, 64)]
    [SerializeField] int segmentCount = 8;

    [SerializeField] Transform[] controlPoints = new Transform[4];
    Vector3 GetPosition(int i) => controlPoints[i].position;
    Vector3 GetScale(int i) => controlPoints[i].localScale;


    Mesh mesh;

    private void Awake()
    {
        mesh = new Mesh();
        mesh.name = "Segment";
        GetComponent<MeshFilter>().sharedMesh = mesh;
    }

    private void Update()
    {
        GenerateMesh();
    }

    void GenerateMesh()
    {
        mesh.Clear();

        float vSpan = shape2D.CalculateVSpan();
        List<Vector3> vertices = new List<Vector3>();
        List<Vector3> normals = new List<Vector3>();
        List<Vector2> uvs = new List<Vector2>();


        for (int segment = 0; segment < segmentCount; segment++)
        {
            float t = segment / (segmentCount - 1f);
            OrientedPoint orientedpoint = getBezierPoint(t);

            for (int i = 0; i < shape2D.vertices.Count; i++)
            {
                vertices.Add(orientedpoint.LocalToWorldPosition(shape2D.vertices[i].point* new Vector3(orientedpoint.scale.z, orientedpoint.scale.y, orientedpoint.scale.x)));
                normals.Add(orientedpoint.LocalToWorldVector(shape2D.vertices[i].normal));
                uvs.Add(new Vector2(shape2D.vertices[i].u, t * GetLength() / vSpan));


            }
        }

        List<int> triangleIndices = new List<int>();
        for (int segment = 0; segment < segmentCount-1; segment++)
        {
            int rootIndex = segment * shape2D.VertexCount;
            int rootIndexNext = (segment+1) * shape2D.VertexCount;

            for (int line = 0; line < shape2D.LineCount; line+=2)
            {
                int lineIndexA = shape2D.lineIndices[line];
                int lineIndexB = shape2D.lineIndices[line+1];
                int currentA = rootIndex + lineIndexA;
                int currentB = rootIndex + lineIndexB;
                int nextA = rootIndexNext + lineIndexA;
                int nextB = rootIndexNext + lineIndexB;

                triangleIndices.Add(currentA);
                triangleIndices.Add(nextA);
                triangleIndices.Add(nextB);
                triangleIndices.Add(currentA);
                triangleIndices.Add(nextB);
                triangleIndices.Add(currentB);

            }

        }

        mesh.SetVertices(vertices);
        mesh.SetTriangles(triangleIndices,0);
        mesh.SetNormals(normals);
        mesh.SetUVs(0,uvs);

    }


    public void OnDrawGizmos()
    {
        for (int i = 0; i < 4; i++)
        {
            Gizmos.DrawSphere(GetPosition(i), 0.1f);
        }

        Handles.DrawBezier(
            GetPosition(0), 
            GetPosition(3), 
            GetPosition(1), 
            GetPosition(2), 
            Color.blue, EditorGUIUtility.whiteTexture, 1f
            );

        Gizmos.color = Color.green;

    
        Gizmos.color = Color.white;


    }

    OrientedPoint getBezierPoint(float t) {
        Vector3 p0 = GetPosition(0);
        Vector3 p1 = GetPosition(1);
        Vector3 p2 = GetPosition(2);
        Vector3 p3 = GetPosition(3);

        Vector3 pA = Vector3.Lerp(p0, p1, t);
        Vector3 pB = Vector3.Lerp(p1, p2, t);
        Vector3 pC = Vector3.Lerp(p2, p3, t);

        Vector3 pD = Vector3.Lerp(pA, pB, t);
        Vector3 pE = Vector3.Lerp(pB, pC, t);

        Vector3 s0 = GetScale(0);
        Vector3 s1 = GetScale(1);
        Vector3 s2 = GetScale(2);
        Vector3 s3 = GetScale(3);

        Vector3 sA = Vector3.Lerp(s0, s1, t);
        Vector3 sB = Vector3.Lerp(s1, s2, t);
        Vector3 sC = Vector3.Lerp(s2, s3, t);

        Vector3 sD = Vector3.Lerp(sA, sB, t);
        Vector3 sE = Vector3.Lerp(sB, sC, t);

        Vector3 position = Vector3.Lerp(pD, pE, t);
        Vector3 scale = Vector3.Lerp(sD, sE, t);

        Vector3 tangent =  (pE - pD).normalized;

        return new OrientedPoint(position, tangent, scale);

    }

    float GetLength(int precision = 8)
    {
        Vector3[] points = new Vector3[precision];

        for (int i = 0; i < precision; i++)
        {
            float t = i / (precision - 1);
            points[i] = getBezierPoint(t).position;
        }

        float distance = 0;
        for (int i = 0; i < precision-1; i++)
        {
            Vector3 a = points[i];
            Vector3 b = points[i+1];
            distance += Vector3.Distance(a, b);
        }
        return distance;
    }

   

}

3.2 Creating a 2d slice editor

With the mesh shape defined and having a path to extrude it along it was time to play around with the shapes a bit. But with the way it was previously built I had to manually insert points, calculate the normal direction and find out where the U coordinates should take place. This could be done with some simple shapes but would take too much time with more complex shapes, and it wouldn’t be maintainable in the long run.
So I created a tool that will let me define points on an axis and automatically calculate the normals.

My 2d shape editing tool, wich lets you create and edit points along the XY axis so it can be used as a slice in the mesh generator

With the set Point and normals method I calculate the new x and y position of a point but also calculate the normals between 2 sets of points.
Each point in the shape has 2 vertices, so I can create a hard edge between the 2 triangles. I had to write it in 3 parts because in the editor you can also adjust the points by dragging them around. If you move the first or last point in the list it has to make sure it grabs the beginning or the end of the shape again

void SetPointAndNormals(int index, Vector3 position)
{
shapeGenerator.vertices[index - 1].point = position;
shapeGenerator.vertices[index].point = position;
    
if (index - 2 > 0)
    {
        float dx = shapeGenerator.vertices[index - 1].point.x - shapeGenerator.vertices[index - 2].point.x;
        float dy = shapeGenerator.vertices[index - 1].point.y - shapeGenerator.vertices[index - 2].point.y;
        Vector2 normal = new Vector2(-dy, dx);
        shapeGenerator.vertices[index - 1].normal = normal;
        shapeGenerator.vertices[index - 2].normal = normal;

    }
    if (index != shapeGenerator.vertices.Count - 1)
    {
        float dx = shapeGenerator.vertices[index + 1].point.x - shapeGenerator.vertices[index].point.x;
        float dy = shapeGenerator.vertices[index + 1].point.y - shapeGenerator.vertices[index].point.y;
        Vector2 normal = new Vector2(-dy, dx);
        shapeGenerator.vertices[index + 1].normal = normal;
        shapeGenerator.vertices[index].normal = normal;
    }
    else
    if (shapeGenerator.closeShape && index == shapeGenerator.vertices.Count - 1)
    {

        float dx = shapeGenerator.vertices[0].point.x - shapeGenerator.vertices[index].point.x;
        float dy = shapeGenerator.vertices[0].point.y - shapeGenerator.vertices[index].point.y;
        Vector2 normal = new Vector2(-dy, dx);
        shapeGenerator.vertices[0].normal = normal;
        shapeGenerator.vertices[index].normal = normal;
    }

3.3 Adjusting UV’s and normals

I got stuck on 3 issues while calculating the UV’s and normals of the mesh.
First the U coordinates would match the points Bézier curve which caused most of the coordinates to be distributed around the corners instead of universally. The same applied. Then the V coordinates would also not fit around the object perfectly. The third problem was when I added lighting the edges of the shape were clearly visible because the normals were not smoothed between 2 connected points.

By calculating the length of the curve along its length and width I was able to distribute the UV coordinates more uniformally.

 public void  SetUData()
    {
        float totalDistance = 0;
        for (int i = 0; i < shapeGenerator.vertices.Count; i++)
        {
            shapeGenerator.vertices[i].u = totalDistance;
            if (i == shapeGenerator.vertices.Count - 1)
            {
                totalDistance += Vector2.Distance(shapeGenerator.vertices[i].point, shapeGenerator.vertices[0].point);
            }
            else
            {
                totalDistance += Vector2.Distance(shapeGenerator.vertices[i].point, shapeGenerator.vertices[i + 1].point);
            }
        }

        for (int j = 0; j < shapeGenerator.vertices.Count; j++)
        {
            shapeGenerator.vertices[j].u = j != 0 ? shapeGenerator.vertices[j].u/ totalDistance : 1;
            Debug.Log(shapeGenerator.vertices[j].u);
        }
    }
By using the length of the 2d shape, and the length of the Bézier curve I was able to fit the texture around the 3d shape


I did have to get started on writing my shaders. So I decided to let these hard edges work in my favour when creating the toon shader.


4. Shaders

With around a week and a half left for my research I had to focus on how I wanted my waterfall / stream to look.

4.1 Toon Shading

I created a simple shader that would use a gradient to display colour on an object. I used another greyscale texture with a hard edge to create a hard shadow. By using the ramp function I cut-off the lighter pixels towards the edge of the texture to make it look cell shaded.

Shader "Toon/Litvertex" {
	Properties{
		_Color("Main Color", Color) = (0.5,0.5,0.5,1)
		_MainTex("Base (RGB)", 2D) = "white" {}
		_Ramp("Toon Ramp (RGB)", 2D) = "gray" {}
	}

	SubShader{
		Tags { "RenderType" = "Opaque" }
		LOD 200

		CGPROGRAM
		#pragma surface surf ToonRamp vertex:vert addshadow

		sampler2D _Ramp;

		#pragma lighting ToonRamp exclude_path:prepass
		inline half4 LightingToonRamp(SurfaceOutput s, half3 lightDir, half atten)
		{
			#ifndef USING_DIRECTIONAL_LIGHT
			lightDir = normalize(lightDir);
			#endif

			half d = dot(s.Normal, lightDir)*0.5 + 0.5;
			half3 ramp = tex2D(_Ramp, float2(d,d)).rgb;

			half4 c;
			c.rgb = s.Albedo * _LightColor0.rgb * ramp * (atten * 2);
			c.a = 0;
			return c;
		}

		sampler2D _MainTex, ;
		float4 _Color;

		struct Input {
			float2 uv_MainTex : TEXCOORD0;
		};

		void vert(inout appdata_full v, out Input o)
		{
			UNITY_INITIALIZE_OUTPUT(Input, o);
		}

		void surf(Input IN, inout SurfaceOutput o) {
			half4 c = tex2D(_MainTex, IN.uv_MainTex) * _Color;
			o.Albedo = c.rgb + (IN.dispTex * _Color);
			o.Alpha = c.a;
		}
	ENDCG

	}
Fallback "Diffuse"
}
First version of the waterfall

I added my cartoon shader to my waterfall mesh and noticed that the texture repeated along the U coordinates. I ended up fixing that by adjusting the tiling to be set to -0.5, but I should look how I could clamp the texture to a minimum and maximum so that process goes automatically.

4.2 Creating small waves

With the waterfall in place I needed a plane for the water to fall into.
I created a vertex/surface shader based on the precious toon shader and the lessons I learned by reading up on some tips by minionsart, (n.d.). I moved the y position of the vertices based on a simple sinus wave.
I found the water to be too shallow/flat so I needed a way to create more depth in the water. I ended up rendering the main texture through the world space instead of the uv-cordinates which caused the water to become dark blue when it was low and light blue when it had reached the top.

Then as a final step I added a texture on top of the waves and made it move slightly to make it look as if there was foam on the water.


Video 4: Simple waves with depth based colors

	struct Input {
		float2 uv_MainTex ;
		float2 uv_Foam;
		float3 worldPos;
		float4 screenPos;
	};

	// #pragma instancing_options assumeuniformscaling
	UNITY_INSTANCING_BUFFER_START(Props)
		// put more per-instance properties here
		UNITY_INSTANCING_BUFFER_END(Props)

	void vert(inout appdata_full v, out Input o)
	{
		UNITY_INITIALIZE_OUTPUT(Input, o);
		v.vertex.y += sin(_Time.z * _WaveSpeed + (v.vertex.x * (v.vertex.z/2) * _WaveAmount))* _WaveHeight;

	}

	void surf(Input IN, inout SurfaceOutput o) {


		half4 color = tex2D(_MainTex, IN.worldPos) * _Color;
		half4 foam = tex2D(_Foam, IN.uv_Foam) * _FoamColor;


		
		color += foam;
		o.Albedo = color;
		o.Alpha = color.a;
	}

4.3 Adding foam edges

The next thing that needed to be done was to add some foam to the edges of the objects in the water. This ended up being harder than I imagined. But I remembered doing something similar before during the boot camp. So I used the eye depth proved by unity and used it to get some foam edges around the objects!

Video 5, Depth based foam at the edge of other objects
float4 projCoords = UNITY_PROJ_COORD(IN.screenPos);
		float rawZ = SAMPLE_DEPTH_TEXTURE_PROJ(_CameraDepthTexture, projCoords);
		float sceneZ = LinearEyeDepth(rawZ);
		float surfaceZ = IN.eyeDepth;
		float foamEdge = 1 - ((sceneZ - surfaceZ) / _FoamAmount);
		foamEdge = saturate(foamEdge) ;

		color += foamEdge * _FoamColor;
		color += foam;
		o.Albedo = color;
		o.Alpha = color.a;

4.4 The illusion of falling water

Originally I wanted to use a voronoi noise to make it seem like foam was falling of the waterfall. I looked up how Unity’s shader-graph generated voronoi noise and tried to replicate the same in my toon shader, but it ended up to be too heavy to calculate. So instead I chose to have my foam texture move along the length of the waterfall, but by tiling the foam around 0.2, moving it slightly along the U axis and rapidly around the V axis it created The effect off random streams of water falling down.

4.5 Edge foam distortion

The last thing I had to do was to add some foam distortion at the top of the waterfall and splashes at the bottom.

I started by adding a single white strip at the top of the waterfall. This already looked better but was looking a bit too static, so I needed to add some distortion so it would simulate the rapidness of the water falling down at that point.

A classmate of mine during this course (J. Min, 2020) had provided the class with some free noise textures we could use during to write out shaders with. So I ended up using a black and white texture to distore the the white strip. After trying out different textures and asking for some feedback I ended up with the current effect.

Video 6: Top foam distortion in action

At the end the only thing left for me was to create some splashes at the bottom. I was short on time, so I had to look for a way to create a cool splash effect. I ended up using a particle effect that would spawn a mesh from ellipse shape. By adding the depth calculations of the waving water shader I was able to add some foam to the edges of the particle meshes.
These particles don’t really fit the style I had in mind, One way to improve these particles would be to add a dissolve effect so that it would look as if they were bubbling or bursting open.


5. End result

The end result of my waterfall shader

In the end I am happy about the end result I was able to create. Having a tool that let me create my own mesh shapes really helped me with creating, testing and adjusting the shape of the water.

Having said that, there are also some parts I Don’t really like about the shader yet.
I think the edge foam doesn’t look good enough because you can clearly notice the difference between the 2d texture and the edges of a 3d object. I can improve that by using a render texture and a top down camera to get the depth and apply it as a 2d texture on top of my water.

Another aspect of my waterfall I’m really missing are the ripples and the splashes. I did put particle effects in my to don’t list but I think having some 2D water splashing coming from the contact point would really improve the overall effect.

I did learn a lot about writing shaders during this course, I feel more confident to pick up shader related tasks and issues now that I have the knowledge on how they work.

My next step is to learn how to write GLSL shaders for WebGL, so I can learn how to face optimization issues.


6. Sources

CartouCHe. (2012, January 26). Cubic Bézier Curve [Illustration]. E-Cartouche.Ch. http://www.e-cartouche.ch/content_reg/cartouche/graphics/en/html/Curves_learningObject2.html

Chahin, M. (2019, October 7). Castle crashers [Illustration]. Blendernation.Com. https://www.blendernation.com/headers/castle-crashers/

Freya Holmer. (2019, August 20). Procedural Geometry – An Improvised Live Course [Video]. YouTube. https://www.youtube.com/watch?v=6xs0Saff940

Min, J(2020, September, 14) Collection of noise textures
https://drive.google.com/drive/folders/1QW0JA_ChHudqvXHhqPRfm-PM3jtz1JLG?usp=sharing

Lague, S. (2019, April 5). SebLague/Ray-Marching. GitHub. https://github.com/SebLague/Ray-Marching

minionsart. (n.d.). Minionsart game-art tips. Minionsart.Github.Io/Tutorials. Retrieved October 26, 2020, from https://minionsart.github.io/tutorials/

Wong, J. (2016, July 15). Ray Marching and Signed Distance Functions. Zero Wind :: Jamie Wong. http://jamie-wong.com/2016/07/15/ray-marching-signed-distance-functions/

Related Posts