Ralph de Horde
Table of contents
- Goals and scope
- Research volumetric clouds
- Noise generation
- Volume marching
- Conclusion & Reflection
Because of an assignment for Game Lab I was walking around in the virtual world of Red Dead Redemption 2, I was looking for graphics optimizations that were used. While looking in the world I of course could not miss the amazing volumetric clouds and weather system they have implemented. I am still amazed at how they managed to get virtual clouds to look just like real world clouds. That’s when I knew I wanted to do research and create my own volumetric clouds.
Of course Red Dead Redemption 2 is not the only game that figured out how to create realistic looking volumetric clouds. Horizon Zero Dawn also had an amazing cloud system, which they published online. This helped me understand and develop my own clouds.
2. Goals and scope
For this project I wanted to try and create volumetric clouds just like in a triple A game. Of course a complete weather system is out of the question because that would be beyond the scope of just volumetric clouds. I just wanted to achieve realistic looking clouds.
My personal goal for this project were to get deeper insight into shader development so that I can apply shader development in my own games / projects. As well as learning how volumetric clouds work and how they can be so realistic without compromising too much performance.
Initially I wanted to randomly generate clouds where the clouds are round on top and bottom, with shadows on the clouds. After multiple weeks of progress and research I managed to render volume that gets carved out by multiple layers of random noise. I would have liked to get shadows working, this was too time consuming and maybe too complex in the given time frame.
3. Research volumetric clouds
Before making my own volumetric clouds I needed to understand what are currently the ways clouds can get rendered and what can volumeteric clouds do that other techniques can’t do.
One way to render clouds is to have a dome over the game world that displays a texture with clouds (Mukhina & Bezgodov, 2015, p. 699). This is very fast and with the implementation from Mukhina en Bezgodov this was at most 2ms for a desktop with an average graphics card.
Of course there are the cloud imposter these are not real clouds, but use just like regular imposters different levels of detail to display realistic looking clouds. Just like way to render clouds with a dome it just uses 2D textures.
Then there are the volumetric clouds. Clouds you can fly through, actual volume with shadows and other features. These look almost like real clouds, everyone is striving for the most realistic games and volumetric clouds are definitely the most realistic clouds. So implementing these clouds in a game gives it a boost in visual quality.
Out of a research thesis (Haggström, 2018) 3-dimensional clouds can be implemented on consumer hardware. This imlementation uses both 3D and 2D textures with mathematical functions to create realistic clouds. The main technique used to render these clouds is called raymarching, raymarching is a technique where a ray is shot at the volume and steps through the volume keeping track of the volume and calculate the lighting.
My implementation tries to follow the research from Haggström (2018). In this implementation 3 textures should be used to form the shape of the clouds. A 2D texture for the weathermap, this stores the location where clouds can form in the RGBA channel. A 3D texture for the shapemap, this stores the general shape of the cloud. In this texture the red channel uses layered perlin-worley noise, the blue channel uses layered perlin noise, and the green and alpha channels use higher frequency layered worley noise. Besides the shapemap there is a lower resolution 3D texture called the detailmap, this texture stores the smaller details that change how to clouds look at the edges. This texture only uses the RGB channels, all of which contain layered worley noise. After generation all these textures an algorithm called raymarching is used to find density in the cloud.
4.1 Noise generation
Initially I used the CPU for rendering the 3D noises. This worked but with a 3D texture that has a size of 128x128x128 pixels it would get really really slow. So this had to change to something faster. While doing some research I found a video from Sebastian Lague (2019, 03:15-05:21) where he also renders volumetric clouds, he used the compute shader to render all pixels in parallel. So I was going to look for an optimized way to implement random noise for the compute shader. I found a way to implement the worley noise for glsl (Gonzalez Vivo, 2015), I converted it to hlsl and got it working.
Optimized way for HLSL to render worley noise on the GPU. This implementation divides the texture up in a grid and uses a single random position for every voxel on the grid to compare the distance between each neighbour
4.2 Volume marching
After I had the noises I could start using the noises to render the 3D volume of the clouds. Most of this was done on the fragment shader because I needed to check for every pixel if the cloud was in view. A single ray has to keep track of how much density it comes across, if the density is greater than 0 that means it has encountered a cloud and will change the opacity of the background. For every step of the ray through the cloud another ray wil shoot at the sun and also keep track of the density encountered. If there is a lot of density in the ray to the sun that means the position from which it started must be in shadow. Thus changing the final color of the cloud (darker). This algorithm can track where the cloud is dense and how much light falls onto it. Unfortunately I couldn’t get the shadows rays working so my prototype does not have shadows.
After all this coding my final clouds can be changed using the noise generation. In the camera I can change density, scale, global coverage, and more other values.
Conclusion & Reflection
This project has been very interesting, I learned a LOT! Noise generation, compute shaders, more about shaders in general, more Unity features, ray marching, and of course a lot about how volumetric clouds work and can be implemented in a real world project.
The time I had for the R&D were only 4,5 weeks (excluding resit) so this was definitely not enough to finish a complete volumetric cloud system. There are many things I would want to finish if I had a bigger timespan for this project.
For example, the shadows in the clouds are in my opinion mandatory if its used in a game, but for how complex volumetric rendering is I think that the time we were given just wasn’t quite enough to get it working. Besides the shadows there are other things such as Beer’s law, Harvey-Greenstein phase function, in-out scattering, weathermap (to manage where clouds can spawn), etc. As well as optimizing performance. In short, a whole lot of other complex features that I can implement in a future project.
Gonzalez Vivo, P. (2015). The Book Of Shaders. Geraadpleegd op 19 januari 2021, van https://thebookofshaders.com/12/
Haggström, F. (2018). Real-time rendering of volumetric clouds. Geraadpleegd van http://www.diva-portal.org/smash/get/diva2:1223894/FULLTEXT01.pdf
Mukhina, K., & Bezgodov, A. (2015). The Method for Real-time Cloud Rendering. Elsevier, 66, 697–704. Geraadpleegd van https://www.sciencedirect.com/science/article/pii/S1877050915034286
Schneider, A., & Vos, N. (2015). The realtime volumetric cloudscapes of: Horizon Zero Dawn [Presentatieslides]. Geraadpleegd van http://advances.realtimerendering.com/s2015/The%20Real-time%20Volumetric%20Cloudscapes%20of%20Horizon%20-%20Zero%20Dawn%20-%20ARTR.pdf
Sebastian Lague. (2019, 7 oktober). Coding Adventure: Clouds [Videobestand]. Geraadpleegd van https://www.youtube.com/watch?v=4QOcCGI6xOU