Terrain
In video games, rendering terrains is normally achieved using splat map technology and various modified techniques for building terrain meshes that support level of detail (LOD). LOD is crucial in games with a free-roaming camera, because it significantly improves performance. This isn’t relevant to our game, as our game is set at a fixed angle and we are always seeing the same LOD. Splat mapping makes it possible to blend several textures in layers: the splat map stores the weights of the various textures, while the terrain shader blends specific textures (e.g., dirt, sand, grass) based on these weights.
In Kingmaker, our terrain was atypical from a game dev perspective. This was because we wanted the world to look hand-painted. To achieve this, our artists added hand-painted details on top of the splat map terrains. The ground in open locations and the flooring in dungeons were the standard models created by our artists in Maya. The process of importing them into our engine was a real experiment on our part. We had to cut them up into small sections in order to optimize culling. We developed a special tool to do this, which took over 180 hours of work, and then another few weeks for tweaking and debugging.
Terrain in Pathfinder: Kingmaker
Besides this, the textures for these terrains had to be large in order to preserve as much detail as possible from our artists’ work. They were sent to the engine in their final form, as we couldn’t slice them into the layers that they really should have been composed of.
In Wrath of the Righteous, we switched to a new terrain pipeline. Our artists now create terrain meshes inside the game engine. Instead of huge baked textures, we now use splat maps and a set of small layered textures. And whenever we need to add extra detail, we use decals (we’ll get on to those later). This saves our artists time when working on terrains and textures in Maya, and also eliminates the need to slice up meshes and textures during import.
Terrain in Pathfinder: Wrath of the Righteous
Foliage
Grass and the process for creating it is a separate, interesting topic. When grass is animated well, it brings game locations to life, transforming them from a pile of static models into a naturalistic landscape. I talked previously about how grass worked in Kingmaker and how it interacts with characters (
https://www.kickstarter.com/projects/owlcatgames/pathfinder-kingmaker/posts/2077236). But what I didn’t reveal was how our artists create grass and how our grass animation works.
In Kingmaker, we copied the mesh for a single blade of grass multiple times to create a patch (approx. 2x2 meters). A location such as the Ruined Watchtower could contain several hundred of these patches. We also created different grass sets for different graphics settings, significantly increasing the memory capacity required for loading locations, as well as the overall build size. In terms of CPU–GPU interaction, this looked like normal mesh rendering, where one patch of grass = one draw call. In other words, rendering grass meant generating hundreds of draw calls, placing this burden on the CPU and the culling system.
Ruined Watchtower location: grass highlighted in green
In Wrath of the Righteous, we’ve developed an Indirect Rendering System to handle grass. This system was designed to shift most of the load onto the GPU. Essentially, we send a single blade of grass to the GPU along with the data specifying where this blade needs to be rendered. The GPU uses this data to carry out culling and renders only the grass that is visible in the frame. The whole process on the CPU side consists of calling a compute shader for culling and a single draw call. On top of this, now we don’t need to save copies of meshes, only their positions in a given location, which has cut the amount of memory required for grass by 6–8 times!
GPU culling debug
Our grass animation also got a major update, and this was driven by something that happened during the development of Kingmaker. On that project, we created our grass animation using animation curves that were sent to a vertex shader. Everything worked as it was supposed to and we adopted this technology into our dev pipeline. We even finished a few dozen scenes. But then we realized that there was an error in the system, and fixing this error forced us to redo our content. This was a big problem. While I was working on a fix, I started thinking about how to modify the system so we could avoid similar errors in the future.
During preproduction on Wrath of the Righteous, I researched grass animation technologies and developed a simplified physics engine based on a Verlet solver for simulating real springs. Now we won’t have to configure any more animation curves. The grass simulation runs in a compute shader on the GPU and doesn’t impact the CPU.
Foliage physics debug
Water
We’ve continued to work on our water rendering since the release of Kingmaker. We were pleased with what we achieved on that game, but there’s always room for improvement. In Wrath of the Righteous, we’ve been paying attention to a few things in particular. The first and most noticeable issue is stutter. Before, our flowing water animation was based on looping 2 textures that alternated from one to the other. The changeover between the textures was noticeable, however, and we struggled to solve the issue. Our second problem flowed (no pun intended) from the first one: disguising the stuttering caused the textures to stretch.
Another problem was that waves on the water didn’t change with the direction of flow.
Stutter, stretched textures, waves not turning in direction of flow
Once again, preproduction for WotR gave me a chance to research possible solutions for these problems. At the time of writing, our complete water shader is still in development, but the majority of our animated textures are already working:
Waves correctly oriented in direction of flow, no stutter or stretching
FX
Another area that we’re continuing to improve and develop is our FX system.
For Kingmaker, we developed a technology called ParticleSnap, which allowed us to stick particles to the bones of characters. Because ParticleSnap was designed as a universal tool, our artists were able to stick particles not just to characters, but to other objects or areas. This is how we created our new-style AoE effects. In these effects, ParticleSnap doesn’t work on creatures, but on space, with particles sticking to specific points. This also allows us to create an AoE that follows dips and curves in the terrain. All this is underpinned by the same particle system, so rendering relatively complex effects still doesn’t cost much in terms of performance.
AoE made using ParticleSnap: Thorny Entanglement
One of the unresolved issues in Kingmaker was distortion for effects. We were able to refract an image, but we couldn’t create several layers of that refraction. To make things worse, our refraction effects wiped out all other effects. To give an example, we weren’t able to create a hot air effect, because all the effects behind it would have simply disappeared. This meant that our artists were extremely constrained in how they could use distortion in effects. Our new renderer based on the SRP allowed us to customize our frame pipeline so that we could mix a variety of refracting layers together and lift these restrictions on our artists:
Full-screen multi-layer distortion: Death Realm effect
Another new development was been created by our Lead FX Artist Victor Demishev. He created a fire constructor that enables us to “set fire” to an entire location in a couple of hours, without having to produce new FX content.
Fire constructor
Decals
Decals are another important addition to our updated graphics, and I previously
described how they worked in Kingmaker. The main problem was the mipmapping. When a decal is projected onto the screen, the decal’s UVs are calculated based on texture depth. Variations in depth (especially at the edge of objects) produces variations in the UVs. And if there is a variation in the UVs, the graphics card thinks that it needs to use a less detailed mipmap. The more variations, the less detailed the mipmap will be in this pixel.
Decal artifacts
I did some in-depth research into this topic, enough to fill a whole article of its own, but I don’t want to overwhelm you with technical details. The gist is that I managed to correct this problem in a way that avoided any significant impact on performance.
Fixed decal shader
We’ve still got a lot of work ahead of us! But we can already see how much simpler our content creation pipeline has become. You can really sense the change within our team, too. We don’t have to spend days redoing a location if we decide to move a single tree. We don’t have to wrack our brains trying to place effects so that distortion doesn’t intersect on screen. We don’t have to check every terrain to make sure it was sliced correctly during import. We now have a new graphics debug system, which makes it much easier to understand what’s going on in a specific pixel. And we have also integrated a new system for shadows, post-processing, HBAO, SMAA—and a whole load of other acronyms—into our new renderer. Let us know in the comments if you want to see more news and details from the technical side of Pathfinder: Wrath of the Righteous.
To arms!
Owlcats.