About


◦ Joachim Lindkvist is currently based in Stockholm, Sweden.
◦ Junior Technical Artist at Starbreeze
◦ Formerly Technical Artist on AER at Forgotten Key.
◦ Graduate of both The Game Assembly and of GSCEPT.

🙔 Professional Experience  🙮

🙬  OVERKILL’s The Walking Dead     🙣    Technical Artist,  Starbreeze  🙮
🙬  AER Memories of Old                       🙣    Technical Artist,  Forgotten Key  🙮

🙔 Other Experience  🙮

🙬 Made some 3D art for Primal Carnage. 🙮
🙬 Started out making personal mods and maps for fun with games like Half-Life and Morrowind. 🙮

🙔 Contact Me 🙮

🙬 Email me at contact@joachimlindkvist.com 🙮

 

Making effects for AER

Introduction

In this article I want to talk about a few of the effects I worked on for AER.

Specifically I will talk about the sky and the fog, the water shader and water effects and how I used ray marching and 1-D mapping as opposed to classic UV-mapping.

I will expand on some of the decisions and options that I considered and explain some of the techniques and how they work.

The Sky of AER

Sky flying showcase

Sky editor showcase

There are many ways of rendering skies in games. A classic technique, which is still one of the best, is to do it with images and map these images to a skybox (or sometimes a skydome). The humble skybox have been used for a long time and consists of one or several images. Sometimes they are rendered in a 3D software but they can also be painted directly by the artist. It is not uncommon to represent a sky using multiple skyboxes and cross-fade between these. The main benefit of using images for skyboxes is that they can support a lot of detail like clouds or landscape. However you have to consider the drawbacks, like extra memory usage for one thing.

The night sky in AER was made with a skybox, however most of the sky was rendered procedurally. One of our goals for AER was to make a dynamic sky that beautifully and seamlessly transitions when the player moves through it. This then becomes one of the reasons why texturing the entire sky with images was not a suitable solution and why I made a procedural sky. The procedural sky is not rendered with images but with a collection of parameters and properties that determines what the sky looks like.

skyb

Example of procedural sky in AER.

skyc

The same sky with different settings.

Properties of a Procedural Sky

A procedural sky can have a lot of properties, some more important than others. Each zone of the game gets its own a set of properties. The properties are first texture mapped, the textures are then transformed into world-space and scaled to fit the size of the entire world map of the game.

worldproj

Not all of the properties are textured mapped to the entire world, that would be too expensive performance-wise, but some of the more important properties are chosen for that like:

We are sampling these texture with the players world-position (the players position relative to the center of the world) and viewpoint. This technique is typically called volume ray marching or volume ray casting (“Volume ray casting,” Wikipedia).

What is volume ray marching?

In some cases the player might have two or more different zones in view at any one time, and so in reality we would be seeing this represented on the sky. In real life of course we would just see this transition take place but to achieve this in the game we use ray marching.

rayMarching

Ray marching is a technique computed in a shader, our ray marcher for the fog breaks down into these simplified steps:

  1. Ray casting. A ray is sent from the view once for each pixel of the screen.
  2. Sampling. Several texture samples are taken at different distances along the ray.
  3. Composite each sample into the final fog color.

In other words we take several steps into the fog texture along a path and at each step taking a sample of the fog properties. We take these samples along the view vector for every pixel of the screen. To compute the final sky/fog color we then composite all of the samples we gathered. Compositing is simple in our case, we take the average value out of all of the samples.

Aurora editor showcase

Screenshot_aurora

Aurora

The Aurora is another example of raymarching, although it looks very different (than the fog) it works by the same principles as the fog does. The key difference with the aurora is the “compositing step”, which is the calculation that spits out the final color. But the effect still breaks down into the same basic steps, raycasting, sampling and compositing.

What is 1D Mapping?

Water falls

Water in AER

Rift editor showcase

The interesting thing about the water of AER is the techniques to map procedural wave animations and textures. Typically when we are texturing 3D objects we use a technique called UV-mapping. UV-mapping is the process of projecting a 2D image onto a 3D model.

When an 3D artist is creating a 3D model they typically also create a UV-mapping for that specific model before texturing. So a UV-map is a coordinate system in 2D because the textures we use are 2D images most of the time. Typically every 3D model have UV coordinates stored in it so we can apply the textures.

rayMarching

The water in AER is no exception and use UV-mapping for the water streams. But there are parts of the water effect that only use a 1D mapping, as opposed to UV-mapping. Along the waters edge the waves are only 1D mapped and this  is also stored in the 3D model. So the question is; how is this data stored in a 3D model? Luckily both OpenGL and DirectX lets us store and use a bunch of data per vertex, like UV coordinates, normals, colors and positions.

Water showcase

Rift Tear

Rift

For the 1D water waves we store the mapping inside the vertex colors. The benefit to this is that vertex colors are editable in any 3D package just like the UV coordinates are. Instead of using the UV-editor in Maya you would use the vertex paint tool instead. Although for game productions of larger scale I would recommend looking for an automated solution or make purpose built tools for the job. I hardly scratched the surface of what could be done as far tools, but I think the first thing to try would be to create some tools for baking a 1D mapping into a texture or possibly into the vertices.

Even though the wave mapping is 1D (per vertex) the result we get is a 2D effect because the vertex colors are interpolated over the triangles. The actual geometry then becomes very important. This effect is a 1D effect dependent on the geometry of the model. This works perfectly with the polygonal art direction of AER and was the main reason for to actually use this approach. The approach is actually quite cumbersome, especially without purpose built tools.

We also used 1D mapping for a dissolve effect the same way and also stored it in the vertex colors for both magical and water effects.

That’s all folks, thanks for reading!

References

Lighthouse

Lighthouse_logo

Lighthouse: The Adventure of Tiny McBoots is a student game made by a group called Polygore at The Game Assembly. I was a technical artist on Lighthouse, I was primarily responsible for visual effects such as water, particles, clouds. I also made a shader for the heads up display, the lightbulb and health meters.

 

Game Trailer

Let’s Play

Development Footage

Although I write this many months after the project was finished I still have some video footage I recorded during development. I thought it would be interesting to take a look at it. Keep in mind this footage does not represent the final quality, but a work-in-progress snapshot during development.

Water Shader

Flow mapping w/ vertex color

Early during development we knew wanted water so I implemented a flow shader. This video shows my early tests using the vertex painting tool in Unreal Editor to paint the water geometry.
The Vertex colors are used as the vector flow data in the shader. The final water shader has more details such as foam and leaves which are also painted onto the surface in the editor. While I was primarily responsible for the shader, the displacement was implemented by fellow technical artist Sven Lind.

The displacement was externally driven by a blueprint so that it could interact with physics objects.

Clouds

Another major responsibility of mine was the background sky. Although we briefly discussed it, in the end we choose to only create one sky for the entire project. The main reasoning was that the game would be too short to warrant more than one sky and that turned out to be true in the end with only 6 levels including the tutorial level. The benefit of this decision was that I didn’t have to waste much time supporting a dynamic sky system.

Cloud with flow-mapping

Clouds Billboards

In an effort to make a cloudy grey sky more interesting I opted to invest time to look at two ways of animating the sky. The first option was to use a particle system, this would work nicely since I knew our camera would have a static orientation, in other words I didn’t have to worry about the clouds looking flat/weird even though they were billboards. I found that it wasn’t good enough to just cover the sky with only billboards, besides this created more overdraw than I was comfortable with. So I looked at another option, flow-mapping. So to complete the sky I baked a cloudy cubemap and animated this using a technique called flow-mapping, luckily I had already implemented this in the water shader so this did not take very much time to implement.

Tools & Scripts

OpenGL Text Editor

opengl text editor

Walfred is an OpenGL-based C++ Text Editor and GUI project I am currently working on. The goal is for it to be a nice and simple data-oriented, fast GUI library in the end.

  • Work in progress
  • OpenGL & C++
  • Currently using STB_TrueType
opengl text editor

Experimental First Person 3D Editor

This I created as an experiment with creating a more immersive tool for creating 3D environments directly in first person. Written in C++ and OpenGL, Bullet Physics.

  • Experiment
  • OpenGL & C++
  • Bullet Physics
first person editor
first person editor

Cubemap Toolset

 This script was created to help render cubemaps in Maya. I made this during the game projects at The Game Assembly.

  • Render 6 cameras from origin
  • Combine Images into Horizontal Cross
  • Layout Images for NVIDIA .dds format
  • Python, PyMEL, Maya
renderCubemap
ps_cubeScript

Maya Level Exporter

At The Game Assembly, we used Maya as a Level editor for one project and this script exported the levels to XML files, which were loaded directly by the game engine.

  • Exports Maya Scene To XML
  • Exports All Transforms, position, rotation, scale
  • Exports User Defined Attributes.
  • Python, PyMEL, Maya
levelexport_window

Texture Batch Exporter

I made this export script so that the artists could quickly batch export several textures from Photoshop. This was also created while I was at The Game Assembly.

  • Export group to texture
  • Combine & shuffle texture channels
  • Batch save multiple textures.
  • Saves to DDS format.
  • Javascript, Photoshop
ddsexporter

3D Art

Primal Carnage


ammo

Primal Carnage, 2010


prod

Primal Carnage, 2010


vending

Primal Carnage, 2010

Personal


ChairAndPiano_Pres_1_178_1111_2-20-2014

Personal, 2013


© 2010 - 2019 by Joachim Lindkvist. All rights reserved. No parts of the artwork or text presented on this website may be reproduced, copied or used under any circumstances, unless stated otherwise.