ANDERSON - Development Update #1

 

Artistic endeavors in ANDERSON

We are progressing smoothly towards the Alpha with the majority of prototyping wrapping up in the next couple of weeks.

We want to share the thoughts of our Lead 3D Artist, Karl Schecht as he brings the world of ANDERSON from concept to reality.

Header.png

Hello! My name is Karl Schecht. I am the sole artist behind the environment of ANDERSON. I’m really excited to talk about the process of creating the environment, and some of the tools I’ve used. Let’s dive in!

Tools

These days, my workflow contains two necessary programs for creating convincing environments, the first of which is Blender. This fantastic, slightly archaic, but completely free and open source software shares a lot of the same features and abilities of Autodesk programs, but is far more lightweight and can even run off of a flash drive. The crux, the creme de la creme, the secret—whatever you want to call it—of Blender lies in the keyboard shortcuts. Once you have those down, the speed of Blender is unmatched. High poly and low poly modeling, and UV unwrapping are my three most commonly used processes.

The second most important piece of software is Substance Painter, which wasn’t always the case. A few years ago, one of my greatest achievements as an artist was the discovery of high to low poly texture baking. This discovery unlocked an ability to produce high quality, beautiful game assets. From there on, I used a combination of xNormals and Photoshop to achieve good quality results—but something was still missing. It wasn’t until I landed a full time studio job in Los Angeles as a technical artist, my coworker and good friend Joey Dhindsa introduced me to the wonderful world of Substance. I’ve used Substance Painter before, but I was turned off by its complexity and never really picked it up again until my time in the west coast. Ever since then, I now use Substance almost as much as Blender on a daily basis. The texture baking in Substance is powerful and easy, but the real cream of the crop, at least for me, are the mask generators. That, coupled with a robust triplanar fill mapping system, makes Substance a top notch program for me.

Onto Unity

When it comes to creating any environments in Unity (or any game engine for that matter) I’m often faced with making a lot decisions early on about how I will approach lighting, as it tends to be one of the heaviest hitters in overall look and performance. In fact, I tend to focus primarily on setting up a fairly detailed first lighting pass, before I move onto detailing the environment. From there on, I’ll make further light passes as I progress with the assets. I immediately knew that I was going to want to use at least a mixed lighting setup in conjunction with a baked box projection reflection probe, mainly for performance reasons since Anderson is a VR title. This setup would provide fairly realistic results without also needing to rely on approximated lighting from light probes.

WIP.PNG

(Early screenshot of the Anderson room, with base textures, basic lighting setup.)

Realism is always a primary goal of mine in most environments I make, as I feel they are a way to bridge the gap between reality and virtual just a little more. A major factor in achieving that realism is from heavy use of Quixel Megascans textures. These are textures that were constructed using photogrammetry, and converted into seamless textures available for download. Quixel also offers photoscanned 3D models, which I employed in Anderson as well.

Optimization!

Creating Anderson’s overall look hasn’t been too challenging. It is actually the most fun part about my work on this game so far. The real challenge was getting it to run comfortably in VR. When working with non VR games, it is usually not a requirement for a developer to have a game run at a constant frame rate, and it is acceptable to have frame rates as low as 30 FPS so long as said frame rate is fairly consistent. All of that changes dramatically when you are working with VR. Due to the inherent technical nature of VR, optimization is absolutely critical. This is because you’re now dealing with stereo rendering—one camera per eye to render your scene! Plus, it has to run at a MINIMUM of 90 FPS per eye. Any lower, and the user will quickly experience headaches, and even nausea. For the best possible experience, I used three major optimization techniques that have thus far proven very effective.

LODs

The first and possibly the most potent optimization techniques that I employed is LODs, or Level of Detail models. The idea is simple: have multiple versions of the same 3D model to render depending on the model’s distance to the player camera. Let’s take this tree for instance. The base model of this tree is about 5,000 triangles. When I am 20 meters away from the model, however, the base model culls out and is replaced by the first LOD, which is about 1800 triangles. Go even further from the camera, then LOD number 2 takes over and so on. See this GIF for visual reference.

Note the LOD indicator beneath the tree, and how the shape of the tree changes due to the model being lower poly.

Occlusion Culling

The second technique I used that has proven extremely helpful when is occlusion culling. (link to occlusion culling: https://docs.unity3d.com/560/Documentation/Manual/OcclusionCulling.html)

The idea of this technique is to basically shut off rendering of objects that are being occluded by other objects. For example, in the outdoor area of Anderson, when you enter a building, the environment outside--trees, terrain, rocks, etc, get culled out, AKA "turned off" until they are no longer occluded by walls. See the GIF here as an example of what I'm talking about:

Note how the scene is broken up into chunks that begin to shut off as objects leave the camera's frustum, and when I enter the lighthouse.

Single Pass Rendering

I won't get too technical with this, (here's documentation if you're interested: https://docs.unity3d.com/Manual/SinglePassStereoRendering.html but this was also a major help in getting framerate to stay at a consistent high. Single pass rendering basically reduces much of the legwork in stereo rendering by making Unity only have to call rendering of certain components of a scene once, instead of twice.

Thank you for reading! I tried to keep this as short as possible, and to only go over things I felt were most important regarding my adventures in creating the environment for Anderson: Remastered! I hope you enjoyed reading :). Feel free to email me questions at anytime, and also take a look at my Artstation for other work I've done. Bye!

karlschecht@gmail.com

karlschecht.artstation.com

-Karl Schecht

Art Lead