Like many, the rise of AI technology caused me a great deal of existential dread. I’ve spent years cultivating a set of skills and an expertise in a number of fields and disciplines. One weekend, I decided to spend some time with these tools (namely Midjourney) to see how I could use AI as a tool to enhance1 what I might be able to make on my own.

This project was always intended to be an experiment more so than a full game. There are several ideas that came up while working on this project that I would love to potentially expand into a full game.

# Picking a setting

I’m a fan of brutalist architecture and am fascinated by its application through the latter half of the 20th century. Particularly in eastern Europe as a aggressively utilitarian response to the needs of the USSR. If I had to guess, I think it has something to do with the juxtaposition between the promise of a “utopian society” with a design language that is so starkly dystopian.

I decided to create a BioShock style game that takes in place in a Soviet-style city underground. Given where the generative models were at the time, I felt it could be a good fit for creating world details (namely propaganda posters) that would greatly limit what I could achieve if I were to do it all by hand.

One thing I discovered was that AI was particularly good for creating “mood” board style pieces that conveyed a tone or could create different color palettes to draw from. This created a nice balance where the AI essentially created a starting point without over-directing me.

# Modeling the environment

To kick things off I started experimenting with different approaches I used to create the entirely interior environment. I did a number of initial experiments involving boolean modifiers but found the end result to messy and difficult to work with.

Geometry nodes were a feature that had recently been added to Blender. I had seen various posts floating around on Twitter and Reddit showcasing what people were doing with them and thought it might be a good fit for what I was trying to do. Within an hour I had the start of a solution that I would continue to extend over the course of the project.

I could edit basic mesh objects (left) however I wanted and my node setup would produce a single mesh (right) almost instantly.
I could edit basic mesh objects (left) however I wanted and my node setup would produce a single mesh (right) almost instantly.

My node setup allowed me to add meshes to one collection that would union the results together, invert them, and then apply plane decimation to the result. Eventually I would add support for a second collection that could subtract meshes from the result.

# Locking down an aesthetic

I decided to give the game a “modernized low fidelity” look by using PBR texture sets but with nearest neighbor filtering. I also used a much smaller texel density, resulting in more obvious pixelation. At this point I was modeling the world in “chunks”, each with their own 4k PBR texture set.

The PBR effect really comes through on shiny surfaces.
The PBR effect really comes through on shiny surfaces.

To further enhance the look, I decided that I would not use repeating textures for the walls, floors, etc… Instead I would allow myself to mask and texture each chunk individually as its own Substance Painter project. This made it possible to add unique details to each chunk without decals, all within the a single texture set.

This style looks significantly better in motion.

# Integration of AI generated imagery

My decisions around texturing would give me a large amount of lee-way in integrating the AI generated imagery into the world. Especially because the AI models at this time were still struggling with coherent details, particularly on faces and hands when there was more than one person in the image.

In-game result (top) with the AI images used (bottom).
In-game result (top) with the AI images used (bottom).

To my surprise, Midjourney’s -tile flag was reasonably good at creating patterned textures that felt like they could have existed in the office of a Soviet bureaucrat. You can see used in various parts of the screenshots, like the walls and carpet.

The mural was created by using the black & white image as a height map.
The mural was created by using the black & white image as a height map.

# Putting together a player character

I really enjoy view models and their animations. I prefer to create view models the classic (and correct) way by creating a proper first person rig with a camera that is parented to the head bone. When texturing non-environmental objects, I would use a 2k texture set and then downscale to either 128x128 or 256x256 to achieve the pixelated look.

Details like the stitching on the gloves were largely lost, and some details (like the wrinkles) had to be scaled up to come through in the final version.
Details like the stitching on the gloves were largely lost, and some details (like the wrinkles) had to be scaled up to come through in the final version.

It wouldn’t be a first-person “shooter” if my character didn’t have a weapon. I’m personally a fan of the Kriss Vector’s form factor, and decided to use that as a basis. Since the world I was creating was essentially an alternate reality version of sometime between the late 60s to early 90s, I looked at several cold war weapons to get a feel for the design language of the time.

The result is essentially a Kriss Vector if it was made from pressed steel with wood furnishings.

# Fleshing out the story

I had created a loose story that involved tears in reality caused by a type of “quantum fungus” discovered within the underground city2. When consumed, the person would be able to bring their deepest desires into reality but only for a few minutes.

Over-consumption of the fungus eventually results in the fungus overtaking the person, turning them into a permanent rift to some alternate reality. Once this happens the spores continue to spread, altering more of the world around them.

In areas where the fungus hasn’t taken over yet, safe zones have been established with strict security. You take on the role of a “retriever”, someone who is tasked with going into the infected areas to retrieve items, relay messages, and sometimes even people. However, you can only stay in the infected areas for a short period of time until you begin to unravel across different realities.

(Unfinished) The commandant is one of NPCs in the safe zone you could interact with for missions. The character's face was generated using AI.
(Unfinished) The commandant is one of NPCs in the safe zone you could interact with for missions. The character’s face was generated using AI.

# What crosses over from the other side?

Now we need something that would be a threat to the player. It makes sense that alternate realities could have all manner of horrific things wandering around. Years ago when I was trying to learn ZBrush, I had created a monster inspired by Silent Hill and a cheek retractor.

It was a design I was particularly proud of and thought it would be a good candidate for the kinds of creatures that might wander into this game’s reality.

# How long did this all take?

Although I had set out to only spend a weekend on it, I wound up spending closer to 2 weeks in total creating everything. I would say the biggest slow down in the project came from how slow some of the Substance Painter files got. This is due to the fact that level chunks got rather large and the texture resolution allowed for a lot of custom detail to be added into each chunk.

# My brief exploration into “mega texture” solutions

The software slow-down, along with the organizational overhead of pre-planning each level chunk, led me to explore ways that I could potentially alter the process of creating a world without tiling textures.

The approach I considered was a custom Blender add-on that would allow me texture items individually (and at full resolution) and place them in a scene. I was also able to use various shader node techniques to add some dynamic alterations, place decals, etc… Then Blender would attempt to bake and output a single massive texture set for the entire scene.

Then it just became a matter of creating a shader in Godot that would allow me to sample the correct part of the texture for each object in the viewport3. As I moved on from conceptualization to implementation, I realized it would be a larger undertaking than I was willing to commit to the project at the time.


  1. Opposed to AI as a replacement↩︎

  2. Why everyone is living underground has yet to be determined. ↩︎

  3. The concept is called “virtual texturing” and is the technique that was created for, and popularized by, Id’s Rage↩︎