All Moving Sprites in One Draw Call!!?

I never thought it was possible, but yes it is. I’ve tried it and used it in Academia. This is with the help of somebody else’s script of course. It’s not without drawback.

Optimizing multiple moving sprites is I think the hardest one to pull off. Even if they’re already sharing the same material, they still don’t batch consistently. When they do batch, there’s so many of them that the process of dynamic batching will actually take more CPU cycles and bogs down the frame rate. Most of the time, the batched sprites are scattered all around.

CharacterBatch
Moving sprites are batched like this

While browsing on the internet for ideas on how to optimize, I stumbled upon the great old Unify Community wiki. I haven’t visited the site for a while and I saw that there are a lot of new shared scripts. So I went through each one and found this SpriteManager by Brady Wright (thank you so much!). It’s actually an old script. It claims that it can draw multiple moving objects in one draw call. A bold statement, I thought. I went ahead and tried a simple test. The usage is a bit rough, but I was able to roll with it. To my amazement, the script works. I spent the next two days integrating the gem to our game. Now, all the moving agents in the game are rendered in a single draw call.

OneDrawCall

I took a good look at the code to understand how it works. It works by maintaining a single mesh for all sprites that were requested from the manager. The manager transforms each vertex in LateUpdate() using the Transform of the owner of each sprite. It then reassigns these transformed vertices to the mesh thus updating the positions of the quads. The engine just renders this one single big mesh. It’s so simple and clever.

An obvious drawback is it doesn’t have culling. Objects that are outside the view of the camera would still be rendered. The transformation of vertices is still computed on those sprites that should have been culled. This isn’t a real issue to us because most players play the game with the camera zoomed out most of the time. We expect players would play this way because one would naturally want to see how their school is doing. Culling will not contribute much.

Eventually of course, we’d probably hit a certain number of sprites where this system starts to bog down. We could always implement culling or try to optimize the code by ourselves when we hit that number. Until then, we’re just letting it be. It’s already good.

Advertisements

The Easiest Mod We Could Support

I have no experience, yet, when it comes to making games with modding support. Academia would be my first. To test the waters, I made the easiest type of mod: image replace or “reskin”. In this post, I will describe our set up and how the modding process would be.

In my previous post, I have described that we used StreamingAssets to load our image assets. This is very different from using sprites inside Unity where you drag and drop a sprite asset to a SpriteRenderer. What we do is we load images using Texture2D.LoadImage(), pack them into one big atlas, and assign them to our custom quad, all of these during runtime. The rationale of course is for modding support. We want to be able to load and use images that are specified by a modder.

We maintain an XML file that acts as the database of mapping to image paths under StreamingAssets. It looks like this:

<TileSpriteMapping>
    ...
    <!-- Student Chair -->
    <Entry id="StudentChairDown" streamingPath="BaseGame/Tiles/Objects/ChairStudentDown.png" />
    <Entry id="StudentChairUp" streamingPath="BaseGame/Tiles/Objects/ChairStudentUp.png" />
    <Entry id="StudentChairLeft" streamingPath="BaseGame/Tiles/Objects/ChairStudentLeft.png" />

    <!-- Standard Chair -->
    <Entry id="ChairDown" streamingPath="BaseGame/Tiles/Objects/ChairStandardDown.png" />
    <Entry id="ChairUp" streamingPath="BaseGame/Tiles/Objects/ChairStandardUp.png" />
    <Entry id="ChairLeft" streamingPath="BaseGame/Tiles/Objects/ChairStandardLeft.png" />

    <!--- Kitchen Sink -->
    <Entry id="KitchenSinkDown" streamingPath="BaseGame/Tiles/Objects/KitchenSinkDown.png" />
    <Entry id="KitchenSinkLeft" streamingPath="BaseGame/Tiles/Objects/KitchenSinkLeft.png" />
    <Entry id="KitchenSinkUp" streamingPath="BaseGame/Tiles/Objects/KitchenSinkUp.png" />
    ...
</TileSpriteMapping>

We then have a manager that parses this file and also handles the loading and delivery of such asset. We can easily get a Sprite or a set of Texture and UV coordinates by using the id of an item in this XML file.

renderer.sprite = SpriteManager.Request("KitchenSinkDown");

To change a particular image, the modder only has to provide his/her rendition of the image and an XML file with the same format as ours. The difference is that the base folder would be the root folder of the mod instead of StreamingAssets. The modder don’t have to specify all entries. The modder only has to specify the entries he/she wishes to change. For example, if a certain modder would like to change the kitchen sink’s image, this could be the his/her XML mapping:

<TileSpriteMapping>
    <Entry id="KitchenSinkDown" streamingPath="Images/Objects/KitchenSinkGraffitiDown.png" />
    <Entry id="KitchenSinkLeft" streamingPath="Images/Objects/KitchenSinkGraffitiLeft.png" />
    <Entry id="KitchenSinkUp" streamingPath="Images/Objects/KitchenSinkGraffitiUp.png" />
</TileSpriteMapping>

The modder can organize his/her images however they like. The custom images would be expected in said folders. What the game does is we parse our default base mapping first, then during mod installation, we look for this XML file in every mod folder, parse them if they exist and use their entries to replace the entries in our default mapping. When sprite packing executes, the image paths specified by the mod will be the ones that get loaded. And that’s it. The modder can replace any image in the game easily as long as they are in the XML mapping.

SampleMod
Here are sample image replace mods that I have tested

Why XML?

It’s the most readable and easy to understand text format in my opinion. Non programmers can easily understand it. Most importantly, it supports comments. I plan to distribute a sample mod folder that modders can just download and build from. With XML, I can write some documentation within the file itself as comments.

Another benefit is it saves us from making a modding editor application. A custom editor would be nice, but in this case, it’s not really needed. We want to release this simple modding environment as soon as we can.

Unity Game Considerations for Modding Support

Last week, I began working on modding support for Academia. I have to pause development at least this week in lieu of bug fixing. I have vastly underestimated the amount of initial work that I have to do to make it work. For this post, I would like to add to the very few resources of this topic and hopefully help others prepare their Unity game better.

From the very beginning, we have decided to include modding support. It’s important to make this decision before making the game as it will affect the implementation details. In our case, we used StreamingAssets to load our images and text data in the game. What this means is we load the assets as if they are external files. The assets are not packed in Unity’s own format when the game is built. The files of these assets remain as is and are contained in the StreamingAssets folder. The reason for doing this is we want to be able to load files in other folders that would be specified by modders.

As a lesson, your game should be able to load assets in other folders, not just in StreamingAssets. I had this assumption that mod files from Steam Workshop could be downloaded to the StreamingAssets folder and I can load it from there. Nope! Steam downloads the mod files from Steam Workshop into folders that are maintained in the Steam library folder. The mod folder is usually in {Steam Library}/{Steam Game ID}/{Workshop ID}. Because of this, your game should be able to accept a different folder and load assets in that specified folder. For example:

// Instead of this:
private Texture2D Load(string assetPath) {
    string fullPath = Path.Combine(Application.streamingAssetsPath, assetPath);
    ...
}

// Use something like this
private Texture2D Load(string baseFolder, string assetPath) {
    string fullPath = Path.Combine(baseFolder, assetPath);
    ...
}

// Use it like this to load assets from Application.streamingAssetsPath
Texture2D texture = Load(Application.streamingAssetsPath, "SomeSprite.png");

In other words, your asset management system should be flexible enough to load other folders.

Another difficulty I had is the implementation of the Steam Workshop API itself. We are using Steamworks.NET to interface with Steam. While the Steam Workshop API is readily available with it, it’s not that easy to understand. You still have to wrap the API calls to make it manageable for your use. This could take days to make. To save time, I bought an asset, instead. While I didn’t like its predefined prefabs (which is how you use the asset), I did use its wrapper class for querying workshop items and downloading them.

The next challenge I had is the mod installation management. I wanted players to be able to install and stack multiple mods. This meant that I needed to have some kind of internal mods database and provide a UI for it. Implementation of this thing took the whole week.

ModsManagement

Aside from this, a system for testing/uploading/updating mods would also be needed. I plan to add this system as a tab to this screen. Loading assets from any folder would be valuable here. I imagine that the mod developer can specify the full path to his/her mod folder. Then he/she can subscribe to this local folder so he/she can test his/her creation. There will also be buttons for uploading/updating the mod. This is why I had to stop because it may take up the whole week again and I still have lots of bugs to fix.

There you have it. These are the considerations that I had thought while implementing the mod support. Hope these will help you on yours.

SampleMod
Simple mods that I installed. Blue characters and some graffiti on those kitchen sinks.

A Simple Mesh Combiner

Character meshes in our game Academia are always the hardest to optimize. There are many of them and they hardly batch even if I had combined all character sprites in one single texture. A character in the game has two quads, the body and the head. I had given up on batching multiple characters but what frustrated me is that these two meshes are always rendered in separate draw calls. This is because the head sprite has transparency and it should be rendered on top of the body sprite. The two can’t be batched. What’s a motivated guy got to do? I combined them into a single mesh.

CombinedMesh
On the left is the original face and body quads. The right one is the combined.

It’s easy to combine meshes. There are already existing tools out there for this purpose. However, I wanted something different. I want to be able to change the UV coordinates of some parts of the combined mesh during runtime. For example, when the character blinks, I want to change the UV coordinates of the face quad. I also wanted it to be lightweight. There are asset products like Mesh Baker that I find too bloated for my needs.

Time for some code! Here’s the MeshPartHandle. It handles the minimal data to represent a “part” of the combined mesh. It just contains a starting index and the vertex count of the mesh part. I’ll show the usage later.

    public class MeshPartHandle {

        private int startIndex;
        private readonly int vertexCount;

        public MeshPartHandle(int vertexCount) {
            this.vertexCount = vertexCount;
        }

        public int VertexCount {
            get {
                return vertexCount;
            }
        }

        public int StartIndex {
            get {
                return startIndex;
            }

            set {
                startIndex = value;
            }
        }

    }

The following is the mesh combiner itself:

    [RequireComponent(typeof(MeshFilter))]
    [RequireComponent(typeof(MeshRenderer))]
    public class CombinedMesh : MonoBehaviour {

        private Mesh mesh;
        private MeshFilter meshFilter;
        private MeshRenderer meshRenderer;

        private readonly Dictionary<Transform, Mesh> meshMap = new Dictionary<Transform, Mesh>();

        private readonly Dictionary<Transform, MeshPartHandle> handleMap = new Dictionary<Transform, MeshPartHandle>();

        private Transform selfTransform;

        public void Clear() {
            this.meshMap.Clear();
            this.handleMap.Clear();
        }

        public MeshPartHandle Add(Transform owner, Mesh mesh) {
            Assertion.Assert(!this.meshMap.ContainsKey(owner)); // Should not contain the specified owner yet

            this.meshMap[owner] = mesh;

            MeshPartHandle handle = new MeshPartHandle(mesh.vertices.Length);
            this.handleMap[owner] = handle;

            return handle;
        }

        private List<Vector3> vertices = new List<Vector3>();
        private List<Color> colors = new List<Color>();
        private List<Vector3> normals = new List<Vector3>();
        private List<Vector2> uvs = new List<Vector2>();
        private List<Vector2> uvs2 = new List<Vector2>();
        private List<int> triangles = new List<int>();

        // Cache array so we could easily set new UV values
        private Vector2[] uvArray;
        private Vector2[] uv2Array;

        public void Build() {
            this.vertices.Clear();
            this.colors.Clear();
            this.normals.Clear();
            this.uvs.Clear();
            this.uvs2.Clear();
            this.triangles.Clear();

            foreach (KeyValuePair<Transform, Mesh> entry in this.meshMap) {
                AddToBuild(entry.Key, entry.Value);
            }

            this.mesh = new Mesh();
            this.mesh.vertices = this.vertices.ToArray();
            this.mesh.colors = this.colors.ToArray();
            this.mesh.triangles = this.triangles.ToArray();
            this.mesh.normals = this.normals.ToArray();

            this.uvArray = this.uvs.ToArray();
            this.mesh.uv = this.uvArray;

            this.uv2Array = this.uvs2.ToArray();
            this.mesh.uv2 = this.uv2Array;

            this.meshFilter = GetComponent<MeshFilter>();
            Assertion.AssertNotNull(this.meshFilter);
            this.meshFilter.mesh = this.mesh;

            this.meshRenderer = GetComponent<MeshRenderer>();
            Assertion.AssertNotNull(this.meshRenderer);
        }

        private void AddToBuild(Transform owner, Mesh mesh) {
            MeshPartHandle handle = this.handleMap[owner];
            handle.StartIndex = this.vertices.Count;

            this.colors.AddRange(mesh.colors);
            this.normals.AddRange(mesh.normals);
            this.uvs.AddRange(mesh.uv);

            // Special case for UV2
            // Other meshes don't have it so we use zeroes
            if(mesh.uv2.Length == 0) {
                for(int i = 0; i < mesh.vertices.Length; ++i) {
                    this.uvs2.Add(VectorUtils.ZERO_2D);
                }
            } else {
                Assertion.Assert(mesh.uv.Length == mesh.uv2.Length);
                this.uvs2.AddRange(mesh.uv2);
            }

            // Adjust the triangle indeces
            for(int i = 0; i < mesh.triangles.Length; ++i) {
                this.triangles.Add(mesh.triangles[i] + handle.StartIndex);
            }

            if(this.selfTransform == null) {
                this.selfTransform = this.transform; // Cache
            }

            // Transform the vertices from its owner
            for(int i = 0; i < mesh.vertices.Length; ++i) {
                Vector3 transformedVertex = this.selfTransform.InverseTransformPoint(owner.TransformPoint(mesh.vertices[i]));
                this.vertices.Add(transformedVertex);
            }
        }

        public void SetMaterial(Material material) {
            this.meshRenderer.material = material;
        }

        public void SetSortingLayer(string sortingLayerName) {
            this.meshRenderer.sortingLayerName = sortingLayerName;
        }

        public void SetUvs(MeshPartHandle handle, Vector2[] uvs) {
            for(int i = 0; i < handle.VertexCount; ++i) {
                this.uvArray[handle.StartIndex + i] = uvs[i];
            }
            this.meshFilter.mesh.uv = this.uvArray;
        }

        public void SetUvs2(MeshPartHandle handle, Vector2[] uvs) {
            for (int i = 0; i < handle.VertexCount; ++i) {
                this.uv2Array[handle.StartIndex + i] = uvs[i];
            }
            this.meshFilter.mesh.uv2 = this.uv2Array;
        }

        public Transform SelfTransform {
            get {
                if(this.selfTransform == null) {
                    this.selfTransform = this.transform;
                }

                return selfTransform;
            }
        }

    }

This is then how it is used:

CombinedMesh combinedMesh = GetComponent<CombinedMesh>(); // Or any other way of getting this instance
MeshPartHandle headHandle = combinedMesh.Add(this.transform, this.headMesh);
MeshPartHandle bodyHandle = combinedMesh.Add(this.transform, this.bodyMesh);
combinedMesh.Build(); // Builds the combined mesh

// During gameplay, say we want the character to use the sprite that's facing left
Vector2[] leftHeadUvs = GetUvs("Head", Orientation.LEFT);
combinedMesh.SetUvs(headHandle, leftHeadUvs);

Vector2[] leftBodyUvs = GetUvs("Body", Orientation.LEFT);
combinedMesh.SetUvs(bodyHandle, leftBodyUvs);

The combiner class is very straightforward. It just maintains a list of mesh data like vertices, colors, triangle indeces, normals, and UVs. Whenever a mesh is added through CombinedMesh.Add(), we also add the data of that mesh to the locally maintained lists. Each MeshPartHandle remembers how many vertices it has and where its starting index is. The handles are then used to change the parts of the mesh it represents. For now, it can only change UVs. It’s certainly possible to allow changes to colors, or normals, etc.

By using this, Unity now renders a character in a single draw call. An added effect to this is that batching multiple characters is now possible in certain conditions. For example, characters that lie in the same horizontal line can now be batched.

CharacterBatch
From the frame debugger

This is just one use. I think I’m going to use this to combine other objects. Our rendering is still terrible. Here’s hoping that I can improve it by using this simple mesh combiner.

Nightmare on Release Day

This last week has been very exhausting. It was the week of the release of Academia‘s Early Access. I thought we’d have a smooth release this time because we already have a good candidate build by September 5, Tuesday. We were releasing on September 8, Friday. Things were pretty chill. We distributed some keys to Youtubers so they can start recording their games. Then Wednesday night, one of the Youtubers reported this:

Zombies

Every single agent in the game acted like zombies with nothing to chase. They don’t do anything else. They’re stuck in this trance state for eternity. Worst of all, I can’t replicate it on my machine. Our artist, Ryan Sumo, can replicate it on his end consistently. This is an unacceptable bug because it stops the flow of the game. The bugs that are hardest to fix are those you can’t replicate, and we only found out of this ugly one two days before release.

The whole day of Thursday was spent on blindly fixing this bug. Anything I try is a just a “guess”. I build the game, upload it to our Steam testing branch, then Ryan downloads it and… “I’m sorry your fix didn’t work”. My emotions go from hopeful to heartbreak, then back to hopeful if I thought of a good guess, and back to sorrow again if the guess failed. It was very tiring.

My best guess revolved around our A* processing thread. We have a separate thread that has a queue of A* requests. The agents can enqueue an A* request whenever they need one and wait for the result. The thread checks the queue request and executes the A* search. This thread runs indefinitely. The agents in the gif are doing their default behavior while waiting for the A* result that they requested. They roam around in their current tile. My guess is that the thread stopped working such that they no longer get their results, thus their behavior of waiting in vain.

This was the unfixed code (Not the actual code. This is a shortened one.):

    // Runs the thread that constantly checks the queue
    class AStarThreadQueue {
        private ActionThread thread;
        private Queue<AStarResolution> queue = new Queue<AStarResolution>();

        public void RunThread() {
            this.thread = UnityThreadHelper.CreateThread((Action)Process);
        }

        public void Enqueue(AStarResolution resolution) {
            Assertion.AssertNotNull(resolution);
            this.queue.Enqueue(resolution);
        }

        private static readonly object SYNC_LOCK = new object();

        // This is the method that the thread executes indefinitely
        private void Process() {
            while(true) {
                if (this.queue.Count > 0) {
                    AStarResolution resolution = null;

                    lock (SYNC_LOCK) {
                        resolution = this.queue.Dequeue();
                    }

                    resolution.Execute();
                }
            }
        }
    }

This code works without problems while in Unity editor, even with hundreds of agents. But when built to an exe and tested on other computers, the thread breaks. If you know multithreading, you can probably see the problem from a mile. First, I didn’t have exception handling inside that while loop. If an exception occurs inside, the loop breaks and the thread ends. Second, the queue is not locked in Enqeue(). Based from the logs, the code

                    lock (SYNC_LOCK) {
                        resolution = this.queue.Dequeue();
                    }

returns null even when the queue count is checked and the items in it are guaranteed to be not null. This is probably due to access conflict to the queue.

This is what the fixed code looks like:

    // Runs the thread that constantly checks the queue
    class AStarThreadQueue {
        private ActionThread thread;
        private Queue<AStarResolution> queue = new Queue<AStarResolution>();

        public void RunThread() {
            this.thread = UnityThreadHelper.CreateThread((Action)Process);
        }

        public void Enqueue(AStarResolution resolution) {
            Assertion.AssertNotNull(resolution);

            lock (SYNC_LOCK) {
                this.queue.Enqueue(resolution);
            }
        }

        private static readonly object SYNC_LOCK = new object();

        // This is the method that the thread executes indefinitely
        private void Process() {
            while(true) {
                try {
                    if (this.queue.Count > 0) {
                        AStarResolution resolution = null;

                        lock (SYNC_LOCK) {
                            resolution = this.queue.Dequeue();
                        }

                        resolution.Execute();
                    }
                } catch(Exception e) {
                    // We log the error but do not end the thread
                    Debug.LogError(e.Message);
                }
            }
        }
    }

With this, Ryan finally claimed “OK, the fix seems to work!”. I cried inside. The fixed code looks somewhat easy now, but the journey to this fix was not. I tried a lot of other fixes before this. I even turned off that Graphics Jobs feature of Unity because it might have messed with our thread. It’s a stupid theory but I’m desperate. Might as well try it.

While multithreading is useful, I realized that it could also be ruthless if you don’t know what you are doing. Treat this as a cautionary tale if you use threads in your projects. Test your build on a variety of machines and with the most complex state of the game. Avoid this kind of nightmare on your release day.

Our game Academia: School Simulator is now available on Steam Early Access. It’s currently discounted at 20%. We’re still a long way to go and we have lots of features to implement. Buy it now while it’s cheap.

Things I Wish I had Done in Academia

This post is going to be a bit different. To be honest, I have ran out of programming topics to discuss that is at least as interesting as the previous one. So instead, I’ve thought about the architectural mistakes that I’ve made and wished that I implemented them differently. It might help those who want to develop games in the same genre. Consider this like sort of a post mortem but in the perspective of the programmer.

I wish I had used Behavior Trees under our GOAP framework

We are using GOAP to drive all our NPC behavior. I’ve written about it here and here. Each action inside our GOAP framework is not just a single action. Instead, we are using a sequence of multiple “atom actions” that the agent executes one after another. This way, we can easily write individual atom action classes and reuse them. A GOAP action then is just a container of atom actions.

multipleactions
A screenshot of our GOAP editor

The problem here is the sequential execution. There’s no way to simulate branching behaviors like for example, if the main sequence fails, then do this other one. If I wanted something like this, it’s either I add more preconditions to the action or create another GOAP action for the branching behavior. It’s just awkward most of the time and adds complexity. This would have been elegantly solved had I used a behavior tree instead of a sequence of actions. This will add development time for the editor but it would have been worth it. Now, I’m stuck with this current implementation because we’re just too deep into its usage.

I wish I maintained all our assets in a single large texture

This is probably my biggest mistake in this game. Like I knew this at the back of my head but didn’t do it because I was lazy and I wanted a working prototype right away. Now it’s biting me at the neck. I mentioned this in my previous post. I didn’t have to do that packing had I done this. It would have save us some game loading time and memory.

I wish I had used a text format like XML for the objects definitions

An object definition contains all the information about an object in the game. For example, say a refrigerator. Its definition will contain information like it occupies 2×1 tiles, it uses this sprites, it has this price, it should be added in this layer, it has this offset, it has to use this prefab, everything! Stupid as it may seem, but yes, we’re not using a text format for this. We are using Unity’s serialized class instead, and it has a fancy editor:

ObjectDefinitionSample

Unity’s serialized classes are very easy to write. You write the serialized class and the inspector automagically creates editor fields for them. You don’t have to write a parser! Since laziness wins, I stuck with it. The problem is that our designer or artist couldn’t use it due to edit conflicts. I’m almost always updating this data and my edits are much more critical so I always end up overriding theirs.

What happened is that I ended up making another definition file which our designer maintains to avoid conflict. Now we have multiple definition files and this is bad! It’s also expected that this will bite me soon because this does not sit well for modding support.

Life would have been different if the definitions were maintained in a single XML file. Conflicts are easier to fix and my team mates would have been happier. Sure, I could generate an XML file from my existing definition. But my point is that’s effort that could have been avoided.

I wish I included the body in animation clips

Our NPCs play some animations like when they are eating, reading a book, cooking, etc. I made this by using a prefab that contains the hands and the animations. Whenever an NPC is generated, I also add the hands prefab to them. I separated the character prefab and hands prefab so it’s easier to maintain the different classes like students, teachers, workers, etc.

HandsPrefab
The “Hands” prefab and all its glory

Unfortunately, the hands prefab really only has the hands. It doesn’t include the character’s body. The animation clips then can only animate hands. This is a problem because later on, we may want to animate the body like when the character is giggling, jumping, shaking, or angry. The current implementation may be more maintainable but this drastically reduces the animation possibilities.

Conclusion

This is obviously an incomplete list but these are probably the major ones. The thing is, I can’t make a major refactor right now. We are at the brink of release. Academia will be out on Early Access on September 8! Even with these mishaps, the state of the current game is working great. I’ll most likely fix some of the stuff here in the near future.

Hope you have enjoyed this and I’ll see you on the next one.

Reducing Draw Calls Using a Simple Texture Packer

When we started making Academia, we didn’t really plan out how are we going to manage our sprites. We just did the quickest way which was to make them individually and load them in the game. All of our game world sprites are stored in StreamingAssets. We load them dynamically when the game is run. This is different from the normal way using an imported texture. We did it this way in preparation for modding support. I envision that modders could then add their own folders and provide them the mechanism to override the base game images.

Ideally, all game sprites should be in a single big texture. This will allow you to use a common material throughout your game objects so that dynamic batching can indeed batch. Now that the game got bigger, it’s harder to put all of our sprites in one single atlas. Our artist wouldn’t agree to this as it’s a lot of work. Additionally, we no longer have the time. We’re releasing our Early Access this September 8.

SomeObjects
A few samples of our sprites. We have folders of these.

While coming up with solutions, I thought what if I could pack the images dynamically instead and use the generated atlas. It should be simple enough to recompute the UVs of the packed sprites. I scrounged the internet on algorithms on how to optimally pack rectangles in a bigger one. Turns out that this is an interesting problem. There are numerous papers about this. It also turned out that I no longer have to roll up my own packer. Unity already made one.

It needs some help, though. I needed something that keeps track of the packed textures. I needed a way to get the same sprite out of the packed one. Time to code!

Here’s the class that abstracts an “entry” of a packed texture:

    public class PackedTextureEntry {

        private readonly Texture2D atlas;
        private readonly Rect uvRect;
        private readonly Rect spriteRect;
        private readonly Sprite sprite;

        public PackedTextureEntry(Texture2D atlas, Rect uvRect) {
            this.atlas = atlas;
            this.uvRect = uvRect;
            this.spriteRect = new Rect(this.uvRect.x * this.atlas.width, this.uvRect.y * this.atlas.height,
                this.uvRect.width * this.atlas.width, this.uvRect.height * this.atlas.height);
            this.sprite = Sprite.Create(this.atlas, this.spriteRect, new Vector2(0.5f, 0.5f), 768);
        }

        public Texture2D Atlas {
            get {
                return atlas;
            }
        }

        public Rect UvRect {
            get {
                return uvRect;
            }
        }

        public Rect SpriteRect {
            get {
                return spriteRect;
            }
        }

        public Sprite Sprite {
            get {
                return sprite;
            }
        }

        public Sprite CreateSprite(Vector2 pivot) {
            return Sprite.Create(this.atlas, this.spriteRect, pivot, 768);
        }

    }

Basically, it’s just a container of the generated atlas and the UV coordinates of a particular sprite entry. The Rect passed in the constructor is a normalized UV (values are zero to one). Sprites, however, are created using pixels. So we need a new Rect for this which is just the UV rect multiplied by the dimensions of the atlas. This class also has a pre-generated Sprite pivoted at the center.

The following class is the texture packer itself:

    public class TexturePacker {

        // Contains the associated names of the added texture so we can easily query its entry after packing
        private List<string> names = new List<string>();

        // This contains the textures to pack
        // Used a list here so we could easily convert to array during packing
        private List<Texture2D> textures = new List<Texture2D>();

        // Keeps track of the packed entries
        private Dictionary<string, PackedTextureEntry> entriesMap = new Dictionary<string, PackedTextureEntry>();

        private Texture2D atlas;

        public TexturePacker() {
        }

        public void Add(string key, Texture2D texture) {
            this.names.Add(key);
            this.textures.Add(texture);
        }

        public void Pack() {
            this.atlas = new Texture2D(2, 2, TextureFormat.ARGB32, false); // Will expand on packing
            Rect[] rects = this.atlas.PackTextures(this.textures.ToArray(), 0, 8192, true);

            // Populate entries
            this.entriesMap.Clear();
            Assertion.Assert(this.names.Count == this.textures.Count);
            for(int i = 0; i < this.names.Count; ++i) {
                this.entriesMap[this.names[i]] = new PackedTextureEntry(this.atlas, rects[i]);
            }

            // Clear to save memory
            // These textures may also be released
            this.textures.Clear();
        }

        public PackedTextureEntry GetEntry(string key) {
            return this.entriesMap[key];
        }

    }

Usage is self explanatory. Create an instance of the packer. Add the textures that you want to pack, each associated with a string key. Call Pack(). Use GetEntry() to get an instance of PackedTextureEntry associated with the sprite. Use PackedTextureEntry.Sprite property to have access of the sprite that is from the packed texture.

TexturePacker packer = new TexturePacker();

// Let's just say you have a library of textures associated by name
foreach(TextureEntry entry in entries) {
    packer.Add(entry.Name, entry.Texture);
}

packer.Pack();

// Get a packed entry and use its sprite
PackedTextureEntry packedEntry = packer.Get("Grass");
spriteRenderer.sprite = packedEntry.Sprite;

And that’s it! It’s really simple but this thing helped in batching by a lot.

YuugeBatch
Yuuuge batch! This is from the frame debugger.

 

Reflection Series – Part 3: Attributes are Magic

Attributes in C# are those things wrapped in “[]” that are sprinkled on classes, properties, or variables. In Unity, I usually use [SerializeField] so that a variable becomes editable in the editor but still remains private or protected in code if I wanted to. There are other attributes, too, like Range, ExecuteInEditMode, HideInInspector, DisallowMultipleComponent, etc. Somewhere in Unity’s processes access these attributes through reflection and executes code that the attribute dictates. The coolest part is you can make your own attributes. The following is a very simple attribute:

[AttributeUsage(AttributeTargets.Property)]
public class PropertyGroup : Attribute {

    private readonly string name;

    public PropertyGroup(string name) {
        this.name = name;
    }

    public string Name {
        get {
            return name;
        }
    }

}

It’s an attribute that can be attached to properties (AttributeTargets.Property). It requires a string name. This is how it can be used:

        ...

        [PropertyGroup("Settings")]
        public string Gender {
            get {
                return gender;
            }

            set {
                this.gender = value;
            }
        }

        [PropertyGroup("Settings")]
        public string Type {
            get {
                return type;
            }

            set {
                this.type = value;
            }
        }

        [PropertyGroup("OrientationIds")]
        public string DownSpriteId {
            get {
                return downSpriteId;
            }

            set {
                this.downSpriteId = value;
            }
        }

        [PropertyGroup("OrientationIds")]
        public string RightSpriteId {
            get {
                return rightSpriteId;
            }

            set {
                this.rightSpriteId = value;
            }
        }

        ...

We use PropertyGroup attribute as a grouping mechanism for our generic editor. What we do is in our editor code, we look for this attribute and collect such properties in their own container before rendering them. There are many ways to access an attribute through reflection. The following is one way:

public static T GetCustomAttribute<T>(PropertyInfo property) where T : Attribute {
    Attribute attribute = Attribute.GetCustomAttribute(property, typeof(T));
    return attribute as T;
}

// Usage
PropertyGroup group = GetCustomAttribute<PropertyGroup>(propertyInfo);

// Do something with the attribute like access its name
List<PropertyInfo> propertyList = GetPropertyList(group.Name);

With this, we can change the look of our editor by just adding attributes to our classes.

GroupedProperties

Other Uses

Persistence

In Part 2 of this series, I discussed an XML writer/reader by reading properties of classes. Aside from this, we also have a property attribute named Persist to selectively write/read properties with such attribute. The attribute also allows specification of a default value. We heavily use this attribute in our persistence system. This is useful for properties that are not present in old save files and you want a default value for them.

    [AttributeUsage(AttributeTargets.Property)]
    public class Persist : Attribute {

        private object defaultValue;

        public Persist() {
        }

        public Persist(object defaultValue) {
            this.defaultValue = defaultValue;
        }

        public object DefaultValue {
            get {
                return defaultValue;
            }
        }

    }

    // Sample usage
    [Persist]
    public float PolledTime {
        get {
            return polledTime;
        }

        set {
            this.polledTime = value;
        }
    }

    [Persist(1)]
    public int Day {
        get {
            return day;
        }

        set {
            this.day = value;
        }
    }

    [Persist(1)]
    public int Year {
        get {
            return year;
        }

        set {
            this.year = value;
        }
    }

    ...

Class Browser

In Part 1, I wrote about loading classes using their class names. As such, we have a lot of different editors where a user can select from a selection of different classes that derive from a single base class. The amount of classes written for these editors can get out of hand so we made a class level property to group classes into their respective domains.

    [AttributeUsage(AttributeTargets.Class)]
    public class Group : Attribute {

        private readonly string name;

        public Group(string name) {
            this.name = name;
        }

        public string Name {
            get {
                return name;
            }
        }

    }

    // Usage
    [Group("Game.CharacterNeeds")]
    public class AddNeedAmount : ComponentAction<CharacterNeeds> {
        ...
    }

    [Group("Game.Task")]
    class GetCurrentTaskEffortPerTile : ComponentAction<TaskDoer> {
    }

With this attribute, we can make our class browsers to look like this:

ClassBrowser

Conclusion

Reflection is a worthy tool to add to your gamedev toolbox. It’s powerful and useful. What I did in this series is probably just the tip of an iceberg. I’ve read about modders who opened up, ripped, and replaced game parts of C# games using reflection. Yikes!

I’m going to end the series for now. I may add to it later. I hope I have convinced you that reflection is a cool toy to learn.

Reflection Series – Part 2: Harnessing Properties

One of my favorite features in reflection is the ability to know about the properties of a certain instance at runtime. This has helped a lot in making tools to aid game development.

Basics

The following code is the simplest way to iterate through the properties of an instance:

// Let's say there's some instance named myInstance
Type type = typeof(myInstance);

// Get the properties that are public and not static
PropertyInfo[] properties = type.GetProperties(BindingFlags.Public | BindingFlags.Instance);

foreach(PropertyInfo property in properties) {
    // Do something with property...
}

PropertyInfo contains the information about a property. You can get its getter and setter methods. You can invoke them. You can get its return type. You can query the attributes attached to the property. You will know a lot (just head over to its API reference).

Example Usage

One utility class that I’ve made is a generic XML writer and loader that can accept any instance and write the instance’s public properties to an XML. This way, I don’t have to make a custom XML writer code for each type of data. I can just use this class and tell it to write the instance.

I don’t write all public properties. I only select those with public getter and setter. The following is a utility method if such property is meant to be written:

public static bool IsVariableProperty(PropertyInfo property) {
    // should be writable and readable
    if(!(property.CanRead && property.CanWrite)) {
        return false;
    }

    // methods should be public
    MethodInfo getMethod = property.GetGetMethod(false);
    if(getMethod == null) {
        return false;
    }

    MethodInfo setMethod = property.GetSetMethod(false);
    if(setMethod == null) {
        return false;
    }

    return true;
}

This is how I built the writer class (showing only the important parts):

class InstanceWriter {

    private readonly Type type;
    private PropertyInfo[] properties;

    // A common delegate for writing a property of each type
    private delegate void PropertyWriter(XmlWriter writer, PropertyInfo property, object instance);
    private Dictionary<Type, PropertyWriter> attributeWriterMap = new Dictionary<Type, PropertyWriter>();

    // Constructor
    public InstanceWriter(Type type) {
        this.type = type;
        this.properties = this.type.GetProperties(BindingFlags.Public | BindingFlags.Instance); // Cached

        // Populate map of writers
        this.attributeWriterMap[typeof(string)] = WriteAsAttribute;
        this.attributeWriterMap[typeof(int)] = WriteAsAttribute;
        this.attributeWriterMap[typeof(float)] = WriteAsAttribute;
        this.attributeWriterMap[typeof(bool)] = WriteAsAttribute;
        // ... More types can be added here if needed
    }

    // Writes the specified instance to the writer
    public void Write(XmlWriter writer, object instance) {
        writer.WriteStartElement(this.type.Name);

        // Traverse properties
        foreach (PropertyInfo property in this.properties) {
            if (!TypeUtils.IsVariableProperty(property)) {
                // Not a candidate to be written
                continue;
            }

            // Must have a writer
            PropertyWriter propWriter = null;
            if (this.attributeWriterMap.TryGetValue(property.PropertyType, out propWriter)) {
                // Invokes the property writer
                propWriter(writer, property, instance);
            }
        }

        writer.WriteEndElement();
    }

    // Writer methods. There are more of these to support the types you need to support.
    private static void WriteAsAttribute(XmlWriter writer, PropertyInfo property, object instance) {
        // Invoking the getter using reflection
        object value = property.GetGetMethod().Invoke(instance, null);
        if (value != null) {
            writer.WriteAttributeString(property.Name, value.ToString());
        }
    }

}

This class just maintains a dictionary of writer methods mapped by Type. During traversal of properties, it checks if a property needs to be written and that it has a mapped writer method. PropertyInfo.PropertyType was used to get the type of the property. It then proceeds to invoke that writer method which writes the value to the XmlWriter.

The actual class is more elaborate than this. There’s a separate map for property writers that needs child elements. For example, Vector3 needs a child element and its xyz values are stored as attributes. We also took it further by checking if the property implements a custom interface of ours named IXmlSerializable and invokes the writer method of that property.

This is then how it is used:

InstanceWriter writer = new InstanceWriter(typeof(MyClass));
writer.Write(xmlWriter, myClassInstance); // Writes a whole element representing MyClass

My XmlLoader was made using the same concept only that the mapped methods now invoke the setter of the property. This is one of the loader methods:

private static void LoadString(SimpleXmlNode node, PropertyInfo property, object instance) {
    if(!node.HasAttribute(property.Name)) {
        // No value from XML. Set a default value.
        SetDefault(property, instance);
        return;
    }

    string value = node.GetAttribute(property.Name);
    property.GetSetMethod().Invoke(instance, new object[] { value });
}

Other Uses

I’ve made a generic data Unity editor that can support any type of data class as long as they expose the editable variables in the form of properties. As you can see in my other posts, most of our data editors have the same look. That’s because we are using the same base code for the editor and provide the custom rendering when necessary.

CharacterSpritesEditor
One of our generic data editors

In part 1 of this series where I talked about loading instances from mere string, we are also using properties manipulation to expose the variables of such instances in the editor. The following is a screenshot from our GOAP data editor.

GoapVariables

All of these are possible because of reflection.

Render Two Sprites in One Shader

Most of the character faces in Academia are generated in a procedural way. Each character has its own combination of face and head. Heads can either be hair or construction hat if they are workers. Both faces and heads are contained in a single texture so Unity may apply batching.

FaceHair
Part of the faces and heads texture

To generate a character, you simply render a face and render its head on top of it.

Characters

This can be done with two sprites. However, there are disadvantages. First, more draw calls. Second, it looks weird when the characters are overlapping because of z fighting. I needed a way to render both the face and head in only one sprite. This reduces draw calls and prevents weird z fighting. Fortunately, this is easy enough to do using a custom shader.

While working on our custom quad mesh where the characters would be displayed on, I realized that vertices can have more than one UV. This means that I can use the first UV for the face and use a second UV to draw the head. These two renders can be done within a single shader.

Shader "Common/TwoUvsLayeredTexture"
{
	Properties
	{
		_Texture("Main Texture", 2D) = "white" {}
	}

	SubShader
	{
		Tags{ "Queue" = "Transparent" "IgnoreProjector" = "True" "RenderType" = "Transparent" }
		ZWrite Off Lighting Off Cull Off Fog{ Mode Off } Blend SrcAlpha OneMinusSrcAlpha
		LOD 110

		Pass
		{
			CGPROGRAM
			#pragma vertex vert_vct
			#pragma fragment frag_mult 
			#pragma fragmentoption ARB_precision_hint_fastest
			#include "UnityCG.cginc"

			sampler2D _Texture;

			struct vin_vct
			{
				float4 vertex : POSITION;
				float4 color : COLOR;
				float2 texcoord0 : TEXCOORD0;
				float2 texcoord1 : TEXCOORD1;
			};

			struct v2f_vct
			{
				float4 vertex : SV_POSITION;
				fixed4 color : COLOR;
				float2 texcoord0 : TEXCOORD0;
				float2 texcoord1 : TEXCOORD1;
			};

			v2f_vct vert_vct(vin_vct v)
			{
				v2f_vct o;
				o.vertex = UnityObjectToClipPos(v.vertex);
				o.color = v.color;
				o.texcoord0 = v.texcoord0;
				o.texcoord1 = v.texcoord1;
				return o;
			}

			fixed4 frag_mult(v2f_vct i) : SV_Target
			{
				fixed4 col1 = tex2D(_Texture, i.texcoord0) * i.color;
				fixed4 col2 = tex2D(_Texture, i.texcoord1) * i.color;

				fixed4 output;
				output.rgb = (col1.rgb * (1.0f - col2.a)) + (col2.rgb * col2.a);
				output.a = min(col1.a + col2.a, 1.0f);
				return output;
			}

			ENDCG
		}
	}
}

The magic is in frag_mult(v2f_vct i). Actually, I’m not too sure if that’s the proper way to blend two images. I’m not very good with shaders. I arrived with that using trial and error. Let me know if there’s a better way.

[Edit] Some people pointed out that it’s generally not ok to have conditionals in shader code. Thus, I’ve updated the code with a better one. This is why I blog.