Third Person, Stylized Action Platformer in which the player cleans up a corrupted garden.
Reaching checkpoints clears and brings color back to the world. Capture critters and bring them home to achieve this.
36 Weeks (20 people)
Time: 2024-08 - 2025-06
Main roles: Technical Design, Audio
Genre: 3D Platformer, Action Adventure
Game Engine: Unreal Engine 5.4
Released on: Steam
Platforms: PC
Reception: 97% Positive (2025, 41 reviews)
Technical Design - Light Baking, Optimization, Profiling, Components.
Audio Design - Implementation, Composer Communication, Wwise.
Pipelines and Guidelines - Extra Quality Assurance
Prototyping - Systems concepting
For the game we originally had an aspect of a water tank that would fill itself in the safe area, and drain when spraying water.
Down the line we added a sliding element which would also drain, and had experiments lined up that would require an instant drainage. Due to these features I made multiple fill/drains function at the same time, as well make it be instantly drain or fillable.
For this I made a component that could be put on any player character (so I could work in tandem, and it would not be limited to a specific character).
To test the game we made a level that we could run on an external device using Command Line parameters. Due to this being early on in the project we decided that having a camera fly through the level, that triggers respective features, was the best way to implement this.
I made the command line arguments work and sent it to the programmer that had set up the device it will run on, and also have implemented the trigger system to trigger the cleansing effect.
This level was also used to test baking Lights, allowing us to work in tandem with the VFX team to ensure neither got postponed.
This level was later given to the enemy team so they could profile their AI behaviours better. The basics worked with interfaces and overlaps.
During the development process we got the main level to a workable stage. This meant that I was allowed to port over our profiling system to the main level.
First thing I did was implement triggers for the checkpoints, and make a full spline go throughout the entire level. This spline passes the most important paths a player will cross and triggers other events such as breaking a wall. Simulating what will occur in the game without needing the finalized player character.
Framerate Chart
One of the team goals was getting our game running on lower end hardware, and the Steam deck.
One of the other goals was having a very pretty game, with lighting looking as good (if not better) than the built-in lighting engine.
Due to this I have researched GPU Light Baking in Unreal using Unreal GPU Lightmass. This was at the time in beta but after experimenting with settings we got something workable.
This has increased the GPU performance of the game greatly and allowed me to work closely with the Visual Effects and Programmers from our project.
The main bottleneck prior to baked lighting was GPU time.
52 FPS Average.
Due to baking (only change) GPU time got a major improvement.
110 FPS Average.
To speed up processes on the team we wanted Asset Validation. The purpose was primarily so we all kept to the naming conventions and put all assets in the right place.
For this I made a quick system using blueprints that people could adapt to whatever assets they wished to validate.
All settings are documented and robust. You can add prefixes, suffixes, folders and select the object to validate. Making it be quick to setup for anyone. This was made to grow with the projects needs.
To be able to work in tandem with the UI developer I made a simple macro library that holds the functionality to call the audio events. Due to me still actively working on these events and changing them in the backend this a good way to not block each others work.
Due to people requesting certain functionalities, I created a macro library that holds simple features. This included a For loop with a delay, a Do Once that can reset itself if another Do Once trigger is hit. There is another Do Once that will automatically reset itself after a period of time.
While my technical implementation work did benefit the team, I am somewhat saddened by the limited time I had to focus on this. This is due to me also having to do a lot of audio work and explore the Perforce Pipeline.
Even though my work got limited by time, I feel that the quality I needed to reach has been achieved. The profiling workflow was adaptable to grow and quick to implement for other developers. The Battery Component being removed wasn't expected, but it did start our Characters transition into being component based, so more people could work on it at once and it was kept manageable.
I am also quite proud that we have made it possible to run our game properly on a steam deck with a steady frame rate. We cannot get it verified by Valve manually but when they get around to our game it should simply work out of the box.
Synthesized using Vital.
Implemented using Wwise.
For this project we wanted to have at least a playable level of audio quality. This meant that I needed to explore options for implementation.
This has led me to try out MetaSounds (built-in), FMod and Wwise. Due to Wwise having a more inheritance based structure over a daw-like structure I got used to it quickly. MetaSounds was not a preferred method as the composer has more experience with Wwise, so this would make communication harder.
After deciding upon Wwise, a programmer and I have implemented it in the project. The project is version controlled using Perforce.
Someone on the team got in contact with a music composer, and brought them onto the project. Together with them and the composer I had bi-weekly calls to get aligned on the vision of the music and to showcase what direction our game was heading in.
Wwise has been used to adapt the audio to given gameplay and also was helpful managing the many audio assets. Due to changes to scope I had to remove quite a bit and Wwise having it all in one place quickly toggle-able helped a ton.
For the creation of audio assets I have used Reaper and Vital. These synthesized sounds are implemented for the UI.
The in world audio and player audio have been sourced from the GDC audio packs. These have effects and filters applied to fit our games needs. I also layered multiple audio and have the audio adapt depending on the gameplay.
This is done according to the players speed, the progress of the game being audible in the music and more.
The philosophy while creating audio was very important, we wanted the world to feel natural. It is a entity that would normally reside in this world.
The UI and robot itself we wanted to feel more towards the synthetic, it's a robot so it should make robot noises. So while the water spraying sounds and environment sounds are nature, the noises the robot itself makes are more robotic.
For the team I was the sole audio designer. Due to this I spent some extra time onboarding a programmer that had time on their hands onto the Wwise software.
This was required as no one else had time to learn it nor wanted to. Making me the sole person able to change, adapt or implement audio would mean a big bottleneck if anything happened to me as a person.
By making the tutorial document I also got the opportunity to explore a wider feature set and decide upon whether we would need to use it or if it would not be viable for the project.
After making this document I have sat in person with the programmer and walked them through the project setup step by step fully onboarding them.
To keep blueprints readable by anyone in the team, we needed a proper structure to them.
Due to this I have chosen to set up guidelines for comment colors, naming conventions of variables and quality assurance.
There was a blueprint in project that had all the comment colors and I explained how to quickly add them to the color picker submenu. These guidelines have made it easier to quickly find what you need in another persons blueprint.
Together with people working on the lighting, 2 programmers I worked on documenting the lightmass plugin. This included our experiments, performance metrics and things to look out for.
Using this data we got a near 1:1 of the real-time lighting, but with the performance of baked lighting. I took the lead in this process and made sure all parties involved were informed on decisions I made, why I made them and how the pipeline works in engine.
Perforce Swarm reviews were implemented to assure higher quality of submissions. Due to having an automatic build pipeline we saw that the build sometimes had compile errors due to human mistakes.
For this we made a submit pipeline that was simple to learn, well documented and robust. A programmer and I sat down together to explore what perforce has to offer and experimented in-depth with the swarm review process. This is something we did that bore great fruits as most issues that arose from people learning the process we could quickly solve.