- Follow the story of Kaity, the girl who lost her head to a goat
- Singleplayer
- In development since August 2023
- Developed during my time at Hairy Heart Games as a Junior Game Developer
This was the project for which I was initially hired at Hairy Heart Games. The goal was to see if it was possible to use modern motion capture AI technology to create lifelike animations for a story-focussed video game based on an old scottish fairy tale. To complement the animations we would create physical puppets of the characters and scan them using 3d capture software.
In the game the player moves between cutscenes consisting of the motion captured animations and at certain points is prompted to make a choice for how Kate should react to something, either as her human self or her goat self to navigate the branching story.
This project is an excellent example of why I love working in the industry; not only did I get to work with a new tool that I had never used before, I also got to work with many new people from professions that I previously had never interacted with: Puppet makers, model makers and theater actors just to name a few.
Return to MAIN PAGE:
Or keep reading about this project!
New Tools
One of the tools we used for this project was an AI-powered software called move.ai which would be used to capture live performances of actors for the scenes and provide motion capture data which would be used inside of Unity to animate the characters.
As the team for this project was quite small the operation of this software during the shoot became my responsibility which meant ensuring the different cameras stayed operational, ensuring consistent file naming and keeping track of the different takes, uploading the footage and finally importing the animations into Unity.
While not exactly a normal task for a Game Designer this was actually exhilerating and is one of the things I love about this job; there is always something new to learn, some new process to adapt or investigate.
Prototyping and Iteration
Unlike the other project for which I was at Hairy Heart Games (RallyAllyAlly) I was actually there from the start for Crackernuts. That meant the game had to be built from the ground up including all the tech in the background, some of which fell to me to implement.
The approach here was iterative, I first developed a prototype using test footage we captured earlier to see how Unity might best be used to play out the cutscenes. We needed to support multiple characters in each scene and the different branches while preferably allowing us to edit the timings of clips to synchronize animation and audio. To do this we used Unity Timelines, a built-in tool for driving animations and audio.
However we quickly realized that Timelines didn't quite offer what we needed, as there was no way to easily blend between different timelines, so to hide transitions a new system would need to be developed. The system we ended up for here went through 3 iterations; at first we had the timeline trigger a transition on a Unity Animator, since those transitions allowed for blending. In the second iteration we removed the animator and "simply" ensured that when timelines switched they would do so on the exact same from of the current animation to hide the cut.
However both of these approaches were flawed similarly due to a limitation of move.ai; this tool only allows the capture and process of clips up to 4 minutes in length meaning we had to introduce cuts at least that often after which actors had to reset. Because of this there were clear mismatches in posing between the cuts which needed to be hidden. So for the final iteration we implemented a system where each timeline had a separate set of hidden characters which we blended between during transitions. This way the transitions would always be smooth, even if the start and end positions were different.
The way to get characters to hold objects went through a similar process; at first objects were simply placed into the hands of the characters but this often didn't look great since the actors were merely miming holding the objects, so their hands weren't consistent. For the second iteration we used IK to move the hands to the object instead and added timeline functionality to blend in and out of the IK state.

The other projects

Back to Top