DEVLOG #12
12 July, 2024
Author: Harry Findlay, Editor: Alex Crean
Ski Lodge Murder Dev-log #12: Bringing Characters To Life With Code
BlendShapes (Manually Without System Integrated) in Unity Game Engine
INTRODUCTION:
Many of us have played video games, both good and bad. So, what affects the quality of a game? Well, the answer is – many different things, it depends on what we are evaluating.
A game is made to feel good via its gameplay mechanics. A game is made to look quality and smooth via its animations. A game is made to feel immersive and interesting by its environments and characters. So, let’s have a chat regarding the characters in 'The Ski Lodge Murder'.
As a narrative-heavy game, with the need for interesting and unique characters, having generic and bland animations and facial expressions simply doesn't cut it for us!
Explore how we use BlendShapes, with custom scripting, to create emotional responses, micro-expressions and enabling deeper emotional connections between players and characters in narrative storytelling.
WARNING! THINGS GET TECHY AHEAD... But we'll try to explain it the best we can!
Facial Expressions Using BlendShapes and Coding Handlers
SO WHAT ARE BLENDSHAPES/SHAPEKEYS?
In their simplest form, BlendShapes (also known as ShapeKeys) are a technique used to deform a mesh into different shapes. These shapes are created by altering the positions of vertices in the mesh to achieve various expressions or animations, such as facial expressions, mouth movements for speech, or other complex deformations. By interpolating between these predefined shapes, animators can create smooth transitions and realistic animations without needing to manually adjust the mesh for each frame.
Once the system has ran the data through its rules and math, the Skinned Mesh Renderer is then called over (referenced) where, as long as the changes made are valid, the changes to the blend shape are applied and the character will change in the relevant manner. For example, the character, after being walked into, gradually changes from a smile to a frown.
THE BRIEF
Our intern programmer, Harry Findlay, was tasked with creating an "Event-Based NPC Body Language System". The core functionality of the system was to adapt the characters' blend trees to respond to certain events within the game.
Once the data has been set up, the system then uses this information and ensures that the desired change to the character can be made without errors or insufficient data. The system uses “clamping”, in case anyone is unsure of what clamping in game development is, clamping refers to the restrictions that can be set to data to ensure it stays within a certain range or to ensure the data follows certain rules. In this case, the clamping ensures that the intensity of the blend shape changes is kept to the rule of 20%. This allows for easy use for example, 0% intensity is a straight face, 40% intensity is a slight smile, and 100% intensity is a complete or “cheese-y” smile.
The changes made are then “Lerped” or gradually changed. Lerp is short for interpolation, and this allows the changes to be gradual or blended, discouraging and preventing the snapping from one expression to another nearly instantly. When coding, this also makes use of other in-engine functions such as IEnumerators and Coroutines, however, we won’t go into depth about these as they can be seen simply as “Timers”, allowing for gradual changes in code and, just like the lerping, prevents snapping or instant changes.
TIME RESULT:
The system has been a success and met with high praise. The system successfully allowed for dynamic changes to the characters in-game, allowing characters to show emotions and giving the characters the ability to react to the game world in real time. This was key in adding to the world-building, environment and overall storytelling because, as stated in the introduction, “A game is made to feel immersive and interesting by its environments and characters”.
Slight Snippit of Code
THE RESEARCH
Let’s have a chat about the ideas and functionality behind the system. There were several floating ideas
Using Convai for NPCs in Games
Facial Animation Tool from Unity Store
Blendershapes for Character Adaptation
Motion Capture Integration
Traditional Facial Animation Rigging
Firstly, Harry Findlay investigated different approaches to how the characters can be adapted and altered to reflect their personalities further. After extensive research, utalising BlendShapes seemed to be effective, and have a low resource requirement (Time and Money). We aren't discounting the other approaches, and we well may use Motion Capture and other tools in combination with BlendShapes to further enhance our NPCs!
Visual Brief
A great thing about Blendshapes is their modular approach, new character poses can be created and implemented into the Event-Based NPC Body Language System even after the initial import.
LETS GET TECHNICAL! THE SYSTEM:
The system designed and developed worked in a simple and efficient manner. Firstly, the data within the system’s scripts are defined so that the system has all of the information which it will need to register and apply the desired changes. Some of the data registered in the scripts include:
Character Name
Current Character (Reference to Game Object)
Blend Shape Type (such as mouth smile or mouth open)
Intensity of Blend Shape (How far the smile, how open the mouth)
SkinnedMeshRenderer (how the blend shape is changed in-code)
A dictionary containing Blend shape type and intensity (to pair them in one data structure)