CHAT SIMULATOR
A game that uses real-time changing NPC expressions for puzzle-solving
Details
Genre: Simulation game
Engine: UnrealEngine5.0EA
Platform: PC(Windows)
Date: June 2022
The project utilizes assets sourced from the Unreal Engine Marketplace.
Demo video
Gameplay
The player will participate in an online gathering with five former classmates. During the gathering, C consistently displays some unusual behaviors, indicating that something out of the ordinary has occurred to her. The player, controlling the character D, seeks to understand the reason behind C's abnormal behavior and provide her with some assistance. The player must navigate the conversation towards the desired outcome by selecting the appropriate responses based on everyone's discussions and facial expressions during the video chat.
Game flowchart
Programming
1. Dialogue system
Store dialogues in an array, click the play button to iterate through the array elements for playing dialogues and adjusting character expressions, while also determining if it's time for a branching decision. After making a choice, remove array elements that do not belong to the selected branch to proceed with the chosen path.
2. Expression system
Recorded expressions are applied to characters who are speaking and can reflect dynamic facial expressions, including mouth movements. This is achieved through pre-recording and playback within the blueprint of each line. Parameter expressions are used for characters who do not speak in the following line. The implementation method involves using Happy and Amazement as indicators to set up a two-dimensional Blendspace. Four expressions, Amaze, Lost, Smile, and Expressionless, are pre-recorded as three endpoints and one central point on the coordinate axes. Happy and Amazement parameters are individually set on each character, with their initial values representing their expressions at the beginning of the first line. Subsequent lines change the characters' expressions by modifying these two parameters at the end of each line.
Livelink-recorded facial expressions
2D Blendspace parameter expressions
4 preset Blendspace expressions
Blueprint for controlling expressions
Game art
5 rapid prototyped MetaHumans
Facial expressions change in real-time to reflect the character's emotional state
After a sentence, the participants' expressions undergo noticeable changes.
Making process
Inspirations
1. Detroit: Become Human
Players will face multiple possible outcomes based on their different choices.
2. Many PRG games
The stiff expressions of NPCs can break the player's immersion.
3. MetaHuman & LiveLinkFace
4. Everyday conversations
Failure to understand others' facial expressions can lead to awkward conversations.
Why I made this game
1. Help people practice chatting
Some people, in order to avoid the negative social impact of not understanding others' facial expressions, increase the frequency of their conversations with others and consciously train this ability. However, this can lead to people incurring high social costs due to speaking mistakes. I created the Chat Simulator to provide a solution to this issue, allowing individuals to engage in lifelike conversations with in-game NPCs, thereby offering a cost-effective way to train themselves to chat with others in various situations. I hope that with the assistance of rapidly advancing AI technology, this solution can provide tangible help for people's lives.
2. Offering a solution for vividly performing character expression
I believe that the following two characteristics contribute to the somewhat stiff expressions of many NPCs in RPG games today: a lack of expression variation based on each line of dialogue, and expressions with too little amplitude.
The MetaHuman+LiveLinkFace workflow I've developed provides a solution for creating more lifelike expressions for NPCs. Moreover, this workflow allows for real-time expression changes for characters through facial capture and adjustments to happiness and surprise values, which could potentially be applied to multiplayer games. For example, by incorporating real-time expression changes into VRChat, player interactions could become much more lively.
3. Exploring the use of facial expressions as clues in games
Facial expressions hold a lot of information and can serve as clues in puzzle games. I believe this would be a rather interesting gameplay element. Just like in real life, trying to guess what others are thinking based on their expressions can be intriguing. Curiosity about this game mode also motivated me to create Chat Simulator. In this game, the correct choices are often reflected in the NPC's changing expressions.
Iteration
1. Game mechanics
Initially, I wanted to recreate the scenarios of people's conversations in real life by designing the game as a roundtable discussion. However, due to the high-performance requirements of MetaHuman, loading a scene with five MetaHumans would result in lag. Additionally, severe distortions occurred when binding actions to the MetaHuman bodies. As a result, I transformed the game into a video chat format, which not only resolved the aforementioned issues but also effectively portrayed the characters' facial expression changes.
MetaHuman with body deformities
2. User Interface
At first, I wanted to display all the information, including the two branching options, on the UI, but this made it challenging to emphasize the most crucial information—the characters' facial expressions. So, I made some changes. I decided not to display regular dialogue and branching options simultaneously. I also reduced the text size to enlarge the facial interface. Additionally, I provided backgrounds for each character to highlight the setting where the five characters were having conversations in different places. Finally, I incorporated game background images and used Photoshop to create elements such as rounded frames, speech bubbles, and call indicators, enhancing the visual appeal of the UI.
Initial UI
Optimized UI