How to Create 3D Characters Upon Storytelling

Concepting, 3D modeling/texturing, blendshapes, and integrating Hyprface facial mocap solution

--

What makes a character alive and beloved? We asked this simple and powerful question — and from here, Sandie & Avery were born.

In this article, we will follow the steps of creating Sandie & Avery from concepting, 3D modeling/texturing, blendshapes, and integrating our facial mocap, Hyprface SDK.

Edit(Aug 24th, 2020): We finally released an avatar video-chat/streaming app that you can create your own 3D avatars! Currently, the app, Hyprmeet, is free to download as a beta app so check it out and let us how you think.

hyprmeet.com

It contains a built-in avatar customizing tool, a virtual webcam feature for streaming/video call software such as Zoom, OBS or Discord, and our real-time facial mocap technology. We are planning to add more features such as VRoid 3D model support, button-triggered animation, etc. so stay tuned!

What is Our Intention?

We, Hyprsense, develop facial mocap solutions that can power characters to be animated in real-time. While facial tracking technology is our core engine, we believe that CHARACTERS are the windows to show and unlock the full potential of our technology.

What Makes a Character Beloved?

Our team wanted to create a girl and a boy that catch everyone’s eyes and attract them to become a fan. Someone like… Anna & Kristoff, Judy & Nick, Rapunzel & Flynn — lovely duo characters that we can name all day.

What makes a character beloved? The answer is in the common feature of these characters, “imperfection.”

Storytelling built upon imperfection will breathe life into the character, connecting them with people’s emotions. Why the characters have similar flaws as us, and how they overcome their flaws. On top of that, their appearance should visually reflect that story.

Keeping that in mind, let’s explore where our team got inspiration and created beloved Sandie & Avery.

Sandie & Avery’s Concept

*we collected inspirational images on Pinterest.

First, we set up our initial focus. Sandie & Avery are lovely and warm-hearted teenagers that anyone would fall in love with. We collected inspirational images from teenagers’ favorite figures these days. Who? Instagram influencers and K-pop stars! Soon the appearance image has narrowed down to Beach Lifestyle for Sandie, and K-pop boy group for Avery.

The next step was to build their story and “imperfection”. The challenging part was how to create characteristics that can be delivered through their appearance. After rounds and rounds of ideation, we came up with freckles and pastel colors.

*We collected inspirational images on Pinterest.

Disney-styled 3D Modeling

To make Sandie & Avery familiar to the general public, we followed the fundamentals of Disney character design and analyzed ‘what makes a character Disney-style.’ Disney characters have a similar ratio in body/face modeling and signature expressions.

Disney duo characters from Tangled(left) and Frozen(right)

Based on the research, the first version of 3D modeling and mock-up expressions came out. Here are some pieces of the very first version.

After a few iterations, we polished the look and signature expressions in a way that can be easily integrated into the Hyprface SDK.

Sandie’s 3D modeling, material, and skin texture maps
Avery’s 3D modeling, material, and skin texture maps

Signature Expression & Blendshapes

The next step is sculpting signature expressions and corresponding blendshapes. As the purpose of the characters is to showcase Hyprface facial mocap technology, the blendshapes of Sandie & Avery are optimized for smoother animation. Refer to docs.hyprsense.com for the list of Hyprface-supported blendshapes and SDK integration specifications.

Trending AR VR Articles:

1. Mario Kart in a real vehicle with VR!

2. How XR Can Unleash Cognition

3. Oculus Go, the Nintendo Switch of VR

4. Expert View: 3 ways VR is transforming Learning & Development

Then, we worked on joint control and keyframe pose assets in Maya. We utilized the Unreal Engine and created pose assets for integrating Hyprface SDK on PC. The SDK also supports Unity Engine and we utilized Unity for building our mobile demo app.

Once the blendshape values are combined, Sandie & Avery start to have their own signature expressions. While integrating the Hyprface SDK, we also cleaned up and modified some of the colliding cases and smoothened the expressions.

Converting 3D Characters for a Mobile App

PC version Sandie & Avery

Although the PC version of Sandie & Avery is ready, yet another quest is awaited ahead — creating a mobile version. For better access to testing our real-time facial mocap solution, we built a mobile demo app and included a mobile version of Sandie & Avery as a feature. The app is now released and available both on iOS & Android.

Hyprface mobile app screenshot images

To optimize the character assets, compressing the mesh and look-dev data is necessary. For mesh data, we reduced the number of polygon vertex and blendshapes by combining selected corrective shapes and extracting in-between shapes. For look-dev data, we reduced the size of the textures and packed multiple materials —ex. metal, occlusion, smoothness — into one using an RGB channel. Refer to the channel packing in Unity.

By the nature of how AR apps interact with phone cameras, we focused on refining the characters’ final look while minimizing the effects of angle/field of view. As a result, the mobile version of Sandie looks rounder and simpler.

Mobile version of Sandie for various angle/field of view

Creating a 3D character requires both artistic sense and technicality. We are grateful that we have a team to manage both. This is it for today — we hope you enjoyed reading this article. Download the Hyprface app available both on iOS & Android and request a trial SDK for collaborative projects.

Hyprsense develops real-time human sensing technology. Hyprface is our product-ready software fully built in-house to track expression values and remap them into a 3D character in real-time. The SDK supports iOS, Android, Windows, Unity, Unreal, and Linux. If you are interested, feel free to ask us for a free 30-day trial SDK.Don’t forget to give us your 👏 !

--

--

Hyprsense develops real-time facial expression tracking technology to light up the creation of live animation.