Date in The Metaverse Project
Project Description
The project explored the intersection of virtual reality and television entertainment. This pilot initiative focused on animating a blind date scenario within a metaverse environment, using Unreal Engine 5 (UE5) as the foundational technology. The project sought to examine the interpersonal dynamics between two contrasting virtual characters during a date—one appearing as a realistic human (a metahuman) and the other characterized as a fantastical monster.
Innovation:
The innovation of this project lay in its experimental approach to storytelling and audience engagement. By leveraging the metaverse concept, the team aimed to redefine the viewing experience of a reality TV setup. It showcased how advanced digital technologies could blend the boundaries between the real and the virtual, offering new narrative possibilities and a fresh perspective on character interaction and development in a controlled environment.
Integrated Production Pipeline and Tools Explored
• Unreal Engine 5: The latest iteration of the powerful game engine was used to create photorealistic animations and environments.
• Livelink Face: This tool enabled real-time facial animation capture during the live blind date shoot, capturing the nuances of human expression through the iPhone's sensors.
• Audio2Face: As a tool primarily for metahumans, Audio2Face was explored for its potential in automating facial animation in sync with dialogue.
• Quixel Megascans: Utilized for high-fidelity environmental textures and assets.
• Mixamo: An Adobe-owned repository for 3D character animations.
• Maya: The 3D computer graphics application was used for creating and refining facial blendshapes essential for the characters' expressive animation.