How Performance Capture Will Bring the Metaverse to Life
There is undeniable overlap between the gaming industry and the metaverse, but they are not interchangeable. VR headsets may allow us to enter the metaverse, but its success as an immersive environment will be determined by movement.
Brett Ineson, President of Animatrik, discusses how truly unique real-time capture will be critical in bringing our digital selves to life in the future shared online world.
Brett Ineson has nearly 20 years of visual effects experience and serves on the board of the Motion Capture Society. He founded Animatrik Film Design in 2004 after working with industry leaders such as Weta Digital to specialize in performance capture for film, games, and television.
Animatrik is based in Vancouver and has the largest independent motion capture sound stage in North America. It also has a second location in Los Angeles, California. Aquaman, Spider-Man: Homecoming, Gears of War 5, Avengers: Endgame, Ready Player One, Xmen: Dark Phoenix, and Rogue One: A Star Wars Story are among his recent credits.
Without a doubt, the metaverse is one of the most hotly debated concepts in the tech world. There is much speculation about what it is and how we will interact with it, but one thing is certain: realistic simulation of movement will be one of the foundational pillars.
Brands that are already positioning themselves and investing in the metaverse must create content that can withstand scrutiny in a 3D open world environment similar to gaming worlds. In fact, when something is referred to as being “in the metaverse,” it usually refers to one of those popular online games that allow users to roam freely and interact socially, such as Fortnite or Roblox.
There is undeniable overlap between the gaming industry and the metaverse, but they are not interchangeable. Consumer behavior in gaming has paved the way for new experiences such as virtual concerts and mixed reality events, which have branched out into other areas of entertainment. On the technological front, the convergence of media and the power of game engines such as Unreal Engine 5 is now spreading far beyond their original industry, with use cases becoming more common in film, television, and digital asset creation in general.
Bridging the Virtual and Physical Worlds
Thriving communities have long existed in gaming worlds, but more is required if we are to bridge the virtual and physical worlds. VR headsets may allow us to enter the metaverse, but its success as an immersive environment will be determined by movement. Any brand or creator wishing to stage an event in the metaverse faces a challenge in recreating realistic movement in a digital environment. This is where performance capture comes in; technology that captures the entire body and face rather than just motion capture.
The performer is given the opportunity to truly embody the digital character they inhabit on the virtual stage, right down to the nuances of their facial expressions. Because of the evolution of this technology, as well as the limitations of live events in recent years, the popularity of virtual concerts and interactive events has skyrocketed.
A-list celebrities ranging from Justin Bieber to Ariana Grande have entered these digital spaces accessible to fans all over the world. One of the first examples was Lil Nas X performing in Roblox in front of a 33 million-strong audience. While the haptic element has yet to be realized, attendees of these shows were able to interact with the artist and each other in real time, creating a collective social experience.
Real-Time Rendering Is Essential for Live Experiences
Without real-time rendering, virtual live experiences like the ones described above would be impossible. The technology, which is typically done through game engines such as Unreal Engine 5 or Unity3D, is capable of analyzing and producing images in real-time, allowing users to interact with the render as it is being developed. Real-time rendering is frequently used in virtual production and in-camera VFX for TV and film, with notable examples including the Matrix Awakens Unreal Engine 5 demo, The Mandalorian, and Lion King.
A sense of immediacy is essential for a memorable virtual concert experience. Live performance capture combined with real-time rendering has the potential to establish that human connection between performer and audience within the digital space.
The Justin Bieber virtual concert experience even gave fans a behind-the-scenes look by showing Bieber in the corner of the screen at one point to show that his movements were in sync with his digital avatar on stage. These experiences are regarded as forerunners of the metaverse, but they are also native to and thrive in online gaming environments.
Artistic Expression Through Movement
The widespread use of performance capture is opening up new avenues for artistic expression, both within and outside of the metaverse. The metaverse has the potential to take all existing forms of virtual events to the next level, from virtual concerts with worldwide fan participation to moving art installations and mixed reality circus performances. Movement as art is already a distinguishing feature of virtual forms of entertainment. The use of avatars in games already demonstrates this; how they appear, how users can interact with them, and their mannerisms are all defining elements of the gameplay experience.
Users frequently buy and sell skins and accessories within games to make their avatars stand out. This will be especially true in the metaverse, where the avatar is more than just a character the user inhabits within the game’s world, but a true representation of the person. Customization will extend beyond features, skin tone, and hair to mimicking the way a person walks, smiles, laughs, or frowns.