Blog

Kidscreen » Archive » FEATURE: Making mocap moves

Motion capture, or mocap, has long been heralded as a solution for producers’ two biggest challenges—constraints on cost and time. But the tech’s uptake has been slow because, at least initially, the bulky motion-capture suits (those dark outfits that are sometimes covered with dots), high-priced software and slow rendering processes made it expensive and complicated.

Even today, mocap hasn’t fully shed its reputation as an enigmatic and prohibitively expensive process. As a result, it’s still alien to a lot of producers who are used to animating 52 x 11-minute shows in more traditional ways. Mocap Animations

Kidscreen  » Archive   » FEATURE: Making mocap moves

But thanks to inexpensive licensed software, a surge in new motion-capture studios and apps on smartphones that can turn live-action footage into animation, mocap is booming. And kids content producers are starting to embrace the tech, finding ways to make mocap their own and developing new projects that lean into its strengths.

Mocap is helping studios deliver content quickly—an especially critical capability when it comes to capturing the attention of digital audiences before they move on to the next IP. How quickly, you ask? Australian mocap studio Fika Entertainment (Teletubbies Let’s Go!—pictured above, PJ Masks in Real Life) was able to produce 300 minutes of finished animation in just 12 months with a team of less than 50 people, says CEO Jordan Beth Vincent. (To put that in perspective, hundreds of animators could typically spend 18 to 24 months on 572 minutes of a traditionally produced 52 x 11-minute series).

Fika Entertainment produced 300 minutes of finished animation in only a year using a small team of motion-capture performers and technicians.

Poland’s Platige Image (The Witcher), which opened its own mocap studio in March, has seen the impact of developments in motion-tracking systems, animation software and cameras on speeding up the back end of mocap work. And the tech’s cost has also come down to the point where talent can sometimes be the most expensive part of a shoot, says Elźbieta Trosińska, a motion-capture producer at the studio.

The quality of work a producer can get from mocap has improved so much that characters have a realistic feel, which can be difficult to achieve in traditional animation, says Halle Stanford, president of television for The Jim Henson Company, which has been working with mocap tech for about 20 years. Famous for its iconic Muppets, Henson sees a lot of similarities between its early puppetry work and what it’s doing these days with mocap. In the past, it built hand-crafted characters that puppeteers brought to life; it’s doing much the same thing now, but with the high-tech twist of using computers to turn performances into finished animation.

Stanford says Henson has identified two underrated uses for the tech: capturing movement (especially dance) and creating characters that can really resonate with kids. And the studio is in the process of developing two new preschool shows that lean into these strengths.

In Monster Jam—created by veteran puppeteer John Tartaglia (Sesame Street) and former Nickelodeon exec Russell Hicks (SpongeBob SquarePants)—monsters of all shapes and sizes dance on screen and encourage kids to get up and be active. And Dani Can Dance (pictured below) from Alex Rockwell (Word Party) centers around a young girl who learns different styles of dance in each episode, ranging from ballet to Bollywood.

“The style of movement and dance, where it looks real, gives these shows an aspirational feel [that inspires kids] to get moving, and that’s something we couldn’t capture in traditional animation,” says Stanford. In Dani Can Dance, mocap makes it easy to change the sets so Dani can be dancing anywhere. This opens up the world of the show in a way that wouldn’t be possible without expensive animation, or traveling to different locations in the case of live action.

In Dani Can Dance, mocap backgrounds can show Dani dancing anywhere, saving on costly animation.

Henson has its own proprietary hardware called the Henson Digital Performance System, through which performers’ motions are captured and used to move animated characters in real time. Combined with other animation tools like Epic Games’ Unreal Engine, Henson’s performers can see a rendered, almost-finished version of what the animation will look like on the monitor while they’re puppeteering.

Today’s mocap characters are more polished and can be pretty much fully rendered right away, which leads to shorter post times, says Stanford, adding that the tech has developed to the point where the company could even have its characters do live interviews. And mocap characters, with their tactile look and realistic movements, appeal to audiences across generations in the same way that puppets can.

Beyond mocap’s improved quality, the speed at which it can produce content is a major reason why it’s now becoming a critical tool for producers. The demand for animation has been steadily speeding up, due in large part to audiences who are used to YouTube delivering content to them constantly. But animation production has not kept pace, says Tom Box, co-founder of Blue Zoo Animation in the UK.

To satisfy that appetite, Blue Zoo is developing its own motion-capture hardware and software called MoPo, which doesn’t require suits and lets animators quickly animate new content for digital distribution. With MoPo, an animator can don a VR headset and use handheld controllers to puppeteer a character layer by animated layer. They can move any part of the character and then see exactly what it looks like on the monitor—in real time, thanks to Unreal Engine—before moving on to the next part of the character. Epic Games gave Blue Zoo a grant in April to develop MoPo, and the studio is already having a lot of success with it.

Significantly, a creator can animate one minute of content per day with MoPo, while it can take a few weeks to produce that much using traditional CG-animation tools, says Box.

In recent months, Blue Zoo has been busy building use-cases for the technology. It’s producing 12 one-to two-minute episodes of the comedy series Silly Duck for its Little Zoo YouTube channel (414,000 subscribers), highlighting a man’s attempts to do everyday things (like boarding a bus or cooking) with a duck sidekick who is always getting him into ridiculous kinds of trouble. Released back in January, the first episode alone has racked up more than 600,000 views on YouTube, and the series as a whole has netted upwards of 1.3 million views. This popularity illustrates that there’s a demand for simpler, less expensive animation that can be delivered to audiences fast, says Box.

In the long term, Blue Zoo’s goal is to license out its MoPo software for other companies to use. But that isn’t an overnight process, as it would require a team to support the product and manage licensing.

It’s early days for the tech, but the studio is developing new music-focused projects that are made specifically for MoPo and will launch on YouTube. And it’s also in discussions with studios and broadcasters about providing service work and animating third-party shows with the software.

“One person can do this work from home, instead of needing a team of multiple people,” says Box. “Kids also really embrace the low-fi, frenetic energy in the content you can make with mocap, and now we can use Unreal to render it so it looks more polished.”

While producers like the speed and quality of mocap, not all of them are after the puppeteered look that often comes with it. But one previous holdout, Vancouver-based Relish Studios (Barbie’s Dreamworld), is now embracing mocap because the tech is proving to be much more versatile than it originally thought.

“Mocap has been said to be the silver bullet of the animation industry for 10 years, and animators hearing about it can be quick to greet it with healthy skepticism,” says Relish CEO Paul Pattison. “But what we’re finding now is that AI can make the animation more lifelike, and less clean-up work is required.”

In its new projects, the studio is exploring three different mocap techniques: full mocap suits; the app Move.AI (an AI-driven software on iPhones that can track movement and overlay it with animation); and licensed software from LA-based Wonder Dynamics that allows Relish to shoot regular live-action footage, then drag and drop elements onto the timeline during editing (the software will instantly make the real actor look like an animated character).

Developments like these make Pattison feel optimistic about mocap’s potential to help studios produce content faster and cheaper. But this big question remains: How far can the tech go in making quality animation that’s less expensive and with fast turnarounds? “Can we shoot animation the way live action gets shot? That’s going to be the real experiment.”

This story originally appeared in Kidscreen‘s October/November 2023 magazine issue. 

This is just some random content to show the different layouts possible

™ Kidscreen is a trademark of Brunico Communications Ltd. Site copyright © 1996-2023

Kidscreen  » Archive   » FEATURE: Making mocap moves

Mocap Blender Kidscreen title, tagline and logo are trademarks of, and the events are produced by Brunico Marketing Inc.