You can do it on an iPhone with no additional equipment and can apply it to a MetaHuman model, which is a relatively high quality human model that epic has tools to generate.
It’s worth noting that both the FaceCap and Moves By Maxon apps can also capture facial animation using only an iPhone with no additional equipment.
It’s hard to tell from the video, but I wouldn’t be surprised if they used their face capture animation to drive wrinkle maps[1] on their MetaHuman models. This is something that other apps don’t offer out of the box, but which can significantly increase realism.
What do you do with those app outputs though? I’m guessing it still feels hard because it won’t automatically map to a mesh that is good to go in your game engine?
The MetaHumans drop in models are the thing that makes this magic imo. It’s not the motion capture tech so much as the complete pipeline to produce game assets.
FaceCap gives you an FBX animated with about 50 different morph targets. If you wanted to use that in a game, you’d probably need to load all the morph targets into a shader yourself.
I agree that the automatic integration with MetaHuman is the main benefit.
Actually you’ve already been able to do that for a few years now with Epic’s LiveLinkFace app (which was basically just a wrapper around Apple’s ARKit). This is just their own (much higher quality version) of it.
I keep seeing people say this without explaining what the huge thing strapped to the actor's face is, that clearly isn't a phone. Not explained in the article either.
I’m not sure what you mean. There’s one shot in the video where they show a full motion rig and clearly explain that this system works with both a phone and/or a professional stereo system.
I’m guessing that is for when you want to move your body around and capture the facial expressions associated with doing so. You need some sort of rig to keep the camera in your face if you’re moving your face.