New third-party apps and creative methods allow you to create something far removed from the world of Animoji & Memoji.
It's been a few years since the release of the iPhone X introduced users to a TrueDepth camera and facial recognition software; with the release of the XR and iPhone 11 Pro, third-party app makers and creators have had more incentives to explore the possibilities of TrueDepth. This will be of interest to any artists wondering if they can unlock the capabilities of their iPhone's AR-toolkit for motion capture work, and produce something more substantive than a nodding chicken head to send to their friends.
Anyone wondering if body capture work is possible with the iPhone can try a technique pioneered by Cory Strassburger, co-founder of LA-based cinematic VR studio Kite & Lightning. Using an iPhone X in tandem with Xsens inertial motion capture technology, Cory shows you can produce simultaneous full-body and facial performance capture, with the final animated character live streamed, transferred and cleaned via IKINEMA LiveAction to Epic Games’ Unreal Engine. This is all done in total real time, as the below video demonstrates.
The method relies on a DIY mocap helmet with an iPhone X directed at the user’s face, an Xsens MVN system and IKINEMA LiveAction to stream and retarget the motion to your character of choice in Unreal Engine. Via this setup, users can act out a scene wherever they are, as Cory demonstrate at 2018's Siggraph convention.
Those interested in facial capture work meanwhile can have a go with Live Face, an app which featured on our sister site Macworld. Realised by Reallusion, this free app feeds facial tracking data directly in real time to a Mac or PC, using your iPhone as a hotspot, and connecting up to your computer and tracking data points on the face via Wi-Fi.
It'd be wise to note though that the end data is received and processed by Reallusion's CrazyTalk Animator 3 suite; the app is currently only compatible with this software, meaning anyone without it will have to shell out £84.99/$89.99 on the App store if interested.
Another option for facial capture is CV-AR from Maxon, a free app also released in June that's compatible with the grandstand that is Cinema 4D. The software captures your facial animation and sends it to C4D, textures and sound included, with the data stored locally on your iPhone inside the app itself.
The app is designed to make the capture and transfer of facial animation as seamless and effortless as possible; transfers are made possible by scanning a QR code, so there are no hotspot or USB options with this one.
A free sister-app has joined its side as of March 2020 in the form of Moves by Maxon, which enables iPhone X/XR and 11 Pro users to capture facial motion and whole body movement; enabling artists to bring motion sequences into Cinema 4D with limited technical effort.
A separate plugin is required to transfer the captured data from your Apple device to Cinema 4D and can be downloaded for free from the Cineversity website. This plugin is only compatible with the latest version of Cinema 4D - Release 2, or later.
A more low-key release is Face Cap from solo developer Niels Jansson. This one is very interesting as its output can work with not only C4D, but also Lightwave 3D, Autodesk Maya, Houdini and Blender, putting its competitors to shame (insert blush-face emoji).
With Face Cap you can record 50 different facial expressions for a recording duration of up to 10 minutes. It exports generic mesh work, blendshapes and animation as Ascii-fbx, and offers a native IOS sharing interface so you can email or Dropbox your recordings.
Lastly, let's have a little look at something different - the Animation for iPhone X service from France's Polywink.
Polywink is an online platform for 3D facial animation that aims to save studios and 3D professionals time and budget by automatically generating blendshapes and rigs. In other words, it's an outsourcing service, with its iPhone option allowing you to upload a neutral 3D head model and receive a character head ready to be animated.
The service automatically generates a set of 51 blendshapes as adapted to the specific topology and morphology of your character. It closely follows ARKit documentation, meaning you can plug your model into the ARKit Unity Plugin and let the iPhone's face tracking do the rest; no additional rigging and modelling is required. The service will set you back $299.00 per head, and promises a 24 hour turnaround.