Renaissance sculpture Adam brought to interactive life with mocap and digital puppetry

The Metropolitan Museum of Art in New York is staging an interactive digital performance installation to celebrate the restoration and return of the Italian Renaissance sculpture Adam (ca. 1490–95) by Tullio Lombardo. 

The statue suffered a catastrophic fall in 2002, but has been painstakingly restored.

Designed and directed by new media artist Reid Farrington, and commissioned by the live arts series Met Museum Presents, The Return blends digital animation with live performance and motion capture to tell the story of the sculpture’s creation, travels and return to the gallery.

Farrington said, “My vision was to bring Adam to life in a believable and genuinely interactive way. By using a motion capture rig and IKinema LiveAction for Unreal Engine 4 to drive the animation in real time, I’ve been able to deliver the level of realism I wanted.”

Animation design consultant Athomas Goldberg of Lifelike & Believable designed and built the digital puppetry system, which enables visitors to interact in real time with ‘digital Adam’.

Guests can speak directly to the digital character and pose questions, as well as visit the mocap theatre within the Museum for a behind-the-scenes experience.

The Return has been more than two years in development and from the outset the team agreed to the fundamental principle of no pre-recorded material – everything is generated live to ensure each visitor’s experience is unique and engaging.

The result is two hours of material spanning 14 scenes with two characters – digital Adam and a museum docent who leads visitors through the performance.

As the performance runs all day, during Museum hours, there are three pairs of performers who have been trained to drive the puppetry system when not performing, enabling them to control the pre-set lighting, audio and effects. The 16-camera OptiTrack system is hooked up to Natural Point's Motive software, which streams the mocap data to IKinema LiveAction for solving and retargeting into Unreal Engine 4.

Goldberg said: “We’re using IKinema LiveAction to drive both the characters and the props. There are other full-body IK solutions out there, but nothing that gave me the flexibility and modularity to create a runtime rig exactly to my specifications, with the ability to easily adapt to each of the actor's unique proportions in a wide variety of rapidly changing environments and situations.”

Note: We may earn a commission when you buy through links on our site, at no extra cost to you. This doesn't affect our editorial independence. Learn more.

Read Next...