Thumbprint Animation helps bring the story of an Egyptian priest to life with 3D reconstructions of a 3,000-year old mummy and the city of Karnak.
A new exhibition at the British Museum due to open July 1 promises to take the visitor back in time 3,000 years to Ancient Egypt. Entitled Mummy – The Inside Story, the exhibition features a cylindrical 3D projection theatre, which allows the visitor to explore what lies beneath the wrappings of Nesperennub – an Egyptian mummy housed at the museum since 1899.
Using SGI visualization technology to convert CAT scan data of the mummy, a lifelike 3D virtual model of Nesperennub, priest of Karnak, was created. Complementing the reconstruction, several 3D animation and live-action sequences produced by Windfall Films reveal the world in which the priest lived. Thumbprint artists Jon Burden, Simon Reid and Martin Davison created the 3D elements for the footage, while 2D compositing and effects were produced in Avid DS by Ian McPherson at post house The Station.
Thumbprint was invited by Windfall to pitch for the project on the strength of its previous work, in particular a documentary about Karnak for Channel 5 that it had completed in collaboration with The Station.
“We like to work with our clients from stage one,” says Simon Reid, 3D animator at Thumbprint. “Sometimes we work with sketches, we often storyboard ourselves or even produce sculptures from clay or plasticine.” (Both Davison and Reid previously worked as physical-effects modelmakers for film and advertising.)
However for this project, Thumbprint was provided with a storyboard for the outset, and animatics and aesthetics were decided during the bidding stage, says Reid.
“The brief was pretty solid from the outset – make it big, make it real. The main concern of the director was that it should not look cartoony or stylized,” explains Reid. “We talked through the various shots with the director and producer before they left for Egypt and worked out a ‘shopping list’ of images we would require from them.
“This is very important for two reasons,” he continues. “One for reference regarding lighting, textures, landscape and atmosphere, and secondly to create realistic materials that we could apply to the 3D models, so they would more accurately match the live plates.”
Five different scenic backplates of Karnak were produced to house the live-action, including the city of Karnak, Khons Temple and the Scared Lake. With little remaining of the ancient city and its temple, reconstructing the buildings and landscape proved a challenge for the team. Fortunately, the Windfall production team had been thorough in documenting the remains at Karnak, and John Taylor, assistant keeper, Department of Ancient Egypt and Sudan at the British Museum, provided drawings and photographs, and approved the authenticity of Thumbprint’s 3D models.
“No one knows exactly how Karnak looked, so we were advised by Egyptologist John Taylor to underplay the amount of paintings on the wall and buildings,” explains Reid.
“If we’d had the time and resources, it would have been good to have added some people to the city, maybe even a few horses and cows. And a bit of dust or a little sand storm would have added depth to the scene,” he says.
The models of Karnak were created using Intel Xeon and Pentium 4 workstations using Alias Maya 4.5. “We choose Maya because we know it better than any other package,” says Jon Burden, 3D animator at Thumbprint. “But these days all the 3D packages can produce very similar results.”
“We were aiming for photo-realism, and I think we came pretty close,” says Martin Davison, 3D animator at Thumbprint. “You need to study the real world, the way light bounces, tint, de-saturation and black point changes with distance, even depth of field as in photographs.”
The largest shot of the project involved a 48-second flyover of the city of Karnak that was designed to be projected from three cameras onto the cylindrical screen, filling the viewer’s peripheral vision.
SGI provided the Thumbprint team with a blueprint of its projection system, with distances and angles marked. The team then consulted a lens specialist who suggested a suitable field-of-view angle at which to render the images. Test renders using this angle were incorporated into several animatics of the flyover, which were then tested in a mock up of the projection system at SGI.
The path of the flyover was agreed with Windfall in advance of modelling the city in detail.
“Because of the enormous size – a resolution of 2,195-x-486 – and complexity of the sequence, we rendered in several layers: shadow, flag, depth, diffuse, reflection, and specular, then combined them all at the end,” explains Davison. “Also, because it had to look real, there was the issue of global illumination rendering times. We faked soft light and radiosity by setting up our own lighting system. The textures were then baked onto the models, which greatly reduced rendering times. Baking textures was originally developed for the games industry, however it allowed us to render in-house, avoiding the use of an external render farm.”
As Windfall Films planned to use some of the exhibition footage in a television documentary of the project, to be broadcast after the opening of the exhibition, Thumbprint needed to re-render the sequence at 25fps without the lens distortion. “All the other sequences were designed with PAL in mind from the beginning of the project,” says Reid.
Planning is vital when it comes to working on animation projects asserts Jon Burden, 3D animator at Thumbprint. “At the start of the project, we sat down with several coffees and worked out a detailed list of the requirements for each shot,” he says. “From this we generated a Gantt chart to fit it into our pipeline with the other projects we were running at the time. This meant we could plan and co-ordinate our resources as effectively as possible. However, new problems always crop up that just can’t be planned for, so there were a few very late nights and weekends in front of the good old Mitsubishi Diamondtron monitor, which always pleases the girlfriend.”
“We also kept in close contact with The Station, Windfall, SGI and, of course, the British Museum,” says Reid. “QuickTime animations and high-res test renders were emailed regularly to all so everyone could see how things – both technical and creative – were progressing, and allowed problems to be tackled early on.”
Thumbprint also had to adapt its pipeline to complete shots in a particular order for the compositor at The Station in order to coincide with the delivery of greenscreen elements and live action footage from the production team.
“Delivering the images to The Station in a precise sequence was one of the hardest aspects,” says Reid.
Thumbprint puts a face to Nesperennub – a 3,000-year-old mummy
Thumbprint worked with The Unit of Art in Medicine at Manchester University on the 3D reconstruction of the face of the mummy. The Unit supplied the team with 3D data from a medical sculpture produced from the bones remains. This was then converted and optimized for animation – with the poly count reduced from two million to around 100,000 explains Davison.
Davison and his colleagues added eye colour, muscle texture, skin tone, facial hair to the model and animated the elements to produce a revolving shot, revealing the subsequent layers from skull to face.
The source textures used for the fat and muscle layers were supplied by Thumbprint’s local butcher. Chicken breasts, streaky bacon and beef steaks were all scanned in to help create the fleshy texture used on the model’s muscle layer. The fatty areas of these images were then compiled to help create the translucent look of the model’s fat layer, explains Davison. “Later that night we had a really good barbecue,” he recalls.
The mummy’s face was created by mapping the model with a photograph of the actor who played the character in the documentary, and the skull texture was created using standard layered material – a mix of bitmaps and procedural shading, says Davison.
Thumbprint used animated fall-off maps in the opacity channel to reveal the various layers one by one that make up the mummy’s head.
“Once the model and textures were completed, we animated a camera to revolve around the head and finish in a position matching the live footage of the actor. The transition from CG to live footage was a simple crossfade,” adds Davison.