UK visual-effects studio Mainframe demonstrates the fine art of creating and shattering porcelain characters in CG, before piecing them back together in this TV promo.

Shattered lives pieced back together – that’s the premise behind Virgin Media Television’s new show Rehab, which delivers an uncompromising look at addiction.

The show, which follows seven people in the public eye as they look to rebuild their lives through rehab, demanded a startling series of visuals.

The brief for visual-effects studio Mainframe was simple: visualize the piecing back together of a broken life. “We originally pitched an idea employing different techniques for each of the participants – the porcelain smash, a hollow man effect, a deflated fi gure and a participant deconstructed in slices,” says Mainframe’s creative director Lee Walker.

“In the end, the client preferred the porcelain route and wanted to use it for all the clips.” Yet while the initial direction was straightforward, the project wasn’t without its challenges.

The visuals demanded that every shot looked as though it was generated from the same explosion, which raised issues with getting continuity nailed down.

The solution – a generic explosion that could be added to each shot – had to be mapped to unique models, and the team had to rely on plain old hard graft to fix and re-time the explosions.

By relying on a particle-effect solution, Mainframe managed to make it look as though everything was demolished in the same explosion. Deadlines were tight, says Walker.

“The shoot [on a Red One camera] took place the week before Christmas, and we had time to get preliminary cuts signed off before the holiday. Then we had just three weeks to complete all four animations – two of the animations being delivered at the end of two weeks.

"The three-week process included refining edits, previz animations and sign-off, modelling all elements, all animation, compositing – and bleary eyes all round.

"Because of the short turnaround, the team needed to be quite small, and it included three modellers, two animators and a compositor.”

The workflow was fairly simple, says Mainframe’s senior 3D artist Jimmy Johansson. A team of three worked on model creation, with the models then shattered using the RayFire plug-in for 3DS Max, while Johansson worked on establishing lighting and rendering preparation.

Render and material presets were created so that when the shattering was completed, the presets were loaded and the segments were ready to render.

Generic pieces were added where needed, and small amounts of debris flowed in before rendering. RayFire proved a workable solution that emerged from an initial stumbling-block in trying to create everything from scratch in Maya, says Johansson: the team quickly realized they simply didn’t have time to script all the particles.

“We were looking into several plug-ins to solve the shattering effect and, after studying Blast Code for Maya and RayFire for 3DS Max, we quickly decided that we should do most of the project in 3DS Max,” he says.

“RayFire seemed more promising for what we were creating and especially for the particles. Utilizing Particle Flow within 3DS Max has been a saviour for us – today we see it as a useful particle plug-in and it will be integrated in our pipeline for future projects.”

For a Maya-based studio, this was a bold move. “We first thought we would be able to use Maya as our primary application for this project, and it worked fine up until a certain point,” says Johansson.

He continues: “We were exporting and importing between 3DS Max and Maya all the time, but we hit the wall when we were supposed to import around 5,000 small debris object into Maya.

"That was when we decided that we should do pretty much everything in 3DS Max. I have next to no professional experience in 3DS Max, which became a huge challenge for me. We ended up doing all the models and light rigs in Maya and exported them as .FBX into 3DS Max – and it worked like a charm.”

Initial modelling was still handled with Maya over 4K Red plates, making constant reference to photos from the shoot. The models needed to be camera-mapped, so the team homed in on creating precise outlines of the models.

With the modelling locked down, the models were camera-mapped in 3DS Max, with the models shelled and material IDs established. Mainframe’s senior 3D artist Arvid Niklasson says that the shell modifier in 3DS Max proved an invaluable tool.

“It was perfect to do this in Max, since the shell modifier lets you define the different IDs at creation, so all we needed to do is add the multisub shader on the shelled model and it was ready to go.

"Each model had three material IDs. The outer material was the camera map, inner material was shiny porcelain, and a third material was used for the cracks.”

Material concerns

These materials were created using V-Ray materials – with the outer surface material made up of an image of the actor from a specific shot. A material was used for the edges of the pieces that are seen as they fly apart, with it adopting a matte, almost granular look; this was then replicated with a noise bump.

The last material – deployed for the inside of the pieces – was highly reflective and used highlighting. This was then rendered as a white surface with an 80 per cent reflective material, giving finer control over it during composition.

Technical issues also tested the team, according to Johansson: “The first issue was to have small debris objects. It was the first time I had used Particle Flow, so I thought I would have major issues getting the particles to simulate correctly.

"After the first hour with Particle Flow I was finished with the first shot and I was amazed how straightforward the workflow was, I couldn’t believe that I hadn’t used it before – although, it got a bit more complex when we had a flow with 5,000 instanced particles that needed to collide with all the generic pieces and the floor.”

For lighting, the team switched back to their familiar Maya, which meant creating a light rig proved less stressful, adds Johansson. “By taking photos of a diffuse ball on set and matching the light to all the reference photos from the location, we could get a decent light rig going within no time at all.

"We simply created a sphere inside Maya and matched it with the diffuse ball. If we, for instance, wanted to add more specular in a specific shot we simply added a spotlight to emit specular.”

Other tools included Photoshop and Syntheyes – the latter was deployed for tracking work. “We’ve tried a lot of tracking software, but in my opinion Syntheyes is the tracking software of the moment,” says Niklasson.

“These shots were very easy to track, but on other jobs when the tracking needs some real effort Syntheyes lets you mould the track into shape.”

Despite the huge array of particles, dust, material and objects in play, compositing the project was fairly straightforward. “We decided to composite in After Effects, as we have multiple machines that run the application and the 3D team could also get involved in comping, if they could find the time,” says Walker.

“The advantage of working so closely together was that any errors, issues or suggestions for the 3D passes could often be identified, rectified and background rendered within a couple of hours.

“On average, we used about eight passes to comp the scenes, but with our limited timeframe, some of the 3D passes weren’t always how we expected them,” he continues.

The restricted time for testing meant that some of the specular passes contained very little information. In these cases, rather than taking time away from the mountains of animation that still needed to be created, the Mainframe team invested in Taronites ZBornToy plug-in.

They could then use the Zdepth to add directional highlights to many of the wide shots, Walker explains. With the project wrapped, Niklasson says his favourite scenes include the end scenes, when the viewer is treated to the final pieces being assembled together – and he says that great backplate shots mean that the subject’s eyes and emotions surface easily – which is crucial to the promos’ impact.

“If we had more time, we’d have loved to refine all the elements of the sequences such as more realistic and more irregular pieces, better dust, more time for comping/finishing and more time for lighting,” he says.

“As it was, the timeline required us to finish two scenes per day – from start to finish – which meant we had to be innovative with how we approached the project and how we created the animations.”

The promo starts with porcelain fragments lying on floor, which then move and reassemble themselves into the shape of a person.

Things fall apart

The show follows a selection of celebrity offspring, former X Factor competitors, glamour models and former pop stars as they attend in-depth rehab sessions at a Malibu clinic to grapple with alcoholism, body issues, and drug addictions.

Tricky breakups

Achieving the look of shattered porcelain demanded a high degree of research, with the team kicking off the project examining high-speed shots of solid objects shattering to get a feel for the effect.

“We realized that there was a lot of fine debris flying about looking like dust, which we ended up replicating by having millions of particles simulating,” says Johansson. “It’s hard to see, but it actually adds quite a bit to the final composite.”

The models were created in a similar way to Russian dolls, as three different layers of materials that nest within each other, reflecting the different surfaces of the ‘porcelain’ models.

Since all the explosions needed to appear to come from the same source, it was crucial to map the velocity and reflections that would occur.


Project: Promo spots for Rehab
Client: Virgin Media Television
Studio: Mainframe
Software: Adobe After Effects, Adobe Photoshop, Andersson Technologies Syntheyes, Autodesk 3DS Max, Autodesk Maya, RayFire, Taronites ZBomToy, V-Ray
On the CD: You can view the spot on this month’s cover disc.