Mistabishi's Printer Jam music video creation detailed

DA: How did you turn this into the final piece?

KF: "I started by taking the track and cutting it up into individual numbered sections then wrote down what I saw happening in those sections and what objects are introduced. I then made a very rough animatic using sketches, rough renders and camcorder footage of one of my old printers. Once I had an idea of the flow of the piece, I did research on the things I wanted to make and then modelled the room, printer, creature and objects, then started animating to each of the individual sections of the track I’d cut up.

"I’d make sure I had stuff to render overnight and once these sections were rendered I placed them into the animatic so I could have constantly changing stuff to show Hospital. When all renders were finished I put them all together, added effects, TV noise and the ‘printer garbage’ to gel everything together."

DA: What was the biggest challenge you faced, and how did you overcome this?

"My biggest challenge was always going to be render times. I only had five weeks to produce the video and don’t have access to a render farm - the cost of outsourcing rendering would have been 10x the budget so I had to think of a way to have nice speedy render times, but not sacrifice the look of the piece. I wanted that soft semi-realistic painted look, but that means using depth of field and soft shadows and they can take an age to render. To get past this I used a lot of planning and a lot of post render options to gain the effects I wanted.

"I started by modeling, texturing and lighting the complete room then baked all the textures so the soft shadows became part of the original texture. This meant I could then render the room as a background plate without using any lights. It took a while to setup, but ended up reducing my render times from four minutes a frame down to one second.

"I then rendered the printer, creature and objects on a shadow/matte material so I got the shadows then comped the elements together. The depth of field look was achieved by rendering a ZDepth pass of the scene which is then used later to calculate the distance for post blurring. Using these techniques slashed my render times and gave me more control over each element."

DA: What software did you use to create the piece?

"The majority of the work was produced in 3ds Max. I created and animated all models here and used Photoshop to create the textures and 'printer garbage’. After rendering I was left with up to five different elements per shot to comp together -- including foreground objects, shadows, background plate, effects and the z-depth pass. I put these together in After Effects, did some colour correction then exported the finished section, which was then taken to Premiere Pro for editing and exporting."


Elsewhere on IDG sites

Read Next...