A new ten-minute film produced for the UK Film Council’s DV Shorts scheme sees Hot Knife Digital Media blend CG characters with live-action video in a tale of a mad scientist who invents a strange camera, and his robotic helper.

Combining live action with CG, Hot Knife Digital Media’s short film F.stop is a revenge story about two characters – a wayward scientist who has invented a camera that steals the views it photographs and his helper, a robot, who is bullied into developing the photographs back in a cluttered laboratory.

Produced as part of the Film Council’s DV Shorts scheme in conjunction with EMMedia whose aim is to encourage filmmakers to produce innovative and contemporary short films that reflect the independent spirit of digital film, F.stop was conceived, written and directed by Hot Knife’s Andrew Whitney and Simon Wallett.

“We had recently completed a CG short based on Boom Bip’s Mannequin Hand and wanted to experiment with a story that combined both video and CG,” says Whitney.

There were many technical obstacles facing the pair in the making of the film. These ranged from working with DV-quality video, integrating real actors with CG characters, visualizing environments that had to disappear while an actor walked them through to producing 3D holograms of the photographs that the scientist had taken.

Having finalized the script for the film, the duo worked on storyboarding the scenes, and then scanned them into the studio’s editing software to appear on a timeline. This allowed them to change, add and delete shots to get a rough idea at pace, timing and effects that were required for the production.

“We also produced the soundtrack which was as integral a part to the themes and the story as the visuals, and early music scores were transferred to the edit”, says Wallett. “We’ve worked on many sequences integrating CG and video before, but none that had required the amount and variation of shots as for a full ten-minute film. Planning each shot in detail helped us both when filming with the actor and crew, and later during the animation and compositing stages of production.”

The film’s main CG character – the robot – was designed and built in the early stages of production. A model was built in Discreet 3DS Max using low-polygon and subdivisions, with secondary details such as the leather that joined the body plates together controlled with a skin-&-flex method. It was animated using Character Studio.

References of the robot were made for the film set, which helped the actor playing the scientist to work to a position or an eye line. Hot Knife employed a separate voiceover actor to play the robot so that the two actors could bounce dialogue off each other during key scenes which required arguments.

“We showed both actors visuals of the robot in order to help them visualize what we would later composite into the shots. For the laboratory scenes, we sat the robot on a chessboard, which was easy to track and get a solid result with, and this worked with the story as the robot was the more cunning and intelligent of the two characters.” says Whitney.

Hot Knife spent three days filming exterior shots, and two days on the interior, ‘laboratory’ scenes. “We’d spent days scouting outdoor locations which would also be striking enough to recreate as 3D holograms for scenes back at the lab, and even longer finding a laboratory that could be used for the film,” recalls Wallett.

The storyline called for part of the landscape in many of the exterior scenes to disappear when a photograph was taken. Various takes of each shot were filmed, with and without the actor, enabling a clean plate from which the landscape could be removed to white, and the actor then composited back over.

“We had a couple of methods for removing the landscape to white,” says Whitney. “Another way was to build an approximate of the object/scene and to camera-match the shot in 3DS Max. We used this method for the first outdoor scene set in a grotty back alley. We’d taken hi-res stills of the scene and mapped these onto the Max model. It was then relatively simple to map white rectangles appearing on the model timed to the shutter’s sound, and it allows us to add a bit of camera movement. When finally graded it was impossible to tell it from the DV footage.”

Another technical challenge was creating shots of the actor walking through a white outdoor environment. Rather than film the actor against a blue/green screen, Whitney and Wallett simply filmed a second pass of him outdoors in location against a white projector screen. “Due to the bright sunlight and the quirks of DV compression, it was easy to overexpose to pure white and achieve the desired effect,” says Whitney. “We then added a rectangle here or there of sky or tarmac path from a clear shot of the scene without the projector screen or actor. If we needed the actor to pass in front of anything, we added a simple mask.”

Animation

The third stage of the film’s production involved editing, animating and compositing. “When producing the film it was essential to have a good workflow that allowed for quick adaptation, especially when a scene’s edit may change in length or mood depending on which angle or take worked best,” comments Wallett. “We aimed to produce as final an edit as quickly as possible so we could begin listing the shots that needed tracking, animating, compositing or colouring. The list quickly became very long and we had
to prioritize the shots that were essential to telling the story. We had the storyboard as a blueprint but sometimes we discovered unexpected gems during filming and so adapted the timeline accordingly.”

For the laboratory scenes with the robot and subsequent sequences that contained his pet bee, Hot Knife tracked the handheld shots and imported them into 3DS Max where a virtual camera path was created to match the virtual set that had been built from digital stills. A matte-&-shadow material was applied to the set to ensure any shadows cast by the robot would be saved in the 32-bit alpha and affect the video shot when composited, which helped both elements to gel explains Wallett.

The 3D robot was then animated: “You have to get into the mind of the character, how it feels, what it’s thinking, how to react with the actor. It feels odd sitting there gesticulating to yourself but you soon get into the flow,” says Wallett, “We imported the sequences into the viewport so that we could animate the robot reacting dynamically to the actor.”

One scene, towards the end of the film, proved particularly challenging for the duo. It involved a scene with several elements – the robot’s pet bee, the actor on a path surrounded by white, and a door. Whitney and Wallett built the entire scene in 3DS Max, creating a virtual path and doorway based on hi-res stills they had taken while filming. The actor was created as a low polygon figure onto which the pair mapped the actor’s clothes, hair, face and even shoes. The camera and tripod was also recreated in 3DS Max. “We then animated a simple walk along the path in front of the door, while the bee flew in close to the camera. The end result was simple but effective,” explains Whitney.

With the edit near completion, the CG shots were rendered using ART’s Pure card and added. A rough cut of the film was sent to the scheme’s producers and after feedback a finalized edit was agreed with a few additions. Finally the film was composited and graded.

“We had just upgraded to the Leitch/DPS Velocity Quattrus, which afforded us several bonuses including real-time compositing of multiple layers of video and animation, solid audio tools necessary for online editing as well as integration with Digital Fusion that provided us with a seamless tool for compositing and grading. These were all extremely necessary to complete the film within the tight timescale,” explains Wallett.

“We normally use Combustion for post production because of its powerful colour correction and image manipulation, however this slows the production process down on something like a ten-minute short as edited sequences would have had to be exported, worked upon and re-imported back into the edit across the network. Digital Fusion enabled us to work directly from the NLE timeline effecting the full multilayered composite and automatically bringing the result back onto the timeline,” adds Whitney.

Learning experience

“The project enabled us to learn a great deal about designing shots either for special effects and post production, or just for directing actors and crew. It also taught us further about combining video, CG animation and storytelling,” says Whitney.

“You can sometimes get carried away about new technology and software and the endless possibilities it can open up, but you always have to remember that you’re telling a story. Advances in animation software and compositing had helped us create shots that would have been too expensive or impossible to film but we tried to make sure the shots were integral to the story and not just effects for effects sake. The processes we used were exactly the same as those used in feature films, just on a different scale – understanding them is the important part.”