How to capture, edit and composite immersive video for VR, YouTube, Facebook and more

You can pan around this 360-degree music video for Avicii's Waiting for Love if you're viewing it in Google's Chrome browser

We’ve been getting inside a lot of immersive video this year. Viewers can pan around a full 360 degrees by moving their heads inside a VR headset like the Oculus Rift, Samsung Galaxy Gear VR or Google Cardboard – or using overlaid controls such as those offered on Facebook or YouTube's immersive videos (try it in the video above). 

Back in June we received some Californian sunshine from Linkin Park’s Mike Shinoda who put together a VR promo with collaborators at The Uprising Creative in LA and Venice Beach-based WEVR. We got a more autumnal feel in September, as immersive experience studio Marshmallow Laser Feast (MLF) took us on a virtual reality walk through Grizedale Forest, the Lake District wood famous for its sculptures.

More recently, the New York Times has created a wealth of VR content, including this film which sees actors including Melissa McCarthy, Michael Fassbender, Kristen Wiig, Benicio del Toro and Charlize Theron (below) taking flight.

Last month we also explored Puzzle's first single, Godlike (below), as Territory Studio showed us how a mix of CG and Kinect-captured movement could create an immersive experience. 

Creating these videos is fundamentally different from traditional video-based storytelling, say those with a lot of experience producing immersive content. 

“360 content and VR feel more like working with theatre than films,” says Territory creative director David Sheldon-Hicks. “With films we can edit between shots and light, focus and compose for a single beautiful shot. However, with 360 and VR it becomes tricky to create edits as the experience is jarring. We also have to consider that the viewer doesn’t have to look where you, the creator, might like them to. Diverting the viewer’s attention to the right action at the right time becomes a new challenge and asks us to rethink narrative story telling.”

Despite the challenges, there are good reasons to get into creating immersive videos, as the audience for such content is exploding. Millions of people can watch 360-degree videos through social networks, such as Facebook's dedicated channel and YouTube's support for playback of 360-degree spherical videos in Chrome, Firefox, Internet Explorer, and Opera browsers (though you’ll need to add custom metadata before uploading your own).

For a more immersive experience, people can watch 360-degree videos with the low-cost Google Cardboard viewer and the YouTube app. Facebook-owned Oculus VR has teamed up with Samsung to offer Gear VR, which mounts a Galaxy smartphone in a more sophisticated headset to offer immersive content.

“The best way to watch VR at the moment is with a head-mounted display,” says Anthony Karydis, CEO of 2, which produced a 360-degree concert by Run DMC (above) for Vodaphone. “And the easiest way to implement this is with your smartphone and cardboard viewer. For the viewer to be completely engaged it should have no cables or buttons, there should be nothing physical that you need to use outside the experience itself. The viewer should be totally immersed in their view. If you have to make choices, then it has to be about where to look and focus.” 

There are also a number of other viewing devices becoming available. Oculus Rift, the VR headset developed by Oculus VR, has long been available as a development kit. The consumer version of the Rift shipping in the new year requires a PC running Windows to be attached, and can be used with gamepad controllers. There’s also the PC-gaming VR system from HTC and Steam, HTC Vive, as well as Sony’s PlayStation VR, shipping next year as a visor attachment for the Playstation 4.

Oculus Rift has long been available as a development kit

What you need to shoot 360-degree video

Unlike our recent look at HDR, where the kit required to shoot content starts at around £2K, there’s a wide choice of equipment to create immersive video, with prices ranging from pocket money to megabucks - so your initial experiments with the format won't break the bank. 

Kodak’s SP360 Action Camera is designed to capture 360-degree HD videos and immersive images in one shot. The small handheld device sports a dome-shaped fixed lens that captures 30fps 360-degree HD videos that can be shot and viewed from multiple vantage points including 360-degree Panorama mode.

The Ricoh Theta S records spherical video in full HD at 30fps with a maximum recording time of 25 minutes. The images are enabled by an enlarged image sensor and fast F2.0 lenses to allow more light through Ricoh's proprietary ultra-small twin-lens folded-optical system.

More 360 video capture devices are available from surveillance firm IC Real Tech, which offers the HD capable ic720 and the stationary, Wi-Fi enabled ALLie

Many custom rigs have been built for VR using RED cameras and the hyper-portable GoPro, sometimes being specially 3D printed for the purpose. The nascent VR video community got really excited when Google gave a glimpse of a 360 rig with 16 GoPro cameras called Google Jump earlier this year

This concept has now become product in a rig called Odyssey (above) which offers 16 GoPro HERO4 Black cameras that each record 2.7K video at 30/25 fps. Only currently available to ‘select industry professionals’ at $15k, Odyssey captures 4:3 aspect ratio video with all the cameras synced as one, then uses the Jump Assembler engine to output h.264 MP4 files as panoramic 2K and 8K video.

The manufacturer also has linked up with Kolor to offer spherical camera rigs holding up to six GoPros, and its Autopano synchronising and stitching software to accompany it. There’s even a submarine version (able to record up to 130m underwater) for fans of Steve Zissou.

The Nokia OZO (above) is another model to explore, if you have a spare £40k. OZO captures stereoscopic 3D video through eight synchronised global shutter 2k sensors and spatial audio through eight integrated microphones. Software built for OZO enables real-time 3D viewing, with a playback solution that removes the need to pre-assemble a panoramic image.  

VR content company Jaunt Studios has also recently announced the Neo, a cinematic VR camera that simultaneously records 3D stereoscopic video in all directions. It also has custom optics specifically designed for 3D light-field capture, a very interesting direction of travel it shares with the Lytro Immerge (see more on Digital Arts here).

How to shoot 360-degree video

As for actually shooting the content, Anthony Karydis says immersive video offers a totally different set of challenges to those you’ll find on an everyday film set. 

“Always keep in mind the placement of cameras, as we are talking about totally different types of cameras when filming 360,” he advises. “The key consideration is that they need to be in close proximity to the action, because normally the cameras that are used for 360-degree video content don’t allow for zoom in and out – so once they are positioned and filming starts they cannot be moved.” 

“Height is a key consideration too,” adds Anthony, whose production company produces 360 content including concert videos for the likes of Muse (which can be watched here). “With 360-degree video, we record a full sphere, so the camera does not need to move. This means that if it is placed at the wrong height it may make people look taller or shorter than they are. This can give an unnatural perspective and will therefore not be as engaging for the viewer.” 

Lighting is a major consideration on set. “When you’re filming 360, lights are going to have to be part of what you’re filming. This means you need to integrate them into the scene so that they don’t look out of place,” says Anthony. “So either you need to use lights within lights as part of the scene, or try to use a lighting set up that means they can be hidden within the seams of the separate lenses. The same issue occurs when it comes to placing mics.”

Francisco Lima, VFX technology supervisor at Gramercy Park Studios, has a lot of experience of building VR rigs and shooting for immersive content. He advises a previz/postviz approach of going out and shooting rough 360 video of your location, then building the world of your VR environment before the main shoot. 

“This allows you to plan for lighting and camera positions, or hiding crew,” he says. “You can also remove some of that in post. But by seeing it before and having a vision can really help a lot. For the DP, the production designer, the director and the VFX supervisor, being together able to visualise the world before going on set, gives so much more value. 

"We see it as a tool for content creation, for conception, for building the design spaces where we can prototype, and one that allows us to focus more on the creative aspects in post.”

MPC has been at the forefront of a lot of cutting-edge imaging, so it’s not surprising the studio has been getting into VR too. Dan Philips, MPC’s head of digital and interactive, says there are issues in current rigs with the scarcity of lens and rig data, timecodes for editing, and sporadic changes in frame rate, not to mention clean-up in every feasible direction.

“Then there are the problems that directors and DPs will face with the lack of on-set playback or review, depth and angling, lighting, movement of characters.

“From a production perspective the main challenge is probably moving the camera, whether it be mounted on a head rig, or a dolly. A moving camera quickly induces nausea which makes the whole experience beyond unpleasant. And the effect seems to be different for each person.

"The emergent cardinal rule of VR development seems to be, under no circumstances take the control of the camera away from the viewer – an alteration of viewpoint for someone whose head remains static is an instant one-way ticket to nausea.” 

How to edit 360 video 

According to MPC’s Dan, there are three common methods for editing 360-degree video. 

“There’s editing from the hero camera and then stitching the footage, which can introduce issues with footage in other angles that weren’t viewed at the point of editing,” he says. “Or multi-camera editing in the chosen editing software, and then stitching, which can make it hard for the editor to keep the overview on all of their footage.  

"There’s also pre-stitching all footage or selects from the shoot and editing straight from the latlong data [a flat projection of the spherical image], which has the downside of a lot of data transfer and rendering, but ensures that all the footage can be seen at the point of editing and it is 100 percent technically correctly aligned. We would tend to opt for the latter approach.”

There are a number of tools available for VR postproduction. Skybox Studio from Mettle is a four product plug-in for After Effects which generates 360-degree content from AE comps and outputs to YouTube 360 video. The Trapcode Suite features a immersive 360-degree plug in for After Effects- it’s available from Red Giant Software.

Dashwood Cinema Solutions showed off its new 360VR Toolbox (above) during the IBC video-industry trade show in Amsterdam in September, This full VR toolset offers the ability to preview 360° spherical footage in the Oculus Rift while playing from an edited sequence in Final Cut Pro, Premiere Pro or After Effects. The downloadable public beta is now available on on the FxFactory plugin store.  

The Foundry’s Nuke quickly became the go-to compositor for stereo 3D video, and the company is likely to make a similar impact in immersive content, when it releases a 360-degree compositing toolset next year.

“When stereo came along we did a lot of engineering at the time to make sure that our systems such as Nuke were capable of dealing with multiple image streams at once,” says The Foundry’s chief scientist Simon Robinson. “So the workflow to deal with that was a good stepping stone to this. A large part of the work with 360-degree video is about looking at the correlations between adjacent images, which is something we did a lot of in stereo 3D.

We solved various aspects of multi-camera alignment, stitching, and colour correction problems in various ways historically. A lot of the new things are to do with moving the workflow away from the idea that the artist is looking at flat rectilinear image, and trying to make sense of working in the spherical world in image systems which are fundamentally flat. I think that’s the real challenge.”

Note: We may earn a commission when you buy through links on our site, at no extra cost to you. This doesn't affect our editorial independence. Learn more.

Read Next...