Remember those cool-but-ultimately flawed Lytro cameras that used light field technology to create cool, variable focus images? It turns out, the first-generation, kaleidoscope-shaped Lytro shooter and the more traditional looking camera-like Illum were just the beginning.
Lytro announced it has jumped into the video arena with Immerge, a video camera that uses light field tech to capture video with depth information for use in The Foundry's Nuke visual effects application. The information cna be used to blend CG elements with footage more easily and realistically.
“Immerge builds on the technical achievements that were unlocked by Illum and moves the technology further into live action video and a 360 virtual space,” Ariel Braunstein, chief product officer at Lytro, told us.
This is a natural evolution for Lytro. Back last summer, the company released a massive update to Illum – a combination firmware and desktop upgrade that positioned the camera as a content creator for virtual reality headsets like the Oculus Rift, Sony PlayStation VR,? Microsoft Hololens, and HTC Vive.
Still a prototype
Immerge, currently still in the prototype stage, will be released to creative storytellers in the first quarter of 2016, the company said.
“We chose not to reinvent the editing environment,” Braunstein said. “Nuke is a post production visual effects standard. Editors can edit every pixel, mix it with computer graphics, and apply visual effects.”
While AR and VR have typically been used to create 3D gaming environments, one breakthrough for Immerge is its capability for realistic and immersive output for dramas and documentaries. This would significantly broaden the target viewing audience as well as creative possibilities for moviemakers.
Immerge is designed to capture once and play back on any device. While it’s optimized for high-end headsets, it also works with mobile phones as the screen for a VR device, such as Google’s Cardboard or other simple, low-cost headsets that include only optics.
Rectifying 2D issues
Immerge was created to combine live action content with computer graphics, but first it had to solve the 2D problem. “A lot of the technologies available today are cobbled together from 2D and don’t work well in a 3D environment,” Braunstein said. “2D tech provides a compromised sense of presence – meaning you know where you are.”
For example, 2D images in VR must be stitched together, but when they are, the resulting seams and lines form a distracting discontinuous space. A talented video editor can hide seams, but not eliminate them.
Within a light field volume, Immerge captures data allowing for virtual views to be generated from any point, facing any direction, and with any field of view. This creates a much more realistic sense of presence, previously experienced only with computer graphics.
In addition, Immerge places viewers in the action by replicating natural light flow and corrects stereoscopic alignment to keep scenes consistent as users move their heads from side to side. The camera corrects flawed visual parallax, a phenomenon where moving objects at different distances appear to move at different speeds.
It allows for natural (“six degrees of freedom”) head movement in any direction with the ability to view the world in a natural way—side to side, up and down, and back and forth. Three additional axes of movement pivot on top of those three.