Artificial limbs, touching holograms in mid air and Star Wars Battlefront in virtual reality (VR) – is this the future of VR and augmented reality (AR)?
These innovations and plenty more will be showcased at this year’s Siggraph conference in Los Angeles at the end of this month. It’s part of the conference’s Emerging Technology program, which gives conference goers the chance to interact with experiences that merge digital, cinema, art and science.
Essentially it’s an insight into what our world could look like in 10 years time. At last year's Siggraph it was a "vibro-tactile sensations" VR suit, ShapeSpace VR, and 3D renderers such as Pixar’s RenderMan making highlights. This year it’s all about cyborg-like extensions and real-time simulations.
There's expected to be 26 Emerging Technologies installations, and 13 experience presentations.
Watch a quick overview of some of the best instalations in the trailer below.
MetaLimbs is an installation definitely worth trying out – a "multiple arms interaction metamorphism" where two robotic arms are attached to the human body.
The arms map the motion of legs and feet relative to the torso, and local motion of the toes. This data is mapped to arm and hand motion, and even further to fingers gripping the artificial limbs. It adds force feedback to the feet and maps the feedback to manipulator’s touch sensors.
But if cyborg realities are too much for you, how about touching holograms in mid air?
This demonstration is based on a touch development kit from Ultraheptics – the only mid-air tactile feedback technology. It provides a touch feeling without any mechanical equipment in the visualisation area.
Alongside the Emerging Technologies program is Siggraph’s 2017 Studio – this year called Cyborg Self: Extensions, Adaptions, and Integrations of Technology with the Body.
As wearable devices continue to expand, developments of extensions, insertions and interventions will either excite or completely frighten you. The prospect of hybridisation in our physical evolution has been explored through many sci-fi films, so although the concept isn’t a new one, the technology showcased at The Studio will present a wide range of industry concepts on wearables, bio-tech, e-textiles and sensory extensions.
Watch the preview trailer below.
The Studio program will feature seven installations and eight workshops, including manipulation of 3D printed objects to be explored for entertainment applications.
Another is Textile++. Based on resistive touch-sensing, the technology is comprised of cloth that can be applied directly to conventional clothing through folding and sewing – and at a low cost.
And of course, how can there be Siggraph without some of the latest techniques within the realm of interactive visuals.
The Mill teamed up with Epic Games and Chevrolet to create the short film, which debuted at Game Developers Conference (GDC) 2017.
Powered by UE4, The Mill’s Cyclops production system and The Mill Blackbird, the film is one that anyone can play. Watch how The Mill created The Human Race here.
Watch the trailer for Real-Time LIVE! below.
Penrose Maestro is a suite of tools that enables artists to work on a VR story collaboratively in real-time, and within the VR environment.
Artists can move into production together, allowing for special context when taking notes from a director.
And how could we forget Star Wars Battlefront VR, the first VR game created using Frostbite, the same engine behind games like Battlefield.
The production team at Criterion Games faced the challenge of creating a VR game on PS4 with visuals that would match AAA games.