The 7 best highlights from Siggraph 2017

Pin-pointing some of the best things at Siggraph is a tricky exercise, but here are seven highlights made up of virtual reality (VR) and augmented reality (AR) experiences, animations, realtime rendered events and new tech from the recent event in Los Angeles.

For a more in-depth look at the realtime entertainment, robotics and serious tech from Siggraph this year, check out Ian Failes' round-up.

Neurable: Using your brain to journey into VR

Like something out of a movie - Siggraph can feel like that occasionally - University of Michigan startup Neurable showcased a brain-computer interface headset that allowed users to control virtual objects with their minds.

Essentially the headset was a modified HTC Vive fitted with EEG (electroencephalography) sensors and eye-tracking sensors. So, did it work? Basically, yes. Neurable has plans to control toys and electronic devices and to jump into VR games.

Neurable / Manuel Alducin

Magic Bench: Together in augmented reality

One of the longest lines at Siggraph was to try out Disney Research’s Magic Bench AR experience. Here, users sat on a bench looking at a screen that displayed their image and a host of animated characters coming up to them, sitting down, and generally interacting with the participants.

Disney made that happen via a depth sensor and a 3D reconstruction of the scene with pre-animated content and some extra smarts to allow for the fun interaction.

Pinscreen: 3D avatars from one photo, quickly

Every day we seem to get closer and closer to being able to communicate in VR and AR, but making an accurate CG representation of yourself can be complex. Which is why Pinscreen wants to help you do this with just a single image. To pull this off, Pinscreen is relying on neural networks and optimizations to calculate the hair style, the geometry of the face and the necessary textures and lighting for your avatar. They even showed how to animate the avatar in realtime using your own facial expressions.


Meetmike: Meet the future of digital virtual humans

Fxguide’s Mike Seymour and a host of companies in the digital human, realtime rendering and VR space were behind MEETMIKE, a daily presentation at Siggraph that involved Seymour’s virtual avatar interviewing industry players, themselves in virtual form, in realtime on the show floor.

Seymour’s avatar took advantage of months of work to scan and build a photorealistic version of the presenter, which was then driven live via a facial capture headset and rendered in realtime (by Epic Games’ Unreal Engine).

Meetmike / Manuel Alducin

Song of a Toad: Best in Show and new tools to boot

Computer animation is a mainstay at Siggraph, but the Computer Animation Festival winner, Song of a Toad, by Filmakademie Baden-Württemberg’s Kariem Saleh, went one step further. It employed several practical devices, including live puppeteering for animation and even realtime compositing in The Foundry’s NUKE, in the filmmaking process. The effect was an added level of authenticity to the humorous tale.

Change your style

You might not have known you wanted to do this, but after seeing a demo from Czech Technical University and Adobe, you surely will be interested in experimenting with something called ‘Stylized Facial Animation’. It enables you to apply a certain artistic style - such as a pastel sketch or a bronze statue - to video of an actor’s face. Adobe has even released a cut-down demo called FaceStyle that you can try out at


Flock: Become a virtual bird

Finally, the one thing to remember at Siggraph is that it’s OK to act a little strange, and one of the strangest sites saw users in untethered VR headsets and adjustable feathered wings hang out together as a ‘flock’ - if only virtually.

The idea here is to enable large-group interaction, still a tough aspect of VR to make possible.

Flock / Manuel Alducin

Note: We may earn a commission when you buy through links on our site, at no extra cost to you. This doesn't affect our editorial independence. Learn more.

Read Next...