Though mainly aimed at marketers, Adobe’s latest Sneaks session includes AI-driven tech that may transfer to Creative Cloud.
Sneaks is a regular part of Adobe’s conferences – giving a chance for the company’s R&D staff to get up on stage and show off early-stage new tech well before it appears in products (if it even makes it that far). It began at its Max conference for designers and artists, and has become part of the Summit expo for marketers.
The R&D staff demo what they’ve been working on to an audience of thousands and a celebrity presenter – who’ve ranged from Nick Offerman and Get Out director Jordan Peele at Adobe Max in the US, to Davina McCall and Jonathan Ross in the UK. At this year’s Summit in London, it was Rob Brydon’s job to pretend to understand what he’s being presented with and mock an audience of marketers for working in marketing (and have his head stitched onto a shoe (below).
There were six new tools shown, of which only one would directly impact creatives – and maybe not in a good way – but the underlying tech could be appear in Creative Cloud apps in other forms.
The creator of Video Ad AI claims it uses machine learning to measure the effectiveness of video ads destined for social media and can suggest how to make them better based on the performance of other ads. It can automatically create shorter versions of videos – a 60-second TV commercial for example – to make them more effective on social media platforms such as Instagram.
The tech can even create these edits as Premiere Pro projects, for editors to refine and adjust.
Some editors and content producers will find this useful, others will see it as a computer trying to tell them how to do their job. Your mileage may vary.
Project New View is a VR-and-voice version of Adobe’s business and marketing analytics tools. The staffer showed 3D performance graphs displayed in VR – which made ‘anomalies’ easier to see – and the distribution of visitors to a website displayed on a 3D globe. The staffer was also able to control the demo using Alexa/Siri-style voice controls.
While this was aimed at users of Adobe Analytics, the use of voice control could be applied across many different apps to speed up certain features – for example, telling Photoshop to select a particular brush from a set of hundreds by name – or getting After Effects to export to a specific format by saying a single phrase.
See and Shop automatically added sales info and links to a website, while Launch It added tracking to a site without manually tagging everything. Experience Analytics brought together online and offline analytics to tell you how, for example, online customers don’t buy swimwear as they want to go to stores to try them on first.
These are all technically impressive uses of machine learning – but hardly creative. Neither was Master Plan, but the tech inside it is something that most creatives would love to have access to – a tool that scans long email threads and automatically picks out dates, times and actions so you don’t miss things buried in them.
Master Plan is aimed at planning marketing campaigns – but it would be great to use it to schedule the elements of creative projects.
That all of these use some kind of machine learning, shows how much the Adobe is looking to its Sensei 'artificial intelligence’ platform for future innovation – not only to automate processes but to find new ways to display information that’s genuinely useful.