As part of Screen Producer Australia’s digital-only Screen Forever’s program, four industry experts sat down to discuss the role of Epic Games’ Unreal Engine in virtual post-production. Moderated by so-called ‘Unreal Engine Evangelist’ Chris Murphy, Clayton Jacobson of Dreamscreen Australia and Epic Games’ LA Lab’s Connie Kennedy sat down with Josh Tanner, writer/direct of Decommissioned, one of six short films showcased by Epic Games in last year’s Unreal Engine Short Film Challenge.
By combining physical sets with real-time effects projected onto screens, productions of all sizes are reaping the benefits of the game engine, which powers high profile videogames like Fortnite and Mass Effect alongside the visual effects for acclaimed screen productions like Star Wars: The Mandalorian.
The focus of the panel was not what Unreal could replace, but what it could enable; Kennedy pointed out that Unreal offered a broad enough set of creative tools to fill the gaps in a huge range of kinds of project, from a high budget production like Star Wars: The Mandalorian to any indie filmmaker; ‘it’s giving the opportunity for anyone to be able to use the engine with whatever resources they’ve got.’
UNREAL ENGINE SHORT FILM CHALLENGE
In 2020, Epic Games, the developers of Unreal Engine, offered a free two-week course in using Unreal Engine for virtual post-production, with one film from each state or territory awarded $20k in production funding at the conclusion of the course. Further to this, one national winner was awarded a further $50,000 in prize money, and all the finalist films were promoted on Unreal Engine’s website.
Tanner’s Decommissioned was one of the six finalists; a science fiction horror film set on the international space station. Employing a combination of physical sets and VFX in Unreal, the small team were able to produce a script that Tanner had had in his back pocket for a long time: ‘We had written the script for that short years ago, around 2017, but we just couldn’t see how it would ever be possible to make that short film with a traditional VFX pipeline… the fact that this technology sort of sparked that back into life was just incredible.’
WHAT THE UNREAL MAKES REAL
On top of making high-quality visual effects available to teams of all sizes, Unreal Engine can also render visual effects in-camera, creating what Ashley refers to as a ‘convergence of departments that would have been in isolation from one another,’ allowing tasks that would usually be assigned to post-production, to take place alongside pre-production and production. To reap the benefits, a team will need to focus early on production budgeting, and clarifying in the earliest stages which assets will be physical or real, and whether they’ll be used in the fore, mid- or background of different shots.
Kennedy pointed out that once that budgeting had taken place, using real-time effects in Unreal allowed the entire team will be able to see how visual effects will look in the final cut. This imbued sets with a sense of spontaneity and serendipity that might otherwise be absent when relying on post-production effects. ‘All of those opportunities for improvisation are really difficult to plan,’ she said, ‘And I think when you have the ability to work in real time, you’re able to capture all of that.’
Read: Videogames have a conflict mineral problem
Ashley described a scene where a character was faced with a waving skeleton played on set by an actor in a motion-capture suit that was rigged in Unreal to an animated skeleton.
‘What was interesting was the skeleton waved in such a way that was very sort of unique, and the actor mimicked it, and then the skeleton gave him this look of contempt,’ Ashley said, ‘and I realised right then and there, you would never get them, you know, normal shoot, because the actor wouldn’t know what was going to eventually be on screen.’
Tanner agreed, adding that it not only changed the performances, but the tone, of the entire set: ‘the fact that everyone on set was taking photos on their phones, and saying, you know, “Oh my God, this looks like space!” it was an incredibly charging experience to sort of go forward with.’
GETTING PRACTICAL WITH VIRTUAL POST-PRODUCTION
Ashley continued that while the engine’s ability to generate effects was exciting, what was more exciting to him was the production solutions it can offer that aren’t so flashy.
‘Think of those 30 scenes that you have to film at the end of production – the scene of the dentist, the scene where the couple meet on a street corner… you know, those one-offs that still mean you need the crew to lock down a whole street just for one small moment where Sandra says, Yes, I love him, and I think I’ll go to the party!
‘Now we can truncate all those, and minimise what would be a month and a half of really arduous location moves, into a much more month.’
But the biggest thing that stuck with him after working with real-time effects on set was the sense of respect it fostered between departments. Far from the ‘military operation’ of a traditional film set, he described the experience as ‘one of the most gentle, quietest sets I’ve been on.’
‘What was really amazing was how quickly a respect built up between all departments,’ he continued, ‘because for the first time, crew weren’t just saying “Let’s leave it to post” and not really knowing what that means. They were literally seeing the architects of the visuals that they were then capturing live.’
The Screen Producers Association’s annual conference Screen Forever ran 16-8 February.
Watch the Unreal Short Film Finalists on the Unreal Engine website.