The Mandalorian television show is built largely with game development tools. The result is the ability to have a far richer word for television and feature films looking forward. The group working with Lucasfilms is very diverse from a range of companies. The Unreal engine has been demonstrated.
Over 50 percent of The Mandalorian Season 1 was filmed using this ground-breaking new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20’ high by 270-degree semicircular LED video wall and ceiling with a 75’-diameter performance space, where the practical set pieces were combined with digital extensions on the screens.
Digital 3D environments created by ILM played back interactively on the LED walls, edited in real-time during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by nVidia graphics.
The environments were lit and rendered from the perspective of the camera to provide parallax in real-time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Jon Favreau, executive producer and director Dave Filoni, visual effects supervisor Richard Bluff, and cinematographers Greig Frazier and Barry Baz Idoine, and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve real-time in-camera composites on set.
The technology and workflow required to make in-camera compositing and effects practical for on-set use combined the ingenuity of partners such as Golem Creations, Fuse, Lux Machina, Profile Studios, and ARRI together with ILM’s StageCraft virtual production filmmaking platform and ultimately the real-time interactivity of the Unreal Engine platform.
The demonstration is very informatives so other studios can consider the achievements of Lucas. Available resources can now be used more widely.
In 2005, the film Doom was based on the game. The opening and closing credits for the film were entirely rendered using the id Tech engine configured for ray tracing. While game engines are designed for gaming, they can be modified for feature film rendering.
Since the golden age of first person shooters, a rapid evolution in technology began to shape the things to come. Motion capture allowed for a completely immersive design while films still depended on actors and film. Now more of this has converged and the future will likely become more imaginative as the improvements in software continue.