The following article was originally published by Epic Games and Unreal Engine in collaboration with Journey.
Set against the evocative backdrop of the Tuwaiq mountains, Qiddiya City is one of Saudi Arabia’s most high-profile megaprojects. This vast entertainment city, comprising 25 districts and spanning 376 square kilometers, will feature the largest theme park in the Middle East and a new world-class motorsports racetrack, alongside art centers, golf courses, hotels, concert venues, and more.
To say the project is huge is an understatement.
The task of conveying the breadth of what’s on offer at Qiddiya City—and more than that, telling its story—fell to Journey.
Using Unreal Engine-powered virtual production techniques and real-time visualization, our team created a slew of films to build awareness and promote various aspects of the city.
Epic Games caught up with the team to hear more about the experience of working on one of the world’s largest and most complicated projects, and how using real-time technology made for a smoother, slicker filmmaking process.
Hello! Please could you introduce yourself—what’s your name and role at Journey? What’s your background?
I’m Joseph Yong, the Lead Unreal Developer at Journey. With a background as a developer, technical artist, and 3D generalist, I’ve been working with Unreal Engine for about 12 to 13 years. Five years ago, I joined Journey and took on the task of building our Unreal team from the ground up.
Some of Journey’s studios have been in business for over 20 years. How, why, and when did you first start using Unreal Engine?
Journey started using Unreal Engine in 2019, but we’ve evolved and expanded our use as UE has evolved. We initially started using it for project pitches, and we also used it internally for its VR capabilities, which allowed us to explore spaces with our internal creative teams.
Then, there was a shift in the architecture world, and everyone wanted to move to real-time as a tool for design, especially its ability to produce renderings. Unreal Engine was the clear-cut option that could render quickly and deliver an excellent visual while the overall design of a project was continuing to evolve.
More broadly, the faster iteration times and interactive nature of game engines enable us to have more efficient conversations with clients and within our team.
Once Unreal Engine 5 came out in 2022, we were able to branch out further and use it to create digital content. Now, we use it in our museum work to help us preview media at scale and context, especially when it is projected onto an irregular screen or mapped around a curved wall. That’s the kind of visual that you can’t really get an idea of by just looking at a computer screen.
In addition, we’ve been building our library of Unreal tools and plugins that we can rapidly deploy in all our projects. (For example, a PCG crowd-generation plugin.) This has helped us thrive in a very fast-paced industry.
Can you tell us about the films you created for Qiddiya City. What did the work entail, how long did it take, and what will the films be used for?
The Qiddiya City marketing team is using the films as part of a coordinated campaign to build awareness and promote a number of exciting districts in the city, from a pioneering stadium to anime villages. With virtual film production techniques, Journey was able to showcase all 360 square kilometers of the world-class sports and entertainment destination—including the Formula One-ready racetrack, the first Six Flags theme park outside of North America, and the prestigious culture and arts center.
The films also underscore how Qiddiya’s monumental attractions overlap with the city, creating a multidimensional entertainment zone that blends, blurs, and augments reality.
One film showcases a futuristic motorsports track that will weave through the heart of Qiddiya. With quick cuts and high energy, the film follows a race car whipping around the track and past attractions like a roller coaster and a glass-bottomed pool.
The work, which was done in Unreal Engine, was structured similarly to usual video production: narrative, pre-production, linear production, editing, sound, and grade. However, because it was a virtual production (VP), we could do away with clunky green screens and instead create a hyperrealistic virtual backdrop for Qiddiya City. UE virtual production techniques gave us access to a three-dimensional virtual space with unparalleled immersivity for the actors, crew, and audience.
The film took about four months from start to finish, from initial narrative to final output. That might sound like a long time, but it was significantly expedited by overnight renders in Unreal. Before UE, it would take us a weekend or more for things to render. Now, we can re-render a shot overnight, fully textured and with all the bells and whistles, and we can see the final pixels much faster.
What were the “must-haves” on the project—was achieving photorealism critical, or conveying the expanse of the project, for example—and how did UE help deliver on this?
Above all, we needed the project to be exciting. From the speed of the race cars tearing around the circuit to the vibrations to the music, the film needed to convey the energy and noise of the Qiddiya City motorsports track. It was a nice shift for us since many of our projects involve slow-panning shots and deliberate camerawork. This one had to be action-packed to demonstrate the master plan of Qiddiya as a fun, vibrant location with lots of lights and excitement.
To achieve that effect, we turned to virtual production techniques. VP generally involves large three-dimensional LED stages called “volumes” that can project real-time “virtual twin” backgrounds of real environments. The camera is tracked to the motion of the backdrop, adding realistic depth and convincing movement to the filmed shots. VP avoids the need for layers of VFX and post-production, and it means the actors can be “on location” without ever leaving the studio.
VP also allowed us to quickly see the details of what we were rendering, which was crucial. If we had filmed the shots in the traditional way, it would have been challenging to previsualize them and communicate the vision to the client in a reasonable timeframe. But with a virtual production in UE, we could see everything in real time and adjust it all in shot. It allowed us to be hands-on with what we were creating rather than waiting for computers to render it out.
Why did it make sense to create these films in Unreal Engine rather than using another game engine or opting for offline rendering?
Over the last five years, Unreal has become our bread and butter because of its efficiency. Offline renders may still have a visual edge in some cases, but they can’t compete on turnaround times.
Because we work in an industry where projects take two to three months, not years, we need to go from storyboard to final pixel—and facilitate conversations around each stage—very efficiently. The quicker we can have a decently rendered shot that conveys the full beats of what we want to communicate, the quicker we can receive client feedback and iterate on it. When it doesn’t take two weeks to re-render a shot, we can be much more flexible and creative in a much shorter time frame.
Unreal also allows us to use virtual production whenever we can, which is another time-saver. It lets us do a lot of preplanning and techvis to limit the time spent compositing. Simply put, we wouldn’t be able to deliver the kinds of projects we’re known for without Unreal.
Please can you tell us about any UE features you relied on to produce the films, and what advantages they brought.
It’s hard to single out one specific feature here or there because we use essentially everything available in UE. That said, certain features are indispensable for our core work.
- Lumen lets us incorporate real-time global illumination and reflections into our scenes to make the lighting look believable.
- World Partition—With Data Layers, we can have enormously scaled maps and only load specific sections when we need them.
- Nanite means we can use production-quality models with pixel-scale detail in a game engine. Without Nanite, we wouldn’t be making the shot.
- Procedural Content Generation (PCG) lets us build iterative tools and content of any complexity. It’s how we can quickly auto-populate a forest full of trees (among many other assets).
- Movie Render Queue gives us enormous resolution shots and the ability to render with extra passes in post-production.
But really, Journey uses everything under the hood in Unreal Engine 5.
What were your biggest challenges in pulling off this project, and how did you tackle them?
Because it represents an entire city, the scale of the Qiddiya master plan was immense. That alone presented technical challenges. If everything in the master plan loads at once, you can’t move around the scene; there’s so much data that navigation becomes slow and unwieldy. Thankfully, Unreal’s World Partition feature let us load in and out what we needed without disrupting the scene.
Another challenge was believability. We couldn’t shoot a race car on a track that doesn’t exist yet, but we could get the race car in the studio. To make it look as close to the real thing as possible, we combined live-action physical props—such as the race car and the roller coaster cart—with virtual production. Then, we had to think about tiny details like the reflections in the driver’s helmet visor, which change as he races around the track. CGI would have made those reflections look fake, and comping them would have taken too much time. Unreal’s Lumen feature gave us all the ambient light and reflections we needed, which made the scene much more physically accurate.
The result was a comprehensive and action-packed representation of the Qiddiya master plan, produced in a groundbreaking way to bring the concept to life. Within one hour of launch, it was the trending topic on X in Saudi Arabia, and it generated over 650 media reports and 363 million social media impressions.
How does real-time technology change the workflow compared to the traditional method of creating film?
Real-time technology helps us align the creative and pre-production teams. It means we can quickly review a master plan, discover angles, link shots together to define the narrative, and see what works. This is a massive time-saver, and it helps us iterate on the project quickly. (We’ve even sent a livestream of a shoot to a client so they could give us feedback as we captured the shots.)
Beyond that, real-time technology means we can shoot footage in order, so the storyboarding makes sense, but we can also deviate from that order if needed. If someone isn’t available for a shot, we can change the background and film a different shot while waiting. Likewise, if we need to capture a shot at golden hour, we can make it be a golden hour for five hours. Using this technology means we can be reactive on shoot days, and we can always get the exact shot we need.
Can you tell us a little about how you used Unreal on the Peak Tram project, including any specific features that helped complete it?
The Peak Tram, a major tourist attraction in Hong Kong and the oldest funicular in Asia, was the first project on which we did a linear production shot in Unreal (as opposed to an offline render). To engage and educate the attraction’s estimated four million annual passengers, Journey created immersive experiences for the newly renovated tram from the bottom to the top of Victoria Peak.
To create immersive installations for the Peak Tram visitor experience, we needed to do an eight-minute render for three synchronous screens, and we needed it to be a continuous shot. That’s 14,000 frames from start to finish. With a traditional render tool, we just couldn’t have done it.
The size of the screen was another challenge. It was ten meters long and three to four meters tall, and there was also a screen across the ceiling and another screen on the back wall. With 100 people in the waiting area at a time, we had to ensure that everyone had something visually pleasing to look at from where they were standing. It was a massive undertaking.
The benefit of working in Unreal Engine was that we were able to get as close to the final pixel as you could be. Working in the editor, we saw 99% of the shot at all times, and everyone who looked at the screen had the same context. That was especially important because the project was taking place during the pandemic. We had to do everything remotely and make sure that the client in Hong Kong could still understand our vision and assess whether it had the impact they wanted.
Then, during production of the Peak Tram project, Unreal Engine 5 came out. Suddenly, the lighting effects and the render support were much better; we could do so much more. You don’t normally switch engines midway through a project—it’s a significant risk—but it offered such a leap in visual quality that it was a no-brainer.
Although we were already producing a lot of interactive media, using Unreal was a big step forward. Our team had to learn to render in a game engine instead of a traditional renderer, but it made a massive difference. We never would have understood the scale of the screen or been able to balance the composition otherwise.
How did your use of Unreal on the Battersea Power Station project differ from other work you’ve done with Unreal, and what was the benefit of using the engine?
It’s our job as creatives—especially in museums and cultural spaces—to find the most engaging way for the client to tell their story. This is especially true when the audience is wide, and the content needs to appeal to an eight-year-old as much as a 78-year-old visitor.
With that in mind, the Lift 109 Experience at Battersea Power Station offered a twist on the usual methods of conveying information about a cultural site. We used Unreal on an interactive touchscreen display that lets users move coal into a turbine and power a massive physical chandelier. Although the concept is simple, it gives visitors a playful and accessible way to understand how a turbine works while also complementing the broader cultural story of the site.
Where do you see the merging of digital media and physical reality going in the future, and what’s exciting about the opportunities afforded by game engine tech in this space?
Within the museum and cultural space, we’re going to be pushing the visual quality of what we can do. We’ll also be doing more and more work with interactive elements as the renderers improve. As Unreal continues to develop, the quality we can deliver will get better and better.
Within the architecture and construction space, our focus is moving toward digital twins. These twins allow us to create a visual representation of a place or a future place, but with an extra layer to help visualize data like traffic patterns or energy consumption. It’s a way of facilitating important conversations and helping people tell the right stories about what they’re building, which is a big part of what we do at Journey.
We know we’ll be able to achieve more as game engine tech evolves. Because Journey deliberately maintains all its tools and stacks in-house, we can take our big technical breakthroughs with us to each new project. That allows us to focus our energy on being creative and delivering cutting-edge projects.
As game engines continue to develop, we’re excited to see where they go. We’re at the forefront of the industry right now and looking forward to it becoming even bigger than it already is.