Back
Explore How UEFN Was Used to Create “Metallica: Fuel. Fire. Fury.” in Fortnite!
The Fortnite Team
“Metallica: Fuel. Fire. Fury” is a Fortnite experience that blends the feel of a real-life stadium concert with fantastical VFX to take you into the world of hard rock titans Metallica.
Created by Magnopus and Harmonix in collaboration with the band, the experience brings players on a journey through five of the best-known Metallica tracks including “For Whom the Bell Tolls,” “Enter the Sandman,” and “Master of Puppets.”
This experience is so much more than just a virtual concert.
Each song has distinct playable moments that have been designed specifically to align with the beat of the music, making it one of the most ambitious music experiences ever made in Fortnite.
That achievement is doubly impressive given Fuel. Fire. Fury. was developed in the live version of UEFN that anyone can download today!
“We made this entire experience using the publicly available version of UEFN,” says Dan Taylor, Creative Director at Magnopus. “We didn't get any special access to any fancy future builds or anything like that. It's the same tool that any player has got access to.”
In this post, we’ll dive into the advantages the team found in using UEFN to deliver Fuel. Fire. Fury., how iterating and live playtesting changed the game on the project, the tricks Magnopus and Harmonix employed to deliver an epic experience within memory constraints, and much more.
Even for a company accustomed to working with world-renowned names, a gig with Metallica is a huge deal. “Every single person in this company wanted to work on this,” says Sol Rogers, Global Director of Innovation at the company.
At music gaming studio Harmonix (part of Epic Games), that sense of excitement came with a degree of trepidation — as well as aiming to wow existing fans, they’d have the responsibility of breaking Metallica to a new audience.
“We've been wanting to do a game with Metallica for the entire time that Harmonix has been a company,” says Helen McWilliams, Creative Director of the Harmonix team at Epic. “All of us wanted to make sure we got it right.”
Magnopus has used Unreal Engine in the past on a wide range of spectacular immersive projects. For Fuel. Fire. Fury., however, the studio would be developing in UEFN in order to deliver the experience directly to Fortnite’s huge audience of millions.
Working in UEFN’s live collaborative environment would bring a number of advantages. In traditional game development, team members can often feel they’re working in isolation. With UEFN, that all changed.
From the very early days of the project, the team was jumping in and out of the live environment to iterate together. “It allowed amazing collaboration between all of the people on the team,” says Daryl Atkins, Executive Creative Director at Magnopus. “We’d meet every day inside UEFN, play sequences, iterate and tune them in real time — and closing that feedback loop was incredible.”
As well as providing a more enjoyable, cooperative way to work, the team also found this approach to development did wonders for the creative ideation process. “One of the things that's great about UEFN is just how quickly you can churn out ideas,” says McWilliams. “You can go from having an idea to putting it in the build so quickly. It allowed us to try things out and experiment.”
That ability to quickly test and iterate on ideas would be paramount as the project’s expansive scope became clear.
With each section having its own distinct art style and gameplay elements, Fuel. Fire. Fury. is effectively five different experiences in one, morphing between car racing to rhythm-based gameplay and beyond.
The team not only had to come up with a range of different environments, they also needed to build bespoke gameplay elements for each song — and they had to find subtle ways to blend them into the experience.
Striking the right balance between the music and gameplay was key. The team carefully designed the playable sections to be engaging without ever becoming distracting, providing plenty of space for players to appreciate the music at every moment.
They spent a lot of time thinking about the essential elements of the songs and coming up with ways to convey the feel of them through gameplay and the tools available in UEFN. For example, in the opening section, the powerful riff and high-energy, driving beat translated into a high-speed car chase through a volcanic world.
“It was really fun to take some of these songs and distill them down to their elements, and then blow that back up into an otherworldly experience,” says McWilliams.
To achieve this interactivity, the team broke down the musical elements from Metallica’s songs to leverage them for gameplay. They created a beatmap grid in Unreal Engine 5 (UE5) and ported it over to UEFN, which then drove the entire experience.
“It was a great tool for all the departments, from VFX and lighting to animation, to make sure that everything was tightly coupled with the music perfectly,” says Atkins. “We're able to synchronize gameplay events, visual effects, audio cues to that beatmap grid.”
Using the beatmap, the team was able to drive the gameplay to the beat of the music—from the volcanic geysers that fire in unison on ‘Lux Æterna’ to the bells swinging perfectly in time on ‘For Whom the Bell Tolls’.
These little details serve to reinforce the rhythm—not just visually, but in the gameplay as well.
Another key tool in the team’s arsenal was UEFN’s multi-track editor, Sequencer. Being a largely linear experience, Sequencer was used to drive the content and sync everything up, ensuring it would play at the right time and for the right events.
“Sequencer was an absolute lifesaver on this project,” says Ross Beardsall, Lead Engineer at Magnopus. “We hung the entire experience out of it.”
The boss fight gameplay that occurs during the “Master of Puppets” section is a case in point.
“What I love about the ‘Master of Puppets’ boss fight is there's a lot of cool stuff going on in sync to the music,” says Taylor. “Not only do you have his attacks working in sync, but the lightning is firing in time with the guitar riffs, his eyes are flashing in time with the beat.”
They deployed a clever data streaming technique to load in various data layers as they were played, ensuring the experience would always run at the maximum memory level.
“We associated all of our content for each of the sections to individual data layers that we could stream in and stream out on the fly in UEFN, which meant we could have a rich experience for each of those sections and not have to worry about cramming all of that into a fixed memory overhead,” says Beardsall.
With the engineers hard at work trying to maximize UEFN’s memory systems, the creative directors on the project were looking to push the visual and gameplay design to their limits. They were always confident they could make something visually spectacular.
“The interesting thing when you're developing in UEFN is that because you have the full graphics pipeline of Unreal, you can make it look amazing,” says Taylor. “The trick is taking the gameplay elements from Fortnite and tweaking them in an innovative way.”
Verse proved essential to building these innovative interactions. The team used the programming language extensively to string together the gameplay elements provided out of the box in UEFN.
In the “Master of Puppets” section, for example, players must grind on six lightning rails in time to the music. Verse was used to build the system that registered where a player would grind, triggering specific VFX for incorrect grinds and awarding accolades for correct ones.
The team also leaned heavily on Unreal’s Niagara visual effects system to bring the wow-factor, creating effects that wouldn’t be possible in a real-life concert.
“When working with music clients, they often want to recreate their stage show, and we give them the opportunity to go far beyond it,” says Rogers. “They have to worry about health and safety. Not in our world. You can turn all the Niagara tools up to 11 and you can be bathed in fire if you want to. You could be underwater — you could fly to space. Working with artists to give them that new toolset and that new opportunity is so exciting.”
Niagara was used heavily throughout the experience, from the driving, fireballs, and lightning strikes to the volumetric light shafts piercing the auditorium.
“Niagara is probably the most valuable tool alongside Sequencer in terms of producing music experiences, because it allows us to deliver so much in terms of the effects and spectacle, which is really at the heart of these concert experiences,” says Atkins.
For those working on the engineering side like Beardsall, playtesting in UEFN was refreshingly easy — everything was ready to go, straight off the bat.
“One of the most liberating things about working in UEFN is the fact we didn't have to set up any server infrastructure,” he says. “We could just use everything out of the box from Epic that's battle tested across billions of users.”
What’s more, because Fortnite is optimized to work across a range of devices, it was easy to test on different hardware.
“In UEFN, there is an ability to preview all of these different platforms,” says Atkins. “So, we got a very immediate sense of how it might look on different devices. But because we could immediately deploy to them, we could also test on the hardware itself — and that gave us a really close feedback loop to test how the features looked on different target platforms.”
He envisions a future where artists use these experiences to take audiences into the world of their songs and their ideas. “For me, that's the most exciting part of this — finding new ways to not just do a concert experience, but how do we create experiences that feel like a new form of expression for artists?” he says.
He highlights how UEFN’s Patchwork suite of devices, used for creating and manipulating music and visuals, gives artists the chance to begin creating these experiences themselves.
It’s a point echoed by McWilliams. She hopes Fuel. Fire. Fury. will inspire artists to dip a toe into virtual concerts — and bring totally different types of experiences.
“One of the big goals of this project was to inspire people and show them what can be done using UEFN,” she says. “We really want bands and artists and individuals out there to make their own content for everybody.”
Created by Magnopus and Harmonix in collaboration with the band, the experience brings players on a journey through five of the best-known Metallica tracks including “For Whom the Bell Tolls,” “Enter the Sandman,” and “Master of Puppets.”
This experience is so much more than just a virtual concert.
Each song has distinct playable moments that have been designed specifically to align with the beat of the music, making it one of the most ambitious music experiences ever made in Fortnite.
That achievement is doubly impressive given Fuel. Fire. Fury. was developed in the live version of UEFN that anyone can download today!
“We made this entire experience using the publicly available version of UEFN,” says Dan Taylor, Creative Director at Magnopus. “We didn't get any special access to any fancy future builds or anything like that. It's the same tool that any player has got access to.”
In this post, we’ll dive into the advantages the team found in using UEFN to deliver Fuel. Fire. Fury., how iterating and live playtesting changed the game on the project, the tricks Magnopus and Harmonix employed to deliver an epic experience within memory constraints, and much more.
Collaborative Iteration with UEFN
Magnopus is a creative technology studio that produces “butterflies-in-your-stomach” immersive experiences. These range from virtual production for shows like The Mandalorian and Fallout to VR projects with NASA.Even for a company accustomed to working with world-renowned names, a gig with Metallica is a huge deal. “Every single person in this company wanted to work on this,” says Sol Rogers, Global Director of Innovation at the company.
At music gaming studio Harmonix (part of Epic Games), that sense of excitement came with a degree of trepidation — as well as aiming to wow existing fans, they’d have the responsibility of breaking Metallica to a new audience.
“We've been wanting to do a game with Metallica for the entire time that Harmonix has been a company,” says Helen McWilliams, Creative Director of the Harmonix team at Epic. “All of us wanted to make sure we got it right.”
Magnopus has used Unreal Engine in the past on a wide range of spectacular immersive projects. For Fuel. Fire. Fury., however, the studio would be developing in UEFN in order to deliver the experience directly to Fortnite’s huge audience of millions.
Working in UEFN’s live collaborative environment would bring a number of advantages. In traditional game development, team members can often feel they’re working in isolation. With UEFN, that all changed.
From the very early days of the project, the team was jumping in and out of the live environment to iterate together. “It allowed amazing collaboration between all of the people on the team,” says Daryl Atkins, Executive Creative Director at Magnopus. “We’d meet every day inside UEFN, play sequences, iterate and tune them in real time — and closing that feedback loop was incredible.”
As well as providing a more enjoyable, cooperative way to work, the team also found this approach to development did wonders for the creative ideation process. “One of the things that's great about UEFN is just how quickly you can churn out ideas,” says McWilliams. “You can go from having an idea to putting it in the build so quickly. It allowed us to try things out and experiment.”
That ability to quickly test and iterate on ideas would be paramount as the project’s expansive scope became clear.
With each section having its own distinct art style and gameplay elements, Fuel. Fire. Fury. is effectively five different experiences in one, morphing between car racing to rhythm-based gameplay and beyond.
The team not only had to come up with a range of different environments, they also needed to build bespoke gameplay elements for each song — and they had to find subtle ways to blend them into the experience.
Striking the right balance between the music and gameplay was key. The team carefully designed the playable sections to be engaging without ever becoming distracting, providing plenty of space for players to appreciate the music at every moment.
They spent a lot of time thinking about the essential elements of the songs and coming up with ways to convey the feel of them through gameplay and the tools available in UEFN. For example, in the opening section, the powerful riff and high-energy, driving beat translated into a high-speed car chase through a volcanic world.
“It was really fun to take some of these songs and distill them down to their elements, and then blow that back up into an otherworldly experience,” says McWilliams.
To achieve this interactivity, the team broke down the musical elements from Metallica’s songs to leverage them for gameplay. They created a beatmap grid in Unreal Engine 5 (UE5) and ported it over to UEFN, which then drove the entire experience.
“It was a great tool for all the departments, from VFX and lighting to animation, to make sure that everything was tightly coupled with the music perfectly,” says Atkins. “We're able to synchronize gameplay events, visual effects, audio cues to that beatmap grid.”
Using the beatmap, the team was able to drive the gameplay to the beat of the music—from the volcanic geysers that fire in unison on ‘Lux Æterna’ to the bells swinging perfectly in time on ‘For Whom the Bell Tolls’.
These little details serve to reinforce the rhythm—not just visually, but in the gameplay as well.
Another key tool in the team’s arsenal was UEFN’s multi-track editor, Sequencer. Being a largely linear experience, Sequencer was used to drive the content and sync everything up, ensuring it would play at the right time and for the right events.
“Sequencer was an absolute lifesaver on this project,” says Ross Beardsall, Lead Engineer at Magnopus. “We hung the entire experience out of it.”
The boss fight gameplay that occurs during the “Master of Puppets” section is a case in point.
“What I love about the ‘Master of Puppets’ boss fight is there's a lot of cool stuff going on in sync to the music,” says Taylor. “Not only do you have his attacks working in sync, but the lightning is firing in time with the guitar riffs, his eyes are flashing in time with the beat.”
Handling Memory Constraints with Data Streaming
In order to bring distinct flavor and personality to every track in Fuel. Fire. Fury., the team had to push the boundaries of what was possible within the memory limitation of a Fortnite Island.They deployed a clever data streaming technique to load in various data layers as they were played, ensuring the experience would always run at the maximum memory level.
“We associated all of our content for each of the sections to individual data layers that we could stream in and stream out on the fly in UEFN, which meant we could have a rich experience for each of those sections and not have to worry about cramming all of that into a fixed memory overhead,” says Beardsall.
With the engineers hard at work trying to maximize UEFN’s memory systems, the creative directors on the project were looking to push the visual and gameplay design to their limits. They were always confident they could make something visually spectacular.
“The interesting thing when you're developing in UEFN is that because you have the full graphics pipeline of Unreal, you can make it look amazing,” says Taylor. “The trick is taking the gameplay elements from Fortnite and tweaking them in an innovative way.”
Verse proved essential to building these innovative interactions. The team used the programming language extensively to string together the gameplay elements provided out of the box in UEFN.
In the “Master of Puppets” section, for example, players must grind on six lightning rails in time to the music. Verse was used to build the system that registered where a player would grind, triggering specific VFX for incorrect grinds and awarding accolades for correct ones.
The team also leaned heavily on Unreal’s Niagara visual effects system to bring the wow-factor, creating effects that wouldn’t be possible in a real-life concert.
“When working with music clients, they often want to recreate their stage show, and we give them the opportunity to go far beyond it,” says Rogers. “They have to worry about health and safety. Not in our world. You can turn all the Niagara tools up to 11 and you can be bathed in fire if you want to. You could be underwater — you could fly to space. Working with artists to give them that new toolset and that new opportunity is so exciting.”
Niagara was used heavily throughout the experience, from the driving, fireballs, and lightning strikes to the volumetric light shafts piercing the auditorium.
“Niagara is probably the most valuable tool alongside Sequencer in terms of producing music experiences, because it allows us to deliver so much in terms of the effects and spectacle, which is really at the heart of these concert experiences,” says Atkins.
Playtesting on Multiple Platforms
Beyond the design of the experience itself, the team found working in UEFN brought a further paradigm shift when it came to playtesting. Every day, the entire team would jump into Fortnite and play the latest version of the build. “We'd have open and honest discussions about what was working and what wasn't working, which let us very quickly find the best elements to focus on and the most fun things to put into the game,” says Taylor.For those working on the engineering side like Beardsall, playtesting in UEFN was refreshingly easy — everything was ready to go, straight off the bat.
“One of the most liberating things about working in UEFN is the fact we didn't have to set up any server infrastructure,” he says. “We could just use everything out of the box from Epic that's battle tested across billions of users.”
What’s more, because Fortnite is optimized to work across a range of devices, it was easy to test on different hardware.
“In UEFN, there is an ability to preview all of these different platforms,” says Atkins. “So, we got a very immediate sense of how it might look on different devices. But because we could immediately deploy to them, we could also test on the hardware itself — and that gave us a really close feedback loop to test how the features looked on different target platforms.”
The Future of Music Experiences in Fortnite
Atkins believes we’re only scratching the surface of what’s possible with virtual music experiences, particularly when it comes to gameplay.He envisions a future where artists use these experiences to take audiences into the world of their songs and their ideas. “For me, that's the most exciting part of this — finding new ways to not just do a concert experience, but how do we create experiences that feel like a new form of expression for artists?” he says.
He highlights how UEFN’s Patchwork suite of devices, used for creating and manipulating music and visuals, gives artists the chance to begin creating these experiences themselves.
It’s a point echoed by McWilliams. She hopes Fuel. Fire. Fury. will inspire artists to dip a toe into virtual concerts — and bring totally different types of experiences.
“One of the big goals of this project was to inspire people and show them what can be done using UEFN,” she says. “We really want bands and artists and individuals out there to make their own content for everybody.”