Back
Fortnite’s Re:Imagine London - Bringing London to Life in UEFN
The Fortnite Team
Re:Imagine London (island code: 1442-4257-4418) is a collaboration between the renowned Zaha Hadid Architects (ZHA) and Epic Games to accurately recreate an iconic area of London. Fortnite players are invited to contribute by designing their own unique buildings within the island.
This project had several key objectives:
ZHA, already experienced with Unreal Engine, played a pivotal role in this project. They not only designed the buildings but also crafted the gameplay, incorporating real-world architectural and construction concepts. ZHA and Epic also collaborated with Accucities, another expert in Unreal Engine, who brought highly accurate models of London into UEFN to create the playable area in the game.
To learn more about the story behind Re:Imagine London, check out this video on The B1M YouTube channel.
The area players can explore features several iconic landmarks, such as St. Paul’s Cathedral, the Tate Modern gallery, and Shakespeare’s Globe Theatre. Within this playable area of London, there are four build sites.
Players define the shape and category of buildings by placing voxels, while the new system automatically generates realistic structures. ZHA designed the modular pieces and set the rules for how these pieces fit together, enabling the creation of buildings much larger than a single voxel.
When a player presses the Fire button, we first check if they are in a “build zone” and then run a raycast to determine which voxel face they’re looking at. If applicable, a new voxel is added at that location.
For both techniques, the team created a large set of modular Building Props — over 330 in total — which Verse then spawns at runtime. The code is deterministic, only deleting and spawning Props as needed.
Here’s a brief explanation of how these techniques work:
Shape Grammar consists of simple rules where each rule takes a box and generates one or more sub-boxes for subsequent rules. For example, a rule might slice a tall box into a one-voxel high “floor,” while another rule assigns the corners to one rule and the walls to another. A special rule spawns a Prop at the size and location of the box.
Each rule is defined as a separate Verse class, which is assembled into a “tree” in code. This approach simplifies the creation of new rules, experimentation with different ideas, and the assignment of distinct styles to each type of building. Applying different rules to the same set of voxels yields varied results, as demonstrated in the image below.
Rules can also select an ‘upgraded’ piece when specific combinations of voxels are placed together, such as a Park and a Residential voxel.
In this implementation, a set of tiles is used and then the team specifies which tiles can be adjacent to each other. A “label” is applied to each edge and tiles can only be placed if the labels match. The algorithm selects a location on the grid, randomly chooses (or “collapses”) from the possible options, and then propagates the consequences of that choice to the possible options at other locations.
This process continues until the entire region is generated.
This feature was implemented by spawning normally invisible boxes and using a Cinematic Sequence Device to control a Material Parameter Collection. The Material Parameter Collection, in turn, adjusts material opacity. The “Instigator Only” option allows players to control this view independently, meaning it only affects the player who enables it, not others.
The island also saves information such as the player’s in-game level, completed Quests, and pieces they have unlocked. All because of Verse Persistence!
To make the city feel alive, three different techniques were employed:
A Data Layer allows easy toggling of streetlights on and off. Additionally, a Material Parameter Collection adjusts the brightness of specific materials — like windows — at night.
We hope you enjoy exploring and creating in Re:Imagine London, and that it gives you ideas for your future islands!
This project had several key objectives:
- To demonstrate a new type of gameplay within Unreal Editor for Fortnite (UEFN).
- To push the boundaries of Verse and other UEFN features, testing and improving their full potential.
- To showcase how Fortnite can introduce innovative ideas to a wide audience in an engaging way.
- To explore how experienced Unreal Engine users can apply their expertise to UEFN.
- Most importantly, to inspire players to envision how future cities could be more walkable, vibrant, green, and sustainable.
ZHA, already experienced with Unreal Engine, played a pivotal role in this project. They not only designed the buildings but also crafted the gameplay, incorporating real-world architectural and construction concepts. ZHA and Epic also collaborated with Accucities, another expert in Unreal Engine, who brought highly accurate models of London into UEFN to create the playable area in the game.
To learn more about the story behind Re:Imagine London, check out this video on The B1M YouTube channel.
A New Kind of Building
Re:Imagine London showcases a whole new building system written in Verse. Players choose between six different types of building (Walkway, Structure, Park, Commercial, Office, and Residential), and design it by placing voxels of those types within the build site.The area players can explore features several iconic landmarks, such as St. Paul’s Cathedral, the Tate Modern gallery, and Shakespeare’s Globe Theatre. Within this playable area of London, there are four build sites.
Players define the shape and category of buildings by placing voxels, while the new system automatically generates realistic structures. ZHA designed the modular pieces and set the rules for how these pieces fit together, enabling the creation of buildings much larger than a single voxel.
Voxel Grid and Raycasting
At the heart of Re:Imagine London is a 3D grid of “cells” for each build site, which stores information about the type of building voxel present (if any). Implementing this in Verse is straightforward when using an array of “optional” references. Additionally, a simple raycast routine takes a starting location and direction, stepping through the grid until it encounters an occupied cell.Input Handling
The island uses a number of Input Trigger Devices to respond to controls such as Fire (add voxel), Aim (remove voxel), Next/Previous Item (change category), and Pickaxe (open custom menu).When a player presses the Fire button, we first check if they are in a “build zone” and then run a raycast to determine which voxel face they’re looking at. If applicable, a new voxel is added at that location.
Procedural Generation in Verse
Re:Imagine London implements two types of procedural generation in Verse: Shape Grammar and Wave Function Collapse. Shape Grammar is applied to 3D buildings (Structure, Commercial, Office, Residential), while Wave Function Collapse is used for 2D “flat” areas (Walkways, Parks).For both techniques, the team created a large set of modular Building Props — over 330 in total — which Verse then spawns at runtime. The code is deterministic, only deleting and spawning Props as needed.
Here’s a brief explanation of how these techniques work:
Shape Grammar
First, all voxels of each category are “decomposed” into larger convex boxes in order to apply Shape Grammar.Shape Grammar consists of simple rules where each rule takes a box and generates one or more sub-boxes for subsequent rules. For example, a rule might slice a tall box into a one-voxel high “floor,” while another rule assigns the corners to one rule and the walls to another. A special rule spawns a Prop at the size and location of the box.
Each rule is defined as a separate Verse class, which is assembled into a “tree” in code. This approach simplifies the creation of new rules, experimentation with different ideas, and the assignment of distinct styles to each type of building. Applying different rules to the same set of voxels yields varied results, as demonstrated in the image below.
Rules can also select an ‘upgraded’ piece when specific combinations of voxels are placed together, such as a Park and a Residential voxel.
Wave Function Collapse
Wave Function Collapse (WFC) is a technique for randomly generating an area based on rules that determine how pieces can fit together. This method was discussed in the State of Unreal 2022 talk, The Matrix Awakens: Generating a World.In this implementation, a set of tiles is used and then the team specifies which tiles can be adjacent to each other. A “label” is applied to each edge and tiles can only be placed if the labels match. The algorithm selects a location on the grid, randomly chooses (or “collapses”) from the possible options, and then propagates the consequences of that choice to the possible options at other locations.
This process continues until the entire region is generated.
View Mode
To help players visualize what kinds of voxels they’ve placed, they can toggle a View Mode, which overlays the building site with colored boxes.This feature was implemented by spawning normally invisible boxes and using a Cinematic Sequence Device to control a Material Parameter Collection. The Material Parameter Collection, in turn, adjusts material opacity. The “Instigator Only” option allows players to control this view independently, meaning it only affects the player who enables it, not others.
Saving Buildings
Players can work on their buildings across multiple sessions and share their creations with friends thanks to the Verse Persistence feature. Build voxels are converted into a text string, saved as an array, and can be reloaded later.The island also saves information such as the player’s in-game level, completed Quests, and pieces they have unlocked. All because of Verse Persistence!
Pedestrians
To make the city feel alive, three different techniques were employed:
- Sequencer
- Used to animate pedestrians, cars, bikes, boats, and trains throughout the city.
- StaticMeshes
- For pedestrians that are standing still or following predefined paths, StaticMeshes with vertex animation materials were used instead of Skeletal Mesh animation. This approach is more efficient as it runs entirely on the GPU. You can learn more about this technique in our Vertex Animation overview.
- NPC Spawner Device
- For pedestrians within build sites, the NPC Spawner Device was used with a custom Verse behavior. These NPCs navigate around park and walkway areas, with more being dynamically spawned as the site develops. The NPCs use the MetaHuman skeleton and locomotion animation set.
Procedural Music
Patchwork was used to implement a procedural music system that evolves as players build. As construction progresses on a site, the music gradually becomes more dynamic. Three composed layers of music were imported into fusion patches, which were then loaded into Patchwork Instrument Players and triggered by a midi track in the Song Synchronizer. The speakers are faded up and down based on the building density.Night and Day
A custom day/night cycle was implemented to showcase the city in both daylight and nighttime. Sequencer was used to adjust lighting parameters over time, triggering changes via Verse.A Data Layer allows easy toggling of streetlights on and off. Additionally, a Material Parameter Collection adjusts the brightness of specific materials — like windows — at night.
We hope you enjoy exploring and creating in Re:Imagine London, and that it gives you ideas for your future islands!