IGN - State of Unreal Livestream: A Glimpse of Next-Gen Gaming

🎁Amazon Prime 📖Kindle Unlimited 🎧Audible Plus 🎵Amazon Music Unlimited 🌿iHerb 💰Binance

hey everyone it’s great to be back at GDC

you know back in 2020 as the pandemic was ramping up we were the last major

company to drop out of GDC we’re the first to go all remote and then we

headed into the pandemic it was a completely crazy time with thousands of

people working from around their homes but the company thrived and so I’d like

to go through some of the things that we’ve done since the since the last GDC

first of all we launched project Liberty epic challenge the App Store monopolies

yeah we’re in front of the ninth Circuit Court Europe has passed major new

legislation and the fight goes on for the freedom of all Developers

mediatonic joined epic um launched Fall Guys for free and

brought in 60 million new players bandcamp the Indie music site joined

epic bandcamp helps Indie musicians reach fans online today and we’re going

to build metaverse opportunities for all musicians in the future

fortnite Thrive during this time we launched two major chapters lots of

seasons and brought the fortnite audience up to 500 million player

accounts Travis Scott launched the astronomical

concert into fortnite Ariana Grande launched the rifter

and so overall fortnite had a really great time with tons and tons of players

enjoying and the world constantly evolving

we also learned some hard lessons Epic into a massive settlement with the

FTC it really brought home to us and to the whole industry that kids in

technology are a pressing topic for all developers to take seriously and that

the expectations of regulators here in the United States and around the world

are much higher than ever before and so we’ve done a lot over the past

few years to improve things we’ve had huge sets of new parental controls into

fortnite and the epic game store and the child safety features like cabin

accounts and being epic when we solve a problem for ourselves we like to bring

it to all other developers so they can benefit from the work and to that end

super awesome joined epic super awesome makes verifiable parental consent tools

they are available to all Developers for free so it becomes much easier to comply

with all the regulatory laws around children’s access to games worldwide

we also launched on your own Engine 5 into Early Access during the pandemic we

went through a year yes

went through a year of Early Access followed by release we updated fortnite

to Unreal Engine 5 and then we spent a whole year visually upgrading fortnite

to the latest capabilities of Unreal Engine 5 and which we launched late last

year and we’ll be showing later in this show

epic game store also thrived the player in race has grown massively during this

time to over 230 million players on PC we’ve uh paid out more than a billion

dollars to all developers and we’ve also just very recently launched

self-publishing so it’s easier than ever before to launch your game to all

customers in the PC market with really straightforward tools and earn 80 88 of

all revenue but that’s a summary of the past and

this day of unreal is about the future so I’d like to invite Nick penwarden to

come up and talk about Unreal Engine 5.2 thanks

all right thank you Tim

thanks Tim and hi everyone I’m excited to get to show you some of the new

experimental features we’ve been working on for Unreal Engine 5.2

let’s take a look

all right so last year we added several new features to the engine to support

foliage rendering and the fortnite team used those features to ship battle

royale chapter 4. at the same time Jacob over there and the team at qixel were

experimenting with what’s possible for photoreal foliage environments as well

as testing out the latest functionality that we’ve been building for Unreal

Engine so Jacob’s here with us today in the

unreal editor let’s explore the environment and what better way to do

that than off-roading and what better way to off-road than an arivian r1t

now rivian uses unreal to power their instrument cluster including 3D

visualization of their vehicles so we worked with them to bring the r1t to

life in this experience let’s head on out Jacob sure thing on my

way all right so we’re building tools for interactive and dynamic worlds so

here we have chaos physics simulating rocks that tumble as we drive over them

leaves Bend out of the way and we also added some real-time fluid simulation

we worked with the team at rivien to set up unreal’s chaos vehicle model to

simulate the suspension of the truck and how the electric motors Drive each

individual wheel chaos also simulates how the tires

compress and deform and meta sounds enabled the team to precisely

re-synthesize the sounds of the electric motors and mix them with the ambisonics

of the jungle

so rivien provided us with a highly detailed model of the truck about 71

million polygons that were able to render in real time thanks to nanite

now the rivian not only looks incredibly realistic because of lumen and nanite

but also its materials and today we’re introducing substrate our new material

framework

and to better demonstrate it let’s swap the paint out for opal

now of course you can’t order a rivian with opal body panels but opal was the

internal code name for this project and also a really great demonstration of

substrates capabilities the base layer models the iridescence

refraction and Reflections that occur inside of an opal and on top of that is

a layer representing the polished surface and how light is absorbed as it

travels through that clear layer of varying depth

and now we can add back on the dust and dirt layers

and notice how the reflection changes when interacting with the dust layer and

that there are no artifacts along the transition from dirt to dust to opal

so substrate is more expressive enabling artists to create materials like this

with different shading models and compose and layer those materials as

they see fit all right let’s uh head on out Jacob on

my way in terms of performance substrate

materials that are similar to the current Unreal Engine shading model cost

about the same but now artists have the freedom to author more complex materials

for extremely detailed use cases like in cinematics and in film

so we’re gonna drive under this fallen tree here and everything that you’ve

seen up to this point was painstakingly hand built by the environment theme at

quixel everything since that fallen tree has been built using our brand new

experimental Suite of procedural content generation tools entirely an engine that

are flexible deterministic and artist driven

our guiding principle in building these systems was to empower artists to make

tools for artists so Jacob’s going to go ahead and add a

procedural assembly to the world and the cool thing is that it

communicates thank you

yeah pretty cool

and the cool thing is that it communicates with other nearby

procedural elements in the scene like the creek bed so let’s say a designer

comes by wants to direct the player to drive to the left Jacob can simply move

the assembly to the right and everything updates to accommodate that change

game design is iterative so let’s say the designer comes back wants to give

the player the choice of going left or right again Jacob can simply move the

assembly back over

now the artist who created this assembly also added some additional handles that

Jacob can use to Art direct where Rock slides occur allows them to customize

the piece a little bit more make it a little easier for the Arabian to drive

by so we started by hand crafting that

original part of the level to set the visuals and art direction for the entire

piece and then built out procedural tools that allowed the team to create a

much larger play Space much more quickly now let’s see how we can use these

procedural tools to make larger sweeping changes to the environment

so Jacob let’s start by removing some of the trees in this area absolutely that’s

easy enough actually

all right a little too much let’s let’s add some trees back in okay and let’s

also add in some Cliff formations give it a little bit more variability

so the procedural systems are all deterministic as Jacob is experimenting

with different sets of input parameters once he finds a set that he likes he can

always go back to it and get out exactly the same results

and the procedural systems aren’t just placing trees and rocks but also fog

cards bugs birds everything that’s needed to bring this environment to life

and everything that you’ve seen here works at scale this environment is four

kilometers by four kilometers if we hide all the procedural elements

we can see that original hand built area about 200 meters by 200 meters

we believe that there will always be the need for hand building environments so

we design these procedural systems to be tools for artists that work in concert

with hand-built content both substrate and the new procedural

tools will be available in experimental form in 5.2

and everything you’ve seen here is running in the unreal editor in real

time on a developer machine with an Intel 13900k CPU and Nvidia RTX 4090 GPU

so Jacob thanks for being here and helping us out today thank you very much

Nick for having me it’s been a pleasure

all right so come visit us at the Unreal Engine booth on the Expo floor you can

check out this demo for yourself and this afternoon here in the ybca we’ll

have a tech talk diving into the details behind this demo

in addition the preview one build of Unreal Engine 5.2 is available now so

you can test out all of the experimental new features we showed today as well as

numerous other improvements head over to the epic game store to download the

binary build and also get the full source code from the Unreal Engine

repository on GitHub now I found that we make the best

technology for developers when we use it ourselves when we really put the engine

through its Paces both through Tech demos like this one as well as full game

Productions so to talk about how the fortnite team used the latest Unreal

Engine 5 capabilities to ship battle royale chapter 4. please help me welcome

John laugh [Music]

[Applause]

thanks Nick for chapter four we wanted to show

fortnite in a way players had never seen before

we upgraded with broad Strokes using Lumen nanite and other ue5 Tech

we work directly with the engine team to improve these features and ensure they

scaled on all platforms fortnite ships on

first up was lighting now our options in the past to improve lighting have been

somewhat limited because fortnite is a really Dynamic game

big lighting just doesn’t work because the moment a player destroys a wall

light maps are invalidated so we were really excited to give Lumen

a shot it updates Global Illumination in real

time as the environment changes early in chapter 4 development we

captured a video of lumen enabled in a fortnight test build and the player in

the video destroyed a wall and light just came flooding into the room and

honestly it was pretty stunning it brought new life to the environment and

the realistic bounce light worked great with fortnite’s vibrant Style

well initial results were exciting you know nothing is that easy in Game Dev

and as we discovered Real World Lighting can create real world problems

playtests revealed areas of the map like attics and basements that had no windows

and were just too dark for gameplay and it was also the first time we were

using Auto exposure and it was causing bloomed out areas when players were in

dark Interiors looking outside we solved these issues using ue5

features like local exposure and some art directable controls within lumen

that enabled us to provide a final image much better for gameplay

while they are team refined content the engine team provided some new

scalability options so that we could run Lumen on next-gen consoles at 60 frames

per second in addition to lighting nanite opened

the door for us to add an incredible amount of detail to the fortnite island

it was introduced in ue5 as a virtualized geometry system that

supports extreme mesh complexity so we spent some time experimenting

looking for a good balance between stylized art Direction and detail

so we could increase visual quality but still maintain fortnite’s iconic Style

now that artists weren’t limited by triangle counts we scaled our content

pipelines to um support Nana on high-end Hardware

Tech artists modeled new vegetation assets

and the art team created some high detailed props and some amazing hero

assets but this still left us with the large

building library that needed a visual upgrade

we resolved this using an offline process that took displacement maps from

our materials and created high resolution nanite

meshes it was really cool to see these Classic

fortnite Materials like brick stone and wood get a high resolution facelift and

pop off their surfaces

now past the United demos have relied on static hard surface environments but

fortnite’s an animated world trees blow in the wind and buildings

wobble when hit so we worked with the Nani team and they

extended the material pipeline to include masked and animated materials on

nanogeometry

the island in fortnite is constantly evolving with major changes each season

and complete reboots when chapters launch so it’s essential that we evolve

our workflows too during chapter 4 development we started

utilizing two new ue5 features that changed how we build levels World

partition and one file per actor World partition automates the level

streaming and it allowed us to work in one large level

we use level instance actors to group content logically like a building and

all of its props one file per actor ensured our large Dev

team could work in this space without Source control conflicts

now these features hadn’t yet been put to their Paces by a large Dev team so

there were some early bumps in the road but rlds were in Daily communication

with the engine team and we saw consistent improvements

not long into production we were developing fortnite and a much more

collaborative environment our goal for chapter four was to deliver

an awesome experience to our players but we also hope by using and improving

the engine’s newest features we’ve improved your experience as well

now with the look with the look of the amazing work developers are already

doing in ue5 please welcome Dana Cowley foreign

[Applause] [Music]

tools and amazing content at your fingertips making games is hard and it

gets even trickier when you’re trying to ship while you’re upgrading your

features and we put a lot of effort into making sure that the trans the

um the transition from ue4 to ue5 was as smooth as possible and you’ve proven

that it worked and we’ve been impressed with all the teams who have made the

jump ever so quickly you know many of you moved your games over right in the

middle of development and quite a few of you have left internal Tech in favor of

ue5 and as of last month 77 percent of all Unreal Engine users are on ue5 and

that’s rapid adoption and Mass and over 750 000 of you are actively

building with Unreal Engine each and every month

you’ve invested in us and we’re investing in you

and Beyond games you’re telling Amazing Stories across live action and animated

entertainment and to date more than 550 television and film projects have been

made with Unreal Engine in production now these are some of the leading

Studios using UEFA today this represents a Brain Trust of

talented game developers from all over the world and it’s getting bigger every

day you have validated that we’re on the

right path and we’re blown away by how you’ve embraced the tools and shown us

what’s possible and now you’re about to hear from

several talented teams who are bringing their awesome games to PC console and

mobile first up we have qubit Studios a small

team whose game infinite testicles has not only received an epic Mega Grant but

it’s also coming to the epic game store with the help of Epic Games publishing

here to give you an exclusive New Look is Michael forzard

hi everyone

I was invited here today to share our story with you qubit Studios is a small

Studio from Europe and we have one mission to bring the

world of infinitesimals to life in our first game you play as Captain

orkney Redding rank as xiani’s crew explore a mysterious planet filled with

large environments now a universe is already big at our

scale but when your main character is only a few millimeters tall it takes on

a world new meaning as a studio we have big ideas as a small

team we need to make choices and that’s where switching from uv4 to

ua5 made a big difference for us it has empowered Us by giving us time back

let me take you through a few examples is very small we want players to truly

feel a sense of scale by having a large space to explore

by using wall partition in uv5 we managed to save months of Dev time not

just because of the size of the world but also because we can build on top of

it it allowed us to specialize a lot of systems very quickly

and with the built-in H lab we get strong runtime performance while being

able to load the entire world visually in editor

what world is great but it doesn’t do us any good if it’s empty and that’s where

the new packed actors and level instances came in handy

a level designer can use prefabs to quickly prototype ideas while artists

can refine those prefabs in editor by assembling assets together

and with one file character it all happens in parallel

on the animation side motion warpings allows us to use a single animation for

a large range of situations and with control rig we can quickly create those

first pass animations in editor but the most outstanding features on a

knight and lumen we don’t need to spend time on topology

optimization and can directly use high polymers lighting is more realistic and

we don’t have to use tricks for transition between exterior and interior

all in all working with V5 has been a tremendous time server for us and

upgrading from B4 was surprisingly easy we were up and running in a couple of

days we can know iterate more quickly focus

on gameplay and allocate time to riskier ideas

but enough talking go on the epic game store and wishlist the game now

thank you everyone enjoy the rest of the show

your team is built a beautiful game Michael fantastic work now our next

partner is one that needs little introduction Kabam develops and

publishes world-class highly social multiplayer mobile games and today they

are here to make a special announcement please join me in welcoming vice

president of product Tyler black [Applause]

thank you Dina for that kind introduction hi everyone at Kabam we’ve

worked with the world’s biggest brands and created our own stories to bring

some of the most successful mobile games to players

today we’re proud to announce a new project

one that ushers in a whole new era for Kabam

King Arthur Legends rise

it’s a new cross-platform Squad RPG set in a stunning reimagined Arthurian world

when we began this ambitious project we knew that Unreal Engine was the only

solution that had the tools to bring incredible visuals to both mobile and PC

at launch so here’s how we put those tools to use

King Arthur Legends rise allows players to explore campaign stages and interact

with their environment using a variety of animations

if we were to use traditional animation techniques the number of assets that we

would need would be massive thanks to unreal’s Advanced animation

technology like ik rig retargeting motion warping and property access we

can create high quality detailed and attractive animations with relatively

few resources but that isn’t the only way that we’re

using Unreal Engine 5’s tools to deepen the player experience

we’re also making use of dynamic weather to affect character strengths and other

stats and keep the battle gameplay fresh and immersive

that’s all from us today we can’t wait to share more with you

about King Arthur Legends rise in the coming months so check out our website

playkingarthur.com and sign up to stay up to date on our newest game

thanks Tyler now the next developer coming to the

stage was founded a few years ago and they’re already making a big name for

themselves hexworks is a CI game studio and they are capturing the imaginations

of fans of Dark Fantasy with their new action RPG Lords of the Fallen

creative director Cesar bitosu is here to give you an exclusive look at how

they’re using ue5 to build one of this year’s most highly anticipated games

thank you Dina thank you thank you for the introduction and handling my name so

graciously hello everyone I am very excited to be

here and represent hexworks our incredibly talented multinational team

behind Lords of the Fallen our upcoming dark fantasy action RPG and today we’ll

be sharing how Unreal Engine 5 has helped bring our highly ambitious and

decidedly Grim Vision to life please enjoy

welcome to the Lords of the Fallen technical showcase

here at hexworks our goal has always been to create the most immersive game

experience possible

today we’re journeying to Sky rest Bridge one of the game’s early locations

to demonstrate just some of the impressive ways Unreal Engine 5’s

state-of-the-art technologies have helped us deliver on this ambitious

vision for our upcoming action RPG we want players to feel fully immersed

in our world by playing as their own unique virtual persona to achieve this

we’ve used a combination of Technologies including 3D scans of real people and

ue5’s own character customization Tech players can create unique faces and

bodies by dynamically morphing between a huge range of shapes before finessing

the finer details our extensive selection of armor sets

seamlessly adapt to whatever shape the player chooses it’s really important to

us that we represent as wider proportion of our audience as possible

being able to customize your hero in Lords of the Fallen is only the

beginning each and every one of our characters is

incredibly detailed using high resolution textures

get up close to any surface in the game like this tunic for example and detail

texturing provides incredible levels of micro detail

how these objects move and behave is also crucial for player immersion ue5’s

chaos physics engine facilitates Advanced simulations for clothes chains

air belts and a whole lot more this helps to substantially enhance the

secondary Motion in all our characters making their movements look much more

lifelike [Music]

the lighting in Lords of the Fallen takes full advantage of unreal’s brand

new Lumen GI we’ve been able to light our complex environments in real time

meaning we can immediately see the effect of a light bounce for example

without waiting for the traditional slow baking processes watch how the lighting

reacts as we move our light source around this environment

Global illumination like this can be quite demanding on processing so we also

make use of emissive shapes to add additional details to the lighting of

any space we can then fine-tune these light bounces on our nanite meshes

providing our world with impressive levels of detail and minimal impact on

performance

perhaps the most important element of Lords of the Fallen is the ability to

travel between two worlds the realm of the living Axiom and the realm of the

Dead umbrella we’ve created our own custom tool set

within ue5 it allows us to intricately craft these two environments side by

side and seamlessly swap between them this means our artists and designers can

ensure these worlds feel intrinsically linked like two sides of the same coin

even if one side is decidedly more horrific than the other

on behalf of hexworks thanks for watching today’s brief Tech presentation

the twin Realms of umbral and Axiom await Fearless adventurers later this

year when Lords of the Fallen Launches on PC and current gen consoles

thank you looking awesome thank you

this next team is known for making some of the world’s most popular

MMORPGs and recently they’ve been pushing the boundaries of cinematic

storytelling and the results speak for themselves

here with more is ncsoft’s Chief strategy officer Dr song Yi Yoon

[Applause] thank you so much for the introduction

Dana good morning everyone

I’m excited to share our latest production Project M with you today

as you may know ncisoft is recognized as one of the best

MMORPG developers thanks to our Blockbuster franchises like lineage and

Guild Wars however

we’re not content with this achievement we’re always pushing for boundaries and

exploring new technologies so let me introduce project MTU

it is our latest Innovation harnessing the power of Cutting Edge Ai and

Graphics technology to an unprecedented level

with help of Unreal Engine we have achieved our vision for Project M with

seamlessly integrating our foundational AI technology resulting

in breathtaking detail

so to give you a sneak peek we have prepared a video

featuring our CEO and chief creative officer TJ

who will personally guide you through our newest creation as a digital version

of himself thank you so much for your attention and

I hope you enjoy Project M

[Music]

[Music] thank you

Project M is one of ncsoft’s most ambitious projects today

[Music]

what would it mean to you if the world was comprised of informational particles

instead of physical particles oh I got a little ahead of myself I am

TJ Kemp from ncsoft welcome to project

[Music]

how would the fabric of simulated reality change our perceived world if we

could tamper with the arrangement of those particles

this very idea is what sparked Project M

m

the informational particles that shape project M’s world can transform reality

based on your choices and each choice you make will change

your experience

the information that exists in the present reality determines how the world

will unfold [Music]

right

value every encounter and every moment these will be essential in the world of

project and this is where our journey ends for now I

look forward to seeing you all again soon

[Music]

thank you Dr Yoon Project M is looking amazing and it’s unlike anything we’ve

seen and thanks to all the teams that have

given us a new look at your incredible games

like we just saw the world’s year building

these are awesome looking characters with loads of detail and authenticity

and our team understands that believable digital humans are at the heart of great

storytelling and for more on that I’m happy to hand it over to our CTO Kim

Larry thank you some incredible games sir hey

everybody wow it’s great to be back after four

years um it’s good to see you all

um so it’s been two years since we released metahuman Creator This Cloud

hosted tool allows you to create photorealistic digital humans for your

game or experience with the Simplicity of an RPG character creator

yet with a level of realism and customization that you’ve never seen

before our community has created millions of

metahumans for their unreal projects and the work we’ve seen has been

unbelievably breathtaking and I’m sure a bunch of you in the audience actually

making better humans as well let’s take a look at some of this work

thank you

amazing you know there isn’t a day that thank you

so there isn’t a day that goes by that we don’t see some post on LinkedIn or

Facebook or wherever showing us amazing usage of metahumans and I think it’s

been a really big game changer for anybody trying to tell a story in their

game or their experience um but today we want to take things to

the next level so I’m going to actually introduce two of my most favorite

friends um to the stage our head of digital

human technology Vladimir masilovich and the actress best known as senua from

Ninja theories fantastic hellblade series Melina Jurgens how are you both

doing good thank you it’s been a while since we were last on the stage in fact

I think it was seven years ago that we went through this uh that’s a long time

ago we are triggering there but it was the first time we all came together

Ninja Theory epic cubic motion free lateral to do something that we thought

would wow audiences here and sort of set a template for the future of using

characters in games so anyway um I think we’re I think we’re in a

pretty good pretty amazing place right now with the industry but we do want to

take things to the next level and what I want to do a little bit about is how’s

things going at Ninja Theory how’s that how’s the new game going yeah the team

and I are doing really good and we’re just extremely busy working on the

follow-up to the first hellblade which will be called senua Saga held light too

I can’t wait the first game was fantastic I really can’t wait to play it

so get it out soon anyway I’m Gonna Leave the stage and let these two take

over go and blow people’s minds thank you

hi everyone

our guiding vision for metahuman has been the democratization of complex

character Technologies allowing you to work faster and see the results

immediately a character is only truly believable if

it’s motion Fidelity matches its visual Fidelity but animating at this level is

a hard task for even the most skilled Studios

some of our best work leveraged 4D capture but this took specialized

Hardware in weeks or even months of processing time

while meta human creature gave you the ability to generate high quality

characters animating them still wasn’t as easy

this is why I’m very excited to announce a new capability to the American product

met a human animator met the human animator contains the

essence of our 4D pipeline but optimized to run on a single machine

it is able to use iPhone as well as stereo professional systems and today

we’re going to demonstrate how it works for this we’re going to need now our

technician John Cook and just the phone

Mel can you take your position please sure let me know when you’re ready

okay and action I need performance capture to work like

a mirror I needed to capture whether I’m acting scared

or angry and sometimes all I need is a look

cut thanks Mel that was great yeah you’re

welcome okay

our technician John is currently pulling Mass performance from the phone onto his

machine where everything will be processed locally

we have updated our live link face mobile app to capture all data at the

best resolution possible with the device meta human animator uses video and Dev

data to convert this data into High Fidelity performance animation and it

can even use audio to produce convincing tongue animation

John is currently scrubbing through the take to pick the section that he wants

to process John are we all good with the data

awesome so from now on it’s just a single button click to kick off the

processing which for a performance of this length will take less than a minute

to convert into animation

so Mel well that is processing let me show you something else

oh is that me yeah this is what we refer to as your meta-human DNA cool

as generated by the capture we made earlier yeah that’s right so from only

three frames of video and that data we can generate a rig that predicts all of

your facial expressions in just a couple of minutes well

to do this once for each actor that’s right it

over to your face so that we can produce the performance in in a way that

faithful reproduces your original performance sounds cool yeah so let’s

check back on the on the processing which today is on the latest CP and GPU

Hardware from AMD metahuman animator uses a custom epic

facial solver and landmark detector we can interactively look at the

Innovation while it’s being solved and compared to your original performance

so it looks like it just it’s almost finished

after this is going to do one more pass to make the curves more stable

really quick and from here on

we can we just need to export the animation this takes only a few seconds

and then John needs to drop it in the level and add the audio so that we can

see the result so Mass meta human should now be ready

in the level well you excited to see the results yeah I can’t wait to see it

I need performance capture to work like a mirror I needed to capture whether I’m

acting scared or angry

and sometimes all I need is a look

thank you also Mel what do you think I think it’s incredible because it usually

takes months between performance capture and getting any results back so this is

blowing my mind and all of this is sold directly onto animate your friend and

controls in this case we are using a bespoke 4D rig which we created together

with Ninja Theory for hellblade 2 but it’s also ready to use on any meta human

or any other rig that follows our new metahuman standard

let’s have a look at that

I need performance capture to work like a

I needed to capture whether I’m acting scared

or angry and sometimes all I need is a look so

the same thing works even on stylish characters

thank you

thank you all these Technologies are completely redefining our creative

process as they already find yours One release met a human animator to everyone

in just couple of months we’ve got one more thing we’d like to

show you we haven’t forgotten about the needs for full performance capture

shoots what you’re about to see is animation that has not been polished or

edited in any way and it took metahuman animator just minutes to process start

to finish yeah so here’s one of my favorite lines from Ninja Theory’s

upcoming game senua Saga hellblade 2 and I really hope you enjoy it and the rest

of the show thank you very much thank you

sign perf cut take 13.

[Music] I see through your Darkness now

I see through your lies I will show them how to see us I do

I will not appease your gods I will destroy them

good you like it [Applause]

man was that cool all right let’s talk about the future of

our creator marketplaces we all know one of the biggest challenges in game

development is the time and cost associated with making great art and

Visually stunning environments now we’ve seen that by giving you access

to high quality game-ready assets like quixel Mega scans World building can be

faster more efficient and more economical the resulting explosion of

creativity that we’ve seen from our Unreal Engine developer Community has

been awesome in fact it’s inspired us to double down

on a mission that started several years ago

since the launch of Unreal Engine Marketplace in 2014 we’ve been working

to Aggregate and organized all the world’s best digital assets for you to

use in your work in 2018 we joined forces with quixel

whose Mega scans have now been downloaded over 40 million times

in 2021 art station became part of the Epic family and we welcomed a community

of the world’s best digital artists who’ve since uploaded nearly 12 million

projects to the site later that same year sketchfab joined

epic and now has more than 10 million registered users who are using the

platform to discover share and manage 3D models

on the web now each of these businesses has their

own unique superpowers incredible content of course vibrant communities

and killer capabilities we’re now working to bring all this

together into a unified offering let’s take a look

[Applause] [Music]

foreign

[Music] foreign

[Applause] we’ll be merging our content offerings

into a new unified creative Marketplace called Fab built on the strong

Foundation of sketchfab.com we’re going to drop the sketch and go all in on Fab

Fab will be an Open Marketplace serving the entire digital content creation

industry that means we’re going to offer assets for all game engines All Digital

content creation tools and metaverse inspired open worlds so whether you’re

Studio making a blockbuster game or movie or your player building your first

island in fortnite our goal is to bring you everything you need to build your

world in one place we’ll have 3D models material sound

plugins animations visual effects and more

that’ll be the new home of quixel Mega scans metahumans and other incredible

epic content designed to push the limits of what’s possible

we’ll also bring you high quality assets and environment packs from leading art

studios AAA game developers media companies and even musicians who are

taking fan engagement to the next level but our primary focus is on bringing you

the best stuff out there from millions of Independent Artists who will benefit

from the lowest fees in the industry pocketing 88 of revenue from their sales

so they can build real businesses and invest in making more great assets for

you to use that’s right

now with millions of assets we have to make it super simple for you to find

what you’re looking for and get inspired along the way we’ll leverage automated

tagging to consistently and accurately describe content and we’ll give you the

ability to inspect 3D models before you download making it easy to understand

the game Readiness of an asset so there are no surprises

building on the success of quixel bridge the Fab plug-in will enable you to

search discover drag and drop content right into your scene without leaving

the game editor

we want to we want to improve your workflows by connecting Fab to all

industry leading digital content creation tools for example our

partnership with Adobe will make it easy for you to export and publish models

from Adobe substance painter directly to Fab

finally leveraging sketchfab’s digital Asset Management capabilities Fab will

be the place where you go to store manage and share assets either privately

with teammates for collaboration or publicly on the web to Showcase your

work with collaboration tools like this we

want to help small teams do big things now it’s no longer just major Studios

who are driving our industry forward World building has become accessible to

everyone and we’re seeing this today with tens of millions of new creators

building awesome experiences in fortnite Roblox Minecraft other open worlds

this is increasing the demand for ready to use content and it’s creating an

exciting new opportunity for asset creators to reach massive new audiences

We Believe an open and fair marketplace where artists and developers can come

together to buy and sell assets is essential to the creation of the

emerging metaverse now we’re going to keep you posted over

the course of the next several months as this rolls out and we’re looking forward

to partnering with you all to make this an industry-wide success and now please

welcome sax Pearson

[Applause] [Music]

[Applause] it’s great to be here today

we’ve been working towards this moment for a very long time

this next segment is exactly why I’m here at Epic to move the industry

forward and help make developers successful

I’m here to introduce powerful new tools to existing fortnite creators powerful

new opportunities for professional Developers

fortnite is becoming an ecosystem that means new tools to design develop

and publish games and a new economy that rewards Developers

these updates bring us one step closer to epics vision of a connected metaverse

with billions of players enjoy high quality creations made by millions of

Developers at the center of this all is unveil

editor for fortnite available today in public beta on epic game store

it’s a new pc application that brings the power of unveil engine to the scale

of the fortnite audience deeply integrated with the game with new

workflows that allow PC and all other platforms to create together fast and

fluid we call it uefn for short but first let’s talk about how we got

here fortnite changed when we released

fortnite creative four years ago creatives and in-game toolkit for

players to make sandbox games inside fortnite and because of creative

fortnite is already much more than a game

it’s already a place where players find games and developers find players

but why should you develop in fortnite well our players are highly social

content hungry with over 500 million play accounts massive PC presence and

the largest console player base is the best most vibrant place for developers

to find an audience quickly and publish new games

Islands built using creative tools already account for roughly 40 of play

time in fortnite that’s billions of hours a year and we expect that number

to keep growing with uefn so this is a sample of games in fortnite

today what we have noticed is that players are

more engaged they have more fun when they play epic Zone games like Battle

Royale and Community games made by creators like you like the combination

is what makes fortnite special and what is going to help us all grow

as we’ve shown it on real engine epic’s mission is to deliver an awesome Suite

of tools for developers on Wheel editor for fortnite is for experienced game

developers like many are you in this room but it’s also for the growing

community of fortnite Island creators that are ready for unreal engines proven

PC editor workflow as you’re going to see in the demo

coming up uefn and the existing creative toolkit

is already a great combo for developer and fortnite Creator teams to make

incredible experiences of course our long-term goal is to make

the entire feature set of unveil engine or bring the entire feature set of

unveil engine to uefn but also over time to expose many of the other services

that epic offers bill just announce Fab Fab is launching

today also inside uefn in an alpha version and products like metahuman will

be coming to uefn in the future to support your Creative Vision

for the next stage of fortnite and ultimately the open metaverse we also

need brand new tools that solve really hard problems interoperability

scalability and resilience a real issue when you operate at this scale

and and expect this level of durability this is what led us to develop verse

it’s a powerful new general programming language designed specifically for this

purpose you’re going to see us use it in the demo in just a moment

the second thing we’re working towards is high interoperability between uefn

and Unreal Engine that’ll enable you to take your work anywhere and

by the way behind me is the new Creator portal also launching today where you

can manage all your Islands one click publish them to the world and when you

publish the uefn your original Ip and your assets are yours to take anywhere

now let’s take a closer look at uefn features and workflows we put together a

short video and then we’re going to do a demo thank you

let’s start with the basics unreal editor for fortnite runs on PC and is

integrated directly with fortnite so you have access to over four years of

content all for free to get you started what’s special about uefn is what we

call live edit live edit allows anyone on your team to join a uefn session from

any platform that runs fortnite that means someone can join from a console

using the normal fortnite client and they’re able to work alongside and

collaborate with PC users uefn has access to all fortnite creative

devices our modular gameplay systems that work in the in-game editor this

allows you to instantly add gameplay and quickly bootstrap your game from the

hundreds of gameplay devices already available the uefn beta has many of the

same features we use to create fortnite including the landscape editing toolset

you can edit the landscape to change the look of your island or make sweeping

changes to create something completely new and then quickly play it in game to

check out the results so even though you have access to tons

of fortnite content with uefn you can make content that looks nothing like

fortnite this is a section from forest Guardian a

tech demo built in uefn that we are launching today

a big feature of uefn is the ability to import your own custom assets so we used

a few 3D models textures and materials that we built just for the project

and all the lighting in the cave was made possible thanks to Lumen our

real-time Global illumination system you can also find more content like

quicksomega scans and sketchfab models and our initial Alpha release of the Fab

plugin for uefn all assets are curated and optimized for use in fortnite and

the full version of Fab will launch later this year you can also create and

modify materials so you can change the look and feel of objects easily and you

can import skeletal meshes and then animate using sequencer and control rig

everything you’ve just seen is available today in the public beta version of

unreal editor for fortnite

so it’s a short video of just like the most condensed way we could show

somebody in like the more important workflows but I think a much better way

is to temp fade let’s do a live demo on PC running on fortnite public servers

right here on stage welcome Michael and Ray to the stage

take it away [Applause]

[Music]

I’m glad that went well all right hey folks so let’s talk about

how all this was made Michael here has the scene open in unreal editor so we

can take a closer look for the environment we built this level using

quixel megascan assets many of which are available in the Fab uefn plugin we also

use custom content that we built just for this demo

for the gameplay you just saw we had we hand placed enemies using a

creative device called a guard spawner which generates the aliens that you saw

at specific locations however we wanted to do more and introduce Dynamic play

that goes beyond what the current creative devices can do

so to do this we used our new programming language called verse

so in this short section of code every second we grab the position of the

player and for each spawn location we then calculate the distance to the

player and if it’s within a certain threshold we tell that guard to spawn

into the level so now we have gameplay that’s more

reactive to the player’s actions this is just a short example of using

verse but for a deeper dive check out the verse Tech talk later today

now Michael is going to show us how we put the intro cinematic together hi

everybody so before the gameplay section we played this quick cinematic animation

this is actually created entirely in uefn using sequencer sequencer is our

multi-track editor and it’s been used in everything from in-game cinematics to

Hollywood feature films so let’s take a look uh now all of this stuff is

available right inside uefn so let’s take a look at this last shot from a

slightly different perspective so the other thing we featured in this

section was a bunch of Niagara Niagara is now available in uefn and it’s our

high-end VFX system so and just like in the movies you can

frame up shots and have great effects so what we’re going to do here is just set

this up

so just look at the movies you have slow mode too makes everything better right

so this is just a sampling of some of the stuff that you can do inside

sequencer but let’s get back to the game all right so now that we’re back in the

game in uefn you have access to fortnite’s time of day system or you

could use what we did and do what we did and use a completely custom lighting

solution and thanks to Lumen both daytime and night time look great now

let’s go to the final gameplay section and let’s close out the demo

but as a reminder everything that you’re seeing here was created in the same

version of uefn that we’re releasing today I think I’m gonna jump in here and

help you right I am all for that

um

foreign

[Applause]

thanks Michael and Ray anyone everybody that put this together

the the demo came about when we asked the special projects team inside epic

that came straight off the Matrix demo to why don’t you put uefn through its

paces and one amazing fact about this demo is that it’s less than 400 Megs it

downloads and plays in less than a minute and it plays on any platform that

fortnite runs on like that is the promise of what uafn is

so again on related to fortnite it’s available today in public beta in epic

game store if you want an early glimpse of some other technical showcases that

our internal teams made with uefn and verse check out asserted domination

Forest guardian and the space inside all three available to play right now in

fortnite discover so we’ve talked about brand new tools

and the fortnite ecosystem now I want to talk about the next generation of

fortnite’s economy they’re both epic and creators will participate in

we call it create a economy 2.0 here’s how it works

the money in the economy comes from player spending in the item shop

fortnite generates billions of dollars a year in revenue from player purchases

fortnite players who have fun engaging in Islands tend to spend more in the

item shop creators who make popular islands are bringing real value to the

fortnite ecosystem and we’re going to share the resulting Revenue with them

this is the engine powering a Creator economy 2.0

we believe this so strongly we will distribute 40 percent of fortnite’s

global net revenue to eligible creators who publish games in fortnite both

independent developers and Epic anything we make like Battle Royale zero

build also participates in the pool shared with creators this will be the

primary way that epic will pay for our own game development in fortnite going

forward

thank you

I’m glad you feel it’s a big deal that I I feel exhibited

um this is this is transformative and a big gut check for us too we imagine

thousands of third-party development teams building businesses and thriving

with this model payments from the pool are based on

performance of The Island we take into account data like Island popularity

engagement retention attracting new players

and the benefit of this new approach is rooted in player fun and rewards

everyone’s creative work both yours and epics

there’s no need to design cunning monetization Loops or extractive gating

items you make an island the players love that is all you need to be part of

the new economy Creator economy 2.0 is live right now

for eligible creators who are currently publishing

we are back dating the engagement payouts to the beginning of this month

and for more details and to sign up go to create.fortnite.com

everything you’ve seen today represents epic’s biggest bet ever the release of

advanced tools that publish directly into fortnite on PC console mobile and

Cloud lowering the barrier for developers to be part of one of the

biggest entertainment ecosystems we welcome anyone to join us in the

future where original content is owned by its creators where developer earnings

our function of fun and what ecosystem can be directly linked through Community

language standards like verse so support to support developers who

adopt uefn we’re broadening the Epic Mega grants program to help teams

bootstrap projects that use uefn and verse

we’re just getting started and we hope that you’ll join us and help shape the

future of development now we’re going to give you a look at

what uefn can do right inside fortnite after the video Tim is going to come

back out to talk about epic’s Future Vision thank you very much

[Applause]

thank you

thanks we think this is going to be a really powerful combination putting

together a fortnight 70 million monthly active users with the power of the

unreal editor and a new Creator economy to share the revenue with creators

through great engaging experiences and it’s another Milestone on the path to a

new kind of entertainment medium which science fiction literature calls the

metaverse because there’s a crazy amount of hype

around this whole topic I’d like to step back and take a long-term look at what

we see here and the core of it is some very real

growth starting with fortnite 70 million monthly active users Roblox is 250

million monthly active users Minecraft’s 100 million and pubg mobile and Apex

Legends and numerous other metaverse inspired games is leading to an

identifiable audience today of over 600 million active users in these Virtual

Worlds and it’s on a growth trajectory they will put it at billions of users by

the end of this decade and so we can set aside the crazy hype

cycle around nfts and VR goggles you know these Technologies may play a role

in the future but they are not required this revolution is happening right now

and the core of it is something every gamer already understands it’s you and

your friends getting together online and going around as a group on voice chat

having a fun time and social entertainment experiences and some of

these experiences are serious games like Battle Royale some of them are going to

a concert and dancing or chatting with friends and just having a good time

we see this as the next big changing change in gaming and an epic’s Evolution

as a company you know we started out back in 1991

making 2D games uh we recognize the opportunity with 3D so we vote the first

Unreal Engine and the first Unreal Tournament game we evolved to make

console games like Gears of War then we evolved to make online games like

Paragon and fortnite but when fortnite shipped first in 2017

it was just our game uh but over time we recognized the audience the opportunity

to bring it to a much wider audience so we built fortnite creative mode

um and then the unreal editor for fortnite coming out now but we see the

future of this medium primarily not about epic’s work but about the work of

independent creators and we’re building towards the sofa

metaverse from two different directions first of all we’re taking the fortnite

audience we have and enhancing the development capabilities with the new

tools and this Creator economy to support everybody’s work

but we’re also helping Developers boning Standalone products evolved

towards building metaverse experiences themselves the Unreal Engine is a tool

for this the epic game store is a distribution vehicle for it epic online

services has taken fortnite an epic 700 million player accounts and 5 billion

social connections and open them up for free to all developers so they can plug

into voice chat and participate in these social experiences in a very easy to use

way and all of these are on-ramps to the Future metaverse

we really think that over the next decade today’s separate apps and

ecosystems are going to join together to form tomorrow’s open metaverse

and yeah because the last couple decades have been about wall Gardens wall

Gardens wall Gardens let’s back up and talk about how open systems have

successfully come about in the past you know this has happened really big time

back in the early 1980s and 1990s when the open internet was built

you know back in those days there were a bunch of closed networks at different

companies and universities that couldn’t talk to each other

and so the industry got together and defined Open Standards connect these

closed networks into one big open network the connected email systems by

playing the at sign and email addresses so people on One service could

communicate with people on other services and then they began to Define

Open Standards like JavaScript and HTML for the open internet so any user could

participate and the rule at that point was that any company could connect as an

equal into this system just by following the Open Standards

now let’s talk about what this means for the open metaverse of the future

you know we have the opportunity to take all these different online ecosystems

for gaming consoles and Publishers and connect them together into a single

place where anybody can talk to anybody else

we have the opportunity to turn today’s game engines into tomorrow is metaverse

browser engines if you take a look at Unreal Engine and

unity and Godot you have some very powerful 3D engines that have an

increasingly common set of features and increasingly adopting industry standards

is ways of communicating with each other and with other tools there’s the glotf

content standards there’s Pixar’s Universal C in description format and

then there are a lot of new standards bodies that have formed to help to

standardize some adverse there’s Cronus group there’s the metaverse standards

form there’s the academy software foundation and the open 3D Foundation

and we have the opportunity for all developers to work together to define

the future of this thing and there’s also the opportunity to

connect the economies of these different ecosystems you know what a user would

really like is to be able to buy a cool looking outfit in one place and take it

everywhere they go unless we were designing this new

fortnite Creator economy one of the things we were constantly thinking about

is how this could in the future be connected into an open metaverse economy

and there’s no reason that the core model there in which revenue from item

shops are shared with creators of experience based on engagement couldn’t

be turned into an economic model for the open metaverse at large with economic

peering and overseen by governance groups to make this work in a safe way

for all players and this gets us to a really key

question here and that is who is going to build the metaverse in the future

and we believe the answer has to be all of us together it has to be a

combination of indie game developers together with double A and AAA game

developers and fortnite creators and Minecraft creators and Roblox creators

each bringing their best specialized knowledge to the to the new world and

Brands like Disney and Ferrari and Ralph Lauren have already experienced the

metaverse through crossovers with fortnite we think this is just the very

beginning of a very long-term opportunity for all these companies tell

that if a much larger presence in the digital world in which it becomes a

major and first class business line for all of them uh future open metavers will

be about musicians and music labels and film and television Studios who are

using virtual production today to build world-class photorealistic content being

able to bring it and launch it playly in the metaverse timed together with their

film and television experiences in an endless stream of awesome entertainment

content that’s open to everybody to participate in

now epic has a long history of supporting developers this way you know

we’ve built a lot of different services that are available to everybody to use

there’s the Unreal Engine there’s epic online services which are open to all

platforms and all stores there’s the epic game store which is open to

software built on all engines using all different online services and we believe

the metaverse has to be opened it can’t be another monopoly’s wall Garden

and epic is all in on this you know we went from the

uh you know we went from the pandemic to the tech downturn the crypto implosion

and now banks are failing and it’s a really crazy time again but we’re

investing very heavily in the future with the belief that companies who

invest now through these hard times are going to come out the strongest company

in the future become the strongest companies in the future

some of these Investments we’re making in fortnite initially and then bringing

the work to Unreal Engine some things were building an Unreal Engine taking to

fortnite but the aim is to support everything the AAA game developers are

doing today and more in this project to build this all out

into an open system is going to take most of the decade and that is the

journey from Unreal Engine 5 to Unreal Engine 6. that will happen over this

time period it is the same Journey to the open metaverse but there’s no need

to wait because we’re doing it live we’re building all these systems and

deploying them as we build them right into Unreal Engine 5 openly with

Partners so we hope you’ll download this stuff download on your editor for

fortnite download the Unreal Engine and get started and come see some of our

Tech talks where you talk about this in way more technical detail thank you very

much for your time

thank you