Ijaw Dictionary Online

How Automobiles Work


>>Galen: Alright.
Hey, everybody. My name is Galen,
and I work at Quixel. I am the head of
Evangelism for Quixel. So basically what that means
is that I am out in the field talking to developers,
talking to film studios, talking to anyone
that is going to participate in the Megascans ecosystem, and basically kind of
figuring out how scan data and Megascans could potentially
fit in with their workflows. So I will kind of
talk a little bit about what we are
going to cover today so you guys have an idea
about what is in store. Then from there,
I am going to be here after the presentation
to answer questions. If you guys have anything
you want to ask, I will stick around
here for a bit. So yeah, I guess we are
going to start with just an overview
of what Quixel is. If you guys are not familiar
with what we do, we are the world’s
largest library of physically
based scanned Assets that is currently available
in the industry. We have an ecoregion approach
to gathering this data, so what that means
that effectively, we are going out
into the field, we are scanning on
five continents at any time. From there what
we are doing is, we are actually breaking
down an ecosystem to all of its most
basic elements, down to the minutia
of everything you could possibly need to reassemble
that environment. So what does that look like? That means we are
scanning with drones. That means we have about
14 different types of scanners that we are actually kind of
bringing out into the field and working to gather
all this data, and to collect in the
highest quality possible way. So we are breaking down
the largest cliffs down to the smallest rock,
twig, leaf, or branch, and from there we are using
a proprietary technology that ensures that we are
getting this data at the highest
possible quality, and also on
the processing side as well. So what that means is that we put it through
a proprietary workflow that allows us to render
in an agnostic way, extract all different
types of calibrations, for whether you are using
an offline render, or a real-time solution
like Unreal Engine. And you can switch between all of these different
calibrations on the fly. We have a ton of
training content that is available
on our YouTube channel as well. We very much want
to give our users, and everyone in
the development community, everything they possibly need in
order to understand what we do, and have a firm understanding
of some of our products. What we are going
to talk about today specifically is
our short “Rebirth.” So if you guys
have not seen it, we are going to roll it here
really quick, and then we are going
to go into a deep dive about everything that we did
in order to actually build that, and talk about some
of the tech specifics. So let us go ahead
and roll this real quick. [VIDEO PLAYS]
[MUSIC]>>Narrator: Adaptation — the ability to learn
from past experience. The use of knowledge
to alter the environment. These virtues
defined our creators and drove them to the
brink of destruction. But we cannot exist
without them. We must save her. What of our creators
exist within us? Humanity has always
had the potential to recognize its flaws,
and choose a better way. Can we save humanity? Was bringing her here
the right choice?>>Galen: Alright,
so that is Rebirth. So basically,
I will talk a little bit about what the goal
was from the start there, and then we will start
to really break it down into some of the things that actually brought us
to the final completion here. The goal from the start
was really to prove ultimately that photorealism
is absolutely possible today, and not only that
it is possible, but it is possible
in a real-time setting. And we obviously chose
Unreal Engine for that. We were able to extract
really photorealistic images from the Editor here,
as you saw in the trailer. So the composition of the team
was such that we actually only had
a three environment artist that worked on this demo. We had support from our friends
at SideFX, on the technical side. They helped us out
with some processing and some kind of
under-the-hood type stuff, to enable us to
kind of work faster. But at the end of the day, what we want to do
is actually kind of assemble the vendors
of these different industries; we talk a lot at Quixel
about how there is a really kind of
special time in the industry right now of this convergence of all these different
industries coming together, learning a lot from each
other, right? So we have VFX quality
supervisors and film guys that were actually
supervising our AAA game artists, and then from there we
actually had a previs team that was focused more in archviz
and the enterprise space. So kind of coming together, and all sort of learning from
each of these different fields is ultimately what
led us to Rebirth. So that was something
that we wanted to prove, is that this convergence
is happening now, and Unreal Engine
really finds itself at the nexus of all of it. From there, I will talk a little bit about
the genesis of the trip here. We took a scan trip to Iceland. It is definitely the biggest
trip that we have ever taken; it is the most
ambitious trip as well. This is over a dozen team
members from Quixel actually went to go and scan
in Iceland for over a month. We gathered so much
data from this trip, that it is crazy. It is the most challenging trip
that we have ever taken, and the conditions prove just in
the time that we were there that it is incredibly difficult
to get this data when everyone is freezing, and it was also just looking
at the types of Assets and the areas
that we were going, and they are some of the most
complex types of Assets and ecoregions
that we had ever been to. So we were scanning here
with drones, we are scanning here
on the field, getting everything you
could possibly need to break down a lot
of these environments. And you are actually seeing
a lot of that content going up on the site. It has been going up
ever since the trip. So yeah, that is kind of
where we knew, after Iceland, that we had to tell a story
in this space. It was something
that we just needed to do. So I will talk a little bit
about the previs, and sort of went into that. We had our friends at Beauty
and The Bit start to construct some paintings for us
that largely informs what we are actually going
to be doing for this, right? We knew we wanted it to kind of
take place in this ecoregion. But we were not really sure
exactly, what is the world — what was the tone
that we were going for? And this painting, specifically, was one that I think it really
kind of solidified in our minds; this is what we want
to be doing, right? So Victor Bonafonte
is the author of this. He did a bunch of
amazing paintings for us. But the thing that is
really neat is that he is actually
taking photo ref that we gathered out
on our scan trip in Iceland, and just photo-bashing it
into this painting, and then putting in this sort
of ambiguous space ship that is sort of in the back.
This is really when we knew this is a demo that
we wanted to make. So we really wanted to do,
as best we could, reconstructing these
environments, based on these paintings
that Victor did here. So the first one was a shot
that was actually cut. But this one was another one,
again, where we just knew, this is the direction
that we wanted to go. And we wanted to match this
as close as we could. I will talk a little bit about
the importance of reference, and how this reference
largely impacted the decisions that our artists were actually making
inside of Unreal. So this is
a very pedestrian photo, shot on an iPhone
out the window, on the way to one
of the locations that we were looking to scan.
But there is so much information that can be gathered
from this type of data. So a very simple picture,
like I said, but all that we are
getting from this is effectively that we can look at the way that the moss
actually sits on the ground. We can see the way
the greenery is actually creeping up the mountains —
all these things largely impact the way that our artists
are making decisions. So again,
simple scouting photos — these photos are
super important for us in actually kind of
building out this world, making it actually something
that is believable and realistic. This is one that
is pretty amazing for looking at the reference
for atmospherics, right? Really kind of
looking at the way the fog sits on the ground,
the way that it actually affects anything that
is in the distance, and these types of things. All super important
for really nailing that level of realism, right? The scans themselves
kind of lend, obviously, the photorealistic quality
to what we are seeing on the textures
and the surfaces. But Unreal Engine
has some amazing tools, and really allowed us to match
this level of atmospherics at that photorealistic quality. So the mashup of all those
different things, really, is what kind of lends itself
to the photorealistic quality that we had in creating Rebirth. Again, just some simple
scouting photos. These photos are really
important as well. These are two
of our hero Assets, shots that really kind of made
it very prominently in the demo. But we wanted to kind of
give our guys access to these types of photos
so they can see, actually, how maybe sand creeps up
next to these types of rocks; how different plants
kind of lay next to it — all these things
are super important. It turns out that we are not as
creative as we would like to be, so you have to really kind of
rely on this photo reference in order to really get
that photorealistic quality when you are making decisions
inside of the Engine. The next thing that we will talk
about is the vehicle. So we knew we wanted to have
this vehicle in the scene. We were not really sure
what it was that we were looking
to do with it, so we actually brought in some
really heavy-hitting talent in Fausto De Martini, who is
an amazing concept designer. He has done some amazing work. Just go check out his
ArtStation — it is ridiculous. He did this awesome concept
for us in 3D, and we actually kind of
landed on this pretty quickly. We knew we kind of wanted it
to be sort of sports car-ish, with also a utility aspect,
maybe, so a little bit more
industrial in that way. This is ultimately
what we landed on. I will not steal
too much of their thunder, but if you guys watch
the SideFX presentation that they did on the livestream, I think it was last week
or the week before, they cover some of the tools
in the game Dev toolset that is available in Houdini. With those tools
inside of Houdini, we were actually able
to crunch this thing down and make it something
that was game-res in almost no time whatsoever. We knew that our guys were
not going to be able to spend as much time
taking the time here to retopologize this,
bake this down, texture it in that way —
we had other fish to fry, as far as making really
photorealistic environments. So our environment artist just did not have
the time to do this. So Houdini largely
paved the way in order to make these types of Assets, just a really,
really simple task for us. If you guys are not familiar with how the Megascans
Library works, we have a lot of amazing tools
that allow artists to go in and search
the different types of scans that are currently available across a wide variety
of ecoregions. I am just showing
the Iceland scans here. The thing that I really want
to stress about the way that
we approach content is that we are very much
looking to democratize content, make it accessible
for all people, across all different industries. So the reason I am telling you
that is because when we were making Rebirth,
we wanted to make sure that our Assets are hitting
the store day and date, for every single person
that had access to them. We did not want to hold this
content back and have our moment when Rebirth came out to be,
like, all right, now you can play
with our toys, right? We wanted to make sure that
everyone had access to them, day and date, and the same day
that our artists were actually putting them directly into
the shots that we were building. This is just in Bridge, so if you guys are
not familiar with Bridge, it is just a really simple way of browsing
the Megascans’ ecosystem. I am just going to
roll it real quick. But ultimately, what it is,
is just a really simple way of browsing the content and being able to see
what is available. So the reason
I show this quick pan here is actually that we have only
crunched about 15 percent of the total data
from our Iceland haul, so we still have
a ton of content that is still to come
from all the ecoregions that we scanned in Iceland. The next thing I want to talk
about is the terrain, right? Terrain is something
that is super important, even if it is something
that we laid out in the vista. With that,
we went to Open Topography and actually sourced LIDAR directly for all the shots
that you are seeing here. So unfortunately, there is
no LIDAR content for terrains that is available in Iceland, so we actually
sourced from Alaska. And these are the exact
two areas that we sourced from. You guys can go and download
this stuff right now. Another really cool thing that Houdini can do is
actually take this point cloud data and make it something
that is digestible, that you can put into a scene
in Unreal. So these are literally the areas
that we sourced from that are actually building
up a lot of the areas that you are seeing in the back. If you go back
and watch the short, you are going to be able to see
that we are actually sourcing from real-world locations, and then again,
I will not steal their thunder, but if you watch
the SideFX presentation, they showed how we made
a lot of the decisions, as far as having
the moss creep up, and really figuring out
erosion in the different ways that we can make this
actually look realistic. So we started with some
really simple tests inside of the Engine. This is a really early
prototype of Owen, one of our
environment artists — if you guys are familiar
with Crab Rave, Owen was one of our guys
that actually worked on this. So he created this really simple
prototype early on for us, seeing what motion would
look like with a vehicle in it in an Unreal scene. This was about four
hours of work for him, just kind of throwing
a couple simple Megascans Assets into the scene. We called it the “cigarette box”
for a while, with lights on it just moving through the scene,
just to see how it looked. We iterated it very quickly
inside of the Engine, because we knew we wanted to see
how motion worked with lights affecting our Assets, and all of the
different Fog Actors that are working on tandem. So these are the shots
that are super easy to prototype inside of the Engine.
So we took advantage of that, being able to iterate
on this stuff very quickly. On top of that,
some of the considerations we had to make early on
are performance. Unreal Engine
has an amazing Viewport, and we were able to throw
a ton of polygons at it. But if you are familiar
with the Megascans Library, we have cinematic-grade
quality meshes that weigh in at crazy high values
for cinematic-grade quality. We wanted to figure out,
is there a way that we could get that cinematic-grade quality
inside of the Editor? It turns out that yes, there was
a way for us to do that. It was again via Houdini. Houdini and our
friends at SideFX crafted some amazing
tools for us, that is basically like a fancy
version of Decimation Master. Again, this is a quick
prototype scene that Owen threw together
in about four hours, early on of just throwing
these Assets into the scene, figuring out how cameras
and the atmospherics and all the different
Assets worked together. What was really cool is just
seeing these types of prototype shots, and being able to actually make
decisions about performance based on what
we were seeing here. Again, just really trying
to figure out how we can get that content
into the Editor, and make it something that
is functioning, and runs well. We have a full breakdown
on our YouTube, by the way, about 45 minutes
of in-Engine content that you guys
can take a look at, of exactly how we made
a lot of the decisions for a lot of these scenes here. But basically,
we are going to be releasing a version of this scene
that you are seeing here today. It is actually a 360 version
of a couple of different shots from Rebirth,
mashed into a single Level, running at incredibly
high frame rates. Really, the goal in showing this
is ultimately just that it is possible go and
prototype these shots, take some really
basic camera moves, and add a really nice level
of photorealism to these scenes. I will not go
into this too much, but definitely go check out — it is literally a
45 minute breakdown of everything from fog
to cameras, to placing actors —
all that stuff. Go and check it out
on our YouTube channel. But ultimately, it just shows how we are able
to construct these shots, and how easy it was to
make all of these decisions inside of the Engine
very, very quickly, to extract final pixel quality that literally rivals
offline render quality. So these are just some of the
breakdowns of some of the shots. You guys can kind of see exactly
how we were able to get this level of detail. Really taking all of these
amazing assemblies that were in the shots,
and making something that was actually workable
inside the Editor. Again, I talk a lot
about performance, and something that we
really need to make sure this stuff was working
super well inside of the Editor. So these types of
shots really show how we were able to craft
something that is nice. So from there, the thing
that I want to stress about this is that there were
no map paintings, or anything like that,
in this at all. Everything that you are seeing
is actually geometry, and we were able to really
kind of push the Editor as far as it could go. One of the things that we really
wanted to make special in this demo, again,
is if you go back and actually listen to this
with headphones, you might get it
into perspective after I show you this. We had an amazing musician
named Terje Isungset, and he is Norwegian. We actually
flew him out to Canada and we rented an ice rink
in Canada. All of his instruments
are literally carved from ice. If you go back and listen
to the soundtrack again with headphones, and you identify
maybe some textures that maybe you are not familiar
with — that is ice. So literally, he builds horns,
he builds xylophones, the percussion instruments,
and all this type of stuff. And we layered all these sounds
from this shoot that we did specifically
for Rebirth — we layered that on top
to actually make the soundtrack. I mean, that is new and unique, and hopefully something that
people had never heard before, and kind of a new
and unique angle. So it was really cool
and apparent immediately when I went to dinner
with our composer, I told him about the project.
He was, like, “Oh, yeah, no, my big inspirations early on
with my music career are actually based
on Icelandic music.” I was, like, okay,
we found the right guy. We are talking
to the right person here. This is Jason,
who is our composer. He did an amazing job,
so definitely go and check out the soundtrack again
with headphones, if you get a chance. The next thing I want to talk
about really quick here is that we actually layered
on some nice camera techniques, to add another level of realism
into the shot. While you can add procedural
noise to lots of camera moves inside of Sequencer
and these different things, we wanted to take it
to the next level. We had a couple of shots where,
literally, we put our guy in a VR kind of setup,
and put him in a chair. And for the shots
where is moving in a car, like down the canyon, we are shaking him
in different ways, trying to get some camera
vibrations that were realistic. What is cool is that we were
able to interpolate between — in Sequencer, interpolate
between the actual camera move that we had programmed
initially, and these really, really subtle
movements from tapping him and making him
do different things. But if you go back and watch
the camera moves inside of the Editor, again,
a lot of them are super-subtle, and we were able to,
with curves, tone down areas that maybe got
a little too ridiculous. But this is the process
that we added in, in order to get
that level of realism. I want to give a huge shout-out
to all of our partners in this that help make it possible. Again, Beauty and The Bit
for all of their previs work, SideFX for everything they did
under the hood for us, Ember Lab doing the music
and the cameras, and helping us out there, and obviously Epic Games
for really believing in us, and really helping
us out along the way. If you guys have any questions
at all, definitely come up. I will be here
for the next couple of hours, and for the rest of the week. Thank you so much for coming,
and yeah, see you next time.

8 thoughts on “Quixel – “Rebirth”: Leveraging Scan Data in UE4 | SIGGRAPH 2019 | Unreal Engine

  1. 1) the rebirth video was incredible and I love hearing all the behind the scenes info about it but…
    2) are you seriously giving a copy of the map out to look at?? I am blown away by the quality of the video and would love to see all the settings

  2. I'm impressed by how you connect the Art directly to the newest technologies! Great Work! BTW the storytelling in Rebirth is breathtaking.

Leave a Reply

Your email address will not be published. Required fields are marked *