Rookie Awards 2024 - Open for Entries!
Lyon Rosenblatt Demo 2023
Share  

Lyon Rosenblatt Demo 2023

by PlanckLiveArchitect on 30 May 2023 for Rookie Awards 2023

This is my entry for this years the rookies. This entry will present a breakdown of the individual projects and the USD pipeline Sorunding them. https://github.com/lyon040502003 https://www.linkedin.com/in/lyon-rosenblatt-580445211/

12 873 1
Round of applause for our sponsors

"Hello World"

Hi There, thanks for coming around. 

This article will center around my first Demo; I will go over the projects and tell you how they came around. After that, I will dive a bit deeper into my pipeline; the later sections will focus a bit more on the pipeline side. 

So I hope you will enjoy it, and maybe I can even inspire you to take your pipeline into the new USD field.

Project Breakdown

In this section, I will go through the individual projects while looking at the artistic side. I might describe some performance ideas here and there, but I try to keep them short. 

PRJ1 ini

PRJ1 info 

So let's talk about the first project.

The project was first called the goddess tree and was imagined to be a big tree with the sun setting behind it. But while I was planning the project, I came across a documentary about Torres del Paine national park, and I just fell in love with the mountain in the middle.

So I decided that I wanted to recreate the mountain, and so this project was born. At first, the project was set in a jungle, but this idea was abandoned directly after testing it. I abandoned it because I found it strange to have one mountain without many trees in a place where plants usually take over everything.

Primed with the test results, I decided to move the mountain into a more European forest setting. At the same moment, I also decided that filming this setting with the sun behind the mountain would not look as interesting, so I moved the sun around and gave the mountain a few very nice sunbeams.

After discovering how the project should look, I got right to work. I knew that the project would end up relatively heavy with all of the plants, so I tried karma xpu against heavy geometry and was quite happy with it. The rest of the work was creating tools to instance objects with transform options easily.

The work itself was a combination of kitbashing, sculpting and procedural modelling. The textures were always a combination of procedural shading, masking and sometimes mari projection and hand-painting textures. The mountain, for example, was done mainly in Mari; having the real mountain as a projection made this work quite straightforward. The lighting, layout and shading were all handled in Solaris, and every object that existed more than once was instanced; this helped with performance quite a bit.

A few things were done regarding performance optimisation; one was instancing all the leaves with just one instancer. This approach also makes it very easy to randomise the leaves.

I did create some matte paintings for the background and the volumetric effects like clouds and fog cards. While working on the fog cards, I started using deep compositing after finding out the limitations of position pass points. In general, the comp was rather simple. The matte painting was created in Photoshop; I also created some projections in Photoshop to fill some areas or give some extra detail.

After all, that, I emulated camera behaviours that you might know from early film cameras, namely the slight pink tint you see in the shadows and the random colour shift in one of the three core colour directions.

I used the most important comp technology and added some contrast and bloom.

Bada bing bada bum project finished. 

Full Scene leaf instancing

This small section will focus on Leaf instancing and why it is essential. 

The slides demonstrate the setup used to indicate the rendering behaviour. You can see the better render times with leaf instancing. On top of better rendering time, it also cuts memory use in half. This might be useful if you have heavy scenes or want to render on your GPU. 

But wait, there is more; this system also makes it very simple to randomise the leafs in your scene. You can throw down a single random colour node and use this attribute to randomise colour, spec etc.; this approach makes it fast and simple to change the randomisation. Plus, if you give your source points a name attribute, you will still have all the control you want in case you want different behaviour per plant species. 

So in general I would say; you might need a bit more setup at the start, but after you automated this setup your life will be better. 

PRJ2 ini

PRJ2 info

With the first project finished, let's move on to the second one, shall we?

The second project was planned with the first project but executed afterwards; the main inspiration for the second project was that I wanted to build up a workflow including Zbrush. 

Ultimately, the house asset was chosen to be built up in Zbrush, then sent through retopology just to be finished in Mari. The workflow was relatively straightforward. The only thing I did differently from most Zbrush workflows was that I ended up baking all the maps with the Houdini labs maps baker instead of the Zbrush internal baker. The reason for this decision was that the Houdini baker was quicker, and the maps always worked with a strength of one in karma; I know that with the proper baking settings, the same effect could be achieved within Zbrush, but Houdini made it a bit simpler for me. 

The second project also included a 3d scanned asset, the axe in the foreground. The axe was done with the  3d scanning function in the substance sampler. It was just meant as a test at the start, but it worked pretty well, so I used the asset in the scene. 

The project was also intended to feature Vray as its render engine, but Vray for Solaris was a bit buggy. I did discuss my issues with the support, and over a few months, they did fix all of those. But in the time they took to resolve the problems, I switched to karma CPU to be safe. I will probably work with Vray in the future, but sometimes you must take the safe route if you aren't sure what the other one will bring. 

Other than that, the project is straightforward; the snow and the icicles were done with two tools that I will discuss later, and the trees are just a combination of state trees inside of Houdini and Speedtree for the big one in the front. 

The project was overall way shorter than the first one, but it also led to a new asset export workflow that I will discuss at the end of this article. 

Project Tools 

Snow and icicle tool

Project Tools: Snow and icicle info

so now we are starting the first tools section; it will run deeper and deeper as you go. 

The first tool I built for the second project was the snow paint setup. First, I tested what type of snow generation would give me the best results. The first idea was to create the snow purely based on procedural marks. This approach worked but was quite limited because I couldn't decide where I wanted snow and where not. The second approach was to simulate all the falling snow with a pop network. And to no one's surprise, this worked well but was extremely slow. In the end, I built a system that allowed me to paint where I wanted the snow and then added detail with the help of procedural maks and some vex expressions. This worked well in the end. 

The icicle tool, on the other hand, was very straightforward. 

There were two options for placing the icicles; the first was to paint a mask and then scatter the source points, and the second was to identify the issues by hand. I knew the number of icicles would be limited, so I put them by hand to maintain more control. The rest of the tool is just a combination of curves that I manipulated with some displacement. I used the first point of the turn to figure out what the closest part of the surrounding geometry is and used this information to add some of this geometry to the icicle geometry; this geo then serves as the point where the icicle interacts with the surrounding, and then to finish it all up, I ran everything through a vdb to get some consistent geometry to get properly working displacement in the render. 

Karma and Material X

soooo next section. 

In this section, I will talk a bit about karma material x and the advantages and disadvantages of having a system that allows you to use a lot of "low" level functions vs a system like Vray that wraps a lot of those functions into nodes to make them more user friendly.  I will tailor those examples to Vray and Karma because those are the two engines I tested against each other. 


Triplaner HDA

Triplaner info

Let's talk about the Triplaner node and the HDA I created from it. 

The default triplane node in Mtlx only allows you to project three image files from 3 sides with no further controls regarding size, position or scale. I looked at how Vray implemented its triplane node to make mine more userfriendly.  

So, my triplane HDA uses the position and normal pass to transform, rotate and scale the projection. I also added randomisation options for those three functions. The HDA does the randomisation based on a projected noise. My HDA is a wrapper around the default functionality of the node. But this simple wrapping makes the node way more userfriendly, allowing me to work faster. 

I intend to push future versions of this node to allow the node to randomise via attribute, and I also want to have some global scale controls for the randomisations and the projection scaling. 

Snow and Noise HDA

Snow and noise info

The snow and noise HDA.

The snow HDA is just a normal-based measuring tool with some integrated breakup.

In the future, I intend to update it to take things like AO into account. I also plan to give the node an option to know the rotation of the sun in order to have the snow react to the sun's position and thereby allow the snow to be more realistic.

For the wood house project, the simple regular measuring option was enough, because it only needed to handle a few assets here and there.

 

The noise HDA is a wrapper around the default Perlin noise node, which is implemented with three noise nodes acting on each other to give extra detail. It uses the same scale, transform and rotation module present in the Triplaner HDA. It also exposes the default options of those three noise nodes, like the pivot option, and allows the control of all three noises with a single parameter. 

Going forward, I intend to push an updated version of this node, including all default noise types and some functions to mix the default noise types to get even more detail. 

Grade Curve and add mix HDA

Grade Curve and add mix Node 

The grade curve node is similar to the RGB curve node present in Arnold, with the slight divergence that my HDA features a value curve to control all three values simultaneously; it also allows a simple HSV grade option at the top. The node was created to make it easy to grade plants into different seasons. This was especially useful with the Wood House project because most plants and plant textures you might find online will usually be created around mid-summer when the plants are at their finest. 

The add-mix HDA is precisely as it sounds. It's just an add node with a mix option. This node allowed easy mixing of 0-centered EXR displacements with negative numbers. Using this node will enable you to mix multiple displacement maps and then mask where the effects would be present. This made it easy to have the house in one shader while still allowing me to export separate maps for the different parts; it enabled me to overwrite the map for a particular UDIM without overwriting the actual file on disk. 

USD asset exporter HDA

in this section, I will talk about my USD asset exporter and why I decided not to work with the default Component Builder. 

So let's start with some pros and cons of the Component Builder. 

The main pros are that the USD asset you will get from there will be very clean and easy to use. The system forces you into the standards set within the USD system. But at the same time, the system is also built to be simple. There are no LOD functions there are no variant functions. This makes the system easy and allows everybody to use it regardless of their experience with variants and other setups inside Houdini. 

But at the same time, the way the system works is not very encouraging to use things like variants or LODs; you could implement those things, but it would not be straightforward. Another issue is that the default Component Builder is not built to generate animated assets. I believe this is a: because the Layout asset gallery used to work poorly with animated assets ( this is better now ) or b: I didn't press the right button. 

No matter what the reasons are for the way the Component Builder is built, I decided that I wanted my assets to conform to a very explicit standard in the future, so I made the OP_asst_builder. Just for you to know, the OP prefix represents the Open Planck project. The name was chosen to have all my open source projects under one name to make my life finding things easier. The name was then shortened in Houdini to make finding the tools faster. 

The rest of this section will focus on the individual nodes and how they work. 

OP_Importer

OP_Importer info

the op importer has two options to import geo meshes.

The first one is importing the mesh directly from a Sop network. This option might be great if your asset has no complicated inner structure. But this option will fail you if you need a lot of subgroups for your asset. 

The second import option allows you to import via Lops; I prefer this option because it allows me to structure my assets in Lops instead of setting all the paths via attributes inside Sops. The Lop importer also will enable you to unpack nested USDs. 

The main window has a time-dependent checkmark for animated assets, and it also allows you to choose if you want your proxy to be time-dependent. This might be helpful or not, dependent on your situation. 


The geo-gen window allows you to set all the LOD and proxy generation options.

Like how many LODs you want and how you want them to be generated. There are three options for that. You can do a bounding box wrapping, a poly reduction, or you can remesh via VDB. All of those options have their ups and downs. In any case you will only have to set a value that decides how strong the poly falloff for the later LODs should be, and the system will figure out the rest for you.

You should also know that all your attributes and Geo paths will still work as expected. Only the Subdivision will be deactivated for apparent reasons. 

On top of the Lod generation, the tool generates a Viewport proxy and a Sim Proxy. 

And to top it all off, the systems will also generate missing uvs; they will not be perfect and only on one UDIM, but they will be enough for a lookdev artist to paint things in if needed. The system will also generate attributes like Rest, Volume, area and velocity for you so you don't even need to remember to put those on. 

OP_Asset_mat_var_assigner

OP_Asset_mat_var_assigner info

The asset mat assignment HDA is built in order to, who would have guessed it: assign materials, but with the slight twist that all materials will be assigned as variants instead of just a regular assignment. 

It might be essential to know that the Import node will also create three utility materials: a clay material, a Render Holdout setup and a variant with no material. This might be helpful if you want to debug some things on your screen or if you want to assign materials globally; in this case, having a set with no material helps with conflicting assignment strengths. 

Other than that, the node will take a material library as a second input and the OP_asset_importer as the first input. The node will also need a name for the material variant. All user material variants will be created under the user_mat variant set. The node also allows previewing of the variant alone. This can be nice if you want to lookdev your materials in this step, but you will need to deactivate the function for export. 

OP_asset_exporter

OP_asset_exporter info

The asset exporter HDA serves a few purposes. 

First, it makes your asset ready to be exported, creates all the directories for the export, renames the asset, and then writes the asset. 

At the same time, the exporter will also create a turntable in low resolution and a thumbnail for your asset. The turntable will be converted into an MP4 via FFmpeg, so that should be installed. 

In future versions, I want to integrate automatic .rat conversion and an option to copy all texture files from the asset to the export location. 

The center asset and transform asset options will only apply to the renderings, not the export. 

And all the functions in this node are executed via a top network, so you can connect a Top network monitor to the Top network to see the status of your export. 

Pipeline Section. 

This section will focus on my "system" for storing assets, and I want to talk a bit about the pros and cons of some other structures. I will also go over my USD Backpack tool. And why I created it in the first place.

Many of you might be familiar with the set project options that Maya and Houdini give you by default. Those options are great for smaller projects, but the fact that you will usually copy your asset into your project makes them relatively heavy for your Drive.

And while this might be fine with more small scenes. It will become a problem the bigger your scene becomes. For example, let's say you have an asset; this asset comes with 4 LODs and five texture scale versions so that your artists can choose the correct texture scale for their scene. In this case, your asset will probably be huge, maybe even more than 10 GB. At this point, copying the asset into every project might take some time, and the asset will fill up the open storage quickly.


So what is my proposal to fix this issue?

I try to work with three directories. In a perfect world, those directories would be some kind of Data Base(Db) like a mongoDB, but they remain Drive folders for now.

The first Db will always be my general asset Db. In this Db, I store all assets that could be used everywhere, like HDIs, General Trees, 3D scanners, etc.

The second Db will be a Db specific to the project I am working on. This Db will hold all the assets intended to be used only for the project. The woodhouse asset would be an example of this.

And the last Db will be some system that allows me to organise and store all my work files; in my case, this is prism pipeline.

The advantage of having all those things separated like this is that I don't need to copy an asset to use it elsewhere. This keeps my storage small and motivates me to build assets that can be used in multiple scenes, increasing the reusability of my assets.

Unfortunately, it comes with a small but essential problem; if you use a system like this, all your paths will probably be absolute and not relative. So you can't just grab your scene and move it to the next PC. This might be a problem for some, but this is fine for me. I only need to move my assets from my shared folders to a PC outside of my network when I want to Render via a Render farm. And the solution to this problem will come up next. 

The USD Backpack 

USD Backpack info

The Usd backpack is a tool designed to run true .usda file and find all the files that this file and all the references and sublayers need to render the final image. 

After it finds all those USD and texture files, it will create a location where it can copy the image files and write new USD files with the corrected file paths.

This allows you to give the tool a render-ready .usda file and get a folder with a new render-ready .usda file with all its textures back. 

The tool also allows you to convert all the image files to .rat files. This will give you the same render result, but decrease the render time and memory utilisation. 

With this tool, you don't need to have all your assets in the same folder, and you will still be able to send the files to a farm or a partner that might not have access to your local file storage. 


I am also reworking the tool to accept all USD types and use the native pxr_usd library for its execution. I hope to give the user more options for converting formats in the future and I would like to implement a system that allows a bit of file compression to make uploading faster. 

Outro

Damn, you made it. First of all, thanks for sticking with me through this article. I know it's one of the longer ones, and this one has many technical aspects. But I hope you enjoyed it and maybe even found some things you want to integrate for yourself. 

Until then, I will put my GitHub and other contact options at the end of this article; feel free to download all the tools shown in this article. They are all open source, so you can use them to your heart's content. 

Yeh, so what's left to say? Working on this demo was a pleasure and a pain at the same time. I would do it again if I could. And hey, maybe we will hear from each other somewhere else. If there are any open questions, let me know. I will try to answer them as best as I can. 

Lastly, many thanks for reading through this article, having you was a pleasure. I wish you all the best for your projects. I believe I have to go now. The next Project is already waiting for me. Lets Rock. 


Comments (1)