Friday, May 31, 2019

Quain Courtyard pt 3, Unity

In my last post, I developed ten materials in Substance Alchemist. Now it's time to start putting everything together in Unity.

I should also mention I've switched from working on the big machine to my personal laptop, so my graphics processing power has been cut in three (at least, haha). Still, my 15in Spectre is not too bad. I'd rather be targeting midrange PCs anyways.

The first thing I need to do is export all of my Alchemist materials out to something useable. I notice right away that I can't seem to consistently export 4K texture maps... Alchemist will crash. Although I was able to export two out of my ten materials. Perhaps a bug because it is still in Beta, or maybe my laptop. I notice 4K maps in the viewport will also crash. Either way, I'm stuck at 2K textures for now.
Another frustration is that I have NO idea where Alchemist is saving my working files, so the only way to get assets OUT is through the exporter. So at the moment, I cannot transfer my working documents to a more powerful computer (to rule out the weaker processing issue). To be fair, Allegorithmic warned us not to use their beta release in production, so I kinda asked for this.

Jumping ahead, later on I realize that the Substance Plugin for Unity reads SBSAR files. This is essentially a package of your texture maps. Its much easier / more accurate to implement these instead of the image files. Still though, I am clamped at 2K exports for whatever reason.
--


So now, I have a folder full of 2K SBSAR files. I did some investigating into current Unity project templates. I've been out of the game (pun intended!) for a little while now, so I kind of wanted to work on something more modern (again, probably asking for trouble).

I first looked into the High Definition Render Pipeline, which was used on the Fontainebleu demo (the same demo that published the expert guide on Photogrammetry for games).

I also looked into the Light Weight Render Pipeline, which is not the legacy renderer but instead, a newer and more optimized scriptable pipeline which targets a wider range of platforms, including mobile. Their demos seemed quite pretty as well, plus with some initial testing, ran totally fine on my laptop and played nicely with the Substance shaders. I chose this over the legacy renderer because the lighting is more sophisticated.

I went back and forth a lot here, and I'm still not totally convinced I'm settled into HDRP. I actually have three WIP versions started. One thing I was realllyyy hoping for was to get the shaders as close to the Substance preview as possible. Those sneaky bastards, the height maps really displace the geometry, which is not the case (as far as I know) in realtime rendering elsewhere. HDRP does have a height map channel and claims vertex displacement, but just like normal maps, it does not seem to alter the silhouette


The deciding factor was ultimately, that I really wanted the lighting quality and to follow the expert guide.

So now, I drag drop my ten SBSAR materials into my Unity project.


Notice the pink? So, the Substance plugin brings in your material as a standard shader. The standard shader belongs to the legacy renderer, and canNOT be rendered with the HDRP. So I need to open up each one and change it to the HDRP "Lit" shader, set the quality to 2048, and double check the maps. Most cases I had to plug them all back in for some reason.

Unfortunately, doing this breaks the thumbnail in the asset bin. :-(
This does not affect the material.


Concrete is expanded to show the HDRP material nested in the SBSAR ^^



For the frosted glass, first the Material Type needs to be set to Translucent.
Then, the base color needs an alpha value of <1. Simply click on the color picker icon and slide the A slider down to taste.



Another setting I really like in the SBSAR properties is the ability to switch between UV and Triplanar projection. Triplanar works great for static meshes, and essentially takes away the need for UV unwrapping with the click of a button.

I mock up a quick greybox version of the space based on the layout geo (generated with the 360 camera)



Honestly I could have done this step in Unity, but I work faster in Maya. I limit myself to about 30 minutes on this step. I use Maya's Game Exporter to export all objects as seperate fbx's (check plugins list if you don't see it under File).


I import back into Unity.
Couple of noob notes for myself RE moving an entire scene into Unity

Maya to Unity:
1. do not freeze transforms in Maya (Modify>bake pivot to get transforms back)
2. export all separately using Game Exporter
3. import into Assets. select all and check the Convet Units checkbox
4. drag into hierarchy. Everything should be in its place


Again, first import the rough layout geo... (iterative design!)

Then everything else. I'm glossing over a few steps here, but I did a quick pass at applying all the materials and tweaking them a bit.


and this is a screengrab, without the layout geo.
A couple of obvious things stick out right away. 

First, there are obviously tiling materials. This is a little in part that they are placeholder, and I'm using them on objects that a much larger than what I intend for the final assembly. 
Still though, I had a lot of problems with scale. I had to manually tweak each material to look right. I realize that I really needed to take real world measurements and plan ahead for texel density. Unity says to take a 1 meter square of any material scan. I grabbed whatever I could get. 
Second, my placeholder geo kinda sucks. haha. I would have been better off with cubes I think? It's actually related to the scale issue, I added bevels without thinking about how noticeable they look.

The postives are, its starting to come together! The rendering and post processing are really really nice. It does't look like a typical "unity student project" that I have noticed is really common in Unity games (even Overcooked has symptoms of this... something about lacking antialiasing, color correction, dinky looking shadows and shaders that falloff to black? I can't totally put my finger on it but there's definitely a look to the legacy renderer.)

Walking around the space is starting to feel like the space. Getting up close to the bricks is actually kind of exciting. 

Next post I'm going to implement the scanned assets... stay tuned!




Wednesday, May 29, 2019

Quain Courtyard pt 2 scan material processing

I had a successful shoot day!
It was rather bright outside, so I had to deal with a lot of harsh sun and directional shadows.
I took a lot of notes as well, and recorded sound.











First I wanna talk about Photoshop's 'Photomerge' function... it's a tool I've been using more lately especially with Alchemist. Its a simple photo stitching feature. Under File>Automate.
It's gotten better than what I remember 5-10 years ago. It keeps all photos on separated layers with masks, and looking at the chunky shapes, the stitching is rather sophisticated.

I would like to, in the future, use a tripod and carefully measure out and scan a large area for material capture. However, I got successful results from just a handful of photos (5-8 or less). I can also repurpose the photogrammetry batches of photos (intended for geo reconstuction).

 

I was able to make this material kit in about a day. I love Alchemist's "Generate Board" feature, which quick exports your material collection into a thumbnail sheet.


The four slate textures were especially fun... in the courtyard, there are four types of slate tiles. I started with two types of slate, pulled from photos. Then used their "Pavement Generator" to define two types of tile patterns.


There are still some frustrations with this Beta release... it will crash on saving, and is a little slow on my laptop.

Something important to mention was that part of my capture day I took a couple of 'reference photos', wide shots of the area or in general, any photo that documented the qualities of the space and were not intended for reconstruction. These were especially valuable, as it helped inform some of my color choices.

I've been feeling a little overwhelmed about asset generating, but I recently rewatched the Star Wars Battlefront GDC talk. Their level kit contained like, 8 assets. They were simple enough that they could be scaled/rotated and repeated throughout the environment. Quality over quantity! And my favorite quote is still that the experience of going on location was more valuable to the artists than the data itself. It's time we stop making CG art based off of Google Images.

Thursday, May 16, 2019

More on Material Scanning... with Substance Alchemist!

I'm very excited to talk about Substance Alchemist, it has really solved a huge chapter in my reality capture explorations.

The methods I had previously researched for material scanning were tedious and required a lot of manual labor. Out of all the scan based material authoring methods, I can generally group into two procedures: one is using only photo data, and the other is using geometry from photogrammetry.

Photo-only: https://www.youtube.com/watch?v=kDMz6djZCkc
Geo baking: https://www.youtube.com/watch?v=HMPZV_Se8W0



I had plans to compare geo based workflow to photo based workflows... and I was getting stuck on the amount of time needed. To pull maps off of a photogrammetry reconstruction, I need to re-topo and UV a clean mesh first. This is fine for one or two materials, but to process more than that would be a lot. 

The other documented method recommends building a scan box to control light, which is cool, and provides a lot of accuracy. If I was scanning a large batch of say, fabric samples that are all roughly the same size, this would be awesome. 

It's DIY but it's also pretty complicated. 
However, applying it to shooting on site, usually outdoors, sometimes vertically... I'd have to engineer some kind of portable scanbox and probably have an assistant to help with lighting. 

Throughout all this research, I came across Substance's announcement for their new Project Alchemist, which threw all the above out the window. 

I took my baked brick photo texture, literally drag and dropped it into a new document, and immediately got this:


A few minutes playing with the auto tile feature...



At around 20 minutes I had this. It's by no means perfect, but holy shit it was FAST. And easy to pick up.  Everything in the Alchemist document is parametric, so I could revisit and re export. (Like any Substance product, I can export all the PBR maps as image files.)


Here's the thing... one of the reasons it wasn't perfect was because I was using a re projected texture map rather than a straight photo (above) so the data already wasn't clean. If the bricks weren't distorted, I think the material would be ready to use.

So I took a few more minutes and generated more.





Here I'm correcting the perspective in Photoshop first.




Alchemist has a feature which generates variants based on color pallets pulled from images. I could see this potentially being used to color correct a large batch of materials with an existing pallette. It's also fun.


Here I gave myself a challenge to replicate one of the materials using NO imported images and only the built in, parametric tools. This is how far I got in about 10-15 minutes.


Lastly, I tested the maps in Unity with its standard shader.
 

It's never as pretty as in the Substance viewport, but the import works and is very straightforward. I'll have to play with Unity shaders a bit in the future.

Tuesday, May 14, 2019

Hologram Portraits

I got to show off my scan portraits the other day on the Looking Glass holo display. I used a pre built app to display the models, and paired with a Leap Motion, I could pinch and rotate/zoom in on the models. 


It was really great to share this with others, especially when it was someone interacting with their own portrait. Sometimes I get really lost in the bigger projects and research ideas, so it was a good feeling to come back to simpler ideas like these portraits. Everyone loves a portrait. They're so fundamental and human.

The only issue was running the display off of my laptop, which has an NVIDIA graphics card but still lagged. 

To prepare the models, I first decimate the mesh to under 1000 verts (about 300 polys). I do this right in Zephyr.


Quain Courtyard pt1 Scouting




Quain courtyard... a small outdoor space I found on UPenn's campus, hidden in their Levine building. I technically don't have access to it, which adds an extra layer of fun. I scouted this space because it's semi enclosed, contains a lot of different materials, foliage, and overall interesting atmosphere.

Quain Courtyard pt 4, mild success