Saturday, June 1, 2019

Quain Courtyard pt 4, mild success

I have written before about getting game ready assets, but this time I approached scanning in a totally different way. Actually, I think it was ultimately a bad way to do this. Plus, I realize a lot of things here that simply need to be redone.
https://drive.google.com/open?id=1yHXE2Gp5NKyeFHSFZbfRDL9HpJDpTSiC

Note the screenshots are totally out of order. I've added captions for clarity



Unity's expert guide goes into various definitions of capturing different types of objects and emphasize collected different pieces. So I decided to strategize and break up the space according to... what felt right... and now I've learned what worked and what didn't.

I've talked about preparing game ready assets in my previous projects.
These are the steps I used this time:

1. In Zephyr, make a clean selection of the dense mesh. Generate mesh, then texture mesh.
2. Decimate the mesh to something reasonable. (*you could export out and retopo and UV here) Generate textured mesh again for the lo poly mesh.
3. Export both, and enable resize to single texture.

4. In substance designer, load both.
5. Open the lo poly mesh in the baker. Assign the high poly mesh.
6. Bake all maps

7. Import lo poly mesh and all maps into unity
8. Create new material (or dupe existing material to reuse settings) plug in maps

--

Overall, some combinations worked better than others.
this one looks okay
I thought I could optimize this asset by capturing around the edges...
...and filling in with a material. In reality, I'm not saving that many polys.
scan vs model

maybe both?
surprisingly pleased about this one, however the scan quality is lacking. but the two decorative pieces merged well.
too many things overlapping...



Things that are going wrong...
  • My world scale is so completely wrong. Photogrammetry is not a precise method and so, the reconstructions were all over the place. I should have centered all my assets before exporting. I found myself nudging rotations back and forth, baaaack and fooorth, scaling by 70-80% down, carefully eyeballing everything into place. Plus the Z-up axis didn't help. Actually I'm really glad I had the 360 reconstruction (layout pass) to help place everything.
  • ARGH!
  • The materials I generated in Alchemist didn't totally translate over to Unity. I kind of expected that, but one thing I got really stuck on is the vertex displacements. The height maps look so pretty in Substance! And after some research, I realized you can in fact, do real geo displacements
    https://blogs.unity3d.com/2019/02/12/creating-an-interactive-vertex-effect-using-shader-graph/

    But right now I'm stuck on this issue
    the height separates the faces


    In which, increasing the height value does not displace the silhouette of the mesh, but instead, causes these separations between faces. I need to ask someone who knows more about shader devving to clarify whats going on.
  • The other issue I had with material scanning was texel density, again, due to mismatched world scale elements. Currently, if the game camera is more than a few feet away from one of the materials I made, the repetitions become obvious. While yes, it is tileable and somewhat parametric document, I can't make up for the texel scale issue. That is, if I want my bricks to be brick sized, the material has to repeat dozens of times. Even in something as random as the concrete (which had an extra randomizing filter on it) the repetitions are obvious when they're far enough away.
    I should have listened to Unity, and photographed a meter square to ensure a meter of texel density.
  • Foliage is totally out of scope.
Things that went right
  • The HDRP is a dream to work with. As the director of one of Unity's short film demos describes it: "High Definition Render Pipeline (HDRP), which — from an artist perspective — just automatically makes things look nice." https://blog.siggraph.org/2019/03/an-irresistible-way-to-make-films-using-real-time-technology-for-book-of-the-dead.html/
    • Actually this is a really exciting observation for the state of computer graphics... tools like the HDRP and Alchemist are helping things just automatically look nice. I'm here for it. 
    • She also uses the word 'opportunistic' to talk about her work. I love the idea of games being spontaneous, opportunistic, since it is traditionally such slow and unforgiving work.
  • The materials were gorgeous to look at anyways, and the Substance plugin was great to work with. Tri-planar projection solves SO many problems-- it improves on UV mapping (doesn't quite erase it except in simple scenarios), it unifies the world scale (so multiple assets can share the same material, plus a mesh can be scaled up / down without affecting the material). There are more features to talk about here, but all in all a total lifesaver. None of that back and forth targa export nonsense.
  • The layout pass (the 360 quick pass reconstruction) was EXTREMELY helpful. It served as a blueprint of the space to iterate on.
General takeaways.
  • Realizing most of these assets are better off modeled, like walls. 
  • Realizing that delighting is not just a 'clean' albedo, its also color matching and ensuring matched shaders. 
  • Realizing that polycounts are not as restrictive in my head
  • Restraining myself from brute force cleaning assets... I know I can make things look good if I sink hours into it... my goal is to determine best practices, not output a portfolio piece. 




Final polishing touches:



  • Water? in the foutain?
  • Obtain a HDR latlong image for the sky. Examples here:
    https://assetstore.unity.com/packages/essentials/beta-projects/unity-hdri-pack-72511
  • While I'm at it... actually I just wanna reshoot the whole thing again. This video from Gnomon https://www.youtube.com/watch?v=IU5XTtS6ALk demonstrates capturing an indoor, architectural space for VR. This trailer goes through the whole process in a timelapse.

    Looking at the camera icons, he just brute force shot a ton of photos. We can see some individual assets are processed, but mainly, he's working with this giant dataset. This is kind of opposite to my approach this term so I'm really interested in replicating it, especially with the 360 camera.


Friday, May 31, 2019

Quain Courtyard pt 3, Unity

In my last post, I developed ten materials in Substance Alchemist. Now it's time to start putting everything together in Unity.

I should also mention I've switched from working on the big machine to my personal laptop, so my graphics processing power has been cut in three (at least, haha). Still, my 15in Spectre is not too bad. I'd rather be targeting midrange PCs anyways.

The first thing I need to do is export all of my Alchemist materials out to something useable. I notice right away that I can't seem to consistently export 4K texture maps... Alchemist will crash. Although I was able to export two out of my ten materials. Perhaps a bug because it is still in Beta, or maybe my laptop. I notice 4K maps in the viewport will also crash. Either way, I'm stuck at 2K textures for now.
Another frustration is that I have NO idea where Alchemist is saving my working files, so the only way to get assets OUT is through the exporter. So at the moment, I cannot transfer my working documents to a more powerful computer (to rule out the weaker processing issue). To be fair, Allegorithmic warned us not to use their beta release in production, so I kinda asked for this.

Jumping ahead, later on I realize that the Substance Plugin for Unity reads SBSAR files. This is essentially a package of your texture maps. Its much easier / more accurate to implement these instead of the image files. Still though, I am clamped at 2K exports for whatever reason.
--


So now, I have a folder full of 2K SBSAR files. I did some investigating into current Unity project templates. I've been out of the game (pun intended!) for a little while now, so I kind of wanted to work on something more modern (again, probably asking for trouble).

I first looked into the High Definition Render Pipeline, which was used on the Fontainebleu demo (the same demo that published the expert guide on Photogrammetry for games).

I also looked into the Light Weight Render Pipeline, which is not the legacy renderer but instead, a newer and more optimized scriptable pipeline which targets a wider range of platforms, including mobile. Their demos seemed quite pretty as well, plus with some initial testing, ran totally fine on my laptop and played nicely with the Substance shaders. I chose this over the legacy renderer because the lighting is more sophisticated.

I went back and forth a lot here, and I'm still not totally convinced I'm settled into HDRP. I actually have three WIP versions started. One thing I was realllyyy hoping for was to get the shaders as close to the Substance preview as possible. Those sneaky bastards, the height maps really displace the geometry, which is not the case (as far as I know) in realtime rendering elsewhere. HDRP does have a height map channel and claims vertex displacement, but just like normal maps, it does not seem to alter the silhouette


The deciding factor was ultimately, that I really wanted the lighting quality and to follow the expert guide.

So now, I drag drop my ten SBSAR materials into my Unity project.


Notice the pink? So, the Substance plugin brings in your material as a standard shader. The standard shader belongs to the legacy renderer, and canNOT be rendered with the HDRP. So I need to open up each one and change it to the HDRP "Lit" shader, set the quality to 2048, and double check the maps. Most cases I had to plug them all back in for some reason.

Unfortunately, doing this breaks the thumbnail in the asset bin. :-(
This does not affect the material.


Concrete is expanded to show the HDRP material nested in the SBSAR ^^



For the frosted glass, first the Material Type needs to be set to Translucent.
Then, the base color needs an alpha value of <1. Simply click on the color picker icon and slide the A slider down to taste.



Another setting I really like in the SBSAR properties is the ability to switch between UV and Triplanar projection. Triplanar works great for static meshes, and essentially takes away the need for UV unwrapping with the click of a button.

I mock up a quick greybox version of the space based on the layout geo (generated with the 360 camera)



Honestly I could have done this step in Unity, but I work faster in Maya. I limit myself to about 30 minutes on this step. I use Maya's Game Exporter to export all objects as seperate fbx's (check plugins list if you don't see it under File).


I import back into Unity.
Couple of noob notes for myself RE moving an entire scene into Unity

Maya to Unity:
1. do not freeze transforms in Maya (Modify>bake pivot to get transforms back)
2. export all separately using Game Exporter
3. import into Assets. select all and check the Convet Units checkbox
4. drag into hierarchy. Everything should be in its place


Again, first import the rough layout geo... (iterative design!)

Then everything else. I'm glossing over a few steps here, but I did a quick pass at applying all the materials and tweaking them a bit.


and this is a screengrab, without the layout geo.
A couple of obvious things stick out right away. 

First, there are obviously tiling materials. This is a little in part that they are placeholder, and I'm using them on objects that a much larger than what I intend for the final assembly. 
Still though, I had a lot of problems with scale. I had to manually tweak each material to look right. I realize that I really needed to take real world measurements and plan ahead for texel density. Unity says to take a 1 meter square of any material scan. I grabbed whatever I could get. 
Second, my placeholder geo kinda sucks. haha. I would have been better off with cubes I think? It's actually related to the scale issue, I added bevels without thinking about how noticeable they look.

The postives are, its starting to come together! The rendering and post processing are really really nice. It does't look like a typical "unity student project" that I have noticed is really common in Unity games (even Overcooked has symptoms of this... something about lacking antialiasing, color correction, dinky looking shadows and shaders that falloff to black? I can't totally put my finger on it but there's definitely a look to the legacy renderer.)

Walking around the space is starting to feel like the space. Getting up close to the bricks is actually kind of exciting. 

Next post I'm going to implement the scanned assets... stay tuned!




Wednesday, May 29, 2019

Quain Courtyard pt 2 scan material processing

I had a successful shoot day!
It was rather bright outside, so I had to deal with a lot of harsh sun and directional shadows.
I took a lot of notes as well, and recorded sound.











First I wanna talk about Photoshop's 'Photomerge' function... it's a tool I've been using more lately especially with Alchemist. Its a simple photo stitching feature. Under File>Automate.
It's gotten better than what I remember 5-10 years ago. It keeps all photos on separated layers with masks, and looking at the chunky shapes, the stitching is rather sophisticated.

I would like to, in the future, use a tripod and carefully measure out and scan a large area for material capture. However, I got successful results from just a handful of photos (5-8 or less). I can also repurpose the photogrammetry batches of photos (intended for geo reconstuction).

 

I was able to make this material kit in about a day. I love Alchemist's "Generate Board" feature, which quick exports your material collection into a thumbnail sheet.


The four slate textures were especially fun... in the courtyard, there are four types of slate tiles. I started with two types of slate, pulled from photos. Then used their "Pavement Generator" to define two types of tile patterns.


There are still some frustrations with this Beta release... it will crash on saving, and is a little slow on my laptop.

Something important to mention was that part of my capture day I took a couple of 'reference photos', wide shots of the area or in general, any photo that documented the qualities of the space and were not intended for reconstruction. These were especially valuable, as it helped inform some of my color choices.

I've been feeling a little overwhelmed about asset generating, but I recently rewatched the Star Wars Battlefront GDC talk. Their level kit contained like, 8 assets. They were simple enough that they could be scaled/rotated and repeated throughout the environment. Quality over quantity! And my favorite quote is still that the experience of going on location was more valuable to the artists than the data itself. It's time we stop making CG art based off of Google Images.

Thursday, May 16, 2019

More on Material Scanning... with Substance Alchemist!

I'm very excited to talk about Substance Alchemist, it has really solved a huge chapter in my reality capture explorations.

The methods I had previously researched for material scanning were tedious and required a lot of manual labor. Out of all the scan based material authoring methods, I can generally group into two procedures: one is using only photo data, and the other is using geometry from photogrammetry.

Photo-only: https://www.youtube.com/watch?v=kDMz6djZCkc
Geo baking: https://www.youtube.com/watch?v=HMPZV_Se8W0



I had plans to compare geo based workflow to photo based workflows... and I was getting stuck on the amount of time needed. To pull maps off of a photogrammetry reconstruction, I need to re-topo and UV a clean mesh first. This is fine for one or two materials, but to process more than that would be a lot. 

The other documented method recommends building a scan box to control light, which is cool, and provides a lot of accuracy. If I was scanning a large batch of say, fabric samples that are all roughly the same size, this would be awesome. 

It's DIY but it's also pretty complicated. 
However, applying it to shooting on site, usually outdoors, sometimes vertically... I'd have to engineer some kind of portable scanbox and probably have an assistant to help with lighting. 

Throughout all this research, I came across Substance's announcement for their new Project Alchemist, which threw all the above out the window. 

I took my baked brick photo texture, literally drag and dropped it into a new document, and immediately got this:


A few minutes playing with the auto tile feature...



At around 20 minutes I had this. It's by no means perfect, but holy shit it was FAST. And easy to pick up.  Everything in the Alchemist document is parametric, so I could revisit and re export. (Like any Substance product, I can export all the PBR maps as image files.)


Here's the thing... one of the reasons it wasn't perfect was because I was using a re projected texture map rather than a straight photo (above) so the data already wasn't clean. If the bricks weren't distorted, I think the material would be ready to use.

So I took a few more minutes and generated more.





Here I'm correcting the perspective in Photoshop first.




Alchemist has a feature which generates variants based on color pallets pulled from images. I could see this potentially being used to color correct a large batch of materials with an existing pallette. It's also fun.


Here I gave myself a challenge to replicate one of the materials using NO imported images and only the built in, parametric tools. This is how far I got in about 10-15 minutes.


Lastly, I tested the maps in Unity with its standard shader.
 

It's never as pretty as in the Substance viewport, but the import works and is very straightforward. I'll have to play with Unity shaders a bit in the future.

Tuesday, May 14, 2019

Hologram Portraits

I got to show off my scan portraits the other day on the Looking Glass holo display. I used a pre built app to display the models, and paired with a Leap Motion, I could pinch and rotate/zoom in on the models. 


It was really great to share this with others, especially when it was someone interacting with their own portrait. Sometimes I get really lost in the bigger projects and research ideas, so it was a good feeling to come back to simpler ideas like these portraits. Everyone loves a portrait. They're so fundamental and human.

The only issue was running the display off of my laptop, which has an NVIDIA graphics card but still lagged. 

To prepare the models, I first decimate the mesh to under 1000 verts (about 300 polys). I do this right in Zephyr.


Quain Courtyard pt 4, mild success