Thursday, May 16, 2019

More on Material Scanning... with Substance Alchemist!

I'm very excited to talk about Substance Alchemist, it has really solved a huge chapter in my reality capture explorations.

The methods I had previously researched for material scanning were tedious and required a lot of manual labor. Out of all the scan based material authoring methods, I can generally group into two procedures: one is using only photo data, and the other is using geometry from photogrammetry.

Photo-only: https://www.youtube.com/watch?v=kDMz6djZCkc
Geo baking: https://www.youtube.com/watch?v=HMPZV_Se8W0



I had plans to compare geo based workflow to photo based workflows... and I was getting stuck on the amount of time needed. To pull maps off of a photogrammetry reconstruction, I need to re-topo and UV a clean mesh first. This is fine for one or two materials, but to process more than that would be a lot. 

The other documented method recommends building a scan box to control light, which is cool, and provides a lot of accuracy. If I was scanning a large batch of say, fabric samples that are all roughly the same size, this would be awesome. 

It's DIY but it's also pretty complicated. 
However, applying it to shooting on site, usually outdoors, sometimes vertically... I'd have to engineer some kind of portable scanbox and probably have an assistant to help with lighting. 

Throughout all this research, I came across Substance's announcement for their new Project Alchemist, which threw all the above out the window. 

I took my baked brick photo texture, literally drag and dropped it into a new document, and immediately got this:


A few minutes playing with the auto tile feature...



At around 20 minutes I had this. It's by no means perfect, but holy shit it was FAST. And easy to pick up.  Everything in the Alchemist document is parametric, so I could revisit and re export. (Like any Substance product, I can export all the PBR maps as image files.)


Here's the thing... one of the reasons it wasn't perfect was because I was using a re projected texture map rather than a straight photo (above) so the data already wasn't clean. If the bricks weren't distorted, I think the material would be ready to use.

So I took a few more minutes and generated more.





Here I'm correcting the perspective in Photoshop first.




Alchemist has a feature which generates variants based on color pallets pulled from images. I could see this potentially being used to color correct a large batch of materials with an existing pallette. It's also fun.


Here I gave myself a challenge to replicate one of the materials using NO imported images and only the built in, parametric tools. This is how far I got in about 10-15 minutes.


Lastly, I tested the maps in Unity with its standard shader.
 

It's never as pretty as in the Substance viewport, but the import works and is very straightforward. I'll have to play with Unity shaders a bit in the future.

No comments:

Post a Comment

Quain Courtyard pt 4, mild success