Monday, February 25, 2019

Pointcloud Meshing with Separations, pt1

Something unique to scanning a room scale environment is the process of meshing the pointcloud into separated pieces. I decided to run some tests with re-projection on a smaller object.


In virtual environments it's important to remember that we are building a fictional set, and not an accurate representation of the world. Typically, the background is a skydome. There may be elements on a billboard/card. As objects are closer to the foreground, they require more accurate models and texture details. 


This is all to say that I wanted to work out the process of hand modeling some of the detailed components from my room scan. In this case, I took the handle on the door. 
In my last post I talked about starting the manual retopo process. It's still going, slowly. This is why I wanted pause to conduct a test in reprojecting texture, to make sure I am retopoing at a good mesh resolution.


So, hiding everything except the door handle. Remember that I modeled off of the Generated Textured Mesh, so in theory everything should line up. I took some time to add bevels and UV unwrap.

I exported him alone, and brought into Zephyr. 
Import>"Import Mesh with UVs"


I wrote about how Zephyr will not know what to do with these new UVs, so while it imports the model into the Texture Meshes bin, it will be incorrect. RClick + Hold the model, and select Extract Stereo Mesh.






It does a quick lo res texture projection (keep settings fairly low for this step) and puts the model into the Meshes bin. RClick+Hold select Make Structure Mesh.
This time you will want your texture projection settings turned all the way up.






As you can see... the results are just ok. I exported and brought into Maya to examine the UV's a little closer.



Not sure why this happened-- the shells stayed in place but each face was separated (white lines = seams).

Lastly, for fun, I imported and re-projected to this super lo res mesh (my work in progress from last post).


Weird! Some strange issues with texel density.

--

So here's what I learned from this test:

  • The dataset was not perfectly aligned to the world axes. This was annoying because the projections were just a little off.
  • When modeling the handle I eyeballed measurements and then scaled up a little to overcompensate. Next time I will be less aggressive in overcompensating and will just trust the generated textured mesh.
  • I suspect the texel density mostly depends on the mesh resolution (more important than the UV layout) because it is a projected texture. The image above demonstrates serious issues with because the mesh resolution is insanely low.
  • I think I might try modeling just the extruded handle and consider the base plate a part of the door. I think I was getting too high detail in this particular separation. 
  • Remember I did not bake normals for this test!

No comments:

Post a Comment

Quain Courtyard pt 4, mild success