Update - 6 Jan 2021
Hey everyone! This year, for the weekly drills, I’ve teamed up with Owen Cabalceta to try learning and practicing making little augmented reality apps using the Unreal Engine. Next week, our posts will be true play by plays on our workflow so you can see what we’ve learned along the way while we are learning ourselves!
I’m working on modeling, look dev, texturing, and lighting (and rendering and composition of some stills of the assets) and Owen is working on programming. Check out his post for details on his workflow in Unreal!
After exchanging different ideas about what to make, we decided to have a dumbbell that we could lift in AR using image tracking.
I modeled the dumbbell in Maya, textured in Substance Painter (one of the main materials I used can be found here on Substance Source - I changed the roughness, added extra scratches and dirt, added extra bump for the weight surface, and made my own alphas for the weight measurements and textured handle), and rendered some still images in Marmoset.
Afterward, I brought the model (OBJ) and texture maps into Unreal and made the model the tracked object. I then created a shader in Unreal and connected the Base Color, Roughness, Metalness, Ambient Occlusion and Normal texture maps to the shader.
I connected the shader to the object and ran the app. The object showed up, but the tracking wasn’t great and the lighting wasn’t that great either.
Owen and I discussed and tried a few different variations of trackable images and we learned that the more detailed and unique an image is, the better it is for tracking. Once we figured that out, the object was tracked pretty well.
I also tinkered with the lighting a little bit. Ultimately we were ok with the lighting we have now, but we agree it would be great to have light estimation based on the AR captured scene and I couldn’t figure out how to do it in Unreal (I tried looking at videos and documentation). Below are some screenshots of other lighting settings that I messed with in Unreal.
For reference, I decided to put the asset into Apple’s Reality Composer (I've been making AR apps using Xcode, and more recently Reality Composer for a little over a year now) and see what scene capture light estimation would look like (both are above for comparison).
Hopefully, I can learn about how to do light estimation in Unreal for future drills! The goal is to make the assets look as much like they fit the world space as much as possible.
Below are some references for the weights: