Weekly Drills 39 - Fitness Equipment
Share
  Play by Play

Weekly Drills 39 - Fitness Equipment

by odc on 6 Jan 2021

Simple AR app made in the Unreal Engine for iOS. Collaborated with Natalia Cabalceta (ncab). Natalia did the 3D modeling, texturing and lighting, while I did the programming for the augmented reality (AR) application.

3 131 0
Round of applause for our sponsors

Update - 9 Jan 2021

Look at the time(!), we're a day away from our deadline and I still haven't followed up on my previous post... yikes, let's fix that.

Due to the lack of time (maybe I shouldn't have binge Jujutsu Kaisen as often), I'll be mainly focusing on the high-level process/thinking that went into developing this week's drill and their logical abstractions... by that I mean I drew a picture down below and I'll walk you through it! 

I want to keep this as simple as possible. But for those looking for more details (like code, logic, machine learning details and etc), don't worry since we'll be mainly working on AR for the near future. So treat this as an intro and as we progress throughout the month, I'll be posting those details in a series like fashion. Right, now back to business.

The image above is basically our simple little ar app (detect a target image and spawn a dumbbell there) in terms of high-level abstractions and logic. Let's break it down into 9 sections:

1

We're expecting our app to read some video feed as a stream of images that we're looking to process and do something with them at some point.

2

The stream of  images (left alone) is not useful for our app... why? Well, think about it like this. We're trying to teach someone how to make our favorite pie (a secret pie of some sort) in a room full of ingredients, but we don't tell them which ingredients are needed and what those ingredients are. To the app, the stream of images is just a stream of raw data (unknown ingredients) and it needs help to separate the useful from the not so useful pieces of data. In our case, the app wants the sections of the images that can be extracted and tracked (trackable geometry)... so get rid of non-pie ingredients!

3.

Great, the app has extracted the pieces of information that it needs (trackable geometries)... now what? Well (going back to making pies), we don't need every pie related ingredient to make a specific pie. In our app's case, it doesn't care about every section of trackable geometry in a given image. It's actually looking for a specific piece(s) of geometry that contains the image that matches our target image(s). In other words, give us the secret pie recipe!

4.

Now that the app knows which image (target image) it's looking for in the incoming set of tracked geometry data, it'll want to filter the data. Only leaving the data (tracked geometry) corresponding to the target image. In terms of baking pies, remove all the unneeded pie ingredients.

5.

It's time to bake the pie! With the filtered data, the app now has all the information it needs to add the dumbbell to our scene. In our case, the dumbbell (obj) is the purple box in our image [#6]. Currently, there's no dumbbell existing in our scene (i.e.: the world space... more on this in the next weekly drill). So the app is going to need to spawn a new dumbbell (i.e.: an actor... more on this in the next weekly drill) on the coordinates of the tracked geometry that matched our target image.

6.

I'll let my collaborator talk about this section (the modeling, texturing, lighting) in her post (ncab).

7.

Lastly, the Unreal Engine is going do a lot behind the scenes and render the dumbbell in the world space (an overlay over the actual scene). More on rendering in the Unreal Engine in future posts.

8.

There you have it. That's a nice and simple ar app!

9.

If the dumbbell has already been spawned, then move the dumbbell to the location of the target image (if moved). The app at this point is already keeping track of this data (the orange  cylinder), we'll talk more about this in our next drill.


Finally, we're done. I hope you enjoyed reading this as much as I enjoyed writing it! I'll see you in the next drill!


Comments (0)

This project doesn't have any comments yet.


Update - 7 Jan 2021

Although the deadline for this drill is three days away, I wanted to give the play-by-play feature a shot (being a new user). At this point I spent about 6ish hours on this drill, where I mainly focused on developing the ar app with ncab. However, that 6ish hours excludes all the prep work of installing tools (this is my first time developing in Unreal Engine) and setting up our dev pipelines for this and future projects/drills.

All and all, I put in about 12ish hours... yes, that includes googling "best source control software for game developing" (and other topics that first time game devs would search lol),  reading documentation, researching and ultimately choosing which tools to use and the first pass of our dev pipelines.  I'll probably save the setup (tooling, pipelines and etc) for another post. I'll bet that I'll come back to decisions and iterate on the tooling/pipelines (being new to game dev)... but hey(!), you got to start somewhere.

Heads up though, software engineering and AI is my passion and trade... so I'll be mainly focusing all my posts/work in that particular perspective.

So what have we completed so far? Well, we just completed our minimum viable product (mvp) which is a basic ar app that tracks an image and spawns an dumbbell, where the modeling, texturing and lighting was done by  ncab:

During my next post, I'll talk more about my setup and other topics, such as the actual code and,  if anyone is interested,  details about how we detected and augmented the 2D image in the video and how the ar library does this as well.


Update - 6 Jan 2021

This project is my entry for the weekly-drills contest #039 and was collaborated with Natalia Cabalceta (check out her post about the project and her perspective). Where she take on the modeling and it's textures and UV's, while I take on the programming of augmented reality (AR) application.