Alternate Portals: Augmented Reality VFX App
This is my complete workflow detailing the evolution of my AR application deployed to an iOS mobile device that explores real-time FX through user interactivity. All of the work in this process was generated myself as part of my final project in the Master of Interactive Design & Game Development at SCAD.
This is a mobile application that guides a user through portals into augmented reality to enable an interactive and dynamic visual effects experience. This project aims to respond to the question "How might a static, physical surface be transformed into a dynamic, interactive experience?"
The application was developed in Unity 3D, with geometry generated in Houdini using HDAs (Houdini Digital Assets), and VFX transitioned from Houdini to Unity VFX Graph for real-time application. User interface (UI) prototyping was structured with Figma before integrated into Unity 3D. The use of the augmented reality application was deployed to an iPad Pro and filmed in use. A usability study was performed through surveys and observations and documented for the project. All work including concept creation, modelling, prototyping, UI development, application programming, usability study, and project documentation was completed entirely by myself. The project served as my final project in the Master of Arts in Interactive Design & Game Development at Savannah College of Art & Design.
Final iOS Application
The following video and screenshots showcase the application in use. A clear and concise UI launches users into an active space battle that is user-driven by physically manipulating the augmented reality environment assets: the targets. This emerging visualization strategy opens up radical and unique opportunities to showcase digital content and interact with it, all at the fingertips of everyday users.
The following sections detail the project process, including procedural modelling, geometry and VFX integration from Houdini into a real-time environment, UI prototyping and development, and the final application programming and composition.
A procedural modelling approach was taken in Houdini to generate the low-poly geometry assets and their corresponding UV layouts. Most importantly, this allowed for quick iterations and multiple instances through a control panel in Houdini to create such variations.
This is a snapshot of the entire node network created to generate the geometries. The network spawns from a single node at the top of the hierarchy that defines the shape and parameters for each variation: square, circle, triangle, n-gon, etc. Specific parameters throughout the network are linked to the control panel to allow for iterating variations.
Geometry & VFX Integration
The low-poly assets and their accompanying textures were integrated into Unity 3D and coordinated with VFX using the target points as the source for the particles. The point caches were linked from Houdini into Unity's VFX graph where the real-time effects were developed.
User Interface Prototype
The UI prototype process was developed with Figma and progressed through the screens that one is intended to go through when using the AR application. The Figma application was deployed to a live website for delivery to the users participating in the usability study. A survey accompanied the Figma application requesting feedback from users which was then used to refine the UI prototype before final programming.
The AR environment & UI for the application was programmed and developed in Unity 3D with many custom C# scripts.