November

Unity AR Model Design

At current stage, we've successfully established a 4-folded interaction engine, with audio source management, event engine, touch management and sensor management, encapsulating the displayed lists of game objects as well as their activity cycles. Most of the architecture is defined by Unity’s typical workflow (C#/GUI Editor). The flow for the application is described below. The experience sets up much of the design and architecture decisions that are made herein.

Touch Events Manager. The touch events checks for input on any given update frame and calls raycasts to check for valid events and set valid variables

more detailed explanations

Thanks to this model, we produced a lot of fun demos, among which my personal favorite is the "random chatter".

Bil's demos:

  • Multitrack (flower toggle)

  • Scaling (surreal)

  • Experimental (talking jibberish)

  • Full application demo

This has been an absolute blast. Can't wait for the whole app to come together soon!!

Kelly and Bil