MINDCONSOLE was commisioned by AVL to create an AR app to show off their new autonomous driving and parking capabilities.
In order to have a target object for the AR app, Mindconsole had to select a suitable model and design it in a way which made it a suitable and reliable AR target for the app.
Designing the vehicle went through a couple of iterations designing a rough design, testing it as a target for AR and depending on the outcome, redesigning and adjusting the details to fit the need.
The autonomous driving capabilities are driven by a couple of AVL sensors that Mindconsole needed to show off in the AR experience. Being able to visualize the range and area of influence of the sensors was cruical for the entire experience.
For the experience Mindconsole needed 3 scenes in total. We decided to keep them fairly simple to keep the viewer's eyes on important things. For better AR experience, Mindconsole tried to not exceed certain size of the scenes. That allowed better tracking and overall increased comfort level through out the experience.
Mindconsole came up with a smart solution to solve the animation process. Thanks to Unity's Timeline feature, we were able to animate all of the required scenes and objects right inside of the used Game Engine, allowing us to iterate over feedback quickly and painlessly. Being able to see and make changes to overall animation real-time is an incredible advantage over typical workflow.
Using Unity Game Engine as the main choice has proven to be the right decision. Despite the short timeframe as well as the difficulty of using a real object (truck) as a target for the AR Experience, Mindconsole managed to implement all required features and content in time. C# as a main coding language allows for quick iterations and comfortable workflow.
If you are interested to see how the entire experience looked like, you can take a look at the video below. This is bsically a screenrecording of one session, just without the "AR" factor that we shipped with the app.