A centrepiece of LEAP’s stand at Sydney Build 2022 was an Augmented Reality (AR) experience on a LEGO model of the Sydney Opera House (which seemed like a fitting choice, given the event theme and location!). Vuforia Studio was chosen to help rapidly create the digital overlays and user interface within the AR experience and leverage the benefits of Vuforia’s tracking technology which helps ensure that all digital assets accurately locked on and tracked the physical LEGO model on display.
Why did we choose to showcase AR at Sydney Build?
Interest in emerging digital technologies such as AR and IoT has been growing rapidly within the built environment and construction industries. We aimed to demonstrate how AR can be used across a variety of use cases, such as incorporating additional 3D CAD information on a physical asset, adding 2D overlays, animated 3D instructions, simulation results (such as Computational Fluid Dynamics (CFD) showing wind engineering scenarios) and how IoT sensor data can be live-streamed from equipment within the building and displayed in context (ie. at the precise location in 3D space where that data is being measured).
What did our team need to make this happen?
First off, we needed access to an “as built” 3D CAD model of the LEGO Opera House – not the actual Opera House, but a 3D model of the LEGO model itself (as close as possible to how it was constructed). By using an accurate 3D CAD model, LEAP’s team could then make use of Vuforia Studio’s model target capabilities. AR model targets in Vuforia rely on a 1:1 representation between the physical model and 3D geometry, which is great news for engineers and product designers working at manufacturers who have easy access to their own 3D models (matching the manufactured product), but this can sometimes be a challenge when dealing with 3rd party products (or buildings, as in this case!) where access to the original CAD is more difficult. Luckily, we were able to get 3D CAD that matched the LEGO Opera House model, which our team checked by overlaying a semi-transparent model over the physical model to identify (and fix) any discrepancies.
This ensured LEAP’s team could generate accurate model targets – these help anchor the experience in physical 3D space, and also allow the physical object to occlude the view of any 3D content (such as flow streamlines) which may reside behind it from the angle of view (occlusion uses the physical model to clip or obstruct view of 3D content). In our experience, the use of occlusion in these physical/digital AR experiences is essential in making the experience look as realistic as possible (particularly while the viewer walks around the model and various component come in and out of view).
How did we approach the creation of the GUI?
How the user interacted with the AR experience was a focal point for the 2D user interface (UI), with our intention being to make it as simple and easy-to-use as possible for people who may not have used AR before. We focused on being able to easily cycle between multiple uses cases and ensure that the relevant information was displayed at any given time. In Vuforia Studio, it is quick and easy to define or rearrange the layout for various buttons, labels and text fields using drag and drop functionality. We also took advantage of the In-built formatting and CSS capabilities which is a powerful way to enhance the UI, and allowed for rapid consistent UI development which generates an appealing and highly functional user interface.
Vuforia Studio’s integration with the ThingWorx (TWX) platform also provides the ability to stream real-time sensor information which can be displayed using gauges. Within Vuforia Studio, these gauges can be positioned within the 3D interface to display in-context sensor information (strategically located at key points within the model). In this example, we included gauges which represented variables such as temperature, power usage and humidity. The refresh intervals of these values can also be set, so the user can choose to stream information as frequently as is required (noting that for different sensors it may make sense to have significantly different refresh intervals).
It’s a LEGO model… so, could AR be used to show assembly and disassembly instructions?
Absolutely! A major benefit of AR is the ability for users to interact with the information when and where they require it, rather than rely on the author to capture all appropriate vantage points beforehand. This is commonly seen in assembly processes, whereby a user might need to view the model from various angles to better understand how it is assembled or deconstructed (and the best or most likely viewing angle may be unpredictable to the person writing the instructions!). The LEGO Opera House experience included an assembly/disassembly sequence to demonstrate this key benefit of AR, helping users to step through the removal or assembly of key components of the LEGO model while walking 360 degrees around the physical object. This particular sequence was created first within Creo Illustrate and brought into Vuforia Studio as a pvz file– these sequences included highlighting of key parts in bright flashing colours, and demonstrated how LEGO parts were translated within 3d space to replicate the desired assembly or removal process.
Want to get started with your own AR experience? Whether it’s a tabletop LEGO model or perhaps a full sized pump, compressor, transformer, truck or even a full aircraft (!), LEAP has the experience with AR to help you deliver a successful project. You can arrange a time to speak with one of our local AR experts via this page – we look forward to working with you on your next AR project!