January 20, 2020 | 10 Min

How to drive engagement with Augmented Reality: Part 1

Rhys Simpson
AuthorRhys Simpson
Default hero
Product updatesEngineering

Augmented Reality (AR) refers to a set of technologies that allow the combination of real and virtual (computer generated) worlds. A real-world subject, typically captured via a live video feed, is 'augmented' with extra layers of digital information, making the virtual elements appear as if they exist in the real world. Once the realm of science fiction, AR is now widely used in mobile gaming and some retailers have seen the opportunity to use the technology to enhance the shopping experience and drive conversion by enabling their products to be viewed in a real world context.

Both Ikea and Wayfair have taken advantage of the technology to allow customers to see how a particular piece of furniture will look in the context of their own home. With a single tap, a user can place the selected item where they want to see it within the camera feed. This simplicity makes AR experiences ideal for using on a storefront, giving users a quick way to make a more informed purchase.

There’s so much to say about AR that we’ve divided this blog post into two parts. In part 1 we’ll examine the current state of AR technology on the web and take a look at the model viewers and formats supported by iOS and Android. In part 2 we’ll dive into the details of delivering and consuming AR capable content from Amplience Content Hub.

AR model viewers

Recent advances in phone technology and AR tracking software have greatly improved the quality of AR images, and usability of AR applications. Up until recently, AR on a mobile was only possible through dedicated apps- only reaching those users invested enough to download them.

Mobile apps are a great way to engage users, but the reality is that the majority of visits are via the browser. We want AR content to be available regardless of whether the user has downloaded an app or not.

One approach is to make use of WebXR: a set of standards designed to allow JavaScript applications to support VR and AR features, and to be used alongside WebGL to render virtual objects in a real world space. WebXR is very capable and provides full control over the environment, but the downside is that it is a new standard and browser support is limited. Those browsers that DO support it currently lock it behind a flag or special build of the browser.

Quick Look and Scene Viewer

A better way of reaching a large number of mobile devices is to use the Augmented Reality model viewers built into the latest versions of iOS and Android. With the release of Apple’s ARKit in iOS 11 and Google’s ARCore, released in 2018 for Android Nougat and later, a large percentage of smartphones now support a well developed AR framework. These frameworks allow apps and browsers to open and preview 3D models in AR using two platform specific technologies: Quick Look in iOS and Scene Viewer in Android.

iOS Quick Look, introduced in iOS 12, allows any application to link a 3D model placeable in Augmented Reality. Simply by opening a model file, usually from a link, users are given the option to view it with an orbit camera, or in the real world using ARKit. An example from the Quick Look demo page is shown below, together with an image of the object viewed in AR.

The Android ARCore integration is Scene Viewer, an applet that can also be invoked from a browser. Although version adoption isn’t as high as iOS, it’s still the best means for delivering AR content to Android users.

Driven by native AR libraries, Quick Look and Scene Viewer support a wide range of features such as very stable tracking, light and environment map estimation, plane detection and more. All of these things combine to create a very stable experience across a wide variety of phones. Both allow placement of a loaded 3D model on any detected floor plane, making it ideal for previewing products in the real world.

Bridging the gap with <model-viewer>

3D model formats are complicated, and to add to the complexity Apple and Google have chosen to support different formats. For iOS, Apple have chosen USDZ, a relatively open model format with a focus on physically based rendering. In Android, Google use GLTF, a JSON model format which aims to minimize effort done when uploading a model to OpenGL, as well as supporting Physically-Based Rendering (PBR) materials and other features through extensions. Both of these formats bundle all related assets like textures into the model file, making them easier to download and hand off to the native viewer. However, both platforms support only their respective model format - so to target Android and iOS, you must handle both.

To make it easier to handle multiple 3D model formats, Google have created their own web component called <model-viewer> that allows you to target browsers through a regular model preview, and supports both AR platforms with relative ease.

To use <model_viewer> just import the component using its script tags and embed a GLTF model anywhere on a web page with the src attribute. The model will be loaded and displayed as an interactive 3D model preview, and if the element includes the “ar” attribute then the user will be able to tap a button to open the model in the platform’s native AR viewer. However, if you want to support iOS as well as Android, you’ll also need to reference the USDZ format of the model, so it doesn’t solve the problem of having to supply the model in multiple formats.

An image from the <model-viewer> demo is shown below.

Providing AR capable content: what we need

Let’s revisit the example we discussed in the introduction and work out what we need to build it using the <model-viewer> approach outlined above.

A furniture retailer wants to use AR to let customers see how an item looks in the context of their own home. The user journey might start with user browsing to a product page, then picking an AR model from a slider of product images in order to view the product as it would appear in their own front room.

Here's how a sofa might look when viewed in AR.

What do we need to make this a reality? We’ll assume we want to support both iOS and Android users.

  • We need two different models: One in the GLTF format required for the Android ARCore<model-viewer> and a second in the USDZ format required for iOS Quick Look.

  • We’ll need a Thumbnail image to display in the slider, and to use as a placeholder while the asset loads.

  • A key element is a DAM capable of storing and serving GLTF/GLB/USDZ models- we’ll use Amplience Content Hub.

  • We need a product viewer capable of representing our thumbnail image alongside other images of the product, and of launching either the 3D AR experience (in the correct format for the mobile OS) or a 2D image depending on selection. Amplience viewer-kit is a set of open source components designed to speed up development of exactly such viewers so we’ll start there.

  • There also needs to be a way to link our AR models and thumbnail with a set of standard 2D product media. This is a key requirement because most AR experiences will augment existing product imagery or video. Media sets are a core feature in Content Hub and support mixed media types as standard, so we’ll use these to add our 3D model.

  • There needs to be a way to link the two formats of the model with the thumbnail so the viewer can determine which model to launch when the thumbnail is selected. We already mentioned that media sets can be used to link assets, so to solve this problem we can add all related assets to the set and use matching names to link them together.

That’s all for this post. We hope it served as a good introduction to AR and its current state on the web. In Part 2 we walk you through an end-to-end solution showing how AR can be leveraged in mobile web content.