Augmented Reality in Retail

AR — Augmented Reality, is it a gimmick whose sole purpose is to capture Pokemon across the globe? Or are there real-world uses? And how close are we getting to them?

Hero image
Illustration by Tiffany Ta
Hero image
Illustration by Tiffany Ta

The Seattle chapter of VR/AR Association recently hosted a panel discussion on AR for Retail: Designing for Retail Customers. The panelists included Angela Argentati of Best Buy, Alex Goldberg of REI, Ashleigh Miller of Amazon, and Hannah Mintek of Valence.

Here are my top takeaways from the discussion. Use them to talk smarter about the opportunities and fine-tune your company’s approach to the technology.

1. Just call it 3D.

AR, VR, XR, MR… it’s confusing enough for industry folks like us to know the difference between these acronyms Unsurprisingly, it’s even more confusing for customers. While these terms are confusing, across the board people know what 3D is. Ashleigh Miller showed how Amazon uses the term “View in Your Room” to make 3D even more understandable.

And just in case you wanted to know: 

VR - Virtual Reality. Being immersed in a totally digital environment. For now, always experienced with a headset. Example devices: Oculus, Vive

AR - Augmented Reality.  Using a device to layer a digital world on top of the real world. Best example: Pokemon Go.

Image credit: Pokemon Go Live https://pokemongolive.com/img/homepage/vid-still.jpg


Image credit: Pokemon Go Live

MR - Mixed/Merged Reality. Honestly, I don’t know why this is separate from AR, but MR is just a supercharged version of AR. While AR allows for a digital layer to be placed over the real world (like a sticker over a pane of glass), MR uses depth sensors to integrate with the real world.

For example, let’s say you want to place a 3D rendering of a lamp in a room using your phone’s camera. In AR, that lamp will show up on your screen but you won’t be able to place it behind a real object — it’ll just float in front of whatever the camera sees. MR can sense depth, so if you place a 3D lamp behind a real couch, that lamp will be partially occluded by the couch. Example devices: Microsoft Hololens, Magic Leap

XR - Extended Reality.  This is a bucket that holds everything: VR, AR, MR, and whatever _R comes next.

2. Seeing things in 3D builds confidence in purchasing big-ticket items.

During the REI Labor Day Sale, REI enabled one of their tents (Grand Hut 4 Tent) to be AR-enabled on their app. About 75% of people who purchased that tent used the AR functionality.

Alex Goldberg, Senior Innovation Steward at REI, notes that the point of AR in retail is higher conversion and fewer returns. Physical retail is on the decline and purchasing online is the new norm. The problem for consumers is that they don’t know how clothes are going to fit, or if that couch is going to look good in a living room. Obviously, returning a couch is much more complicated than a shirt and AR is here to make sure you don’t have to do that.

3. Testing in AR is difficult.

There is no A/B testing in AR because adding the z-axis (versus just an x and y on a flat screen) makes it exponentially more complicated. The size of a room, the lighting, the amount of crap on the floor/walls, your height, your location, the color of the carpet, etc.—it all matters when placing a 3D object in your space.

Angela Argentati, Product Manager at Best Buy, discussed how conventional and inexpensive user testing via recorded screens is not helpful. With screen recordings, you can see how a user is interacting by tracking their mouse or eyes. In 3D space, the phone is essentially the mouse — so imagine those nausea-inducing screen recordings.

This means that the user testing for AR at Best Buy is done in-house and in-person, which means a slower and smaller data set to conclude from. Until testing becomes easier, I don’t see companies investing heavily in creating AR mock ups of products.

4. AR will be a game-changer... when we have the gear to go with it.

You can change the color of your room as often as you want, or your bus ride can be decorated top to bottom in Ariana Grande, while your seat-mate dons the bus in drippy black metal.
Tifany Ta

We all remember the rise and spectacular fall of Google Glass. It was creepy, invasive, and worst-of-all, extremely dorky looking.

Currently, the AR headset market exists at the enterprise level. Hololens, Vuzix, and yes, even Google Glass are being used as a virtual prosthetic for employees who work mainly in manufacturing and logistics. The headsets provide voice commands and access to training videos, plus allow for hands-free collaborative video calls, and more. 

When will this tip over to the consumer market? The technology is here — but do we have the appetite for it? Will headsets ever be affordable enough for widespread public adoption? With rumors of Facebook and Apple joining the fray, this could drive demand.

Imagine a world where you can personalize everything to your taste. You can change the color of your room as often as you want, or your bus ride can be decorated top to bottom in Ariana Grande, while your seat-mate dons the bus in drippy black metal. The Seattle-I’ve-met-you-five-times-but-sure-I’ll-introduce-myself-again phenomena will disappear because the name of the person and what you’ve talked about will pop right next to their heads in your AR headset. You’ll control your world and be an omnipresent all-knowing, being.

On the flip side, what happens to the people left behind when we all adopt glasses (or whatever form it becomes)? Will the non-adoptees be left behind in a barren wasteland where shop signs and public information are all in our headsets and everything in the “real world” is a blank (or worse QR codes)? Will human interaction become even more strained when we put a digital layer over our everyday world. Who controls the amount of information that you’ll be bombarded with? We already can’t control ourselves with notifications from our phones, having it sit literally on top of our faces can’t be any better, can it?

Will human interaction become even more strained when we put a digital layer over our everyday world?

5. The path to becoming an XR designer is undefined.

Hannah Mintek, Creative Director at Valence, was the best example of this idea. She’s transitioned from documenting the war in Georgia as a photojournalist to owning a house painting business to working in customer service to designing for UX, which eventually lead her to work in the 3D space. The other panelists also came from vast and varied backgrounds from dancer, to game designer, to civil engineer.

To me, that’s exciting. XR is a burgeoning industry and the designers who are willing to explore this undefined medium are no doubt shaping our future. We need people with diverse backgrounds and empathy building our new (virtual) world. So...

6. Let’s get started.

View our work