UX Magazine

Defining and Informing the Complex Field of User Experience (UX)
Article No. 1010 April 30, 2013

Augmented Reality and a Better User Experience

After decades on the periphery, augmented reality finally appears poised to explode into the mainstream. Consumers will soon be dazzled by products such as the much-hyped Google Glass, the FieldTrip app, and Oculus Rift, prompting many brands to explore this strange new world for the first time.

Successful development still has its challenges, however. While Hollywood has given us some thrilling examples of the possibilities in films like Minority Report, Total Recall, and Iron Man, little substantive headway has been made in real life. Some cool work exists, but there’s been far more gratuitous AR eye candy with little utility or long-term usefulness.

To alleviate some of the mystery, below are six key learnings for building AR apps that will help enhance user experience and improve end results.

1. Categorize the Experience and Plan for It

Not all apps and their use cases are created equal. Think about the environmental conditions of where the campaign will be staged, as well as the type of interaction users will have with it. Rob Manson of AR UX has a great way of thinking about the scenarios AR users are likely might find themselves in:

  • “Public”—The whole body is involved, as are large screens and full movement of limbs and torso; think Nintendo Wii or Microsoft Kinect.
  • “Intimate”—Think of the user in front of their desktop computer with a webcam, generally sitting down with their body an average of 2-3 feet from the camera.
  • “Personal”—Using a smartphone in a public 360º space, like recent ARGs such as Layar and Junaio. Here the user will be standing up and/or walking and interacting with the environment around them.
  • “Private”—The newest category, including wearable tech: glasses like Google Glass or headgear like the Oculus Rift. This will be a personal and intimate experience that will be completely sensory and visual. In the case of the Oculus, it may completely remove the real-world stimulus from the user.

Identifying these scenarios upfront goes a long way toward getting interaction design and technical requirements sorted out before the heavy lifting starts.

2. Get Out in the Real World before Coding

The concept of "measure twice, cut once" is especially important in building AR apps. Unlike a website, which typically has a reasonably generic set of conditions under which it is viewed, the environmental conditions of an AR app are as broad as they are challenging. Start by being a location scout and collect all the details of the physical environment to be augmented. Details matter: Is there sunlight or are there shadows? Interior or exterior? Static or motion? The more devices and lighting and environmental conditions that can be tested before building starts, the better.

Also keep the user in mind when calculating all the technical details. If the campaign features “real-world” locations in public spaces, be sure to consider issues such as reception quality (nothing worse than having a perfect location only to have zero connectivity) and safety (a post-industrial landscape may sound great in the brief but will the audience really feel safe whipping out their expensive smartphones there?). Legwork done upfront saves headaches down the road.

3. Design for the Small Screen First

Designers may use the latest 27" iMac with native resolutions of 2560x1440 but chances are most users will be on something much smaller, so design for their experience. Viewing an app on a small screen means typefaces must be clear and easy to read, and colors must be high-contrast and positioned in a way that don’t block a user’s view of the world. This becomes especially important when designing for wearable displays where there’s even less physical space to deal with. The risk of the user developing eye-strain or fatigue over the long run with flashing or heavy duty animations is also much greater than when viewed on a desktop.

4. Challenge Users Mentally, not Physically

User fatigue isn't limited to eye-strain. Regardless of how much people might enjoy a single experience, they won’t want to walk around with their hands up, holding a device for an extended period of time. They also shouldn’t be asked to walk around staring at a small mobile screen the entire time and not at their surrounding environment, which is not only distracting but dangerous. Users will generally tolerate a higher degree of interaction while sitting, or if they are holding their mobile device at chest level or lower (15-20 degrees of tilt above eye level is fine), based on a near-full extension of their arm and elbow. It’s surprising how heavy a tablet can get when held outstretched for an extended time.

5. Frame Rates are as Important as Image Density

Director Peter Jackson made a big deal about moving up to 48fps in his recent films, but AR experiences can go as high as 60fps. There’s a big caveat, of course: this level of super-fast animation requires greater processing power. Keep in mind both the complexity of the 3D objects created and the frame rate at which they appear. Given the wide range of user devices, find a sweet spot. Running moderately complex polygons at anywhere from 48fps to 60fps will generally work well as long as GPU acceleration is added to the experience (it’s not recommended to go under 24fps). Taking the time to optimize the application before it goes out will ensure a better experience, regardless of the device.

That also applies when working on the overall "feel" of the experience. Many experiences are limited to a 180º field of vision. While this can sometimes work for a simple heads-up display, most experiences should seamlessly integrate as closely as possible into the real world. That means thinking in a 3D space, even if a phone screen is flat.

6. Two Hands are Better than One

When designing a mobile AR experience for use in a public space, add a high degree of automation in the app. Users find it awkward to hold a device with one hand, look through the screen and then have to tap or swipe navigation elements at the same time with the other hand. Having a user hold their device with two hands provides a better experience and will allow them to focus on the experience rather than the device.

Utility is a core element of user experience, so forget about gimmicky AR and make the app both fun and useful. If today’s savvy, highly demanding users take the time to download your app and learn it, make it worth their time and effort. Invest in users, and they will invest back in you.

ABOUT THE AUTHOR(S)

User Profile

Olli Siebelt is Head of Client Solutions at The Famous Group in Culver City, California, a full-service hybrid production company, handling all aspects of live action filming and editing, CG/motion graphics, digital development, and production for broadcast and the web.

User Profile

Michael Marina is Director of Interactive at The Famous Group in Culver City, California, a full-service hybrid production company, handling all aspects of live action filming and editing, CG/motion graphics, digital development, and production for broadcast and the web.

Add new comment

Comments

55
46

Thanks for the great ideas about AR design. I found your intro curious, though. Oculus Rift is clearly VR and not AR. There's a huge difference. More subtle but also important, Google Glass is a HUD, not true AR.

54
56

Cool article guys. But I think is too more focused on technology and less directed to User Experiencie itself. Please, the google glass is not AR, for a series of technical and concept reasons. The Kinect / Wii are not AR too, they are in other types of interaction: Augmented Virtuality is the opposite of AR, more digital information and less real world, both in the same "world" of Mixed Reality.
The last congress ISMAR, categorized the experience with more consistently than those that expose in this article, I believe it has small number of scientific articles from UX for AR actually.
Remember even the technology is almost 50 years of research, the stage now for mobile devices is equivalent of internet in the 80s, with the mosaic browser.

60
52

I would also add to #1 that the AR use-case needs to be specifically evaluated. Is AR being added to an existing app with its legacy functionality, or is the app going to be a stand-alone AR experience? AR shouldn't be added on top of a planned or existing app just because it's the latest technology that has Madison Avenue's attention - that's almost a sure way to create a poor UX and to continue the impression that AR is limited to gimmicky implementations. Rather, the desired functionality needs to be strategically evaluated to fit with the experience that the AR display medium can offer.

I generally think of Rob's categories as 1st, 2nd and 3rd person. "Public" & "Intimate" are both 3rd person where the user is being augmented as viewed by a 3rd person - like a magic mirror. "Personal" is 2nd person where the user is viewing the augmented world, but through the filter and condensed viewpoint of a screen. And "Private" is 1st person where the user is now viewing the world as augmented with as much "suspension of disbelief" as possible. This dictates not only the general type of experience that the developer is framing, but also the UI - when ideating on functionality or how to present the augmented content, the developer needs to consider how the UI impacts the UX. In other words, which side of the glass the interaction with the content or functionality needs to (and should) take place - either actually on the user or on the screen ("near-side" - usually with motions, swipes, touches, or clicks) or out in the camera-view ("far-side" - usually with device motion or verbal cues).

Near-side UI for 1st, 2nd, or Intimate 3rd Person UX implies a much more private interaction. This interaction for 1st & 2nd Person UX takes place on the screen of the mobile device, which is generally viewed as the user's personal space and which will tend to broaden the kind of content or functionality that can be appropriately accessed. The interaction can be more discreet and/or secure.

Far-side UI for 1st, 2nd or Public 3rd Person UX implies a much more publicly visible interaction so certain kinds of content or interaction should be avoided. If developing for Glass or other eyewear, the developer shouldn't create verbal command cues about personal information, and similarly needs to limit the kind of gestural input required of the user.

59
58

Awesome article!
I strongly believe one of the key groups that will turn AR around is the User experience designers community. As the cofounder of Augmented World Expo (AWE 2013) - now in its 4th year is the largest AR event - we put special focus on sessions focused on UX design for AR.
The AWE agenda includes many design oriented sessions delivered by top UX talent from around the world. Hope to see many of you there!

http://augmentedworldexpo.com

55
57

You mention Google Glass but not Microsoft Kinect in an article for Augmented Reality.