Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Home ›› Mobile Technology ›› How UX Can Drive Sales in Mobile Apps

How UX Can Drive Sales in Mobile Apps

by Jeffrey Powers, Vikas Reddy, Jeremy Olson
23 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

The creators of RedLaser, a popular iPhone app, tell the UX story behind their success.

This is an interview with Jeff Powers and Vikas Reddy, the founders of Occipital and creators of the popular iPhone app, RedLaser. We became interested in their story when we learned the differentiating factor between a somewhat unsuccessful first version and a wildly popular second version was due to their attention to UX.

You can listen to the audio using the player below (on our site) or download the audio file.

Olson: Hi, I’m Jeremy Olson, a contributor for UX Magazine and developer of a recent iPhone application called Grades. I’m the writer of the tapity.com blog. And I’m here with Jeff Powers and Vikas Reddy who are the founders of Occipital, and the creators of a very popular iPhone application called RedLaser. So, Jeff and Vikas, could you introduce yourselves briefly and talk a little bit about your company?
Powers: Hey, thanks Jeremy, this is Jeff. I’ll tell you a bit about myself. I was working on a PhD at the University of Michigan where I met Vikas when I realized two different things. First was that the science of computer vision, this field, was incredible. And then there’s a lot of new science—very practical stuaff—in the field, but really none of it all all was being used by everyday people. And second, I realize that the only way to really change that would be to start my own company. And then Vikas joined a few months later.
Reddy: Yeah, so I was actually working for a startup in New York called zenga.com and I quit that to join up with Jeff to do the startup. So we started working on some ideas around photo organization and visualization using basically computer vision, artificial intelligence, things of that nature.
So we got into TechStars in 2008, which is a seed-stage incubator based here in Boulder, and went through that program. We tried to raise money after that and weren’t successfully. We changed our focus to trying to get to cash-flow positive and also switch into mobile, and that’s when we launched ClearCam, which was a super-resolution application on the iPhone that let you take 4-megapixel photos with the original camera. And that got us cash flow positive, just on very little money.
Powers: And that brought us up to present day, where we launched RedLaser.
Olson: Great. So why don’t you tell us a little about RedLaser, the iPhone app?
Powers: Sure. RedLaser is our most popular application so far. RedLaser is a barcode scanner for the iPhone, and in addition to scanning barcodes it can also check prices. The idea is that if you’re in a store and you want to know if something in front of you is a good price, you can scan it and find out within a few seconds whether it is.
We called it RedLaser because we wanted people to think of it like those checkout scanners that use RedLasers. We didn’t want to call it something like “Camera Scan,” or something like that, because we didn’t want to emphasize the camera too much. We wanted users to think about it like they were actually pointing a RedLaser at a barcode.
The way that RedLaser came to be is that I came to work at this office we were in in a deli (a basement office in a deli), and I was looking at a copy of Scientific American with the usual barcode printed on the cover. So I guess in a way you could say the idea came from Scientific American, but we didn’t get past the cover. A few days before that, before I looked at the barcode on that cover, we had been brainstorming all of the things we could get the phone to recognize via the camera, and it sort of hit me at that point that despite that the iPhone was well on its way to becoming a mainstream device, there was absolutely no effective barcode scanning technology for the iPhone.
And so usually when you have an idea like that that seems pretty obvious, you get disappointed—you do a quick Web search and it reveals that indeed there’s something else out there trying to do what you just thought of. But in this case there really wasn’t anything. Everything that was out there was pretty terrible or it required a special add-on lens. And the reason for this terrible experience with the existing applications was that the iPhone camera wasn’t able to focus up close. So this led to the conception that barcode scanning on the iPhone was impossible…until we came along.
Olson: So you created this great technology and combined it with a practical product idea, and then you finally got it on the App Store. So what happened?
Powers: Our first release was moderately successful. We got to the front page of Digg, which was exciting. We peaked at something like number 87 on the top 100 of all the App Store apps, or paid apps specifically. And so we were pretty excited about that, but it fizzled out pretty quickly and RedLaser went back into relative obscurity at that point. So our first release wasn’t a big hit, in the end.
Olson: But obviously something happened—the idea was obviously good—since now it’s one of the most successful apps in the App Store. So what happened that bridged that gap between then and now?
Powers: Truthfully, it just wasn’t easy enough to use. We actually thought it was pretty good, and it was indeed better than anything else out there at the time. But that wasn’t enough. Outside of very tech-savvy users, the experience just wasn’t very intuitive. For example, specifically, the first version was photo-based. It required you to hold the barcode some distance away (about 10-12 inches) and to take a still photo. In addition to that, there wasn’t any feedback on the device itself, so you had to send a photo up to our servers, which would then process the image and then respond to you. So it took a lot longer.
We thought it was pretty good. At one point I actually went home to Ann Arbor to visit my grandparents, and I showed my grandma how easy this new app that I had built was to use. And it turned out it was completely foreign to her. Even though she’d used checkout scanners for years, she just had no idea how to use it. So we really had to make the capture process more in line with her mental model for how a barcode scanner should actually work.
Olson: So you revamped the capturing experience. What exactly did you do to change it?
Powers: We totally revamped that technology in the second major version of RedLaser—version 2.0 and then on. Instead of using still photos like we did before, we used real-time scanning straight from the video feed. Rather than waiting for a server to respond several seconds later, we gave feedback in a fraction of a second. We have brackets on the screen that will turn green and inform the user that the barcode’s in range. It then informs them to stay still while the scan occurs. So you get some feedback immediately with no training.
And then the accuracy of the process was greatly improved. One of the things we did is we specifically focused on reducing the chances that the scanner would decode a barcode incorrectly—by that I mean, making an incorrect guess about what the barcode is—which is a big departure from version 1 where the application would always make a guess. Even if it wasn’t sure at all whether it was right it would always make a guess. And that frustrated some users because you’d scan a barcode and it’d come back with completely gibberish numbers. We thought that was better than returning nothing, but in version 2 I think we realized it’s better to return nothing than to return something that was completely wrong. Once we did all of these things, there was a threshold that we hit where people started telling their friends about the application.
Olson: So this is version 2 now, you’ve made these changes…I noticed that for version 2, you got a lot of great press; I think TechCrunch and a bunch of different places covered you. So what’d you do to get their attention? What did you do to get noticed?
Powers: The new version took off by word of mouth, really. And although you have seen a lot of great press now, and if you do a historical search you’ll see a lot of great press, we didn’t really get any great press at first. Remember it was version 2, so it was an update. You’re launching an application now yourself, and I think you’ll find that when you come out with version 2 it’s going to be really hard to get people to pick up that story—just because it’s an update, it’s not that newsworthy. So we had a lot of trouble getting people to pick it up.
But nevertheless, after the update occurred that really increased the quality of the experience, new users per day started to grow exponentially. Actually, it doubled literally several days in a row, taking the app from basically obscurity where it was after that first launch had died down all the way up to top five in the App Store, just by growing and doubling and doubling and doubling. Based on the fact that we saw this exponential growth, we were able to infer that the growth really was word of mouth.
Olson: This was before you had any press for it?
Reddy: The press was actually interesting because it was probably a cause rather than an effect. The word of mouth got us popular, but then got us the attention of the press that wouldn’t cover us before. And it probably helped that it was holiday season as well, and there was this new idea about mobile and shopping that helped it get picked up by some pretty major press.
Olson: This really shows how word of mouth can really push an application to the top charts without even the help of the press, which is pretty amazing to me. Being that word of mouth was so important with this release and how it played such a big role, what do you think motivated people to share the experience? How can we make experiences that are viral, shareable experiences that people want to share with their friends?
Powers: In the case of RedLaser, the experience was new—there was a lot of novelty— and it was something you could only do on a mobile device—you couldn’t really do it on a computer. And it was something that mimicked an experience that people had every day. They saw these barcode scanners doing this, but they couldn’t do it themselves on a mobile device. So you gave them this big novelty factor—now you can do something that you could never do before on a mobile device but yet you’re prepared to understand it because you’ve seen it day in and day out. And so because we were able to give them a novel experience that they immediately picked up on, that was completely easy for them to use and share with their friends.
And I think a similar pattern could be followed: essentially look for something where people are prepared to understand it, but they’re not able to use it today because it just doesn’t exist. Create an application that can then allow them to do that thing on their mobile device—and by the way, if it’s something that you can only do on a mobile device, that’s going to add to the novelty and the cool factor of it.
If you put all those things together, I think you have something that people are going to be wanting to tell their friends about, especially as they get a new iPhone and they want to show off. “Why did I get this new iPhone? What’s so cool about it?” Sort of bragging to their friends, and they can show you these experiences. If, on the other hand, it was a new experience that was very hard to understand and none of your friend will even get it if you explain it to them, you wouldn’t want to tell them. With barcode scanning it was completely obvious and easy to explain, so I think that led to getting a lot of word of mouth.
Olson: Speaking of this novelty factor, you mentioned people showing it to their friends…what kinds of users were these? Were these mainly serious users? Could you talk about the kinds of users that you got?
Reddy: I think there was definitely a novelty factor to the application. We kind of anticipated a little bit the extent to which our number of users downloaded the app or thought it was cool just for the novelty factor of walking through their cabinets scanning barcodes for fun was pretty interesting. But the core user group is still price comparison; a lot of people are using it for that. And then there has been some use of it for inventory. I’d say probably those are the three main categories. We released an SDK so people can take our technology and use it in other applications, and we see that as a way for people to take on other uses.
Olson: Let’s talk a little bit more about the user experience of RedLaser. What were some of the other aspects of the user experience other than the capture interface that you think contributed to RedLaser’s success?
Powers: The other aspect of the RedLaser user experience that was important was just an extremely simple experience, or an extremely simple interface. The capture was good, and you were able to scan something and be immediately dropped in results. You didn’t have to go through two or three steps to get to results. And so there was one button press to bring up the scanner, and there were actually no more button presses after that—you were done. So you press one button press to bring up the scanner, you scan the barcode, and you’re dropped into information.
And so I think that dead simplicity was key to the user experience. In addition to that, we did a lot of work to increase the odds that after you completed the scan you would actually get something back, because we knew it would be a bad user experience if you would scan a barcode and the application would say, “I don’t know what that is.” And you’d scan another barcode and it’d say the same thing.
So we put a lot of effort to increasing the odds that it would at least be able to tell you something—maybe who’s the manufacturer, or what’s the product title at a minimum—and then, in the best case, return a bunch of prices and all kinds of other information. But just the dead simplicity and the odds that you got something back so it was a pleasant experience after you did complete that initial scan.
Olson: Your app definitely got a lot of good word of mouth and obviously became very popular. And a lot of apps become popular, but for a time. I’ve seen so many apps go to the App Store and get into the top 100 and then fall out, so what were some factors that you think that contributed to your continual success? Your app is still in the top 100 and it seems to be doing very well for a very extended period of time. What were some of the factors that you think contributed to that?
Powers: The timing was great—and we didn’t necessarily anticipate this—but the update to RedLaser occurred in September and that was a couple of months before the holiday season. So what we had was an initial spurt due to the novelty factor and word of mouth growth, then we had a little bit of a decrease in actual sales and popularity.
But then this led right into the holiday season, and just as we were getting some kind of critical mass of users where people in media were starting to learn about the application as well. And so as people started shopping and media started talking about shopping stories, it was a pretty obvious connection to connect shopping with mobile, which was another hot topic, and talk about the intersection of shopping and mobile. And so we ended up getting a lot of stories in major press and media—on television, the New York Times, and everywhere—about RedLaser because it was this perfect combination of mobile and shopping together.
And so that helped us sustain users for quite some time, all the way through December. And after that we still managed to maintain fairly high popularity, and I think a lot of that is just due to the fact that word of mouth growth doesn’t tail off as fast as an ephemeral press mention or just the initial launch of your application. I think what’s contributing to this is, whenever someone gets a new iPhone and their friends that find out about it—their friends that have iPhones already and have applications—RedLaser is one of the first applications they mention.
So we retain a lot of new sales just based on the fact that there are new iPhones coming out every day and RedLaser’s one of the apps that are often mentioned. And we’ve seen this happen a lot of times on Twitter, where someone will say they got a new iPhone and then you’ll see someone else telling them all the apps they should get, and RedLaser’s often mentioned. Those are two key factors.
And then the other theory we have—which we haven’t been able to prove, but an interesting theory nonetheless—is that since barcodes are printed on just about everything, it’s kind of a constant mental reminder of this application. So since you see barcodes every day, we think it possibly increases the odds that you’ll remember RedLaser on your iPhone among the other hundred apps you have on your iPhone. You’ll remember it, it’ll be top in memory, so when someone asks you what apps they should get, you might remember RedLaser just because you stared at a barcode five second earlier.
Olson: One of my friends got an iPhone a while back, and the first app he showed me was RedLaser. So now that your app’s been out there for a little while, how has seeing what people are actually using your app for affected your strategy and your plans for the future?
Reddy: The only big thing we did was we added two features centered around food because we saw that a lot of people were scanning foods—partly because when you first download an app you’re in your house and the first thing that’s around you is probably some food or whatever household items. Other than that, though—and those features were nutrition information and some allergy information—but other than that we probably stayed pretty true to the goal of price comparison and product information. The SDK is supposed to address the other possible uses of barcode scanning. We anticipated that this technology isn’t just for price comparison, but you can actually use it for many, many uses.
Powers: And to add to that, the SDK (since we haven’t talked about it yet) is essentially a component that we’ve created and allowed other iPhone developers to use such that they can add barcode scanning into their applications because, what Vikas just mentioned, we didn’t think we could create every application using barcode scanning so we enabled other to hopefully create all the other cool applications as well.
Olson: Let’s move to more general user experience on the iPhone and your thoughts on that. Based on your success with RedLaser, what do you think are the most crucial principles of user experience in iPhone apps?
Reddy: From what we’ve seen so far—obviously this is from our perspective, this isn’t globally true—simple, easily understandable actions with high reward seem to do really well on mobile. For example, Shazam, our own application, there’s tons of games like Doodle Jump—if you’ve seen, that game’s been doing really well for the last couple of months even—where it’s just a very simple action but the payoff in either enjoyment, or information, or any kind of utility’s high. And I think a lot of the apps that you couldn’t really do on non-mobile platforms seem to do best—the ones that are inherently mobile, for example, using the accelerometer, or using the GPS, or using the compass, things that you couldn’t really do if you’re just sitting at your desktop, that are made specifically for mobile, do well.
Olson: In terms of user experience, lots of iPhone apps—and I’ve definitely seen a lot of mistakes made in terms of user experience—what do you think are the biggest mistakes developers are making in developing for the iPhone?
Reddy: I’ve seen a lot of cases where people are trying to port a desktop or Web application to mobile, but then instead of trying to focus on what’s good about the mobile version or what’s unique about the mobile version, they try to jam all the features over to the mobile version. A lot of people have actually done this pretty well in the sense of avoiding clutter. One really good example of that I’ve seen is the New York Times; they didn’t just take their website and try to jam every single feature and every link into their Web version, they pared it down, figured out what’s the core information people want to see. And, interestingly enough, personally I prefer the mobile app to the Web one.
I think the other mistake is complex value propositions. There’s a lot of cases where you have to jump through a couple hoops before the app becomes useful to you, which is kind of an issue because people have (it seems like on mobile) a shorter attention span. So if you can do X, Y, and Z, and then get some kind of reward—set up an account, add a bunch of things to a list before that app is useful to you—you don’t get that immediate payoff and you risk the person just uninstalling it or losing interest.
Olson: I’ve definitely seen that. When the first thing you see is a signup screen, or a register screen, or a login, that’s not a good sign. I want to talk about learning user experience. Am I correct to say that neither of you really came into this whole experience with a user experience background? Then how has your experience with RedLaser changed your view of UX, if at all, and what would you suggest to other developers who aren’t necessarily trained in user experience who are seeking to learn more about it?
Powers: Yeah, you’re correct, we had essentially no background in user experience, especially nothing formal. We were both engineers and there isn’t a lot of focus on user experience, at least in engineering at the University of Michigan. So no, we didn’t have much experience at all. But we were users, as well, so we knew what made sense from a user perspective.
But that said, we did learn a lot. I think some of the things that we realized—and this may not necessarily be strict changes in our thought process, but at least an evolution in our thought process—and one of those things is that you need to have something that is easy to use without any training at all. With barcode scanning in our first version, it required some training, and because users have a very short attention span—especially on mobile—that didn’t work, it didn’t fly. Even though it was the best technology for the purpose, it required training and thus it kind of failed.
Another thing that we realized, which came to our advantage early on when RedLaser didn’t have the greatest database backing it, was that we were able to be successful even without the greatest database in the backend because the user experience—that initial capture experience—was very good. So as soon as we launched version 2.0, RedLaser still didn’t have the greatest database backing it, but yet the user experience was very good. So RedLaser became probably the most barcode scanning application on the iPhone even as its database was still fledgling.
I think the takeaway there is that user experience can trump the end value just because with a lot of other applications you’ll never get to see the end value because you can’t get past that initial user experience hump. That’s something we realized, and it actually came to our advantage because we were focused on that capture experience more than anything.
Another thing that we’ve learned is you shouldn’t worry about adding a lot of features to the application if the users can’t even use them or can’t even get to them. So focus on the initial features that people are absolutely going to use, and then later add the features that are nice to have, because there’s no sense in blowing a big huge feature list at everyone if they’re not even going to get to use them.
And then the last thing that we learned is that really the best way to learn how your user experience or your app works is just to launch it. Don’t worry too much about the first version being terrible because the truth is not a lot of people are going to see the first version if it’s terrible. We were a little over-concerned with that, and in the end those initial users that told us our first version was terrible only account for maybe 0.1 percent of our total user base. And so it isn’t really worthwhile being too afraid of launching something that’s not quite there.
Olson: Alright, so now let’s talk about Occipital’s future—what you guys are working on next. I know this is related to augmented reality; you guys are interested in augmented reality. How would you describe augmented reality, and what do you think people think when they hear that term?
Reddy: I think broadly it means overlaying virtual objects and information onto real-world scenes. Currently through mobile devices it could be through a heads-up display on top of a video feed. In general, you can think of it as Terminator vision or, if you want something that’s a little less evil, Iron Man vision, when he looks at things through his helmet he often sees information overlayed on top of the scene in front of him. There’s actually a variety of technologies used to achieve this—it’s a pretty broad field, with a lot of different technologies trying to achieve this overlay.
Olson: I’ve messed with a lot of augmented reality experiences, and they tend to be (at least the one’s I’ve tried) a bit unsatisfying and kind of novel, have a kind of novelty factor. Use it once, kinda cool, but it’s not really that useful. So what do you see are the current downsides of the current technology we’re seeing.
Powers: I think that’s a common experience, that there’s a bit of novelty in trying out a new AR application, and then there’s just something about it that just isn’t quite satisfying or useful. What we think that is is really just the way people are going about augmented reality on consumer devices today is by relying on the compass, and the GPS, and the accelerometer to give the understanding of where the user is and what they’re looking at.
And that works to some degree, but it tends to break down a lot. Compasses, specifically, are notoriously inaccurate if you’re anywhere near a piece of metal, and if you’re in a city (which is where these AR applications of today are particularly interesting) that’s actually where compasses work the worst. And not to mention GPS works pretty terrible in urban canyons as well.
And so you get this jumpy experience where the information being overlayed bears no real correlation to the screen itself, or to the video feed itself, and so I think you get really confused. And then you start to think, is this really just better than looking at it on a map or maybe looking at some top-down view where everything isn’t jumping around quite as much. That’s the conclusion we’ve come to, that the current approach to augmented reality really lacks this true correspondence between virtual information and the real world. And until you can address that, the user experience is going to be jumpy, it’s going to be floaty, it’s going to be frustrating, and so you’re not going to use it.
In fact, even if you just turn on the compass function on the iPhone 3GS, you can get frustrated by that just because it isn’t pointing in the right direction. And i think Apple realized that as well, and they made it so that compass feature wasn’t on by default because they knew users were going to get a little bit frustrated by it. Those approaches just don’t get you to the point where AR is truly intuitive and useful.
Olson: So that being said, what is your vision for the future of augmented reality?
Reddy: In our vision, we see computer vision specifically as the way to bridge the gap between this user experience. I think if you take that technology and advance it far enough, it enables pixel-accurate augmentation, which means you can easily maintain the illusion of virtual objects existing in the real world, meaning if there’s a virtual character on a tabletop they don’t jump around as you move your phone around, or move your devices around. That’s what we see as the future in the sense of bridging that gap.
Olson: Can you give any use cases for that kind of technology?
Powers: Sure. In one example of a very relevant use case, especially in the context of RedLaser, is product visualization or product preview. You can imagine pointing your camera at your living room and seeing what it would look like if you had a certain model of couch over in the corner, or if you point your camera at your kitchen counter and you can see what a new espresso machine would look like if you had it on the counter there.
And to the extent that that can be a realistic experience, it’ll allow you to maybe feel confident with ordering things online that you normally would have to go into a store to see. Especially when you think about furniture, everybody really has to see it typically before they buy it. But with augmented reality, particularly very accurate positioning, you can start to get comfortable with doing that kind of stuff.
Another example that we think about a lot is in gaming. You can imagine pointing a camera at a tabletop and playing a game where there’s creatures running around your table, jumping off of real-world objects and doing things like that. So you can create a game experience where the real world is actually kind of the level, so we think there’re some interesting things there.
You can think of a lot of other examples like outdoor navigation. One of the examples that we’ve posted on our blog is a video showing how you could overlay the route that someone can take to get from point A to point B on top of their video feed. This is being done in some sense, if you think about it, in car navigation systems. It shows you the route in front of you, but it’s not really showing you on top of the actual video feed. At the point where we can actually show it on top of a real video feed, it becomes very, very tangible, very easy to see and follow, whereas sometimes looking at a street name on a screen and then trying to figure out how that matches up with the street in front of you can be confusing.
But again, until we can get those very precise overlays, it’s almost worse to try to overlay it and tell you the wrong direction. So better augmented reality navigation is coming, and that’s an application I think is interesting.
Reddy: One idea that a lot of people will bring up to us is the idea that you guys should build an app where you hold your phone up to somebody’s face and it recognizes it and gives you information on them. But I just don’t think that’s going to happen until we have heads-up displays because you can imagine going up to somebody and trying to take a picture of their face while holding your phone up to their face as you try to recognize them. It’s kind of a ridiculous user experience there. It’s definitely possible in the future once we get some sort of glasses that can overlay information.
Olson: Sounds pretty exciting, I’ll be looking forward to what you all come up with. It’s been great talking to you guys, Vikas and Jeff, thanks so much for the interview.
Powers: Thank you.
Vikas: It was a pleasure.
post authorJeffrey Powers

Jeffrey Powers
Jeffrey Powers was a PhD student at the University of Michigan in Ann Arbor, but left early to start Occipital after realizing that the fastest way to make an impact was to start a company. While in Ann Arbor, he led a robotics research project involving autonomous flying blimps, he was elected chapter president of the engineering society Tau Beta Pi, helped build a collaborative filtering engine at Xanga.com, and spent two summers as an intern at the NSA.

post authorVikas Reddy

Vikas Reddy
Vikas Reddy is co-founder of Occipital, a mobile computer vision startup based in Boulder, CO. In his almost non-existent spare time, he writes at AwkwardRules.com, runs on various trails, plays tennis, and reads sci-fi and history. Vikas has a bachelor's degree in Computer Science Engineering from the University of Michigan, and minors in Mathematics and History. He has previously worked at NYC based startup Xanga.com and IBM.

post authorJeremy Olson

Jeremy Olson
Jeremy has always been right at the intersection of design and software. As a homeschooled kid, he always loved art and was delighted to discover software development as an outlet for his creativity. After ten years of building software, Jeremy landed his first big success at the age of 19 as a Sophomore at UNC Charlotte when he built an iPhone app called Grades. Apple not only featured Grades on the App Store homepage, but awarded the second version the coveted Apple Design Award in 2011 and Grades has since become one of the most popular Education apps on the App Store, being featured in national press such as Fox News and the Huffington Post. Now a senior, Jeremy has transformed his company Tapity into a full time business and his team works with startups and big brands alike to craft delightful apps for iOS.

Tweet
Share
Post
Share
Email
Print

Related Articles

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and