We stand with Ukraine and our team members from Ukraine.

The Community Of Over 578,000

Home ›› Customer Experience ›› Tech Companies: If You Create Addicts, You Need to Help Them

Tech Companies: If You Create Addicts, You Need to Help Them

by Nir Eyal
Share this post on
Share on twitter
Tweet
Share on linkedin
Share
Share on facebook
Post
Share on reddit
Share
Share on email
Email
Share on print
Print

Save

How could tech companies help people become less addicted with their products? And should they?

Two years ago, I published a book on how to make products more habit-forming. The book became a bestseller and I’m frequently asked to consult with companies — particularly tech companies — looking to make their goods and services stickier and harder to stop using.

Unfortunately, making things more engaging also makes them more potentially addictive.

The techniques I describe in the book, intended to help product designers build healthy habits in their users (like using a wellness app, keeping better track of personal finances, or staying in touch with family and friends) are the same tactics used by some to keep people un-healthfully hooked.

The solution is not stripping out what makes these products engaging; rather, it’s helping the addicts.

Luckily, the two-way nature of Internet-connected services means companies can identify, message, and assist people who want to moderate use.

Use and Abuse

For example, instead of auto-starting the next episode on Netflix or Amazon Video, the binge-inducing video streaming services could ask users if they’d like to limit the number of hours they watch in a given weekend.

Online games could offer players who cancel their accounts the option of blacklisting their credit cards to prevent future relapses.

Facebook could let users turn off their newsfeeds during certain times of the day.

And rather than making it so fiendishly difficult to figure out how to turn off notifications from particularly addictive apps, Apple and Android could proactively ask certain users if they’d like to turn off or limit these triggers.

What to Do About It

These services know the usage patterns of each and every user. They don’t need to bother everyone, just those people showing patterns of behavior indicative of a problem.

For example, setting a trigger based on the number of hours spent using an online service could prompt the company to reach out to suggest ways to cut back or deprecate certain features.

Indeed, the benefit of all the data being collected about us these days is that companies could use this information to help people who may be harmed by their products’ overuse.

Clearly, there are many things tech companies could do to help users break the cycle of addiction. Whether they actually do anything however, is another matter.

There are some industries and companies that can’t and won’t help addicts. It’s not just dealers of illegal drugs who benefit from harmful addictions. Legitimate industries depend on addicts as well.

For example, those ubiquitous ads for online games like Clash of Clans and Candy Crush are fishing for what the industry calls “whales” — the 0.15 percent of players who bring in 50 percent of revenue.

In an industry where the cost of acquiring a player is just barely less than the revenue made per user, whales tip the scales to profitability. Without these extreme customers, their businesses aren’t viable.

We Can’t Let Silicon Valley Become Las Vegas

Similarly, the casino industry depends on a disproportionate share of revenue coming from a small group of likely addicted gamblers, some of whom are known to wear adult diapers to avoid having to stop playing.

Many industries earn an outsized proportion of their revenue from their most loyal customers. The fast food industry for example, amusingly calls the 20 percent of diners who account for 60 percent of their revenue, “heavy users,” according to the Wall Street Journal. While there’s nothing unethical about being a patron’s favorite brand, a line is crossed when a company knowingly exploits people with addiction problems the way the gaming and gambling industries do.

For example, though most American casinos are required by law to have “self-exclusion” programs for gamblers who wish to stop their addiction, casinos have been known to welcome problem gamblers back with open arms. A similar situation revealed itself during a discussion on ethics I recently led at a publicly-traded online gaming company. The product managers confessed that they also allow people to play even when the players have explicitly asked to be cut off.

Casinos escape liability through a legal loophole protecting them from prosecution. Nevertheless, it is unethical to accept patronage from someone a company knows wants to stop using your product but can’t. This moral standard should apply to all industries that collect personal usage data on individuals and therefore have the ability to identify, message, and help problem users.

The trouble is, gambling and gaming companies are as addicted to their addicts as their addicts are to the companies’ products.

Doing the right thing is an existential threat since luring whales can mean the difference between the success and failure of a game or casino. Without outsized proceeds from the few addicted players, these industries would have a hard time making a profit.

Thankfully, not all companies are as dependent on addicted users as the casino and online gaming industries. Helping addicts wouldn’t much hurt Facebook or Reddit, for example.

In fact, some tech companies are already limiting overuse, albeit in rudimentary ways. Stack Overflow, a technical question and answer site used by 6 million coders, was designed with breakers built-in. “The current system is designed to reward continued participation, but not to the point that it creates obsession,” according to a post on the site by co-founder Jeff Atwood. “Programmers should be out there in the world creating things too,” Atwood noted, stressing that Stack Overflow should be a utility, not an addiction.

Unlike, say, cigarettes — potentially addictive products where the manufacturer does not know the user personally — online services are intimately aware of their users’ behaviors and can therefore intervene.

Of course, tech companies won’t be able to “cure” addictions, nor should they attempt to do so. Nor should they act paternalistically, turning off access after arbitrarily determining that a user has had enough. Rather, tech companies owe it to their users simply to reach out and ask if they can be helpful, just as a concerned friend might do. If the user indicates they need assistance cutting back, the company should offer a helping hand.

With the data these companies collect, identifying and reaching out to potential addicts is a relatively easy step. A harder one, it seems, is caring enough to do the right thing.

post authorNir Eyal

Nir Eyal, Nir Eyal writes, consults, and teaches about the intersection of psychology, technology, and business. Nir founded two tech companies since 2003 and today is a past Lecturer in Marketing at the Stanford Graduate School of Business. Nir is also an advisor to several Bay Area start-ups and incubators. Nir’s last company received venture funding from Kleiner Perkins Caufield & Byers and was acquired in 2011. In addition to blogging at NirAndFar.com, Nir is a contributing writer for Forbes, TechCrunch, and Psychology Today. Nir attended The Stanford Graduate School of Business and Emory University.   Nir Eyal is the author of Hooked: How to Build Habit-Forming Products and blogs about the psychology of products at NirAndFar.com. For more insights on using psychology to change behavior, join his newsletter and receive a free workbook.

Share on twitter
Tweet
Share on linkedin
Share
Share on facebook
Post
Share on reddit
Share
Share on email
Email
Share on print
Print

Related Articles

Artificial intelligence (AI) could soon surpass human intelligence. Having both advantages and disadvantages, it still creates powerful opportunities and produces more accurate customer-behavior models.

Time to reflect on my future in the age of AI
  • Since AI-driven processes can create powerful opportunities to improve producing more accurate customer-behavior models, many traditional businesses will soon transform their core processes and business models to take advantage of ML.
  • Sonia P., People-Centric Design Enthusiast, brigs up such questions related to the role of AI in the future:
    • What’s exciting about AI?
    • What’s worrying?
    • How will we work with machines?
  • In order to make machines that behave better for humans is for UX designers to take all factors into considerations, bridge the gap and merge the knowledge from all sides to define the best solution.
Share:Time to reflect on my future in the age of AI
Time-to-reflect-my-future-in-the-age-of-AI.-article-image.png

There are numerous qualitative methods with students as users. Finding the right AAC system and implementing it effectively is essential to give every student access to communication.

How I Use UX Research in Speech Therapy
  • Kate Paolini, UX Researcher, and Speech-Language Pathologist digs deeper into how her students using augmentative and alternative communication (AAC) can benefit from a UX research approach.
  • Communication is much more than speech; therefore, finding the right AAC system and implementing it effectively is essential to give every student access to communication.
  • Kate Paolini conducted the research with the help of the following methods:
    • Buy-In & Initial Methods
    • Designing the AAC System for the User
    • Access & Education
    • Observations & Data Tracking
    • Heuristic Evaluation
    • Assess, Adapt, Repeat
Share:How I Use UX Research in Speech Therapy

This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and