Two years ago, I published a book on how to make products more habit-forming. The book became a bestseller and I’m frequently asked to consult with companies — particularly tech companies — looking to make their goods and services stickier and harder to stop using.
Unfortunately, making things more engaging also makes them more potentially addictive.
The techniques I describe in the book, intended to help product designers build healthy habits in their users (like using a wellness app, keeping better track of personal finances, or staying in touch with family and friends) are the same tactics used by some to keep people un-healthfully hooked.
The solution is not stripping out what makes these products engaging; rather, it’s helping the addicts.
Luckily, the two-way nature of Internet-connected services means companies can identify, message, and assist people who want to moderate use.
Use and Abuse
For example, instead of auto-starting the next episode on Netflix or Amazon Video, the binge-inducing video streaming services could ask users if they’d like to limit the number of hours they watch in a given weekend.
Online games could offer players who cancel their accounts the option of blacklisting their credit cards to prevent future relapses.
Facebook could let users turn off their newsfeeds during certain times of the day.
And rather than making it so fiendishly difficult to figure out how to turn off notifications from particularly addictive apps, Apple and Android could proactively ask certain users if they’d like to turn off or limit these triggers.
What to Do About It
These services know the usage patterns of each and every user. They don’t need to bother everyone, just those people showing patterns of behavior indicative of a problem.
For example, setting a trigger based on the number of hours spent using an online service could prompt the company to reach out to suggest ways to cut back or deprecate certain features.
Indeed, the benefit of all the data being collected about us these days is that companies could use this information to help people who may be harmed by their products’ overuse.
Clearly, there are many things tech companies could do to help users break the cycle of addiction. Whether they actually do anything however, is another matter.
There are some industries and companies that can’t and won’t help addicts. It’s not just dealers of illegal drugs who benefit from harmful addictions. Legitimate industries depend on addicts as well.
For example, those ubiquitous ads for online games like Clash of Clans and Candy Crush are fishing for what the industry calls “whales” — the 0.15 percent of players who bring in 50 percent of revenue.
In an industry where the cost of acquiring a player is just barely less than the revenue made per user, whales tip the scales to profitability. Without these extreme customers, their businesses aren’t viable.
We Can’t Let Silicon Valley Become Las Vegas
Similarly, the casino industry depends on a disproportionate share of revenue coming from a small group of likely addicted gamblers, some of whom are known to wear adult diapers to avoid having to stop playing.
Many industries earn an outsized proportion of their revenue from their most loyal customers. The fast food industry for example, amusingly calls the 20 percent of diners who account for 60 percent of their revenue, “heavy users,” according to the Wall Street Journal. While there’s nothing unethical about being a patron’s favorite brand, a line is crossed when a company knowingly exploits people with addiction problems the way the gaming and gambling industries do.
For example, though most American casinos are required by law to have “self-exclusion” programs for gamblers who wish to stop their addiction, casinos have been known to welcome problem gamblers back with open arms. A similar situation revealed itself during a discussion on ethics I recently led at a publicly-traded online gaming company. The product managers confessed that they also allow people to play even when the players have explicitly asked to be cut off.
Casinos escape liability through a legal loophole protecting them from prosecution. Nevertheless, it is unethical to accept patronage from someone a company knows wants to stop using your product but can’t. This moral standard should apply to all industries that collect personal usage data on individuals and therefore have the ability to identify, message, and help problem users.
The trouble is, gambling and gaming companies are as addicted to their addicts as their addicts are to the companies’ products.
Doing the right thing is an existential threat since luring whales can mean the difference between the success and failure of a game or casino. Without outsized proceeds from the few addicted players, these industries would have a hard time making a profit.
Thankfully, not all companies are as dependent on addicted users as the casino and online gaming industries. Helping addicts wouldn’t much hurt Facebook or Reddit, for example.
In fact, some tech companies are already limiting overuse, albeit in rudimentary ways. Stack Overflow, a technical question and answer site used by 6 million coders, was designed with breakers built-in. “The current system is designed to reward continued participation, but not to the point that it creates obsession,” according to a post on the site by co-founder Jeff Atwood. “Programmers should be out there in the world creating things too,” Atwood noted, stressing that Stack Overflow should be a utility, not an addiction.
Unlike, say, cigarettes — potentially addictive products where the manufacturer does not know the user personally — online services are intimately aware of their users’ behaviors and can therefore intervene.
Of course, tech companies won’t be able to “cure” addictions, nor should they attempt to do so. Nor should they act paternalistically, turning off access after arbitrarily determining that a user has had enough. Rather, tech companies owe it to their users simply to reach out and ask if they can be helpful, just as a concerned friend might do. If the user indicates they need assistance cutting back, the company should offer a helping hand.
With the data these companies collect, identifying and reaching out to potential addicts is a relatively easy step. A harder one, it seems, is caring enough to do the right thing.