Illustration by Joe O’Donnell
I decided to go back the next day rather than spend any more time in traffic that night. I told myself it could have been worse: I could have received and taken the wrong drug. But that thought pissed me off even more. I tuned the whole thing out with a plate of food over whatever I was binge-watching on Netflix that night.
Think about the last time something like this happened to you. Maybe it was your local fast food restaurant. Maybe it was the dry cleaners. Whatever the case, it wasn’t a pleasant experience, and if it happens often enough, you might end your patronage with that business. That’s what happened to me; I recently stopped using this pharmacy and switched to a mail-order service.
What do we minimize when we put metrics in place? Usually, it is the user experience.Sometimes these issues occur simply as a result of human error. But often they occur because there is some other force in play — like metrics. Consider fast food chains. Years ago, they used to have a timer at the drive-through indicating how long you had been waiting on your order. Doubtless there was a metric in place measuring employee output. In the case of pharmacies, there is likely a metric in place for how fast a prescription can be filled or how many prescriptions are filled per hour, etc. These are called performance metrics.
What happens when you put a performance metric in place in a retail environment? What happens when you place a metric on user engagement in a social networking app? How about metrics in the health care system or anywhere else for that matter?
What do we minimize when we put metrics in place? Usually, it is the user experience.
Goodhart’s Law, named after the British economist Charles Goodhart, essentially states that a measurement ceases to have any real meaning when it is used as a means of control. Put another way, when we use a metric to regulate or control (as we do in performance metrics), we then alter human behavior in such a way that the metric becomes a target, rather than a true measurement of given performance. This skews the measurement and essentially induces the Hawthorne effect.
If you’re not familiar, the Hawthorne effect was the result of more than eight years of studies at the Hawthorne Works factory, a Western Electric plant just outside of Chicago. The studies involved changing the environmental conditions of workers — such as lighting, the location of workstations, and even the cleanliness of the work area — to observe the effect on employee productivity. The study revealed that the productivity of the workers increased with changes in their environment such as illumination. However, on later analysis, it was posited that the mere fact of the workers being observed was responsible for the increase.
The power of observation to affect and change human behavior has been proven time and again in studies. Metrics are a means of observation and will influence behavior. Thus, we can induce the Hawthorne effect by establishing known metrics.
In user experience design, we often find ourselves building systems that regulate users through certain metrics and measurements. Consider the nurse who must administer a medication by a certain time, recorded via a time/date stamp in an electronic medical record (EMR). They know that the time/date stamp will be recorded and they adjust their behavior accordingly. If, say, the nurse is running behind on rounds, they have a few choices in this situation.
When we put metrics in place that ultimately alter the user’s experience without adequately informing them, it becomes an ethical issue.
They can allow the system to take a true measurement and be late administering the medication (even if it is only by a minute) when there is some greater patient need on the unit. But this will have consequences for them and may reflect on their annual review or result in a reprimand. They could even find themselves looking for a new position if late too often.
They could establish a workaround (quite a common practice in health care systems). That is, they could enter the medication into the EMR prior to administering it (just to meet the time/date stamp requirement) and attend to more immediate matters. However, this could result in adverse consequences for the patient if the nurse forgets to actually administer the medication until much later.
A more realistic scenario would be for the nurse to cut corners in care to achieve the desired metric. They could rush through treatment protocols for other patients, skip washing their hands, ignore specific sanitary guidelines in medication administration, or cut corners in the care of some other patient.
Again, what takes a backseat to established metrics? In this case, patient care.
This is a reality in health care today. This is one metric in one hospital for one nurse. Consider what this means for the entire population of health care professionals in a single country like the United States.
I can assure you this is the tip of the iceberg. As a health care UX designer, I have been in countless meetings where metrics are discussed, employed, and integrated into an interface with the intention of regulating, mitigating, or measuring a specific behavior. But we aren’t measuring the behavior via the metric — we are changing the behavior.
We don’t just find this behavior in retail chains and health care either; you’ll see this type of behavior in almost any business. Metrics are subtle, and you won’t necessarily identify them unless you are looking at how a given system is engineered.
Consider a big social networking company like Facebook or Twitter. They have the ability, through algorithms and metrics, to choose what content receives the most exposure. But an algorithm is a human construct and can be manipulated. Moreover, content pushed to the top of a feed will naturally receive more attention. That is, a “like” or “retweet” will beget more of the same.
We’d like to think this process is controlled organically through metrics and objective algorithms. But there are other forces at play. Imagine yourself sitting in an executive meeting for one of these large companies, and a prominent member of the meeting explains the company’s stock has fallen. They detail how the company must “drive more user engagement,” optimize growth, and increase traffic.
The metrics in this situation are now shaping the overall design of the product. Mike Monteiro alludes to this very scenario in his article Twitter’s Great Depression. Facebook’s recent fake news scandal is a direct result of its business goals — a major one being increasing user engagement.
How do you monetize your business as a social networking company? Through increased growth and user engagement. Controversial material (such as Trump’s tweets or fake news) increases engagement, which increases clicks, which increases advertising dollars and revenue.
Facebook is clearly playing a numbers game, as outlined in a recent article in the New Yorker. Consider this:
New hires learned that a crucial measure of the company’s performance was how many people had logged in to Facebook on six of the previous seven days, a measurement known as L6/7. “You could say it’s how many people love this service so much they use it six out of seven days,” Parakilas, who left the company in 2012, said. “But, if your job is to get that number up, at some point you run out of good, purely positive ways. You start thinking about ‘Well, what are the dark patterns that I can use to get people to log back in?’ ”
What suffers in the end? Clearly, the user experience. In my own career, I have seen this happen many times. A way to measure progress or use or growth becomes a target, and we somehow forget about what our true purpose is. We’ll do anything to make the projected numbers a reality and forget whatever mission or vision statement our company posts on its website. We forget our true purpose. Our true purpose as designers, as I have often written, is to serve the customer’s needs, not the business. Our purpose is to build the best user experience we can.
I once sat in a meeting where we discussed a feature in terms of how we would assign a default value to a given selection. The selection was a menu where a clinician could choose between a brand name or generic medication. On a new prescription, the default selected value was “generic,” even if the patient had specifically requested the brand name. My colleague and I argued against this on the basis that a patient might have a specific reason for requesting the brand name over the generic, and we should not subjugate a patient preference with a default selection. We also argued that generic medications can contain different inactive ingredients than brand name drugs, and these ingredients can cause allergic reactions.
We considered this a patient safety issue, but we also bristled at the idea of making a selection for a patient (on subsequent fills) without getting informed consent from the patient — a concept that ensures a patient understands the implications of a certain treatment so they can make an educated decision about their own health. When we put metrics in place that ultimately alter the user’s experience without adequately informing them, it becomes an ethical issue.
As UX professionals, we don’t always have the ability to reframe the target from the metric to the user experience. But we do have a voice.
Our argument was met with talk of KPIs and metrics associated with the sale of generic medication, measurements that were not going to change or budge. With that single statement, the conversation was over. It was clear the business needs were superseding the patients’ needs.
I sit in meetings where engagement is discussed, where metrics become the target over user experience. Unfortunately, I know there will always be more of these meetings. Corporations will continue to prioritize stock market values over the customer’s values. They will continue to hire user experience teams not because we add value to the user experience or because we make the customer’s world a better place. No, that’s considered a tertiary benefit. They hire us because we help build products that sell more, increase user engagement, and ultimately increase their bottom line.
The meetings will go on. Vile decisions will be made to release your data to large corporations or to the public. Vile decisions will be made to set your preferences to default values — more lucrative values — in a health record system. Vile decisions will be made to compromise your safety, your privacy, or your well-being.
These decisions will not be made by evil people who have evil intentions. They will be made by regular employees just like you or me. They are just doing a job, trying to support their families. But they need to make their quota. They need to strive for that quarterly or annual bonus. There is a set of metrics shaping their behavior too. And that will eventually affect you, the customer.
These vile and unethical decisions are made daily in corporations around the world. They are part of a numbers game. And when you are a customer or purveyor of a service from one of these organizations, you play the game as well. You become a number — just one data point, a singular analytic.
As UX professionals, we don’t always have the ability to change these policies. We don’t always have the ability to reframe the target from the metric to the user experience. But we do have a voice.
I have a voice, and I will continue to use that voice. I will continue to argue against putting a feature in place simply because it will increase our numbers for a given objective. I will continue to defend the user experience over all else. I have a voice in whom I choose to provide my services to, whom I choose to work for.
The metric is not the target. The user experience is the target. The customer experience is the target. And if I work at an organization that does not value this and my voice is not heard, I will exercise my voice in another way. I will find a place of employment that does value the user experience.
User experience professionals are in a unique position. We are often the sole defenders of the user: the customer, the person exploited at the hands of large corporations. I believe our highest ethical obligation and professional duty is to do no harm.
I know where my line is drawn. Do you?