Dark Patterns are a problem online with deceitful interfaces becoming more prevalent in webpages, software installation processes, and mobile apps. I’m not crying conspiracy, but I’m sure there are companies using analytics tools to discover ways to fool or mislead visitors in order to increase their revenue, giving rise to dark UX.
From the perspective of individual responsibility, it’s up to everyone to be savvier in recognizing manipulative interfaces—to acquire design awareness. By studying these Dark Patterns and surfing the Internet, we get better at detecting them. However, I wonder what we can do as a design community to have a significant impact on these practices as a whole.
So far, my solution has been to catalog Dark Patterns on the site darkpatterns.org—a communal library created by Harry Brignull. Exposure and shame have worked since medieval times, and being outed for using deceitful interfaces can curtail bad behavior, but is it the only option? I suggest a two-pronged strategy to fight Dark Patterns: a dedicated social network with reviews combined with a licensing system called “Open Experience Design.”
A UX Social Network with Reviews
Good user experience is already rewarded through social networks, while companies using tricky mechanisms are openly criticized. An example of this is Ryanair. They had been reported using the dark pattern “tricky question” for several years and maintained it even after a change in the look and feel of their interface. Now there is a Facebook group called “Ryanair Sucks” with 4,070 unhappy followers—a significant number considering that the company’s official page has 7,000 likes.
Good experiences are also rewarded on TripAdvisor, the webpage dedicated to reviewing hotels, restaurants, and activities. This site is a frequently used tool for organizing a trip, and people rate everything according to the experiences they had at tourist attractions, hotels, restaurants, etc. For many users the final decision they make on a hotel or restaurant is often based upon reading the opinions of those who have left reviews.
I envision a platform with similar characteristics dedicated to digital experiences (webpages, mobile apps, etc.). Like TripAdvisor, it would be useful to people and would also provide valuable feedback to companies. A simple rating system could help pinpoint dark patterns and highlight the most trustworthy and user-friendly sites. Naturally, other aspects of design could also be discussed.
By providing a stream of “free user research” this type of platform might bring about the change needed to improve the Internet as a whole. Over time, every brand with a digital presence would see the value of having a presence on such a platform: the potential for good reviews. It would also give designers dozens of arguments to justify or propose changes in their work. For the user, sharing frustrations could be pretty therapeutic.
Open Experience Design: A License for Avoiding Frustration
It’s often been discussed that UX could benefit from a code of ethics. But what would such a code contain? The User Experience Professionals Association created one, which included ethical principles regarding honesty and information privacy among other things. It works to a point, but I often ask myself how we could implement it to the benefit of the user. If an ethics code does not become a useful tool for users to defend themselves against deceitful practices, then it’s not useful.
John Stuart Mill, a 19th-century English philosopher formulated his “greatest-happiness ethics principle” saying that one must always act so as to produce the greatest aggregate happiness among all sentient beings. We also need to ensure websites are designed for the happiness of the highest number of sentient beings (users) as possible. We know some are designed with Dark Patterns to take money from as many people as possible without paying attention at their happiness.
With digital products taking over our lives, we need a strategy for preventing Dark Patterns
Fortunately, Mill also stated that he believed “happiness is the absence of pain,” which leads me to think that happiness in a digital experience is the “absence of frustration.” This definition is easy because we can identify and fight frustration, which appears when there is bad design, deception, or when unrealistic expectations are set.
Bad design only implies that designers need to improve their work. By deceiving or creating unrealistic expectations, the user depends on the designer’s intent (or in some cases in their supervisor’s intent). Usually, when users are getting what they are promised everything is fine. But when there is a misalignment between needs and service communication (promises) or interaction design things start going wrong.
The solution I propose for avoiding frustration is a new UX code of ethics based on openness. We ask designers to open their creative process and explain their intent, goals, and decisions for a more honest form of experience design. This is what I call Open Experience Design, which is analogous to other open models or frameworks. It’s a set of principles to help designers make their goals and methodologies for creating a service or product user experience open and transparent.
In Linux environments developers share their code to explain how software works. This allows everyone to contribute (“given enough eyeballs, all bugs are shallow”). Open code and open knowledge licenses have been revolutionary to software development and for collecting and sharing knowledge in a manner reminiscent of Wikipedia. If websites would explain each design purpose, they could quickly build trust.
I admit code behaves very differently (it is easily fragmented into combinable functionalities) and there are reasons to think that such a code of UX ethics wouldn’t work. One consideration is that there aren’t many designs with open licenses or open design libraries. Another reason is that collaborative design does not work because “design is very specific.” However, here the word open is not a call for co-working, but a way of building a shared ground for the user and the service.
The disclosure I am asking for does not need to be complete in all cases. Creating an e-commerce site, a newspaper site, a site for customers to manage their electricity bills, or a social game all require different design approaches. Depending on the site, the designers could license different experiences to different degrees of openness (like in Creative Commons).
For instance, a first type might include only an intent declaration and a list of Dark Patterns the site doesn’t use; the second type could add up an explanation for most of the design elements; and the third type might go deep into the design process (including wireframes and user research).
Even though the third type would definitely be the best for the user, only complying with the first would already be a great improvement. In e-commerce, for instance, an intent declaration would include checking the following items: “easy sign-up and unsubscribe processes,” “no undesired e-mails,” “clear prices during shopping,” “no extra charges,” and “no sneaking items into the basket.”
I recall the lack of trust with financial transactions during the early years of e-commerce. Websites solved this problem by including credit card logos, the Paypal logo, or something as simple as an image of a lock next the words “safe” or “secure.” After releasing a new version or a new site designers would call to obtain a Dark Patterns-free badge. From the easiest to comply with, the first type, to the hardest one, which would make available all the design process, darkpatterns.org could verify the compliance and introduce the webpage in a database as using “white hat practices.”
Organizing Ourselves Against Difficult User Experiences
With digital products and services taking over more and more aspects of our daily lives, we need to find an effective strategy for preventing Dark Patterns. What I propose allows working to eliminate Dark Patterns from both the user side (social network) and the designer/company side (licenses).
From a company perspective, using these strategies to build trust comes with costs in terms of designers’ time and revenues that might be lost once Dark Patterns are eliminated. Furthermore, the fear of revealing key information is strong, but it’s also true that adopting of these strategies will almost certainly bolster customer loyalty and vastly improve the environment we’re all designing in together.
Image of casting steel wool courtesy Shutterstock.