Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Get exclusive access to thought-provoking articles, bonus podcast content, and cutting-edge whitepapers. Become a member of the UX Magazine community today!

Home ›› Analytics and Tracking ›› Using Open Experience Design and Social Networking to Stamp Out Dark UX

Using Open Experience Design and Social Networking to Stamp Out Dark UX

by Marc Miquel
6 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

A two-pronged strategy for dealing with the Dark Patterns that deceive users, frustrate designrs, and ruin the reputations of organizations and companies of all shapes and sizes.

Dark Patterns are a problem online with deceitful interfaces becoming more prevalent in webpages, software installation processes, and mobile apps. I’m not crying conspiracy, but I’m sure there are companies using analytics tools to discover ways to fool or mislead visitors in order to increase their revenue, giving rise to dark UX.

From the perspective of individual responsibility, it’s up to everyone to be savvier in recognizing manipulative interfaces—to acquire design awareness. By studying these Dark Patterns and surfing the Internet, we get better at detecting them. However, I wonder what we can do as a design community to have a significant impact on these practices as a whole.

So far, my solution has been to catalog Dark Patterns on the site darkpatterns.org—a communal library created by Harry Brignull. Exposure and shame have worked since medieval times, and being outed for using deceitful interfaces can curtail bad behavior, but is it the only option? I suggest a two-pronged strategy to fight Dark Patterns: a dedicated social network with reviews combined with a licensing system called “Open Experience Design.”

A UX Social Network with Reviews

Good user experience is already rewarded through social networks, while companies using tricky mechanisms are openly criticized. An example of this is Ryanair. They had been reported using the dark pattern “tricky question” for several years and maintained it even after a change in the look and feel of their interface. Now there is a Facebook group called “Ryanair Sucks” with 4,070 unhappy followers—a significant number considering that the company’s official page has 7,000 likes.

Good experiences are also rewarded on TripAdvisor, the webpage dedicated to reviewing hotels, restaurants, and activities. This site is a frequently used tool for organizing a trip, and people rate everything according to the experiences they had at tourist attractions, hotels, restaurants, etc. For many users the final decision they make on a hotel or restaurant is often based upon reading the opinions of those who have left reviews.

I envision a platform with similar characteristics dedicated to digital experiences (webpages, mobile apps, etc.). Like TripAdvisor, it would be useful to people and would also provide valuable feedback to companies. A simple rating system could help pinpoint dark patterns and highlight the most trustworthy and user-friendly sites. Naturally, other aspects of design could also be discussed.

By providing a stream of “free user research” this type of platform might bring about the change needed to improve the Internet as a whole. Over time, every brand with a digital presence would see the value of having a presence on such a platform: the potential for good reviews. It would also give designers dozens of arguments to justify or propose changes in their work. For the user, sharing frustrations could be pretty therapeutic.

Open Experience Design: A License for Avoiding Frustration

It’s often been discussed that UX could benefit from a code of ethics. But what would such a code contain? The User Experience Professionals Association created one, which included ethical principles regarding honesty and information privacy among other things. It works to a point, but I often ask myself how we could implement it to the benefit of the user. If an ethics code does not become a useful tool for users to defend themselves against deceitful practices, then it’s not useful.

John Stuart Mill, a 19th-century English philosopher formulated his “greatest-happiness ethics principle” saying that one must always act so as to produce the greatest aggregate happiness among all sentient beings. We also need to ensure websites are designed for the happiness of the highest number of sentient beings (users) as possible. We know some are designed with Dark Patterns to take money from as many people as possible without paying attention at their happiness.

With digital products taking over our lives, we need a strategy for preventing Dark Patterns

Fortunately, Mill also stated that he believed “happiness is the absence of pain,” which leads me to think that happiness in a digital experience is the “absence of frustration.” This definition is easy because we can identify and fight frustration, which appears when there is bad design, deception, or when unrealistic expectations are set.

Bad design only implies that designers need to improve their work. By deceiving or creating unrealistic expectations, the user depends on the designer’s intent (or in some cases in their supervisor’s intent). Usually, when users are getting what they are promised everything is fine. But when there is a misalignment between needs and service communication (promises) or interaction design things start going wrong.

The solution I propose for avoiding frustration is a new UX code of ethics based on openness. We ask designers to open their creative process and explain their intent, goals, and decisions for a more honest form of experience design. This is what I call Open Experience Design, which is analogous to other open models or frameworks. It’s a set of principles to help designers make their goals and methodologies for creating a service or product user experience open and transparent.

In Linux environments developers share their code to explain how software works. This allows everyone to contribute (“given enough eyeballs, all bugs are shallow”). Open code and open knowledge licenses have been revolutionary to software development and for collecting and sharing knowledge in a manner reminiscent of Wikipedia. If websites would explain each design purpose, they could quickly build trust.

I admit code behaves very differently (it is easily fragmented into combinable functionalities) and there are reasons to think that such a code of UX ethics wouldn’t work. One consideration is that there aren’t many designs with open licenses or open design libraries. Another reason is that collaborative design does not work because “design is very specific.” However, here the word open is not a call for co-working, but a way of building a shared ground for the user and the service.

The disclosure I am asking for does not need to be complete in all cases. Creating an e-commerce site, a newspaper site, a site for customers to manage their electricity bills, or a social game all require different design approaches. Depending on the site, the designers could license different experiences to different degrees of openness (like in Creative Commons).

For instance, a first type might include only an intent declaration and a list of Dark Patterns the site doesn’t use; the second type could add up an explanation for most of the design elements; and the third type might go deep into the design process (including wireframes and user research).

Even though the third type would definitely be the best for the user, only complying with the first would already be a great improvement. In e-commerce, for instance, an intent declaration would include checking the following items: “easy sign-up and unsubscribe processes,” “no undesired e-mails,” “clear prices during shopping,” “no extra charges,” and “no sneaking items into the basket.”

I recall the lack of trust with financial transactions during the early years of e-commerce. Websites solved this problem by including credit card logos, the Paypal logo, or something as simple as an image of a lock next the words “safe” or “secure.” After releasing a new version or a new site designers would call to obtain a Dark Patterns-free badge. From the easiest to comply with, the first type, to the hardest one, which would make available all the design process, darkpatterns.org could verify the compliance and introduce the webpage in a database as using “white hat practices.”

Organizing Ourselves Against Difficult User Experiences

With digital products and services taking over more and more aspects of our daily lives, we need to find an effective strategy for preventing Dark Patterns. What I propose allows working to eliminate Dark Patterns from both the user side (social network) and the designer/company side (licenses).

From a company perspective, using these strategies to build trust comes with costs in terms of designers’ time and revenues that might be lost once Dark Patterns are eliminated. Furthermore, the fear of revealing key information is strong, but it’s also true that adopting of these strategies will almost certainly bolster customer loyalty and vastly improve the environment we’re all designing in together.

Image of casting steel wool courtesy Shutterstock.

post authorMarc Miquel

Marc Miquel
Marc Miquel is a User Engagement PhD researcher & Game Designer in Barcelona, Catalonia. He worked in the gambling industry as a user researcher and now curates darkpatterns.org with Harry Brignull in an effort to stamp out user interfaces designed to trick and improve the web. He also created the company Catalan Games to explain his Catalonia’s culture in a funny way.

Tweet
Share
Post
Share
Email
Print

Related Articles

AI didn’t just change work — it removed the starting point. This piece explores what happens when early-career jobs vanish, and why the most “future-proof” skills might be the oldest ones.

Article by Pavel Bukengolts
AI, Early-Career Jobs, and the Return to Thinking
  • The article illustrates how AI is quickly taking over beginner-level jobs that involve routine work.
  • The piece argues that the skills that remain most valuable are human ones, like critical thinking, communication, big-picture understanding, and ethics.
  • It suggests that companies must decide whether to replace junior staff with AI or use AI to help train and support them.
Share:AI, Early-Career Jobs, and the Return to Thinking
5 min read

Learn when to talk to users, and when to watch them in order to uncover real insights and design experiences that truly work.

Article by Paivi Salminen
Usability Tests vs. Focus Groups
  • The article distinguishes between usability tests and focus groups, highlighting their different roles in UX research.
  • It explains that focus groups gather opinions and attitudes, while usability tests observe real user behavior to find design issues.
  • The piece stresses using each method at the right stage to build the right product and ensure a better user experience.
Share:Usability Tests vs. Focus Groups
2 min read

Explore how interaction data uncovers hidden user-behavior patterns that drive smarter product decisions, better UX, and continuous improvement.

Article by Srikanth R
The Power of Interaction Data: Tracking User Behavior in Modern Web Apps
  • The article explains how interaction data like clicks, scrolls, and session patterns reveals real user behavior beyond basic analytics.
  • It shows how tools such as heatmaps and session replays turn this data into actionable insights that improve UX and product decisions.
  • The piece emphasizes using behavioral insights responsibly, balancing optimization with user privacy and ethical data practices.
Share:The Power of Interaction Data: Tracking User Behavior in Modern Web Apps
14 min read

Join the UX Magazine community!

Stay informed with exclusive content on the intersection of UX, AI agents, and agentic automation—essential reading for future-focused professionals.

Hello!

You're officially a member of the UX Magazine Community.
We're excited to have you with us!

Thank you!

To begin viewing member content, please verify your email.

Get Paid to Test AI Products

Earn an average of $100 per test by reviewing AI-first product experiences and sharing your feedback.

    Tell us about you. Enroll in the course.

      This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and