We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Home ›› Business Value and ROI ›› 6 Key Questions to Guide International UX Research ›› Crossing the Great UX–Agile Divide

Crossing the Great UX–Agile Divide

by Mike Bulajewski
13 min read
Share this post on


Understanding the labor issues at the core of the agile manifesto can help experience designers find news ways of working to enlighten developers.

In 2002, interaction design pioneer Alan Cooper debated Kent Beck, creator of the Extreme Programming (XP) variant of agile, about the differences between their respective approaches to building software. Unfortunately, their disagreements were strong and by the end, they had found precious little common ground.

Twelve years later, the war rages on. Every year, the UX community musters more articles, interviews, conference workshops, and panel discussions in an effort to resolve the seemingly unresolvable challenge of integrating UX into an agile process. Now more than ever, it’s important to step back from the growing body of tips, strategies and best practices, and ask why this conflict exists in the first place.

Agile methodologies are famous for their flexibility and ability to adapt to changing circumstances. Agile is a family of methodologies that embrace a wide range of practices rather than dogmatically sticking to a single process. The movement hasn’t stood still either. Once dominant, XP lost its place to Scrum as the most popular practice, and lean has seen growing interest in recent years.

With all the flexibility, diversity, and evolution, we might expect a predisposition of openness among agile teams to the alternative perspective that UX brings. So why do we find the opposite?

Since at least as far back as the Cooper–Beck debate in 2002, we’ve always assumed that agile and UX are different methodologies with the common goal of building quality software. But a closer look at history and rhetoric of the agile manifesto and other writings by its authors suggests another agenda that has nothing to do with software per se: to empower and protect the interests of software developers at work.

The Agile Labor Union

In 2001, a group of leading developers and consultants met to discuss their shared vision of how to transform software development practice. They coined their movement “agile” and wrote a manifesto with 12 principles. Some were related to how work was conducted within software teams, but five principles were aimed at refashioning the relationship between developers, managers, and customers.

  • Agile principle #5. Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done.
  • Agile principle #11. The best architectures, requirements, and designs emerge from self-organizing teams.

The fifth principle asks that managers trust developers and create a pleasant work environment that motivates good work. The eleventh principle advocates for self-organizing teams—giving developers autonomy and freedom to choose the projects and features that most interest them and avoiding a dictatorial style of management. Among agile developers, there is a strong anti-management sentiment that has been described as the “We don’t need no stinking managers” attitude.

In the movement, the concept of servant leadership reigns as the dominant role envisioned for managers. It’s a more nurturing and empathetic style that rejects the reliance on command-and-control and authority and focuses on empowering and enabling the worker.

Self-organization has always been a controversial topic. In 2007 Jim Highsmith, one of the signatories to the agile manifesto, wrote against it.

I’ve been thinking recently that the term “self-organizing” has outlived its usefulness in the agile community and needs to be replaced. While self-organizing is a good term, it has, unfortunately, become confused with anarchy in the minds of many. Why has this occurred? Because there is a contingent within the agile community that is fundamentally anarchist at heart and it has latched onto the term self-organizing because it sounds better than anarchy.

Highsmith is right about the anarchist streak. Fred George, a software developer and consultant, has proposed an agile methodology called “Programmer Anarchy” that rejects the concept of managers altogether.

  • Agile principle #4: Business people and developers must work together daily throughout the project.
  • Agile principle #6: The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.

These principles mandate co-location of the engineering team and daily interaction with business people. Agile leaders justify them with observations about the inefficiency of distributed teams communicating across time zones and the necessity of face-to-face contact as a catalyst for the innovative and creative spirit of the team.

The agile movement only began in 2001, but the methodologies at its core were developed and used much earlier, starting in the 1990s. At that time, economic globalization was a hot political issue. Outsourcing of software and IT work to developing countries like India was beginning to boom and many software developers in the West wondered if they would still have jobs.

Although these concerns aren’t directly mentioned in the manifesto, in practice, the agile principles of colocation and face-to-face interaction with business people amount to a protection from job losses due to outsourcing.

Work Hour Limits
  • Agile Principle #8: Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely.

The eighth principle is about “sustainable development.” In Extreme Programming Explained, Kent Beck clarifies this as meaning that software developers should work no more than 40 hours a week, an important rule for an industry that has been known to demand 12 hour days, 7 days a week.

Using these five principles, the agile movement tries to limit the number of working hours, demands autonomy and self-management, rejects authoritarian styles of management, prevents software jobs from being shipped overseas, and ensures that software engineers are involved in their workplace as decision makers. In other economic sectors, these principles would be recognized as a list of work rules, part of the labor agreement that a union leaders would negotiate with an employer on behalf of union members.

The Unity of the Development Process

Agile methodologies are framed in opposition to waterfall, a linear model of software development where requirements are written up by managers and then “thrown over the wall” for developers to implement. Instead, agile methodologies stress working software over comprehensive documentation, and it’s a major reason why designers and design researchers have difficulty working with agile teams.

Agile promoters criticize waterfall for many reasons. Requirements are a moving target and waterfall isn’t iterative and doesn’t leave time for customer feedback. But there’s something more at stake beyond arguments about quality.

Agile leaders sometimes argue against the practice of Taylorism (or Scientific Management), an efficiency-oriented management theory that originated in the late 19th century to improve productivity through time- and motion-based measurement, standardization, rationalization of work, and division of labor. Often associated with factory work, a key principle of Taylorism is “the separation of conception from execution”—a way of organizing work so that a manager or expert decides what needs to be done and how to do it, and gives the instruction over to the worker who simply executes the instructions in an efficient, but mechanical way; “throwing it over the wall” in the agile idiom. In this model, there’s no room for the worker to bring his or her own creativity to the work. Those higher functions are reserved for managers and experts.

Martin Fowler, a leader of the agile movement, wrote an introductory essay that includes a section called “Separation of Design and Construction” where he criticizes this idea:

The notion of people as resources is deeply ingrained in business thinking, its roots going back to the impact of Frederick Taylor’s Scientific Management approach. In running a factory, this Taylorist approach makes sense. But for the highly creative and professional work, which I believe software development to be, this does not hold.

In Extreme Programming Explained, Beck makes a similar argument:

No one walks around a development shop with a clipboard and a stopwatch. The problem for software development is that Taylorism implies a social structure of work. While we’ve lost the rituals and trappings of time-and-motion studies, we have unconsciously inherited this social structure and it is bizarrely unsuited to software development. The first step of social engineering in Taylorism is the separation of planning from execution. It is the educated engineers who decide how work is to be done and how long it will take. The workers must faithfully execute the assigned task in the assigned way in the allotted time and all will be well. Workers are cogs in a machine. No one wants to think of himself as a cog.

These agile leaders owe this line of criticism and even the particular phrase “separation of conception from execution” to the American socialist writer and labor activist Harry Braverman. In his wide-ranging critique of industrial capitalism Labor and Monopoly Capital: The Degradation of Work in the Twentieth Century published in 1974, Braverman makes the following point:

In the human, as we have seen, the essential feature that makes for a labor capacity superior to that of the animal is the combination of execution with a conception of the thing to be done. But as human labor becomes a social rather than an individual phenomenon, it is possible—unlike in the instance of the animal where the motive force, instinct, is inseparable from action—to divorce conception from execution. This dehumanization of the labor process, in which workers are reduced almost to the level of labor in its animal form, while purposeless and unthinkable in the case of the self-organized and self-motivated social labor of a community of producers, because crucial for the management of purchased labor. For if the workers’ execution is guided by their own conception, it is not possible, as we have seen, to enforce upon them either the methodological efficiency or the working pace desired by capital. The capitalist therefore learns from the start to take advantage of this aspect of human labor power, and to break the unity of the labor process.

Many proponents of agile reject the use of requirements and spec on the grounds that they amount to unnecessary overhead that slows down the team. Designers and researchers respond by speeding up the pace, becoming more efficient and using ever lighter and faster methods. The widely accepted best practice for working with agile teams is the “one sprint ahead” model—designers complete designs in one sprint, and then turn over their work to be implemented by developers.

Taking the agile methodologies at face value, that approach should work. But many agile proponents have rejected it as “mini-waterfall” despite the increased speed and agility. Perhaps speed is not really what’s at stake. Taking their cues from Harry Braverman, leading thinkers in the agile movement and many agile teams insist on the unity of the labor process as a way of preserving the autonomy and self-management of software developers.

Taken to its logical conclusion, the mere existence of designers and researchers as specialized job roles in a software team divorces conception from execution, regardless of how much or little documentation they create, how fast they do it, or whether they do it upfront or incrementally.

Traditionally, a software project is evaluated in terms of business and technical considerations: Was it delivered on time and under budget? Does it have all the needed features? Is it free of bugs? Does it perform well? Is it stable, easy to maintain, and secure? If you’re in the UX field, right about now you’re suppressing the urge to say “OK yes, but what about the users!”

That instinct signals a far-reaching industry-wide shift in our thinking about the nature of technology. The rise of specialized UX job roles is rooted in the growing recognition in the industry that a software project also needs to be evaluated from the perspective of the user’s experience. Is it simple, effective, usable, attractive, and culturally and contextually appropriate? Does it support or hinder the user’s goal? Is it unnecessarily complex, confusing, or intolerant of errors?

The expansion in our understanding of what makes a successful technology project means an expansion in the number of skills that must be brought to bear. We increasingly draw on a wide range of competencies beyond those traditionally taught in university computer science departments: research methods from anthropology, sociology and other social sciences; usability techniques developed by cognitive psychologists; visual, interaction and motion design; information architecture; branding; prototyping; and so on.

The agile ideal is unity of the labor process, where a software developer’s execution is guided by their own conception. But in many cases, the agile vision of the labor process hasn’t kept pace with these new competencies. Agile teams use software development processes to ensure autonomy and self-organization in the workplace, but these structures end up doing more harm than good when they fail to recognize the importance of new skills that are outside the traditional purview of software engineering. These skills are pushed outside of the team, reintroducing the feared separation of conception and execution.

What Can UX Designers Do?

If the purpose of agile methodologies is to protect software developers’ autonomy at work, then our existing attempts combine agile with UX practices are making a category error. The fraught relationship between UX and agile looks more like a labor conflict than a disagreement about the best way to build software.

Designers and researchers surely don’t intend to harm the workplace interests of the software developers we work alongside. User-centered design has a historical connection to the Cooperative Design movement of Scandinavian trade unions. It is a fundamentally democratic and egalitarian approach that recognizes the practical and—dare we say it—moral importance of empowering users (who are often workers) by including them into the design process.

A diplomatic approach can be helpful to smooth over tensions with developers while we’re trying to create conditions for effective design. Some teams may be receptive to the view that UX could be an ally of software developers in their quest for autonomy and self-management—delighted users would be a clear signal to business owners that the process is working.

But in the long term, the presence of UX design and research as specialized job roles commoditizes software development work, moving crucial creative and strategic activities outside the team. This may lead to an increase in standardization and routinization of programming work, lower compensation, and outsourcing for all but the most talented developers.

The sociologist George Ritzer coined the term “McDonaldization” to describe this process, where innovation and creative problem solving are removed from work activities. A hyper-rational form of work more commonly associated with fast food restaurants takes its place, one characterized by efficiency, calculability, predictability, and control.

Although many software developers today enjoy the high salaries and excellent working conditions associated with white-collar work, it may not stay that way and UX design could be a contributing factor. The McDonaldization of software work may already be underway in the guise of lean software development, a newer methodology, which has grown rapidly in influence and adoption.

The McDonaldization of software work may already be underway in the guise of lean development

Lean draws inspiration from Taylorist management philosophies, which have been used to improve the efficiency of automobile factory work. It focuses relentlessly on continuous improvement, waste minimization, and optimization of performance metrics. Unions and labor activists call it “management by stress” because of the way the work process is decomposed into repetitive motions and optimized down to the second.


Software developers have faced the pressures of McDonaldization in the past, most notably in the so-called Software Crisis of the ‘60s and ‘70s, a period of time when the industry was concerned with problems of low quality and rising labor costs. Prominent academics in computer science departments and research labs developed new techniques to address this problem. They were surprisingly open about their intent to reduce skills and eliminate the creativity associated with the job, making it more like factory work.

In 1974, the creators of the database query language SQL explicitly framed the goal of their project to reduce the labor costs of software, much like the structured programming technique that had been developed in the same period:

As computer systems become more advanced we see a gradual evolution from procedural to declarative problem specification. There are two major reasons for this evolution. First, a means must be found to lower software costs among professional programmers. The costs of program creation, maintenance, and modification have been rising very rapidly. The concepts of structured programming have been introduced in order to simplify programming and reduce the cost of software. Secondly, there is an increasing need to bring the non-professional user into effective communication with a formatted database. Much of the success of the computer industry depends on developing a class of users other than trained computer specialists.

In the 1990s, object-oriented programming was used for very similar purposes. Brad Cox, creator of the Objective-C programming language, believed that software costs would finally be kept under control with the silver bullet of the object-oriented paradigm:

The silver bullet is a cultural change rather than a technological change. It is a paradigm shift—a software industrial revolution based on reusable and interchangeable parts that will alter the software universe as surely as the industrial revolution changed manufacturing … I use a separate term, software industrial revolution, to mean what object-oriented has always meant to me: transforming programming from a solitary cut-to-fit craft, like the cottage industries of colonial America, into an organizational enterprise like manufacturing is today.

Over the last 50 years, the organization of software work through nearly universally adopted techniques like object-oriented programming and structure programming has tended towards a trajectory of deskilling and routinization. It’s easy to see why there would be a reaction against this trend, and why the agile philosophy emerged in the 1990s to counter it. By creating processes that protected their interests, agile methodologies allowed programmers to also benefit from the quality improvements brought by new software engineering techniques and mitigating some of the potentially harmful effects on their jobs.

Many in the UX field have felt obstructed by development teams who insist on adopting agile methodologies and confused about why their arguments for paying closer attention to users go unheeded. It may be helpful to ask whether the presence of UX designers and researchers on the team contributes to rolling back some of the hard-earned job protections that software developers have won, and to find new ways of talking about and practicing our craft that highlight the benefits to them.


Image of ants building a bridge courtesy Shutterstock.

post authorMike Bulajewski

Mike Bulajewski,

Mike Bulajewski holds a Master’s degree from the University of Washington’s Human Centered Design & Engineering program and works as a UX designer and design technologist with experience in e-commerce, education, social media and collaboration software. His writing on the social, economic and political implications of technology has appeared in West Space Journal, The New Inquiry and Cyborgology.


Related Articles

Article by Chris Kernaghan
Do Founders Even Care About Design?
  • The article emphasizes the importance of design in startup success, highlighting the risks of ignoring user feedback and the necessity of effective communication between founders and designers.
Share:Do Founders Even Care About Design?
6 min read

Designers should not be mere decorators, but understand language and microcopy, which is a crucial design skill, now more than ever.

Article by David Hall
The Essential Guide to Microcopy and Becoming a More Literate Designer
  • The article emphasizes the importance of microcopy in design, highlighting its role in enhancing user experiences and providing guidelines for crafting effective microcopy throughout the design process.
Share:The Essential Guide to Microcopy and Becoming a More Literate Designer
10 min read

Out-of-the-box design process by Gett mobile team.

Article by Moishy Neulander
How Insight from Netflix Profiles Doubled Our Conversions
  • The article discusses how Gett, a taxi booking platform, leveraged insights from Netflix to implement a profile selection feature, significantly improving user experience and doubling business conversions.
Share:How Insight from Netflix Profiles Doubled Our Conversions
5 min read

Did you know UX Magazine hosts the most popular podcast about conversational AI?

Listen to Invisible Machines

This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and