UX Magazine

Defining and Informing the Complex Field of User Experience (UX)
Article No. 1258 June 23, 2014

Crossing the Great UX–Agile Divide

In 2002, interaction design pioneer Alan Cooper debated Kent Beck, creator of the Extreme Programming (XP) variant of agile, about the differences between their respective approaches to building software. Unfortunately, their disagreements were strong and by the end, they had found precious little common ground.

Twelve years later, the war rages on. Every year, the UX community musters more articles, interviews, conference workshops, and panel discussions in an effort to resolve the seemingly unresolvable challenge of integrating UX into an agile process. Now more than ever, it’s important to step back from the growing body of tips, strategies and best practices, and ask why this conflict exists in the first place.

Agile methodologies are famous for their flexibility and ability to adapt to changing circumstances. Agile is a family of methodologies that embrace a wide range of practices rather than dogmatically sticking to a single process. The movement hasn't stood still either. Once dominant, XP lost its place to Scrum as the most popular practice, and lean has seen growing interest in recent years.

With all the flexibility, diversity, and evolution, we might expect a predisposition of openness among agile teams to the alternative perspective that UX brings. So why do we find the opposite?

Since at least as far back as the Cooper–Beck debate in 2002, we've always assumed that agile and UX are different methodologies with the common goal of building quality software. But a closer look at history and rhetoric of the agile manifesto and other writings by its authors suggests another agenda that has nothing to do with software per se: to empower and protect the interests of software developers at work.

The Agile Labor Union

In 2001, a group of leading developers and consultants met to discuss their shared vision of how to transform software development practice. They coined their movement "agile" and wrote a manifesto with 12 principles. Some were related to how work was conducted within software teams, but five principles were aimed at refashioning the relationship between developers, managers, and customers.

Autonomy
  • Agile principle #5. Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done.
  • Agile principle #11. The best architectures, requirements, and designs emerge from self-organizing teams.

The fifth principle asks that managers trust developers and create a pleasant work environment that motivates good work. The eleventh principle advocates for self-organizing teams—giving developers autonomy and freedom to choose the projects and features that most interest them and avoiding a dictatorial style of management. Among agile developers, there is a strong anti-management sentiment that has been described as the "We don't need no stinking managers" attitude.

In the movement, the concept of servant leadership reigns as the dominant role envisioned for managers. It's a more nurturing and empathetic style that rejects the reliance on command-and-control and authority and focuses on empowering and enabling the worker.

Self-organization has always been a controversial topic. In 2007 Jim Highsmith, one of the signatories to the agile manifesto, wrote against it.

I’ve been thinking recently that the term “self-organizing” has outlived its usefulness in the agile community and needs to be replaced. While self-organizing is a good term, it has, unfortunately, become confused with anarchy in the minds of many. Why has this occurred? Because there is a contingent within the agile community that is fundamentally anarchist at heart and it has latched onto the term self-organizing because it sounds better than anarchy.

Highsmith is right about the anarchist streak. Fred George, a software developer and consultant, has proposed an agile methodology called “Programmer Anarchy” that rejects the concept of managers altogether.

Colocation
  • Agile principle #4: Business people and developers must work together daily throughout the project.
  • Agile principle #6: The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.

These principles mandate co-location of the engineering team and daily interaction with business people. Agile leaders justify them with observations about the inefficiency of distributed teams communicating across time zones and the necessity of face-to-face contact as a catalyst for the innovative and creative spirit of the team.

The agile movement only began in 2001, but the methodologies at its core were developed and used much earlier, starting in the 1990s. At that time, economic globalization was a hot political issue. Outsourcing of software and IT work to developing countries like India was beginning to boom and many software developers in the West wondered if they would still have jobs.

Although these concerns aren’t directly mentioned in the manifesto, in practice, the agile principles of colocation and face-to-face interaction with business people amount to a protection from job losses due to outsourcing.

Work Hour Limits
  • Agile Principle #8: Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely.

The eighth principle is about "sustainable development." In Extreme Programming Explained, Kent Beck clarifies this as meaning that software developers should work no more than 40 hours a week, an important rule for an industry that has been known to demand 12 hour days, 7 days a week.

Using these five principles, the agile movement tries to limit the number of working hours, demands autonomy and self-management, rejects authoritarian styles of management, prevents software jobs from being shipped overseas, and ensures that software engineers are involved in their workplace as decision makers. In other economic sectors, these principles would be recognized as a list of work rules, part of the labor agreement that a union leaders would negotiate with an employer on behalf of union members.

The Unity of the Development Process

Agile methodologies are framed in opposition to waterfall, a linear model of software development where requirements are written up by managers and then "thrown over the wall" for developers to implement. Instead, agile methodologies stress working software over comprehensive documentation, and it's a major reason why designers and design researchers have difficulty working with agile teams.

Agile promoters criticize waterfall for many reasons. Requirements are a moving target and waterfall isn't iterative and doesn't leave time for customer feedback. But there's something more at stake beyond arguments about quality.

Agile leaders sometimes argue against the practice of Taylorism (or Scientific Management), an efficiency-oriented management theory that originated in the late 19th century to improve productivity through time- and motion-based measurement, standardization, rationalization of work, and division of labor. Often associated with factory work, a key principle of Taylorism is “the separation of conception from execution”—a way of organizing work so that a manager or expert decides what needs to be done and how to do it, and gives the instruction over to the worker who simply executes the instructions in an efficient, but mechanical way; "throwing it over the wall" in the agile idiom. In this model, there's no room for the worker to bring his or her own creativity to the work. Those higher functions are reserved for managers and experts.

Martin Fowler, a leader of the agile movement, wrote an introductory essay that includes a section called “Separation of Design and Construction” where he criticizes this idea:

The notion of people as resources is deeply ingrained in business thinking, its roots going back to the impact of Frederick Taylor’s Scientific Management approach. In running a factory, this Taylorist approach makes sense. But for the highly creative and professional work, which I believe software development to be, this does not hold.

In Extreme Programming Explained, Beck makes a similar argument:

No one walks around a development shop with a clipboard and a stopwatch. The problem for software development is that Taylorism implies a social structure of work. While we’ve lost the rituals and trappings of time-and-motion studies, we have unconsciously inherited this social structure and it is bizarrely unsuited to software development. The first step of social engineering in Taylorism is the separation of planning from execution. It is the educated engineers who decide how work is to be done and how long it will take. The workers must faithfully execute the assigned task in the assigned way in the allotted time and all will be well. Workers are cogs in a machine. No one wants to think of himself as a cog.

These agile leaders owe this line of criticism and even the particular phrase "separation of conception from execution" to the American socialist writer and labor activist Harry Braverman. In his wide-ranging critique of industrial capitalism Labor and Monopoly Capital: The Degradation of Work in the Twentieth Century published in 1974, Braverman makes the following point:

In the human, as we have seen, the essential feature that makes for a labor capacity superior to that of the animal is the combination of execution with a conception of the thing to be done. But as human labor becomes a social rather than an individual phenomenon, it is possible—unlike in the instance of the animal where the motive force, instinct, is inseparable from action—to divorce conception from execution. This dehumanization of the labor process, in which workers are reduced almost to the level of labor in its animal form, while purposeless and unthinkable in the case of the self-organized and self-motivated social labor of a community of producers, because crucial for the management of purchased labor. For if the workers’ execution is guided by their own conception, it is not possible, as we have seen, to enforce upon them either the methodological efficiency or the working pace desired by capital. The capitalist therefore learns from the start to take advantage of this aspect of human labor power, and to break the unity of the labor process.

Many proponents of agile reject the use of requirements and spec on the grounds that they amount to unnecessary overhead that slows down the team. Designers and researchers respond by speeding up the pace, becoming more efficient and using ever lighter and faster methods. The widely accepted best practice for working with agile teams is the "one sprint ahead" model—designers complete designs in one sprint, and then turn over their work to be implemented by developers.

Taking the agile methodologies at face value, that approach should work. But many agile proponents have rejected it as "mini-waterfall" despite the increased speed and agility. Perhaps speed is not really what's at stake. Taking their cues from Harry Braverman, leading thinkers in the agile movement and many agile teams insist on the unity of the labor process as a way of preserving the autonomy and self-management of software developers.

Taken to its logical conclusion, the mere existence of designers and researchers as specialized job roles in a software team divorces conception from execution, regardless of how much or little documentation they create, how fast they do it, or whether they do it upfront or incrementally.

Traditionally, a software project is evaluated in terms of business and technical considerations: Was it delivered on time and under budget? Does it have all the needed features? Is it free of bugs? Does it perform well? Is it stable, easy to maintain, and secure? If you're in the UX field, right about now you're suppressing the urge to say "OK yes, but what about the users!"

That instinct signals a far-reaching industry-wide shift in our thinking about the nature of technology. The rise of specialized UX job roles is rooted in the growing recognition in the industry that a software project also needs to be evaluated from the perspective of the user's experience. Is it simple, effective, usable, attractive, and culturally and contextually appropriate? Does it support or hinder the user's goal? Is it unnecessarily complex, confusing, or intolerant of errors?

The expansion in our understanding of what makes a successful technology project means an expansion in the number of skills that must be brought to bear. We increasingly draw on a wide range of competencies beyond those traditionally taught in university computer science departments: research methods from anthropology, sociology and other social sciences; usability techniques developed by cognitive psychologists; visual, interaction and motion design; information architecture; branding; prototyping; and so on.

The agile ideal is unity of the labor process, where a software developer's execution is guided by their own conception. But in many cases, the agile vision of the labor process hasn't kept pace with these new competencies. Agile teams use software development processes to ensure autonomy and self-organization in the workplace, but these structures end up doing more harm than good when they fail to recognize the importance of new skills that are outside the traditional purview of software engineering. These skills are pushed outside of the team, reintroducing the feared separation of conception and execution.

What Can UX Designers Do?

If the purpose of agile methodologies is to protect software developers’ autonomy at work, then our existing attempts combine agile with UX practices are making a category error. The fraught relationship between UX and agile looks more like a labor conflict than a disagreement about the best way to build software.

Designers and researchers surely don't intend to harm the workplace interests of the software developers we work alongside. User-centered design has a historical connection to the Cooperative Design movement of Scandinavian trade unions. It is a fundamentally democratic and egalitarian approach that recognizes the practical and—dare we say it—moral importance of empowering users (who are often workers) by including them into the design process.

A diplomatic approach can be helpful to smooth over tensions with developers while we're trying to create conditions for effective design. Some teams may be receptive to the view that UX could be an ally of software developers in their quest for autonomy and self-management—delighted users would be a clear signal to business owners that the process is working.

But in the long term, the presence of UX design and research as specialized job roles commoditizes software development work, moving crucial creative and strategic activities outside the team. This may lead to an increase in standardization and routinization of programming work, lower compensation, and outsourcing for all but the most talented developers.

The sociologist George Ritzer coined the term “McDonaldization” to describe this process, where innovation and creative problem solving are removed from work activities. A hyper-rational form of work more commonly associated with fast food restaurants takes its place, one characterized by efficiency, calculability, predictability, and control.

Although many software developers today enjoy the high salaries and excellent working conditions associated with white-collar work, it may not stay that way and UX design could be a contributing factor. The McDonaldization of software work may already be underway in the guise of lean software development, a newer methodology, which has grown rapidly in influence and adoption.

The McDonaldization of software work may already be underway in the guise of lean development

Lean draws inspiration from Taylorist management philosophies, which have been used to improve the efficiency of automobile factory work. It focuses relentlessly on continuous improvement, waste minimization, and optimization of performance metrics. Unions and labor activists call it “management by stress” because of the way the work process is decomposed into repetitive motions and optimized down to the second.

Resolution

Software developers have faced the pressures of McDonaldization in the past, most notably in the so-called Software Crisis of the ‘60s and ‘70s, a period of time when the industry was concerned with problems of low quality and rising labor costs. Prominent academics in computer science departments and research labs developed new techniques to address this problem. They were surprisingly open about their intent to reduce skills and eliminate the creativity associated with the job, making it more like factory work.

In 1974, the creators of the database query language SQL explicitly framed the goal of their project to reduce the labor costs of software, much like the structured programming technique that had been developed in the same period:

As computer systems become more advanced we see a gradual evolution from procedural to declarative problem specification. There are two major reasons for this evolution. First, a means must be found to lower software costs among professional programmers. The costs of program creation, maintenance, and modification have been rising very rapidly. The concepts of structured programming have been introduced in order to simplify programming and reduce the cost of software. Secondly, there is an increasing need to bring the non-professional user into effective communication with a formatted database. Much of the success of the computer industry depends on developing a class of users other than trained computer specialists.

In the 1990s, object-oriented programming was used for very similar purposes. Brad Cox, creator of the Objective-C programming language, believed that software costs would finally be kept under control with the silver bullet of the object-oriented paradigm:

The silver bullet is a cultural change rather than a technological change. It is a paradigm shift—a software industrial revolution based on reusable and interchangeable parts that will alter the software universe as surely as the industrial revolution changed manufacturing … I use a separate term, software industrial revolution, to mean what object-oriented has always meant to me: transforming programming from a solitary cut-to-fit craft, like the cottage industries of colonial America, into an organizational enterprise like manufacturing is today.

Over the last 50 years, the organization of software work through nearly universally adopted techniques like object-oriented programming and structure programming has tended towards a trajectory of deskilling and routinization. It's easy to see why there would be a reaction against this trend, and why the agile philosophy emerged in the 1990s to counter it. By creating processes that protected their interests, agile methodologies allowed programmers to also benefit from the quality improvements brought by new software engineering techniques and mitigating some of the potentially harmful effects on their jobs.

Many in the UX field have felt obstructed by development teams who insist on adopting agile methodologies and confused about why their arguments for paying closer attention to users go unheeded. It may be helpful to ask whether the presence of UX designers and researchers on the team contributes to rolling back some of the hard-earned job protections that software developers have won, and to find new ways of talking about and practicing our craft that highlight the benefits to them.

 

Image of ants building a bridge courtesy Shutterstock.

Add new comment

Comments

From the first paragraph this article is full of misunderstandings, and inaccurate or just plain wrong statements.

Let's start with the first statement: "twelve years later the war rages on."

Well if you are referring to Allan Cooper, you are simply wrong. I was at Allan Cooper's keynote at Agile 2008 where he spoke about the similarities of what Agile and interaction design we trying to accomplish. You can find that interview at http://www.infoq.com/interviews/Interaction-Design-Alan-Cooper

In 2009 Cooper released a full endorsement of Agile principles and the quality of product based on skills learned in a craft like way, stating that like programming, IxD is a craft. You can find that long endorsement here.

http://www.cooper.com/journal/insurgency_of_quality/insurgency-of-quality.pdf

Agile teams and product owners have been heavily influenced by Personas, activity mapping, called user story mapping, and many other discovery techniques that originated in the UCD community.

In 2011 I wrote the article Agile User Interface Design and Information Architecture from the trenches describing how Just In Time UX can be integrated in Agile teams. http://www.innovel.net/?p=220

The misunderstanding permeating this article makes me wonder where are the editors and fact checking?

Hi Mike,

This article is before its time. I have been uncomfortable with the methodologies used by industry for a number of years but have not been able to articulate it as well as you have in this article, well done. Managers seem to be more concerned with 'managing the software process', rather than the end user. I am starting to wonder if half these IT managers even know what 'uasbility' or 'UX' actually is :).

Regards,

Reece

This article seems takes an scholarly, academic approach to proving its point. You have to experience agile to understand it.

It seams like software here is perceived as flat, working GUI.It is true in certain type of systems like simple web systems. However, in general, I don't think there is any threat to McDonaldization of software creation. Software is not UI.

First, @Mike, I do believe you're an ignoramus and that you've authored this article intending to stir a pot or stoke a fire. I hope you fail in that goal.

Second. Agile methods aims to solve certain business problems:to enable both continual innovation AND disciplined execution -- all in an iterative, collaborative, team-based setting.

Lean UX aims to solve other certain business problems: to quickly test decisions made regarding the design and/or implementation of a system.

Both methods are useful -- they are different tools which serve DIFFERENT YET COMPLEMENTARY purposes. Agile and Lean UX are both about inspection and adaptation, both encourage quick feedback loops, test-driven development and/or testing and validation of design.

A great Scrum or XP team will be capable of incorporating Lean UX practices really well. And an experienced Lean UX practitioner will appreciate the way that Scrum or XP provide helpful team-based routines, rigor, and self-organization.

If you believe there's a divide between the two practices -- or worse, a zero-sum "competition" -- then I encourage you to observe Lean UX as practiced by a great Scrum or XP team. (come visit us at http://myplanet.io)

This article has some merit, but misses a key component: agile techniques are designed to prepare for change and "Mcdonaldiztion" isn't. By putting UX at odds with agile practices, both practices are undermined. The value of agile practices to a UX designer is that if a UX has a fundamentally flawed assumption about their users, the software has been built o be easily changed and retested. Even frequent usability tests with high fidelity prototypes can't uncover all the pitfalls a UI will produce. True usability is uncovered through observation, and a prototype lessens its effectiveness.
I'm not defending the "anarchist" tendencies of agilists. I'm saying agile, if practiced well, can complement UX.

I think your point about Mcdonaldization not being prepared for change is rooted in a common misconception about ways of organizing work according to Taylorist principles. The stereotype of assembly line work as rigid doesn't match up with today's highly flexible and dynamic supply chain networks. The most interesting examples are "fast fashion" brands like Forever 21 and Zara, which have adopted rapidly iterative manufacturing processes that are responsive to constantly evolving "customer requirements" (i.e. fashion trends), and able to deliver new products to stores within weeks where traditional clothing manufacturers may take 6 months or more. But these companies are also linked to serious abuses like child labor, sweatshops, even slave labor, which underscores the point that reorganizing work in this way isn't necessarily good for workers.

The issue of testing prototypes vs working products is an interesting one. I think if we let go of disciplinary divisions, its clear that you can evaluate a product at many different levels of fidelity, depending on the product and what aspects you're evaluating. Of course its true that some things can't be evaluated except in the highest fidelity, but some agile thinkers believe that only working code should be evaluated. To me, this really about who is involved in the evaluation and who decides how to respond to the findings. Insisting on working code usually guarantees developers a seat at the table.

Let's separate the "software coders" from the "system architects" from the UX Designers" first. Coders succeed when they can build to some agreed upon specification. System architects succeed when their design of the structure of the software program optimizes the build goals (code quickly and efficiently) and the operating goals (cheap to maintain and conforming to end-users needs). But neither of these professions is equipped to design and build the optimal software user experience. People with 1any of these three unique skill sets have been trained in these disciplines and should be good at what they do in within their own discipline.
In my experience, the resolution of the tension between coder, system architect, and UX designer seems to happen naturally and organically at shops that are building commercial or retail software for sale. The mutual respect between these three disciplines for the other skill sets is palpable and contributes to the overall quality, cost, and marketability of the products.
However, if the software is a product of an internal corporate IT shop, the build model is focused on cheap and quick code development, to fit an arbitrary time frame and budget. Good system designers or UX designers do not spit out code on a regular and predictable pace, so these designers are often not part of the build process. The attraction by the business to Agile processes in the internal corporate environments is all about seeing some code in production as fast as possible, so that the arbitrary budget and build timelines can be met; Good UX and system designs are expensive and stall the code delivery process, so they go out the window with the bath water.

This went in a much different direction than I expected.

I disagree about the assessment of Colocation as solely a job-loss protection. I have worked on distributed teams as Interface Designer. On one project half of the developers were sitting in the cubes next to me, while the other half were on the other side of the globe. Guess which half delivered components with less rework. It wasn't the developers that sat 12 hours different from me. And it wasn't for a lack of skill, in fact there were some amazing developers on the distant team. I completely attribute it to the fact that the colocated developers could discuss design issues and seek further direction with a 15 minute conversation in shared space. The t-cons and video chats with the other team were nowhere near as productive.

Also, I depend upon the creativity of the developers I work with. I have found that discussing what the user needs to do with a developer often leads to creative solutions on how the user can complete that task. In my experience, developers typically ask for more information about the use cases/scenarios not less. This is especially true if the use case is framed as a user goal rather than a feature of the product. This makes me rather certain that the UX method does not represent a true or perceived threat to the Developer's creativity.

I have seen time and time again, when UX is not considered, the projects (even when delivered to spec or through agile) are relegated to shelfware. Often replaced by spreadsheets and paper. Yes. Spreadsheets and paper.

Yet managers are the ones who are reluctant to hire these UX skills. They seem not to grasp that software that has a great UX amplifies the efficiencies they seek by developing software solutions through increased adoption, less user training, and efficiencies of a well designed app (such as intuitive layout, good use of colour, clear and understandable terminology).

To reliably deliver great software, UX has to not just be a consideration, but a focus on a par with the requirements. To achieve this in an agile way involves everyone. To use a formula 1 analogy, the designers could be the aerodynamicists, and the developers the engine manufacturers. Both skills highly specialised and completely different yet both working together for the common goal.

Developers are also generally rubbish at UX, but still think they can do it. Even today in the face of a changed world where even they reject the 99 apps in the App Store for the one that works the best. The one which is fast, fluid, reliable, fun and intuitive. They are choosing apps based on UX, yet expect their users not to because they're in a corporate environment. Software development is about getting people to work faster, better... Not about slowing them down. It's not enough to deliver to spec.

Ha.

The last "UX Architect" I worked with, was rubbish at UX.  He didn't think he was though, he thought he was brilliant.  He was one of those guys that would get in the elevator and say "See how these buttons are arranged, it is all wrong, they should do this and this and...."

Some engineer or team of engineers designed the panel and the elevator based on concepts and factors that this guy had no clue about.  It was useless confjecture and blather, much like most of his ideas.

I fired him after 3 months on the project.

I am a senior developer at a London based financial company. There is continuously a massive amount for developers to learn as new technologies come about, architectures change, programming languages and frameworks develop and evolve. It is simply not true that developers cannot use their creativity and problem solving skills while focussing purely on development. Development is a challenging and rewarding skill that provides businesses with a lot of value.

"I am a senior developer at a London based financial company. ... It is simply not true that developers cannot use their creativity and problem solving skills while focussing purely on development."

It's just that when it comes to financial software, they choose not to.

Agreed. These were my thoughts as well. In addition, the developers on our team actually WANT to have the functionality defined for them, so that they can get down to work. The more thought-out and better defined the specifications are, the happier they are.

Great article