Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Get exclusive access to thought-provoking articles, bonus podcast content, and cutting-edge whitepapers. Become a member of the UX Magazine community today!

Home ›› Usability ›› Evaluating UX by Measuring Task Efficiency

Evaluating UX by Measuring Task Efficiency

by Cathleen Zapata
5 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

Using the Keystroke-Level Model to evaluate physical and cognitive demands.

Efficiency, or the amount of time an expert user takes to complete a task, can have a direct impact on the bottom line. A few seconds off a call center task can save hundreds of thousands of dollars in a year. However, what is the impact of efficiency on consumers?

Measuring Task Efficiency

CogTool is a software product developed by Dr. Bonnie John’s lab at the Human Computer Interaction Institute of Carnegie Mellon University. CogTool evaluates a design with a predictive human performance model to predict the time it will take for an expert to complete a task. CogTool’s modeling procedure is based on the Keystroke-Level Model (KLM), which assigns time values to each physical and cognitive action. Actions may include reading text on a screen, moving the mouse, moving the hand from the mouse to the keyboard, and typing.

Example in Use

CogTool can be used to evaluate and enhance a form application process, checkout flow, or software UI where speed to completion is essential to the bottom line. For example, Metrics Marketing Group’s Research & Customer Experience team conducted a study utilizing CogTool to evaluate and compare the account opening process amongst 13 top financial institutions. The goal of this study was to measure the task time of opening a checking account online to benchmark this process and use insights obtained to optimize this common process on most banks’ websites.

Through this study, Metrics learned that the average online checking account application completion time was 6 minutes, 49 seconds, with the shortest being 4 minutes, 37 seconds and the longest being over 10 minutes. As this evaluation is based on the prediction of an “expert” user, this finding raised a concern. The reality of the lack of efficiency of opening a checking account online for non-expert users may be increased substantially. In fact, research shows that when performing tasks online, the slowest 25% of users take 2.4 times longer as the fastest 25%.[*]

In addition to utilizing CogTool to measure task efficiency, this study also included measuring the total number of questions to complete the application. The results of this measurement concluded that the average number of total application questions was 42, with 33 of them required, on average.

Sample Results:

BANK NAME
TIME
REQUIRED QUESTIONS
TOTAL QUESTIONS
BB&T
4m 37s
31
38
Principal
5m 27s
37
43
US Bank
5m 32s
35
45
Bank of America
6m 1s
19
29
Wells Fargo
7m 38s
31
42
CitiBank
10m+
37
46

Table 1: Predicted times shown for a user to complete an online checking account application.

The following chart demonstrates efficiency excluding time spent reading disclosures. This was important to review as previous research indicates that users do not typically fully read disclosures. In addition, almost 80% of users scan the page instead of reading word-for-word online. (Source: Sun)

BANK NAME
TIME
Fifth Third Bank
3m 55s
BB&T
4m 37s
Bank of America
5m 8s
Principal
5m 27s
CitiBank
5m 30s
US Bank
5m 32s
Wells Fargo
6m 2s

Table 2: Predicted times shown without the task of reading disclosures.

The efficiency of performing the task of opening a checking account online is only one obstacle to overcome. The UX team at Metrics has utilized survey research, usability testing, and numerous other methods to uncover common UX issues with this process. Some of the key findings through other research means includes common use of jargon, users’ inability to know what to do next, not seeing and not knowing how to correct site errors, issues with site credibility and security, and too many questions causing user frustration, site abandonment and inability to complete, just to highlight a few.

Additional Points of Measurement

Incorporating multiple points of measurement to evaluate the user experience often yields the best results. According to Forrester Research, 58% of visitors who experience usability problems on a website never return. In addition, the average website can double its conversion rate by designing for increased usability. (Source: IBM)

Evaluating efficiency and task times is a helpful way to benchmark, identify elements of a task that take longer than others, orcompare design ideas. For interfaces where efficiency is the most vital evaluation metric, CogTool can be a very helpful tool. CogTool provides just one point of measurement to evaluate a site’s usability in order to improve conversion and the overall customer experience. However, it is important to keep in mind that efficiency testing only goes so far.

It does NOT tell us:

  • The “Whys” of User Behavior – We might know how long it will take an expert to complete a task, but we do not know why it takes actual site users longer. We do not know why a user might be unwilling to complete the application.
  • Potential Risks in Design – We are unable to identify places where users will be confused, unwilling or unable to convert, or have a negative experience.
  • Actual User Expectations and Needs – Task times do not provide insight into what actual users want to do, what they expect, what their goals are, how they go about selecting a product in the first place, or what overall process this interface fits into. For example, we can answer the question of how long it would take an expert user to complete the online checking account application, but not if that expert would even want to complete it in the first place.
  • Non-Expert User Performance – CogTool and the KLM methodology are designed to measure expert use. However, for most interfaces we interact with—whether it’s a website or a DVR programming screen—actual users often do not use it frequently enough to become an expert.

Utilizing other user research methods such as usability testing, heuristic evaluations, user personas, card sorting, focus groups, and surveys are essential to fully evaluate the user experience and maximize conversion.

To comprehensively measure and benchmark, additional points of measurement to consider include:

  • Task Completion Success / Failure
  • Time to Complete Task (of actual system users)
  • Steps to Complete Task
  • Frustration / Delight
  • Number of Errors
  • Severity of Errors
  • Learnability
  • Memorability
  • System Usability Scale (SUS) Score
  • Eye Tracking Measurements, such as:
    • Gaze Sequence
    • Hit Rate
    • Dwell Time
  • Subjective Satisfaction

Use points of measurements based upon the overall goals of the interface, as well as areas of the user experience that specifically need to be benchmarked and measured.

post authorCathleen Zapata

Cathleen Zapata
Cathleen Zapata is currently the Vice President of Research and Customer Experience at Metrics Marketing Group, an analytics-driven database and interactive marketing firm in Cleveland. She is also the President of the Northeast Ohio chapter of the Usability Professionals’ Association, one of the most trusted organizations in the field of usability with over 2500 members worldwide. Cathy has over 15 years experience in quantitative and qualitative research, specializing in usability. Her background includes a wealth of experience in usability testing, eye tracking, heuristic evaluations, user research and conversion enhancement strategies. She is the architect and manager of MetricsLabSM, a groundbreaking research facility in Cleveland, Ohio, making Metrics Marketing the only interactive marketing firm with a full solution of both owned in-lab and mobile eye tracking across the entire Midwest. Cathy has been featured in MarketingSherpa, About.com, Crain's Cleveland Business and The Columbus Dispatch, and regularly teaches on topics of usability and conversion strategies at colleges, conferences and events internationally, including Shop.org, BAI Retail Delivery, UPA, and more. She has worked with numerous Fortune 500 companies to improve their user experiences including American Greetings, Sherwin-Williams, Coach, Inc., Midas, Speedway, KeyBank, Citizens Bank, The Cleveland Clinic, Medical Mutual of Ohio, Nationwide Insurance, and more.

Tweet
Share
Post
Share
Email
Print

Related Articles

Join the UX Magazine community!

Stay informed with exclusive content on the intersection of UX, AI agents, and agentic automation—essential reading for future-focused professionals.

Hello!

You're officially a member of the UX Magazine Community.
We're excited to have you with us!

Thank you!

To begin viewing member content, please verify your email.

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and