Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Get exclusive access to thought-provoking articles, bonus podcast content, and cutting-edge whitepapers. Become a member of the UX Magazine community today!

Home ›› Business Value and ROI ›› 6 Key Questions to Guide International UX Research ›› Putting Your Content to Work: A user-centric approach to evaluation

Putting Your Content to Work: A user-centric approach to evaluation

by Lindy Roux
4 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

Taking stock of your content’s punch and relevance to users reveals pathways to rewarding interactions.

Too often, we view digital content from the perspective of the organization or business, rather than the user. No matter how beautiful your site design, how elegant your user interface, or how appropriate your information architecture, if your content is not what your users are looking for, their entire experience can be compromised.

For most organizations, digital content is a critical element in establishing the company’s credentials, reinforcing the brand, supporting sales, and delivering value through thought leadership. But excessive content can dilute the value of your message and it can be difficult for a company to be objective about what works and what doesn’t.

Typically, Siteworx finds that a majority of the content views on a website (up to 80%) are concentrated around a small number of pages (as few as 10%). Of course this ratio varies by industry and by company, but there is almost always a cluster of more popular pages and a long tail of pages viewed far less frequently (or not at all).

Increased mobile usage and the trend towards responsive design have also necessitated a closer look at content, with a focus on delivering only the most important content on any given screen.

So how should we, as designers, go about assessing what content is important? Using a score sheet to evaluate your content, both qualitatively and quantitatively, can provide an apples-to-apples comparison that helps determine just how valuable content is to your users. Content scores can be used to rationalize old content, track the effectiveness of current content, and to steer the focus of future content.

How it’s Done

Quantitative measures that can be used to evaluate the value of your content to your users include:

  • Page views: The more popular your content is, the more likely it is to meet your users’ needs. Of course, lack of page views does not automatically disqualify content, since it could just be hard to find on your site.
  • Time on page: This is a tricky metric, since some pages are designed to be navigational, which means we want the time on page to be low. It is best to confine this measurement to content-rich pages, designed to be immersive.
  • Date published and/or updated: The more recently content was published or updated, the likelier it is to be relevant and current (although this may not always be the case, since some content is evergreen).
  • Number of inbound links: Links or references to your web page, particularly from third-party sites, indicate its value to the community.

Since the objective of the content scoring exercise is to develop an aggregate score for comparative evaluation, we typically rate each of these measures on a scale of one to five: assessing the overall site metrics as well as industry standards to create benchmark levels for a particular site or experience. For example, 10 page views could mean a score of one for some sites but five for others.

Similarly, the qualitative evaluation rates content on a scale of one to five for the following factors:

  • Scanability: Can a reader glance at a page and tell you what the basic gist of the content might be? Look for use of bullets, numbers, and sub-headings.
  • Compelling: Is there a clear call to action on the page and compelling language around it? Look for cues as to what the user should do with this content—share it, bookmark it, click for more information, etc.
  • Current: Does the page reference events that have passed and/or information that is no longer true?
  • Relevant: Is your content consistent with user needs and business goals? Does topics occur frequently in internal or external searches or social media?
  • Search engine optimized: Does your content incorporate SEO keywords appropriately?
  • Consistent with brand attributes: If your brand is friendly and playful, does the language on your site reflect these attributes?
  • Accessible, plain language: Does your content comply with plain language standards?
  • Grammar/spelling: Does your content contain grammatical or spelling errors?

Combining the qualitative and quantitative scores will give you a picture of the relative quality and relevance of your content. Based on this assessment, you can inform a content rationalization exercise. Content that scores low on both qualitative and quantitative measures is targeted for decommissioning.

On the other hand, content that scores well quantitatively, but poorly qualitatively, may need revising or updating, while content that scores well across the board might be kept or migrated as-is. This methodology has been helpful to companies Siteworx has worked with, reducing redundant and outdated content (in some cases by up to 90%) and improving the quality of the remaining content.

In addition to a one-off rationalization exercise, the content scorecard can be used to assess the value and performance of your content on an ongoing basis. Typically, we recommend conducting regular content reviews in batches (rather than reviewing your entire universe of content at once). Using a baseline score, you can assess whether your content has continued to perform, and decommission or revise as needed.

Finally, a content scorecard may reveal topics or content styles that always perform well. We worked with a B2B organization that used this information actively in managing their editorial calendar. Content consumption increased because what they were producing was more relevant to their audience.

Conclusion

A content-first approach to user experience design requires a rigorous and user-focused assessment of your content in order to deliver only the content the user wants, when they want it. What criteria are you using to assess and optimize your content?

 

Image of tools courtesy Shutterstock

post authorLindy Roux

Lindy Roux
    As the Principal Content Strategist at Siteworx, Lindy is responsible for articulating, guiding, and overseeing the various content services that we deliver, including content audits and strategy sessions, migration planning, taxonomy development, content definition, modeling, and creation. She has over 15 years of experience defining, creating, and managing engaging digital experiences that resonate with required audiences and drive conversion, helping to meet business and user goals. Lindy has worked on a broad range of projects including enterprise-level websites, intranets, e-commerce sites, experiential microsites, online communities, mobile sites and applications, and email and social media campaigns, across a number of verticals, including retail, telecommunications,  consulting, healthcare, banking, insurance, pharma, and food.   When not busy facilitating strategy or brainstorming sessions, Lindy can be found evangelizing content strategy and talking about our content-centric approach to user experience. Recent speaking engagements include a workshop on Responsive Design at User Focus 2012; a panel on mobile content strategy at Content Strategy Applied 2012; and a workshop on multi-channel content strategy at a Content Mavens Meetup.  

Tweet
Share
Post
Share
Email
Print

Related Articles

Trusting AI isn’t the goal — relying on it is. This article explores why human trust and AI reliance are worlds apart, and what UX designers should focus on to make AI feel dependable, not human.

Article by Verena Seibert-Giller
The Psychology of Trust in AI: Why “Relying on AI” Matters More than “Trusting It”
  • The article argues that “reliance,” not “trust,” is the right way to think about users’ relationship with AI.
  • It explains that human trust and AI reliance are driven by different psychological mechanisms.
  • The piece highlights that predictability, transparency, and control make users more willing to rely on AI.
  • It concludes that users don’t need to trust AI as a partner — only rely on it as a dependable tool.
Share:The Psychology of Trust in AI: Why “Relying on AI” Matters More than “Trusting It”
4 min read

What if your productivity app could keep you as focused as your favorite game? This article explores how game design psychology can transform everyday tools into experiences that spark flow, focus, and real engagement.

Article by Montgomery Singman
Flow State Design: Applying Game Psychology to Productivity Apps
  • The article shows how principles from game design can help productivity tools create and sustain a flow state.
  • It explains that games succeed by balancing challenge and skill, providing clear goals, and offering immediate feedback — elements most productivity apps lack.
  • The piece argues that applying these psychological insights could make work tools more engaging, adaptive, and motivating.
Share:Flow State Design: Applying Game Psychology to Productivity Apps
12 min read

Learn how understanding user emotions can create intuitive, supportive designs that build trust and loyalty.

Article by Pavel Bukengolts
The Role of Emotion in UX: Embracing Emotionally Intelligent Design
  • The article emphasizes that emotionally intelligent design is key to creating meaningful UX that satisfies users and drives business success.
  • It shows how understanding users’ emotions — through research, empathy mapping, journey mapping, and service blueprinting — can reveal hidden needs and shape more intuitive, reassuring digital experiences.
  • The piece argues that embedding empathy and emotional insights into design strengthens user engagement, loyalty, and overall satisfaction.
Share:The Role of Emotion in UX: Embracing Emotionally Intelligent Design
5 min read

Join the UX Magazine community!

Stay informed with exclusive content on the intersection of UX, AI agents, and agentic automation—essential reading for future-focused professionals.

Hello!

You're officially a member of the UX Magazine Community.
We're excited to have you with us!

Thank you!

To begin viewing member content, please verify your email.

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and