Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Home ›› Business Value and ROI ›› 6 Key Questions to Guide International UX Research ›› Putting Your Content to Work: A user-centric approach to evaluation

Putting Your Content to Work: A user-centric approach to evaluation

by Lindy Roux
4 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

Taking stock of your content’s punch and relevance to users reveals pathways to rewarding interactions.

Too often, we view digital content from the perspective of the organization or business, rather than the user. No matter how beautiful your site design, how elegant your user interface, or how appropriate your information architecture, if your content is not what your users are looking for, their entire experience can be compromised.

For most organizations, digital content is a critical element in establishing the company’s credentials, reinforcing the brand, supporting sales, and delivering value through thought leadership. But excessive content can dilute the value of your message and it can be difficult for a company to be objective about what works and what doesn’t.

Typically, Siteworx finds that a majority of the content views on a website (up to 80%) are concentrated around a small number of pages (as few as 10%). Of course this ratio varies by industry and by company, but there is almost always a cluster of more popular pages and a long tail of pages viewed far less frequently (or not at all).

Increased mobile usage and the trend towards responsive design have also necessitated a closer look at content, with a focus on delivering only the most important content on any given screen.

So how should we, as designers, go about assessing what content is important? Using a score sheet to evaluate your content, both qualitatively and quantitatively, can provide an apples-to-apples comparison that helps determine just how valuable content is to your users. Content scores can be used to rationalize old content, track the effectiveness of current content, and to steer the focus of future content.

How it’s Done

Quantitative measures that can be used to evaluate the value of your content to your users include:

  • Page views: The more popular your content is, the more likely it is to meet your users’ needs. Of course, lack of page views does not automatically disqualify content, since it could just be hard to find on your site.
  • Time on page: This is a tricky metric, since some pages are designed to be navigational, which means we want the time on page to be low. It is best to confine this measurement to content-rich pages, designed to be immersive.
  • Date published and/or updated: The more recently content was published or updated, the likelier it is to be relevant and current (although this may not always be the case, since some content is evergreen).
  • Number of inbound links: Links or references to your web page, particularly from third-party sites, indicate its value to the community.

Since the objective of the content scoring exercise is to develop an aggregate score for comparative evaluation, we typically rate each of these measures on a scale of one to five: assessing the overall site metrics as well as industry standards to create benchmark levels for a particular site or experience. For example, 10 page views could mean a score of one for some sites but five for others.

Similarly, the qualitative evaluation rates content on a scale of one to five for the following factors:

  • Scanability: Can a reader glance at a page and tell you what the basic gist of the content might be? Look for use of bullets, numbers, and sub-headings.
  • Compelling: Is there a clear call to action on the page and compelling language around it? Look for cues as to what the user should do with this content—share it, bookmark it, click for more information, etc.
  • Current: Does the page reference events that have passed and/or information that is no longer true?
  • Relevant: Is your content consistent with user needs and business goals? Does topics occur frequently in internal or external searches or social media?
  • Search engine optimized: Does your content incorporate SEO keywords appropriately?
  • Consistent with brand attributes: If your brand is friendly and playful, does the language on your site reflect these attributes?
  • Accessible, plain language: Does your content comply with plain language standards?
  • Grammar/spelling: Does your content contain grammatical or spelling errors?

Combining the qualitative and quantitative scores will give you a picture of the relative quality and relevance of your content. Based on this assessment, you can inform a content rationalization exercise. Content that scores low on both qualitative and quantitative measures is targeted for decommissioning.

On the other hand, content that scores well quantitatively, but poorly qualitatively, may need revising or updating, while content that scores well across the board might be kept or migrated as-is. This methodology has been helpful to companies Siteworx has worked with, reducing redundant and outdated content (in some cases by up to 90%) and improving the quality of the remaining content.

In addition to a one-off rationalization exercise, the content scorecard can be used to assess the value and performance of your content on an ongoing basis. Typically, we recommend conducting regular content reviews in batches (rather than reviewing your entire universe of content at once). Using a baseline score, you can assess whether your content has continued to perform, and decommission or revise as needed.

Finally, a content scorecard may reveal topics or content styles that always perform well. We worked with a B2B organization that used this information actively in managing their editorial calendar. Content consumption increased because what they were producing was more relevant to their audience.

Conclusion

A content-first approach to user experience design requires a rigorous and user-focused assessment of your content in order to deliver only the content the user wants, when they want it. What criteria are you using to assess and optimize your content?

 

Image of tools courtesy Shutterstock

post authorLindy Roux

Lindy Roux,     As the Principal Content Strategist at Siteworx, Lindy is responsible for articulating, guiding, and overseeing the various content services that we deliver, including content audits and strategy sessions, migration planning, taxonomy development, content definition, modeling, and creation. She has over 15 years of experience defining, creating, and managing engaging digital experiences that resonate with required audiences and drive conversion, helping to meet business and user goals. Lindy has worked on a broad range of projects including enterprise-level websites, intranets, e-commerce sites, experiential microsites, online communities, mobile sites and applications, and email and social media campaigns, across a number of verticals, including retail, telecommunications,  consulting, healthcare, banking, insurance, pharma, and food.   When not busy facilitating strategy or brainstorming sessions, Lindy can be found evangelizing content strategy and talking about our content-centric approach to user experience. Recent speaking engagements include a workshop on Responsive Design at User Focus 2012; a panel on mobile content strategy at Content Strategy Applied 2012; and a workshop on multi-channel content strategy at a Content Mavens Meetup.  

Tweet
Share
Post
Share
Email
Print

Related Articles

What do Architecture, Computer Science, Agile, and Design Systems have in common?

Article by Kevin Muldoon
A Pattern Language
  • The article explores Christopher Alexander’s impact on diverse fields, from architecture to software development, introducing the concept of design patterns and their influence on methodologies like Agile and the evolution of Design Systems.
Share:A Pattern Language
7 min read
Article by Eleanor Hecks
8 Key Metrics to Measure and Analyze in UX Research
  • The article outlines eight essential metrics for effective UX research, ranging from time on page to social media saturation
  • The author emphasizes the significance of these metrics in enhancing user experience and boosting brand growth.

Share:8 Key Metrics to Measure and Analyze in UX Research
6 min read

As consumers’ privacy concerns continue to grow, so should our attention to addressing privacy issues as user experience designers.

Article by Robert Stribley
Designing for Privacy in an Increasingly Public World
  • The article delves into the rising importance of addressing privacy concerns in user experience design, offering insights and best practices for designers and emphasizing the role of client cooperation in safeguarding user privacy.
Share:Designing for Privacy in an Increasingly Public World
9 min read

Did you know UX Magazine hosts the most popular podcast about conversational AI?

Listen to Invisible Machines

This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and