Over the last month, Jonathan from the Web Services team has carried out an audit of guide pages on the University website. It’s been an incredibly useful exercise, not least because it’s made me reflect on why we created guides in the first place.
Guides were one of the first content types that were developed in 2019 in the early days of the website transformation project. Back then, we understood that users visit a website primarily to access information that helps them perform a task. This is obviously still the case today.
As users arrive at a web page with a specific information need, it’s vital that we provide answers to their questions. To make it easy for them, we should do this through content that is clear, in plain English, and optimised for the web.
User versus business benefits
Naturally, we tend to focus on the benefit to users when arguing the case for useful web content but there are sound business reasons for this approach too.
- It increases customer satisfaction.
- It impacts positively on the University brand and enhances our reputation.
- It takes pressure off our other organisational channels to provide information (e.g. support calls and emails).
- If content is created based on user needs then it should impact positively on our search rankings for these pages.
Task-based content
Guides were our solution for providing a structure for task-based content and the audit of guides has shown that guides are some of most popular pages on the website – see table below. Many of these are related to IT or student recruitment tasks like:
- Accessing your email
- English language requirements
- Paying your tuition fees
Page views for popular guides
Over a period of 12 months, guide pages as whole received almost 950,000 visits. Interestingly, that traffic is split around 50/50 between desktop and mobile. That poses a question around the optimal amount of content for these pages. We know from previous user testing sessions that mobile users get frustrated when there’s lots of content on a page. If we end up with a particularly long guide page, it might be an indication that we’re trying to fulfil too many user needs on a single and that it would be better split into separate pages.
However, page visits don’t really tell the full story about whether the content is useful.
Bounce rate
Bounce rates were another metric that we looked at in the guides audit. Bounce rates indicate the percentage of visitors that arrive at page on a website and then leave without visiting any other pages. A low bounce rate is often used to indicate page content that is engaging the user enough to visit other pages on the website. Conversely, a high bounce rate can suggest that the user hasn’t found what they were looking forward and gone elsewhere.
The ‘Access your email using Outlook on the web’ guide has a bounce rate of 82%. Is this a poor performing page? Well, no because the call to action link prompts the user to leave the website and visit outlook.com. If the user successfully reaches this link, then they have completed their task – happy user and the guide has fulfilled its intended purpose!
Most discussions about bounce rate are useful as a performance metric are prefaced with the words “well, it depends” and guides are no exception.
Beyond quantitative metrics
One of the qualities we aim for across all website content is consistency. If you can be consistent in the way guide content is written and structured, then it builds trust between the organisation and the user. If users trust us then they can confidently make decisions.
To understand if we’re taking a consistent approach to creating guide content, we need to move beyond quantitative metrics and look at the quality of the content itself.
As part of the audit we (or should I say Jonathan mainly) reviewed each of the 348 guides on the website – yes it took a while! This process involved:
- Establishing if the content had been updated recently.
- Contacting owners to check if the page was still needed.
- Reviewing the content against our guide standards.
From this we discovered that:
- 160 guides had not been updated the last 12 months
- 21 guides had not been updated in the last 2 years
This doesn’t necessarily mean the content is inaccurate, but it could be taken as warning sign that it might be and needs checked.
Defining ownership
We archived 50 guides after confirming with staff that the content was no longer needed. However, during this process we realised that it was often difficult to establish who was responsible for content. Sometimes it wasn’t obvious who to contact or, even if it was, a response wasn’t forthcoming. It was clear that we needed a better process for defining ownership and the responsibilities of the role.
Going forward, our aim is for all guides to have a named content owner. In practice, this will usually be the person who works with Web Services to create the content in the first instance.
Once we establish a content owner, we can check if the content is:
- valid (whether it is accurate)
- relevant (whether our users care about it)
When speaking to content owners, it is easier for them to confirm validity than relevancy. Content owners will often just assume because the content is published that there will be an audience for it, this isn’t always the case!
Obviously, there will be considerations around content owners moving to different jobs or if the guide overlaps with different services, but we can take a pragmatic approach to this.
Content standards
Our guides standards were written back in 2020 and whilst we’ve adhered to many of the principles defined in this, the audit identified guides that could be improved in terms of content clarity and plain English.
Documentation and standards are essential, but it also needs to be accompanied by a defined content creation process that should be followed for every guide. This is something we intend to remedy with an updated workflow for guides that can be followed during projects and will be documented in our standards.
The starting point for content creation is, establishing from evidence, a clear user need for a guide. If the user need doesn’t exist then neither should the content!
Test published guides with users
We put a lot of emphasis on testing content with users during the content creation process but less so once it’s been published. Content relevancy can diminish over time so it’s important that we test guides with users regularly to establish whether the content is still needed.
The sheer number of guides on the website makes user testing all of them a challenge. One way of prioritising the guides to test with users is to consider whether the content relates to a top task.
A top task is one that occurs frequently across our audience needs. Gerry McGovern (who created the top task methodology) recommends that it’s best to…
“focus on what really matters (the top tasks) and defocus on what matters less (the tiny tasks).”
Taking this onboard, we intend to focus testing on the content related to these top tasks. User testing sessions can be time-consuming but it would be interesting to explore other techniques like guerrilla testing to quickly gather feedback from staff and students and use this as a basis for refining and improving guide content.
Make more time for content maintenance
Guides are definitely one of the success stories of the website transformation project, but the audit has shown that we can’t take this for granted.
We can maintain high standards by making time for content maintenance and improving our standards and processes. If we can treat content more like a product that has a lifespan beyond publish date, to be refined and improved, then ultimately it will work harder for the business and help users over a much longer period.