Centering Data on a Technical Writing Team
We make hundreds of decisions every day at our jobs. have found that at most companies, the majority of decisions are made based on gut feelings and experience, instead of data and information. As a technical writer, it can be difficult to know what data to use to guide decisions. As you document features, best practices, and getting started guides, you hope that you are making each user’s life easier. But how can you use quantifiable metrics to demonstrate that you have? And what data can you use to guide future decisions regarding your technical writing & knowledge management strategy?
I have previously written about how we bootstrapped a usability testing program to use data to guide key product UX decisions. Continuing the theme, let’s look at centering data to bolster gut decisions by talking through how we defined tangible KPIs to measure and assess the success of our technical writing program.
When we began working toward a data-centered philosophy for our technical writing team, our first step was to define tangible metrics to track success. In defining those metrics, it was essential that they correlate with our company’s high-level goals and track the progress in achieving those goals.
- If public documentation exists to answer the customer’s question, they can simply send a link rather than reinventing the wheel each time the same question is received.
- If internal documentation exists to answer the question, the support team can self-educate. Equipped with the necessary knowledge, they can respond to questions without meeting with peers or long goose chases through email history or shared file networks.
Both of these goals can be difficult to quantify. Here’s how we measure them:
- Track the number of support tickets per customer: We tracked the number of support tickets per customer over time to see if they trended downward as we built out our help center and internal resources. This can be a nuanced number, because increased support tickets can also correlate to increased customer engagement, increased usage of new features, etc. So we wanted to ensure we weren’t looking at this metric in isolation.
- Track the number of support tickets associated with documentation requests: We have a process that allows anyone at Tasktop to submit a documentation request if they aren’t able to find the information they need in our help center. By tracking the percentage of support tickets associated with documentation requests each quarter, we were able to measure how likely it is that support is able to find the information they need in our existing resources.
- Measure support ticket categories: Our support team uses tags to categorize the types of questions they receive — covering themes like upgrades, user management, and error messages. Each quarter, we review the top trends, and then use those trends to drive future areas of focus for our team and to assess whether previous documentation efforts to address support trends have been successful.
- Good ol’ pageviews: We also used a web analytics tool to track the number of pageviews for our user docs. We target pages outlining new product functionality in order to understand the usage of our documentation to learn about those new features. In addition to page views, we also look at user sessions, bounce rates, and time on each page to get a fuller picture of user engagement.
And there’s more! As our technical writing team grows more sophisticated, we are excited to explore click path analytics tools to track anonymous behavioral trends across users such as:
- What percentage of users clicked the help icon on a new page of our product?
- What page are users frequently on right before they click the “Contact Support” link? (And follow-up question — are there ways to interlay help resources on that page to decrease support clicks?)
These objectives have helped us measure our team’s success, but more importantly, they have given us a foundation of data to help guide our future decisions.
Optimizing UX: How to set up your own Usability Testing Program in-house
Written by: Rebecca Dobbin
Originally published at https://www.tasktop.com on December 3, 2019.