One challenge I consistently face while maintaining technical documentation is ensuring it evolves in line with user needs and expectations.
The solution? Actively engage with and listen to your users. Their feedback doesn't just point out what's missing or off the mark but gives a real sense of the documentation's pain points.
In this article, I'll share the specific strategies I use to collect and integrate user feedback into my documentation process.
Getting feedback before publishing
As docs writers, it's vital to be the first users of our documentation, testing its utility and ensuring its relevance firsthand.
Before sharing documentation more broadly, an internal review is always a good idea. This helps catch any overlooked issues.
Ideally, I seek feedback from three distinct profiles:
A subject matter expert: To ensure the content is technically accurate.
A fellow technical writer: To have a second opinion on clarity and adherence to style standards.
Someone without previous context: To have a fresh perspective on clarity and usefulness. This could be an individual within your organization unfamiliar with the specific feature or product being documented.
Securing timely commitments from reviewers is essential, as delays can affect the release timeline. If all three reviews aren't feasible within the schedule, I prioritize feedback from the SME and another technical writer. This ensures that both technical accuracy and stylistic consistency checks pass before publishing.
One last tip: Keep in mind this review period when giving your estimations, as it's frequently overlooked.
Adding direct feedback mechanisms
A quick way for users to share their thoughts is by having a feedback feature right on the documentation page.
If your readers are mostly developers, a link to the GitHub repo might be enough. But, this method can be limiting: Some users might not be familiar with GitHub. Plus, they'd need to leave the documentation page and give feedback without anonymity, which can deter some.
That's why I prefer feedback tools right on the page. This allows anonymous feedback with as few steps as possible.
For example, I use the PushFeedback widget. It's designed for documentation sites, letting users point out issues directly without leaving the page.
Disclaimer: I'm the founder of PushFeedack! It's free for open-source projects, and reasonably priced for everyone else :)
Then, I forward this feedback to our issue tracker to ensure each suggestion is addressed.
Direct engagement offers deep insights that might be overlooked through other feedback channels.
Whenever I get the opportunity, I aim to have a conversation, either through video or in person, with users who've recently worked with a specific part of our documentation. Such interactions provide me a firsthand glimpse into their journey, the challenges they face, and the sections they found particularly helpful.
However, organizing these discussions can be particularly challenging when the documentation is intended for external users. So while it's a valuable method, it's more about seizing the moment when the opportunity presents itself rather than relying on it as a consistent feedback mechanism.
Conducting user testing sessions
User testing sessions, where users engage with documentation in real-time, are extremely valuable. While traditional methods like group discussions are helpful, making it fun or competitive can get even better feedback.
Take the following initiative by Finboot to test their documentation: organize a live "Escape Room" challenge. Here's how it unfolded:
The scenario: An evil adversary had set a challenging puzzle. The participants' mission? Solve it using only the company's documentation.
The gameplay: The entire company is divided into two teams. Their goal was to follow the documentation and send blockchain transactions in the correct sequence to solve the puzzle.
This wasn't just a one-off event to do something different for one day but an actionable feedback session.
Throughout the challenge, we gathered insights, spotted inconsistencies, and identified areas of ambiguity. Post-event, the commitment to refine the documentation, fueled by the shared experience, was stronger than ever.
Asking for feedback on surveys
Surveys offer a broad view of user sentiment. While some users might not have recently interacted with the documentation, they can provide an overview of their overall perception.
For this reason, I prioritize generic questions such as:
How helpful do you find our documentation (1-10)?
Did you notice any topics or details that needed to be included?
How can we improve our docs?
Monitoring communication channels
Feedback isn't always direct. I regularly monitor communication channels, issue trackers, and social media. You'd be surprised how often users discuss documentation or have problems that can be potentially fixed with better docs.
One might argue that indirect feedback is even more telling than its direct feedback. It's raw, unfiltered, and frequently captures users' immediate reactions to their pain points.
Whenever I encounter issues where the documentation didn't meet expectations, I create a separate issue. This ensures we circle back and consider potential refinements to enhance our docs.
Technical documentation is always a work in progress. It's a continuous cycle of feedback, refinement, and adaptation. A holistic feedback strategy that includes direct feedback and indirect feedback ensures that our documentation is constantly evolving, relevant, and user-focused.