We’re back from our Christmas pod break with a new episode. We explore the challenges of receiving and responding to criticism as a technical writer. Documentation plays a crucial role in user experience, and receiving feedback- whether constructive or harsh – can be an opportunity for growth. We discuss practical strategies for handling feedback, evaluating its validity, and implementing improvements to enhance documentation quality.
Key topics covered
- Why receiving feedback (even negative) is better than receiving none
- How to separate personal feelings from professional criticism
- The importance of acknowledging user feedback and addressing concerns
- Types of criticism: Constructive vs. Unconstructive
- Methods for evaluating the validity of feedback
- Tools and techniques to measure documentation quality (e.g., IBM Quality Matrix, analytics, usability testing)
- Addressing common documentation challenges: clarity, findability, audience mismatch, and linking
- Steps for implementing improvements and tracking their impact
- Preventative measures for reducing future criticism
Key points
- Criticism is not personal – It’s about improving the documentation, not attacking the writer.
- Acknowledging feedback is crucial to building trust and ensuring continuous improvement.
- Evaluating feedback critically helps differentiate between valid concerns and personal preferences.
- Quality measurement techniques (analytics, support ticket trends, usability testing) can validate feedback.
- Structured improvements through linking, clearer writing, audience targeting, and prioritization can make a big impact.
- Continuous monitoring is necessary to ensure long-term effectiveness.
Mentioned resources
- IBM Quality Matrix for documentation assessment
- “Every Page is Page One” by Mark Baker
Want help improving your documentation?
Cherryleaf specialises in fixing developer portals and technical documentation. If you’re struggling with user feedback, contact us for expert guidance.
Transcript
Hello, and welcome to the Cherryleaf Podcast. This is our first podcast in 2025. In this episode, we’re going to look at the topic of dealing with criticism.
When you’re a technical writer or a technical author, you develop documentation and receive feedback stating that the documentation isn’t good – that there are problems with it.
If you speak to most technical authors, they will tell you that this is something they have experienced, and it can be difficult. You put your heart and soul into creating clear, concise, and helpful documentation, only to receive feedback stating that the information is unclear, confusing, hard to find, or even useless.
So, how do you deal with these situations? What do you do when you receive this type of feedback?
First, any feedback – good or bad – should not be taken personally. The criticism is directed at the documentation, not the individual. Within the Government Digital Service, when they review content, one philosophy they follow is that everyone did the best they could with the time, resources, and information available at that time. That may well be true in your situation as well.
It’s important to respond to critical feedback by acknowledging the concerns and appreciating the time taken to provide it. You need to recognise the user’s experience and thank them for their input.
From there, your response can vary. You can acknowledge their input and state that it will be used to improve the documentation, or you can work to specifically address the problems they have highlighted for a quicker and more immediate remedy.
Let’s start with the positive: feedback is useful. In many ways, it’s better to receive feedback – good or bad – than to receive none at all. Without feedback, you won’t know if a problem exists, and if a problem does exist, you won’t be able to solve it. Feedback is an essential part of creating effective documentation.
You need to carefully listen to or read the feedback and try to understand the user’s perspective. Acknowledge the feedback, thank the user for their time, and recognise their frustration.
A good place to start is to evaluate the validity of the criticism. There are different types of criticism – some constructive and some unconstructive. Examine the feedback: is it specific and actionable? Does the user provide concrete examples of where the documentation falls short? For example, do they point out missing or incorrect steps?
Or is the feedback vague and emotional, such as simply saying, “It’s useless,” without any direction on how to fix it? Is the feedback a personal opinion, and is that opinion representative of the majority of users?
Sometimes, criticism revolves around language choices – such as whether contractions should be used, whether simple phrases should be preferred, or whether buzzwords should be included or avoided. Another consideration is whether the criticism pertains to the scope of the documentation. The user might say, “The documentation doesn’t teach me how to solve a problem,” when that was never the intended purpose of the documentation.
Does the feedback align with analytical data? Can you examine support call records to determine if others are experiencing similar issues? Can you assess whether your documentation is effectively reducing support requests?
Can you reproduce the user’s problem? Can you follow their steps and experience the same difficulties? Can you identify a root cause? Is the issue a lack of clarity, missing information, poor organization, or something else? Are there patterns in the feedback? If multiple users report the same issue, it likely needs urgent attention.
One common problem is documentation quality. There are different measures to evaluate this, such as the IBM Quality Matrix, which provides a checklist for assessing documentation based on different criteria. You can use a spreadsheet with a red-amber-green traffic light system to prioritise issues. Quality assessment criteria include:
-
Clarity and conciseness: Is the documentation overly verbose or complex?
-
Completeness: Are important tasks or features missing?
-
Tone and style: Is the tone appropriate – neither too technical and intimidating nor overly friendly and casual?
-
Accuracy and currency: Does the documentation reflect the current version of the product?
-
Findability: Is the information well-organised, indexed, and easy to locate?
Another way to validate feedback is by gathering data. This could include analytics on documentation usage, support tickets, and common user issues. Some websites feature feedback mechanisms, such as thumbs-up/thumbs-down icons or forms for users to provide comments. Usability testing tools can track mouse movements and interactions to identify pain points.
You should also assess the intended audience. Is there a mismatch between the documentation’s intended audience and those providing feedback? A common issue is content that doesn’t match the skill level of its users. If you’re writing developer documentation, for example, product managers might review it to decide whether to invest in an API. In such cases, the documentation needs to guide them on the API’s benefits and importance.
Related to this is the “Every Page is Page One” concept, from a book by Mark Baker. Many users arrive at documentation via search engines, landing on a specific page rather than starting at the beginning. If that page lacks context or assumes prior knowledge, new users may struggle. A solution is to provide internal links to related topics, as seen on Wikipedia.
One challenge for technical writers is information silos. Documentation, training materials, support articles, and developer guides may be created by different teams using separate platforms. Consequently, search functionality may be limited to specific repositories. This fragmentation can lead to user complaints that crucial information is missing when, in reality, it exists elsewhere.
To address this, technical writers should maintain an index of available content, including descriptions of what each resource covers, its intended audience, and its purpose. By linking relevant materials, you can guide users to helpful content beyond your immediate documentation.
Once you identify an issue, the next step is to fix it. Solutions include:
-
Writing new content
-
Improving existing content
-
Adding internal links
-
Breaking complex topics into smaller steps
-
Incorporating more visuals
-
Converting passive voice into active voice
You may end up with a long list of required improvements. Prioritisation is key. Using a spreadsheet with a red-amber-green system can help focus on the most critical issues first.
Since you likely can’t fix everything at once, set realistic deadlines for each task. A structured plan also allows you to communicate updates to users who provided feedback, informing them that their concerns are being addressed and when they can expect changes.
After implementing changes, track their impact. Did the problem resolve, or are users still encountering the same issue?
Finally, consider ways to prevent similar issues in the future. Depending on the problem, solutions might include:
-
Researching your target audience to ensure content is tailored appropriately
-
Establishing style guides and templates to standardise writing
-
Enhancing findability through better information architecture
-
Regularly reviewing analytics and usability data
To summarise: criticism isn’t a personal attack; it’s an opportunity to improve documentation. Common issues often involve navigation, clarity, and audience alignment. Having a structured plan prevents feeling overwhelmed by necessary changes. Many fixes are relatively simple yet significantly improve user experience.
At Cherryleaf, we help improve developer portals and technical documentation. If you need assistance, feel free to contact us at info at Cherryleaf.com.
Leave a Reply