We’re a month into our trial of assessing candidate CVs using a points system, so it’s an appropriate moment to look at its progress and how candidate Technical Authors are assessed in general.
On our scoring system, the difference between the Technical Author CVs we’ve received recently has been within 10%, with a few outliers outside of that range.
So far, no candidate has asked to know their score, which is to be expected from people who are typically “IN” Myers-Briggs personality types. We decided not to email candidates their score “on spec”, in case they felt we were being condescending, insulting, unfair etc and took offence.
One key benefit has been that it has helped clients recruiting a Technical Author to better assess the relative merits of different candidates.
In short, the scoring system seems to be useful.
It can be difficult for organisations recruiting a Technical Author to assess the candidates. For example, past examples of the candidates’ work may not be available, due to confidentiality issues. Also, many of the skills, such as project planning and project management can be hard to judge via a CV or at an interview. This can lead to organisations using a writing test to assess the candidates. Tests can be resented by Technical Authors – you don’t ask an Accountant to add up numbers or a nurse to do physio at an interview, so why ask an author to write? However, in the absence of other ways to measure the capabilities of a Technical Author candidate, it’s one of the few measures they can use.
Developing additional ways that help organisations select the best candidate for their vacancy is something we plan to develop in the future.
How do you assess Technical Authors applying for positions at your organisation? Please use the comment box below.
I always score the CV as well. I think it is 2 part process though. A score for skills and experience and a score for presentation of those skills. I always only interviewed candidates if they showed that they could present themselves well. It is after all what we’re supposed to be good at – presenting information in an easily digestible format. The ability to use Word is also a major part of the scoring.
Additionally I always set a writing task. You would be surprised of the rubbish that I have received from so called technical authors.
First, I look for achievements, rather than role descriptions. For example, “Responsible for the XYZ Documentation” is not as good as “Increased the usability of the XYZ Documentation over x release cycles by …”
Second, I look for an understanding that the job of a technical writer is to assist users in achieving their goals.
Third, I look for technical knowledge of the things they say they’ve written about. I don’t expect general technical knowledge, but I would expect them to be able to talk about their last two or three projects.
I might give junior authors a writing test. I have, in the past, given candidates a paragraph or two of text and asked them to re-organise it and/or chuck it into logical headings.
I recommend that writers prepare a few short (3-8 pages) genericised (company identification removed) excerpts of some of their previous projects that illustrate different skills (such as a procedure, something containing a flowchart, a concept description, part of an installation or quick-start instruction) that they can submit as e-mailable writing samples, if such samples are requested. After being unable to schedule an interview with my current employer, I submitted such samples, and was hired without a face-to-face interview.
As a technical author, I hate writing tests. Not only are they demeaning, they rarely come close to mimicking the kind of writing that would be actually required in the job. I rely on my portfolio. Then, if a prospective employer still insists on a test, I probably wouldn’t want to work with them anyway.