Of course in that article, we’re talking more about ad-hoc feedback, and the importance of giving it throughout the year. A 360 feedback is a much more structured and formal way of giving feedback, which complements ad-hoc, and can be a framework for working with a staff member to help develop them through mentoring.
Our jobs involve a lot of elements. What am I doing well? What could I do better? We can worry about these things, but without getting some form of feedback, we’re just guessing at these things. Maybe in some areas we are too harsh on ourselves, whilst in others we think we’re doing superb, when really we’re falling into a trap of overestimating our competence. Without some feedback, we’ll never know and worse still may be fretting about the wrong things.
It’s easy to see the process of feedback through review or appraisal as a negative and soul destroying piece of HR. And some companies do indeed wield it as such. But the process really needs to be a constructive one, which is focused on putting time and energy into helping people develop.
During the performance appraisals I've done, I’ve often spoken to a number of members of staff, and talked about how we enter our career in our early 20s, and exit in our late 60s. We’re not going to just “develop a level of skill” and stay static. Every year will involve some level of growth and learning, either in a technology area, people skills area or just about a project area we’re working on. We’re not ever going to be able to just “coast” knowing what we do now from year to year.
Collecting viewpoints for a 360 review
A 360 review involves taking feedback for a staff member for a number of peers within our wider team (yes – not just testers). The form asks questions about this peers view of the member of staffs competency across a number of areas.
I’ve tailored our 360 form to touch series of diverse competencies which knitted together form the “tester skillset”. It’s tough – the questions have to be diverse enough to cover a diverse range of topics (it's 360 because it allows you to see the whole picture after all), but not too many questions that people feel bored and don’t want to participate in giving feedback.
I try to set the tone for the feedback with the following upfront statement,
Thank you for your commitment to provide feedback for this software tester. The process of feedback allows team members to find out areas in which they’re doing well, together with identifying areas in which they can develop and set goals for the coming year.
In providing this feedback, you’re helping to invest and develop another’s career. Although comments made here will be treated anonymously, we strongly recommend you attempt to give anything important directly to the person being reviewed – whether that feedback is around areas of success or suggestions for growth.
I think for a core questions for asking for feedback on a tester covers these areas,
- What is their work ethics and drive to succeed like?
- How well do they communicate?
- How they give feedback (including defects)
- How good they are at problem identification and taking the initiative?
- What is their technical ability like?
- What is their understanding of the core business like?
- What areas of growth do you see for this person?
I only expect one answer to each core question, but I provide some sub-questions to stimulate a response from the person giving feedback.
What is their work ethics and drive to succeed like?
Areas to think about,
- How do they approach their work?
- Would you describe them as a processional?
- Do they have passion to succeed?
- Do they seem to enjoy their work?
- Do they champion the company values?
How well do they communicate?
Areas to think about
- Do they express their viewpoint well?
- Do you feel your position is respected?
- Do they form positive relationships in our team and with the customer?
- Do you feel you’re working in the same team as this tester, or against them?
How they give feedback (including defects)?
As testers the main way we add value is the feedback we give on software, often in the form of defects. Being able to give effective and appropriate feedback, whether on software or a document is thus a key tester skill.Areas to think about,
- How does this tester approach you with problems?
- Do you feel comfortable asking this tester for feedback on yourself or a piece of your work?
- Do you feel they raise the right level of problems? Too many? Do they miss items?
How good they are at problem identification and taking the initiative?
Areas to think about,
- When given a new piece of work, or encountering a defect, have you observed this testers approach to identifying and solving problems?
- What could be improved inn their approach? What has worked well?
- When a serious problem has occurred, have you witnessed this tester using their initiative to make decisions and attempt to get resolutions? How effective were they?
What is their technical ability like?
Areas to think about,
- Are they competent in any tools they need to use?
- What are the areas of technical strength for the person?
- Have they displayed having an understanding of how applications are intended to work?
- Have they shown an understanding of the suite of applications we develop in our group?
- Are they able to work beyond just requirements (when required) to understand how the application should work?
- Are you aware of any areas of opportunity in which the tester can further develop their technical ability?
What is their understanding of the core business like?
As an IT business, we’re not just producing software, but business solutions. How well does this tester demonstrate an understanding of the important values and factors of this business?Areas to think about,
- How has this tester shown understanding of the end users and customers of the system under test?
- Does this tester show understanding of the business solution we’re trying to achieve in our software?
- How has this tester shown understanding of the target market we’re developing software for?
What areas of growth do you see for this person?
Areas to think about,
- What should this person do to help improve what they contribute to their team?
- As you have observed them, how does their ambition compare with their abilities at present?
- What things would help their growth in their current role?
- What areas of challenge should they seek more of that are out of their current comfort zone?
Of course, no tester is really expected to excel at all of this – some will be better at business understanding than technical understanding or visa versa for instance. This isn’t really for measuring a tester, it’s for showing them where they’re doing well, and where they might want to think about whether there are some goals they’d like to do, either to address things they’re not happy about, or to extend themselves in an area they enjoy, but “take it to the next level”.
I collate all the comments, try to keep them as anonymous as I can, and add my own commentary where possible when I think there’s something additional that needs to be added (which I make clear as being from me). It’s important to mention here that everything needs to be done confidentially. Hence I keep all the comments in my private drive, and even book a meeting room to go through them in private on my laptop. People deserve that these notes aren’t available for all to see if you get called away from your desk!
For most people, this is going to be really simple, but for a few people, there may be issues highlighted through the feedback which may need investigation, or prior thought before giving. If I feel out of my depth, I know that I need to talk with someone in higher management or HR about how to go about handling if I’m expect issues.
Giving the results of the feedback to the individual
With most people, there is an unspoken elephant in the room that the 360 review feeds into a pay review. This makes people tense for obvious reasons.
I can’t speak for your pay scheme, but for myself, it makes sense to put the 360 review separate from the performance appraisal. I let the individual know that the 360 review feeds into the performance appraisal, but it’s a part, not the whole thing. Ideally the two things need to be spaced out, to allow for some reflection between them.
Giving the 360 review needs privacy and confidentiality (again meeting room, if possible far away from the team). I need the individual I’m doing the review with to feel relaxed, have an open mind, and to not feel on the defensive. I book at least an hour, I’m not just going to go through the feedback but give them an opportunity to discuss how they feel.
Before I go through the appraisal I reiterate the goals of this exercise, to give them feedback about their strengths, and to investigate areas they want to grow with, to feed back into our mentoring sessions. This helps put them at ease beforehand . I then go through the responses an area at a time. Discussing how they feel about the comments, and if they think they are fair. Give them time to explain anything they’re unhappy with. I try to be sympathetic if they feel it’s unfair, but try to offer details where I think they’re mistake. It’s important that you don’t let them convince you the whole world is wrong. If they’re agitated, I give them time to calm down. If they won’t, I have to consider terminating the meeting (again this will be really rare it gets to this).
That said, some feedback will be “unfair”. Sometimes an individual will be seen to be negligent but it is untrue. The problem isn’t the tester is negligent, but that people aren’t seeing a particular activity happening. For instance, I once had feedback that my testing was disorganised, and I never recorded what I did. That was actually untrue. But what was happening was I was not advertising enough where all that information was, and broadcasting enough when it got updated. I worked with a delivery manager to address that issue. It’s our job to try and determine what the truth of a situation is (wisdom of Solomon needed here). But again, this is why I need to do any groundwork on anything that’s likely to be contentious.
This feedback of course will create a roadmap of things the individual wants to develop. Sometimes it’s a roadmap of mentoring and upskilling (which we’ve covered in previous articles). Sometimes it’s about making something they do more visible to the rest of the team (without over-advertising). Generally it’s good to set 3-5 realistic goals for the year ahead for development (too many goals, and they’ll easily fall by the wayside). Goals that feed into the work environment ahead. I have talked about SMART goals in the past…
If someone has goals for a certain area, I make sure that they’re considered for any upcoming work that matches (but I try to avoid showing them undue favour). It’s important to try and play fair by their aspirations – being fair is incredibly important to me.
But if they have goals I just cannot match them to, it may be worth me asking how important those goals are and if they want to potentially transfer to another unit that can (even if only on a temporary basis). No-one can keep hold of talent whose heart lies in work that you can’t provide for them. If it’s that important to them, you can’t promise if you can’t deliver. People tend to respect being dealt with frankly and having these cards put on the table.
The 360 review process seems scary – especially with the rare “what if someone takes this badly” scenario – but it’s worth it. Never lose focus that this feedback is about being constructive. You are not doing this to make someone feel bad about themselves, but to genuinely help them to develop themselves. If the review you have prepared doesn’t feel like that, go back to the people who have provided you with information, and get them to expand.
It helps if you are having a relationship with this individual regularly - at a past company I only used to see the manager who did my 360s about 3 times a year, so there was that relationship. Also remember if you're giving information in the 360 which feels like a bolt-out-of-the-blue surprise, then something is fundamentally flawed with your department. Nothing should come as a huge surprise. Feedback should be happening throughout the year, this should just be collecting it up, with the odd omission, but again nothing that's going to shock.
360s are worth it because it gives us meaningful information as a "big picture" as to where our efforts are best applied in developing our testers to be the best they can be. It feeds back to the tester, and it feeds into our mentoring and one-on-one sessions to give them real goals.