The Do’s and Don’ts, getting 360 Feedback right
We are biased, but we love 360 feedback, and over the years have
seen many successful implementations of these tools leading to many positive individual
eureka moments and organisational culture change. Having had many conversations
about 360 feedback, there is a consistent theme in those that are resistant to
using them. They have all had, or know
of a bad experience of this type of tool and when you dig deeper this is
usually down to the way in which the tool was implemented. Needless to
say in the wrong hands, 360 can be divisive, unhelpful or worst harmful and a
complete waste of time. Here are some thoughts on getting it right, and
what to consider when planning to use 360 Feedback
Don’t...
·
Give
the reports direct to the participants without support
We have seen
grown men and women in tears when receiving their feedback, also in jubilation
or amazement, because in their eyes the feedback is tough. Bearing this
in mind, support is key when utilising the data. Ideally using an experienced
coach to help unpick the data, provide an objective view and ask insightful
questions would be the best option. Now, we understand that not all
budgets allow for this especially when 360 Feedback is being rolled out on a
large scale. At the very least the participants (well briefed) manager
should walk them through the feedback, helping them to remain detached from the
emotion and judgment specifics and balance their views between their strength
and areas for development
·
Try
to measure everything / making the questionnaire too long
You may be
tempted into measuring every behaviour at all levels of your business, this is
fine if that is essential data and all will be useful in the debrief, but a
note of caution the more questions you have, the longer the questionnaire will
take to complete.... because of the nature of 360 feedback the time investment
becomes compounded as for every candidate / delegate you have going through the
process there could be upward of 10 people responding to their request.
The other challenge is attitude, if reviewers see that there a lot of
questions, you may find quality of responses drop.
·
Just
focus on the weaknesses
For some this
may sound fluffy, other may say 'what, I don't have a blind spot', but the
truth is human nature often means that we draw our self to the negative,
'what do I need to fix?'. In our experience there are always strengths
that can be identified from the 360 data, the debrief should be balanced,
helping the delegate to understand the things that they do well and the things
that reviewers value.
·
Brief
the process up front, be transparent about how the data will be used, and level
of confidentiality.
One of the
biggest fears in using these tools by both participants and reviewers is what
the data will be used for and how the data will be shown in the reports. For
example you could say, ‘this process will only be used to support your
development. You will have a conversation with a coach discuss the results’ or
for reviewers ‘your data will be completely confidential any information you
put into this questionnaire will be combined together with other feedback and
no comments will be attributed to you’. Being up front about these points will
ease any anxiety and ensure that the feedback can be more genuine.
·
Measure
the most important competencies and behaviours for your organisation
Keep it
simple, only measure behaviours that colleagues and Direct Reports can observe.
To avoid the survey fatigue keep it short, people will be much more likely to
contribute quality feedback to this process if it means they won't need to book
a week off to get through all the requests.
·
Simple
reporting
Where 360
Feedback is really tested is on how easy it is to understand, interpret and use
the data that has been collected. If those taking part in the process can't
understand the information they are presented with, then it provides more reasons
not to take action. Inaction or lack of response can have a massive affect
to the culture of feedback. (that is probably a blog for another day.)
·
Follow
up
Feedback
should not be given in isolation, for it to be transformative feedback needs to
be reinforced. Keilty & Goldsmiths research into the impact training
over a period of time suggested that the most effective interventions followed
up on the feedback, so check in on progress how has the 360 participant has put
the feedback into action 3, 6, 9 months after the initial review. Where
possible re-measure the 360 at an appropriate time frame, probably no more
frequently than 12 month, re-measuring provides a good insight, into how
effective the changes have been embedded, and provides another chance to review
the strengths that are still valued and if there are any other areas for
development.
These do’s and don’ts are
basic, but essential to guiding your thinking.
We recognise there is a lot of grey ‘magic’ in any process that helps
you collate opinions and draw actionable conclusions from them. The list could
be much longer, in fact you could write a blog with 50 do’s and 50 don’ts given
the time and inclination, but just following these few will stand you in good
stead to delivering a positive and useful 360 feedback experience. Good luck.