It’s that time again when we have to submit our quarterly reports within the region and then compile an uber-report to go up to national level.
Sadly this time a combination of choosing the wrong seat at a meeting and Anika‘s total failure to jump right up and volunteer (she is normally a joiner in, unlike me) meant I am now part of the national report compilation team. Joy.
Of course I had completed my report, it just consists of a single side of A4 with the words “fixed some computers, innit”, a smiley face and a crudely drawn phallus.
Other divisions and services though, what with actual busy-busy work and critical other stuff to do, haven’t completed theirs. So I now assume the roll of the nagmaster general making myself even more unpopular in the office (and indeed who would have thought that was possible) by chasing up outstanding reports.
And then of course there is the other small problem.
In an attempt to build a streamlined, monitored and efficient management structure the Ministry has previously embarked on a process of implementing the dreaded balanced scorecard at all levels. For anyone not familiar with this it’s a management tool designed to help you identify important stuff, target it, plan to accomplish those targets and then monitor ongoing performance. Blah blah blah.
Correctly done, like a lot of tools, it can be a powerful way of identifying what is important to your service/business and provide easy evaluation.
Our implementation of it consisted of a very highly paid external consultant coming down for a week and basically playing buzzword bingo until my ears were bleeding. I won’t dwell on the minutiae but suffice to say there was a lot of what he was saying as apparent “standard practice” with scorecards that was news to me and didn’t make much sense (to me). He was probably right of course as why else would we be paying him a small fortune?
Anyhow the outcome of this was a number of these wonderful scorecards which in many cases didn’t really align to what we’re doing.
Some central Ministry objectives were stated and all ours had to fit into theirs. Even if they didn’t, in which case they couldn’t be listed. Some could be got round with careful use of English but others had to fall by the side.
Then we were told all that matters is your scorecard. All activities etc should be aligned entirely to the scorecard and only those activities should be done.
If as was also planned the ongoing day-to-day management was to be based upon these scorecards this would have been quite a scary outcome. Luckily of course no-one pays attention to them, except for quarterly reports.
For quarterly reports we’ve been told we must report using our scorecard. Simples.
Yet no. A large part of the very important stuff people (not me) have been doing doesn’t fit on the scorecard and even if it does… National level have a template of some bastard off-spring of their scorecard which is what we must actually submit.
AND NONE OF THE OBJECTIVES/MEASURES MATCH.
So step one is nag for everyone to complete their scorecard. Step two go through these in laborious detail trying to tie stuff up with the national one. Step three is identify a whole load of indicators and measures not yet reported by services. Step four is to fight with each of these services as to who should provide the data. Step five chase for this data. Step six compile nationals submission.
In step seven you just hope national level pay great attention to it. I reckon there might well be yet another reporting structure up there though which, at a guess, doesn’t fit the report we’ve submitted to them.
Joined-up thinking and ensuring the golden thread (both terms used by our consultant in entirely inappropriate and seemingly random contexts) it is not.
END OF RANT
Off now to continue the great compilation.