Benchmarking and Customer Satisfaction

If part of the purpose of your job is to spread the use of information and communications technology, it's a good idea to start collecting statistics in order to benchmark your performance.

This article looks at a fairly simple approach to benchmarking which does not take long to implement, but which can be extremely useful.

It is true that you could content yourself with collecting statistics on how many people are using the educational technology facilities, but I regard that as necessary but not sufficient. For a start, it tells you nothing about the quality of what people are doing, and it is more than likely that if you start to insist on high standards of work, or even merely that colleagues do not use the computer facilities as a fall-back when they don't have a lesson planned, you will start to see a fall in the amount of usage -- at least in the short term.

Furthermore, there is little you can do about increasing the usage until you know why people use or don't use the facilities. Hence, some deeper probing is required.

A very good "way in" is the customer satisfaction survey. If your school or organisation has a history of poor performance and bad experiences in this area, you may feel that to carry out customer surveys would lay yourself wide open to criticism, and therefore be the last thing you'd want to do. In fact, in those circumstances finding out what people like and dislike  about the service on offer is even more essential.

There is another dimension to this as well. In general, although people are often happy to criticise someone or something when in a crowd, and anonymous, they are usually much more considered when asked to do so in writing, and with their name attached to it. In one of my jobs, the IT service was constantly being criticised by Headteachers: not directly to me, but to my boss. As well as being upsetting for me, it was also upsetting for my team, who tried to do a good job and, from feedback they received whilst in school, thought that they were. Once I'd implemented the customer survey regime, my boss and I had a couple of the following sorts of conversation before the unwarranted criticisms stopped altogether:

Boss: At the meeting today, the Headteachers were complaining that your team take ages to respond to a call for assistance, and never complete the work properly.

Me: That's strange, because according to the customer satisfaction records we've been keeping, 95% of the schools rated our service as excellent, and the rest rated it as very good. Was there anyone in particular who was leading the complaints?

Boss: Yes, Fred Bloggs.

Me: Hmm, that's a bit odd. Looking at his last completed customer survey sheet, he said "An excellent service. The technician was really helpful and fixed the problem with no interruption to the school's computer network at all." Would you like a copy?

Now, there was no intention on my part to stifle criticism. However, I think that if you are going to criticise someone, especially when potentially people's jobs are at stake, you need to be very specific about what was wrong. The trouble with educational technology is that people have come to expect the same level of service as they enjoy from the electricity board. And so they should, but they do not always understand the wider forces at work. Thus it was that when an internet worm knocked out computer systems all over the world, my team got the blame! When things like that happened, the Headteachers would complain in their meetings with the boss that the IT service is useless, not realising what the real causes were. Given that on no occasion, as far as I know, did any of them contact him out of the blue to say "The IT service is fantastic today!", the impression the boss had was that we were not doing our jobs properly. The implementation of the customer survey approach counteracted that by being very specific, and by providing hard evidence of how Headteachers found the service in general over the long term, as opposed to how they felt immediately after the most recent virus alert.

OK, so how do you conduct a customer survey? I would suggest that you ask people to complete a very simple form, and sign and date it. Then transfer the details to a spreadsheet, which won't take long once you have created the spreadsheet in the first place. You will then be able to generate useful statistics.

The questions themselves will differ according to the nature of the service you are running, of course, but if you are an ICT Co-ordinator (Technology Co-ordinator) I would suggest the following items be put on the form:

  1. Name of teacher

  2. Class

  3. Date

  4. Subject

  5. Was the room tidy when you entered it?

  6. Was the system too slow/fine?

  7. How easy was it to achieve what you set out to achieve? Very easy/very hard

  8. Please add a brief explanatory note, especially if it was very hard.

  9. Any suggestions as to how the facilities or service could be improved?

As you can see, a very simple form, which not only helps you to obtain some information in a consistent manner, but also indicates pretty clearly what your own concerns are --  the room being left tidy, for example.

I’d strongly suggest you assign numerical values to the responses (EG 1 = Very Good) and use a spreadsheet to collate and analyse the responses, because it is easier to calculate averages where necessary.

Run this for half a term, and see if you can spot a pattern emerging. If so, it will help you to prioritise future developments.

How helpful did you find this article? Please leave a comment. If you like the customer focus approach, you will probably find this article interesting too, and this one on the Framework for ICT Support.

An earlier version of this article was published on 16th September 2008.