As an educator or a trainer the end of every module or workshop is accompanied by a course evaluation exercise. The aim is to make sure that standards of teaching are high, that needs of students are met, that money is being spent wisely…
Yet I’ve often wondered whether the information in feedback forms is really relevant. A recent publication reveals how inaccurate some feedback can be… Some very obviously false statements were included within a standard evaluation form and surprisingly large numbers (up to 69%) of students agreed that the false statements were true. For example, 28 per cent said that it was true that “the instructor never even attempted to answer any student questions related to the course”. Well, if that were true it would be an appalling teaching event. I suspect it’s most likely that students either failed to read the statement or were careless in how they answered the question.
As a trainer I am really keen to hear how to improve my content or delivery. I know that it’s impossible to please all the people all the time so some feedback might be true for the individual rather than true for everyone, other feedback might be more generally applicable and I’m really keen to hear both types.
I also know that, as a course participant I’m often rushed at the end of a workshop and that it can be embarrassing to write potentially negative or critical feedback when the trainer is still in the room.
So what’s the way forward? Well. I guess the key is to have several different ways of evaluating content, delivery and impact. Personally I’d also love to see some evidence of embedded knowledge and changes in behaviour after having attended a workshop. Filing the handouts is not the best way to ensure you act on the new knowledge!
So what will you do to show your participation has been worthwhile?