Hello Fellow HR People:
I agree with Leo--this list of topics he has offered is excellent, and should be required reading for every potential and current Trainer.
I could only add one thing to his post. Leo, I hope you don't mind if I do that.
One thing which we have done is "compact" the Training Evaluation to 2 pages--10 questions--and yet have gotten significantly more information. How? By asking questions which will be rated in a "numerical" manner.
I went back to my old undergrad days when we studied test instrument effectiveness, and came up with a plan to use "tried and true" assessment techniques to shorten the assessment and feedback process.
Continuous rating scales from 1-5, with 5 being "best" and 1 being "least" bring the assessment to a numerical scale, as well as offer a true assessment of the skills of the trainer. Averages offer self-performance assessments across the board to the trainer. I sometimes jokingly "bet" myself on the scores I'll achieve before the results are in, to assess my own performance.
Here's the gist of the function.
Our objective when we approach an opportunity for training is to be able to communicate the concepts, theories, or applications we need to address-- in a manner positively reinforcing the learning process, right?
With assessments targeted across a numerical range, we can tell if we communicated what was expected, and the degree with which we competently communicated it; we also assess the learner's ability to assess whether or not the trainer understood and was able to communicate his/her material; we can question "style" and we can question whether or not the "answers" given addressed either the questions asked or not asked--many times, the questions not asked are more important than those asked--especially in "soft skills."
Also, let's face it.
The more training you do, the more you realize that as a trainer--you're not always at the top of your game when you do a seminar or a training exercise.
Although we all want to feel we're at 100% when we get up in front of the 1, 10, 25, or even 500 people to whom we're presenting--it's a fallacy to believe that we're always 'pinging on all cylinders.'
I know I'm not, and I'm old enough to realize it, and admit it.
Numerical assessments give us immediate feedback in a quantitative model.
If I know that I presented this material to a similar skills group seventeen times over a period of two years, and have an average Assessment rated at 4.59, and today, my Assessment was only a 4.10, I have an issue that I need to recognize and address.
The more assessments you have your trainees/seminar clients perform, the more valuable they become as objective measurements of performance.
We have a total of 10 questions in our Assessment and all are critical assessment points. To me, the two most critical components are Question #8--"Will you be able to apply the lessons you've learned to real life?" and Question #9--"Do you feel the time you've invested will pay real rewards?" Both questions are rated on a scale of 1-5.
If the ratings on these two questions are low, all else pales in comparison.
You can be a great presenter, and demonstrate a wonderful vocabulary, tell interesting stories, have the newest PowerPoint templates, excite your audience with your experience and qualifications to present--but if they don't walk away with a positive learning experience that is applicable to their lives--somebody's money and time has been wasted.
Just my two cents' worth. If anyone is interested in our Assessment Form, we'll be happy to post it.
Alan Guinn, Managing Director
The Guinn Consutlancy Group, Inc.