by Peter Holtmann
How many times have we looked at flat-pack furniture and thought, “Who dreamed this up?” or looked at children’s toys in their box with the instructions “some assembly required” and wondered if the pieces were ever meant to go together?
I am sure we also have our favorite stories around the collapsible beach shelters from hell that would never really collapse, the new unprogrammable-programmable recorder, and the like.
This train of thought led me to think about all things technical in our industry of training, examining, assessing, and certifying personnel. How do we ensure that our best efforts to impart the knowledge and wisdom of our industry into five days of training, give or take, actually does work?
At what point do we recognize that training won’t deliver the result and that we should perhaps be assessing for inherent capabilities in the person? After all there are adages from across the eons that quip that leaders are not made they are born or that one is thrust into greatest rather than thrusting oneself upon it, etc. So when we do try to train for leadership qualities in people, are we really training at all or just imparting an opinion or self-belief?
Who is to say that one person’s success model is right for another? I have read scores of leadership, vision, success, and business excellence books, and they all seem to me to be the opus of the “achieved.” How do we take that knowledge and implement it in our own life?
Well, there are many books and philosophies around this area of thought. I won’t drag you through them all, but I will summarize to say that in our industry as complexity increases so, too, does the need to examine for successful completion of tasks versus teaching how to do it.
Many of these complex roles impose training regimes upon the task, but what I notice is that they provide fundamentals of thought followed by a script of work process. This isn’t really teaching, but rather a regurgitation of process followed by a lengthy assessment of skill.
When we move into the areas of communication, leadership, interpersonal relations, logic, and critical thinking, we are moving to the realm of something far harder to teach let alone test. In my opinion, we are moving more into the need to assess or quantify just how much of the particular attribute we are measuring.
So when we apply these thoughts to the audit profession, just what parameters are we teaching versus assessing?
Let’s take your average, barn-raised international standard, say ISO 19011, and look at the content of the standard.
Much of the standard’s content describes the acts of preparing and delivering the audit process. Good, yes, it can be taught and tested for understanding of theory and its application.
But what about the sections on leading audit teams? Is this really a straightforward exercise of training and testing for demonstration of understanding or are we now heading into a deeper assessment of attribute?
I would argue that we are assessing, and delivering training on this topic is more about the acts of coordination of resource and less about actually leading people, making decisions, and communicating outcomes or expectations.
So when we begin to look at the capabilities of professionals in the conformity assessment industry, when should we start to move away from conventional methods of training and testing and start moving into methods of assessment to determine their capabilities?
Under a qualification-based system you would look at their audit logs and work experiences. The logs would demonstrate that they have lead team audits and their work experiences would show some form of supervisory, management, or even executive function in the workplace, but that isn’t an assessment rather recognition of a known quality that instills a level of confidence.
In a competency-based system it would be assessed using some form of psychometric or attribute analysis, such as the Myers-Briggs assessment. But what this is really telling us is that you may or may not be fitting into a preconceived model of the square peg for the square hole.
For my money I would look at a combination of both methods and combine it with an ongoing feedback system from the customer. Did they perform to the customer’s expectations, would they have them return to site in future, was the outcome a demonstration of an effective process, etc.?
If we apply these ideas to the flat-packed furniture model, not being able to decipher the instructions and complete the build does not make you incompetent or incapable of doing it in future; it means you haven’t received the training to translate the instructions and convert to skills or it may also mean that on a purely attribute-based level your cognitive reasoning qualities are not matched to the authors.
Training, testing, and assessments have their place and they should be scrutinized before being applied to task or outcome. When selecting training for professional development, it may be wise to look for a mix of these methods to get a well-rounded experience, which delivers a more complete picture of you and your capabilities.
About the author
Peter Holtmann is president and CEO of Exemplar Global. and has more than 10 years of experience in the service and manufacturing industries. He received his bachelor’s degree in chemistry from the University of Western Sydney in Australia and has worked in industrial chemicals, surface products, environmental testing, pharmaceutical, and nutritional products. Holtmann has served on various international committees for the National Food Processors Association in the United States and on the Safe Quality Foods auditor certification review board.