WHEN the competency training framework was rolled out in 1987, it met with some strong resistance and criticism from training practitioners and educationalists.
For example, Richard Bagnall wrote a paper in 1996 called “Pluralising Continuing Education and Training in a Postmodern World: Whither Competence”, where he concluded that “from a post-modern perspective it [competency based education] is primitive, simplistic and oppressive”.
Since then, criticisms have become much more muted, perhaps because of changes to the framework in responding to the criticisms and a propensity by others who have “learned to live with them”. Or, is it that teachers and training providers have been bullied into acquiescence?
But, the critique still is relevant. There are many problems that lurk in the netherworld of competency training. The single biggest problem is the complexity of the system, which means that it’s too slow to meet 21st century training needs (after all, the competency-based training system was conceived in a pre-digital age).
At every level we look there is complexity.
Take descriptions of a competency. This requires listing in a systematic order, every step necessary to accomplish a task, whether this is to ‘Conduct interviews’ or ‘Work with crocodiles’. These steps are called ‘Performance Criteria’. On the assessment ledger, a candidate who does not achieve every single performance criteria in the process (20 in interviewing, 22 in the case of crocodiles), is deemed “Not Yet Competent” (in layman’s terms, failure).
The implementation of the competency framework is understandably rife with misunderstandings and work-arounds.
In order to enforce the standards, there is a system of external auditing, and random audits are made on training providers to ensure that they comply. These audits are complex, often contradictory, and often do not take into account shifts in training practice and assessment, particularly where digital technologies are used. (See E-assessment Report)
One primary confusion is the relationship between learning and assessment.
It is not well understood that the competency framework is mainly about assessment. National standards are devised by industry against which candidates can be assessed to see whether they are competent.
Unfortunately, many trainers use the competency standards to structure their teaching materials or worse, as a pedagogy. Consequently, the training can be very boring (going over ground that the trainee already knows), lock step and perfunctory, which stifles learner versatility, creativity and imagination.
Sure, the training has to make sure it covers off the assessment requirements, but how a trainee gets there, the pathway they choose to take, is irrelevant.
Assessment comes off second best in this regime. Assessment activities are often poorly devised, and barely meet the AQTF standards.
The trainer, because training is their focus and not assessment, finesses with the training, plodding through material that perhaps half the class already knows. There’s a reluctance to engage with personalised learning, as being just too difficult. There is very little account taken for the differences between learner’s aptitudes and what they already bring to the training. Recognition of prior learning (RPL) is avoided or discouraged. Training conducted for decades is ‘mapped’ against the competencies; not altered to meet new learnings. No wonder so many students drop out!
A second more fundamental criticism is the slowness of the system to reflect changes in the real world.
Most Training Packages, as they are called, have a shelf life of about 5 years before they are systematically reviewed. The review process is long and detailed with many “stakeholders” involved. The review has to catch up on those changes that may have occurred in industry in the proceeding five years, but it needs to project forward at least 5 years to anticipate changes that may occur. So the review is really covering 7 – 10 years of industry change.
In many industries, the changes over this time period can be enormous. Many technologies have come and gone in that timespan.
Competency training was conceived in a basically pre-digital era (ie. prior to the Internet). Even though Training Packages have sought to include computer skills into their ambit, it’s the rate of transfer of skills and knowledge that is at issue here, which the digital era has enabled. There is a serious disconnect between the competency training framework and the open digital world of networks, knowledge sharing, mentoring and personalised learning.