3. Introduction to Competency Assessment System
The Toolkit follows quality management system (QMS) principles. When applied, the Toolkit will fit directly into an existing or a future QMS. The goal is to describe processes and to improve quality over time. Collection of evidence and documentation of results is critical in competency assessment as it is for any QMS.
3.1 Assessment principles
Assessment principles state that assessments must be valid, reliable, flexible and fair.
Validity refers to the extent to which the interpretation and use of an assessment outcome can be supported by evidence. An assessment is valid if it integrates the required knowledge and skills with the practical application of a workplace task, and if the assessment outcome is fully supported by the evidence gathered.
Reliability refers to the level of consistency and accuracy of the assessment outcomes; that is, the extent to which the assessment provides similar outcomes for candidates with equal competence at different times or places, regardless of the assessor conducting the assessment. It also implies repeatability, in other words, that the candidate can demonstrate competence on more than one occasion, and in more than one context.
Flexibility refers to the opportunity for personnel to negotiate certain aspects of their assessment, timing for example, with their assessor. All candidates should be fully informed (through the assessment plan) of the purpose of assessment, the assessment criteria, the methods and tools used, and the context and timing of the assessment.
Fair assessment does not advantage or disadvantage particular candidates or groups of candidates. This may mean that assessment methods are adjusted for particular candidates (such as people with disabilities or from different cultural backgrounds) to ensure that the methods used do not disadvantage them. An assessment should not place unnecessary demands on candidates that might prevent them from demonstrating competence. For example, an assessment should not demand a higher level of English language or literacy than that required to perform the workplace standard outlined in the competencies being assessed. The assessment process should not prevent anybody from demonstrating their competence, skills or knowledge because the design of the assessment differs from the work and places them at a disadvantage.
In summary, assessment processes used must:
3.2 Assessment process
Evidence is the information gathered which, when matched with the requirements of the competency, provides proof of competence. Evidence can take many forms and be gathered from a number of sources. It can be direct, indirect, or third-party supplementary. No single form of evidence is better than another. Indeed, applying direct, indirect and third-party supplementary evidence in combination can be the most effective (and fair) means of assessing an individual’s competence.
Assessment methods are the means of collecting the evidence required to demonstrate satisfactory performance. There are different methods which are intended to collect evidence that have to be valid, sufficient, current and authentic. The main examples are
3.3 Assessment Tools and their Descriptions
Once the method has been selected, the means for collecting and analyzing the evidence are then chosen or designed. These means are called assessment tools. In general, the term assessment tool is used to describe a document that contains both the instrument and the instructions for gathering and interpreting evidence.
(i) Direct observation
Direct observation evaluates the individual performing a task in real time. It enables the assessment of actual processes employed by individuals undertaking operational activities. Performance assessments examine AMP's actual application of knowledge to solve problems. The assessment focuses on the process and the outcome.
Direct observation can be done in a real job environment or in a different environment using simulators (see "Simulations" below). Video recordings can also be used for direct observation if appropriate. It has the advantages of minimal disturbance at work bench by the assessor and remote accessibility. The assessor cannot assess the thinking process of the forecaster or observer through video recording but it can be remedied through following up interviews and questioning. Because direct observation is limited to whatever is occurring at the time, it is usually combined with other tools to cover parts of the job that occur less commonly.
Advantages:
Disadvantages:
(ii) Simulations
A simulation is where the forecaster or observer is given a real or hypothetical situation and asked to respond as if he or she were on the job. Simulations range from simple questions, such as experiential questions, case replay and demonstration, to mock briefings or full operational simulators.
Advantage:
Disadvantages:
(iii) Tests
Tests are a more traditional method of assessing knowledge. In some cases in the Toolkit, knowledge is used as a substitute for competence to make the assessment practical and cost-effective. Test could be in the form of interview, written quizzes or self-evaluation. Many styles of questioning can be used including multiple choice, short answers and more open-ended questions.
Advantages:
Disadvantage:
(iv) Experiential Questions
Experiential questions are like tests but the question asked is of the form "what would you do if…?". Experiential questions can be posed in written quizzes or oral interview. In the case of verbal questions, it is very important that answers and results are documented.
Advantage:
Disadvantage:
(v) Portfolio
A portfolio contains evidence of knowledge, ability or competence based on past experiences. Portfolio evidence can be powerful in demonstrating competency as it provides clear evidence of what an individual has done. Testimonials from employers/peers/customers, self-assessment can be included in a portfolio. A portfolio could even describe unsuccessful examples along with remedial work that the AMP has done to remedy the deficiency.
Advantages:
Disadvantages:
3.4 Using multiple tools to build an assessment framework
Most competency assessments will be done using direct observational techniques. Of the tools available, direct observation is the most authentic and is used for most performance criteria which can be observed frequently. Other techniques, such as tests or experiential questions, will often be needed to provide evidence of competence for seasonal or rare events. A well-developed competency assessment system will use multiple tools and methods to demonstrate that personnel meet the required competencies.
3.5 Competency Assessment Matrix
Once the assessment tools have been established, it is good practice to map them back to the performance criteria. Table 1 shows a competency assessment matrix for aeronautical meteorological observers (AMO) while Table 2 is a matrix for aeronautical meteorological forecasters (AMF). The specific performance criteria are listed in the left column while the other columns represent one tool or type of tool. Ideally, each performance criterion should be assessed with at least two tools to demonstrate that the forecaster can apply the skills and knowledge in a variety of contexts. More than three assessment tasks for one performance criterion may be considered redundant. In a small organization, using two assessment tools might be difficult unless one of them requires less effort, such as existing or extended supervisor's reports.
The check mark () indicates that the specific criterion could be assessed using the tool as provided and that this tool is recommended. The lightning icon () indicates that a specific performance criterion might be assessed on a given day with the (direct observation) tool. The snowflake icon () signifies that this type of tool could be developed to assess the specific criterion.
Competency |
|||||
|
|
|
|
|
|
|
|
|
|
||
|
|
|
|
||
|
|
|
|||
|
|
|
|
||
|
|
|
|
||
|
|
|
|||
|
|
|
|
||
|
|
|
|
||
|
|
|
|
||
|
|
|
|
||
|
|
|
|||
|
|
|
|||
|
|
|
|
||
|
|
|
|
|
|
|
|
|
|
||
|
|
|
|
||
|
|
|
|
||
|
|
|
|
||
|
|
|
|
Table 1: Competency Assessment Matrix for Aeronautical Meteorological Observers. Links lead to the specific tools as well as details of the performance criteria.
Competency |
|||||||
|
|
|
|
|
|
||
|
|
|
|
|
|
||
|
|
|
|
|
|
||
|
|
|
|||||
|
|
|
|
|
|
||
|
|
|
|
|
|
||
|
|
|
|
|
|||
|
|
|
|
|
|
||
|
|
|
|||||
|
|
|
|||||
|
|
|
|||||
|
|
|
|||||
|
|
|
|||||
|
|
|
|||||
|
|
|
|||||
|
|
|
|||||
|
|
|
|
|
|
||
|
|
|
|
||||
|
|
|
|
||||
|
|
|
|
||||
|
|
|
|
||||
|
|
|
|||||
|
|
|
|||||
|
|
|
|||||
|
|
|
|||||
|
|
|
|||||
|
|
|
|||||
|
|
|
|||||
|
|
||||||
|
|
|
|||||
|
|
|
|||||
|
|
|
|
|
|
||
|
|
|
|
|
|||
|
|
|
|
|
|||
|
|
|
|
|
|||
|
|
|
|
|
|||
|
|
|
|
|
|
||
|
|
|
|
|
|
||
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||
|
|
|
|
|
|
||
|
|
|
|
Table 2: Competency Assessment Matrix for Aeronautical Meteorological Forecasters. Links lead to the specific tools as well as details of the performance criteria.