3. Introduction to Competency Assessment System

The Toolkit follows quality management system (QMS) principles. When applied, the Toolkit will fit directly into an existing or a future QMS. The goal is to describe processes and to improve quality over time. Collection of evidence and documentation of results is critical in competency assessment as it is for any QMS.

3.1 Assessment principles

Assessment principles state that assessments must be valid, reliable, flexible and fair.

Validity refers to the extent to which the interpretation and use of an assessment outcome can be supported by evidence. An assessment is valid if it integrates the required knowledge and skills with the practical application of a workplace task, and if the assessment outcome is fully supported by the evidence gathered.

Reliability refers to the level of consistency and accuracy of the assessment outcomes; that is, the extent to which the assessment provides similar outcomes for candidates with equal competence at different times or places, regardless of the assessor conducting the assessment. It also implies repeatability, in other words, that the candidate can demonstrate competence on more than one occasion, and in more than one context.

Flexibility refers to the opportunity for personnel to negotiate certain aspects of their assessment, timing for example, with their assessor. All candidates should be fully informed (through the assessment plan) of the purpose of assessment, the assessment criteria, the methods and tools used, and the context and timing of the assessment.

Fair assessment does not advantage or disadvantage particular candidates or groups of candidates. This may mean that assessment methods are adjusted for particular candidates (such as people with disabilities or from different cultural backgrounds) to ensure that the methods used do not disadvantage them. An assessment should not place unnecessary demands on candidates that might prevent them from demonstrating competence. For example, an assessment should not demand a higher level of English language or literacy than that required to perform the workplace standard outlined in the competencies being assessed. The assessment process should not prevent anybody from demonstrating their competence, skills or knowledge because the design of the assessment differs from the work and places them at a disadvantage.

In summary, assessment processes used must:

  • be consistent with the tasks and standards of the service area;
  • comply with the relevant assessment guidelines;
  • use a process that integrates knowledge and skills with their practical application in a workplace task (holistic approach);
  • target the correct qualification level;
  • be customizable.
  • 3.2 Assessment process

    Evidence is the information gathered which, when matched with the requirements of the competency, provides proof of competence. Evidence can take many forms and be gathered from a number of sources. It can be direct, indirect, or third-party supplementary. No single form of evidence is better than another. Indeed, applying direct, indirect and third-party supplementary evidence in combination can be the most effective (and fair) means of assessing an individual’s competence.

    Assessment methods are the means of collecting the evidence required to demonstrate satisfactory performance. There are different methods which are intended to collect evidence that have to be valid, sufficient, current and authentic. The main examples are

  • direct observation of the task in real time by the assessor in the workplace;
  • completion of simulated examples, such as a case study;
  • answers to questions on forecast and warning processes;
  • a portfolio of actual forecasts and warnings.
  • 3.3 Assessment Tools and their Descriptions

    Once the method has been selected, the means for collecting and analyzing the evidence are then chosen or designed. These means are called assessment tools. In general, the term assessment tool is used to describe a document that contains both the instrument and the instructions for gathering and interpreting evidence.

    (i) Direct observation

    Direct observation evaluates the individual performing a task in real time. It enables the assessment of actual processes employed by individuals undertaking operational activities. Performance assessments examine AMP's actual application of knowledge to solve problems. The assessment focuses on the process and the outcome.

    Direct observation can be done in a real job environment or in a different environment using simulators (see "Simulations" below). Video recordings can also be used for direct observation if appropriate. It has the advantages of minimal disturbance at work bench by the assessor and remote accessibility. The assessor cannot assess the thinking process of the forecaster or observer through video recording but it can be remedied through following up interviews and questioning. Because direct observation is limited to whatever is occurring at the time, it is usually combined with other tools to cover parts of the job that occur less commonly.

    Advantages:

  • When combined with other tools, offers a complete image of the competence of the assessee.
  • Is the most authentic tool.
  • Disadvantages:

  • Weather can rarely be the same during the assessment of AMP's, making it harder to ensure consistency across many AMPs.
  • Unlikely to be suitable for assessing rare events.
  • (ii) Simulations

    A simulation is where the forecaster or observer is given a real or hypothetical situation and asked to respond as if he or she were on the job. Simulations range from simple questions, such as experiential questions, case replay and demonstration, to mock briefings or full operational simulators.

    Advantage:

  • Can cover any of the aspects that were not covered through direct observation or even replace this tool depending on complexity.
  • Disadvantages:

  • Difficult and time consuming to create the scenario.
  • Building a library of case studies is time consuming, and may be difficult to maintain.
  • (iii) Tests

    Tests are a more traditional method of assessing knowledge. In some cases in the Toolkit, knowledge is used as a substitute for competence to make the assessment practical and cost-effective. Test could be in the form of interview, written quizzes or self-evaluation. Many styles of questioning can be used including multiple choice, short answers and more open-ended questions.

    Advantages:

  • Can deal with some unusual aspects of competence that couldn't be covered through direct observation.
  • Are repeatable.
  • Disadvantage:

  • Being focused on knowledge, can be very difficult to build as a test of competence.
  • (iv) Experiential Questions

    Experiential questions are like tests but the question asked is of the form "what would you do if…?". Experiential questions can be posed in written quizzes or oral interview. In the case of verbal questions, it is very important that answers and results are documented.

    Advantage:

  • Are a very useful tool in completing the competence image built through direct observation covering the aspects that couldn't be observed.
  • Disadvantage:

  • Not as authentic as direct observation.
  • (v) Portfolio

    A portfolio contains evidence of knowledge, ability or competence based on past experiences. Portfolio evidence can be powerful in demonstrating competency as it provides clear evidence of what an individual has done. Testimonials from employers/peers/customers, self-assessment can be included in a portfolio. A portfolio could even describe unsuccessful examples along with remedial work that the AMP has done to remedy the deficiency.

    Advantages:

  • Promotes self-evaluation, reflection, and critical thinking.
  • Measures performance based on genuine samples of AMP's work.
  • Disadvantages:

  • May cover only the aspects the assessee selects depending on how the portfolio is developed.
  • Can be time consuming for the assessee to collate.
  • Presenting challenges if they have not forecast/observed a particular situation historically.
  • Can be time consuming for assessors to mark if lots of people require assessment.
  • 3.4 Using multiple tools to build an assessment framework

    Most competency assessments will be done using direct observational techniques. Of the tools available, direct observation is the most authentic and is used for most performance criteria which can be observed frequently. Other techniques, such as tests or experiential questions, will often be needed to provide evidence of competence for seasonal or rare events. A well-developed competency assessment system will use multiple tools and methods to demonstrate that personnel meet the required competencies.

    3.5 Competency Assessment Matrix

    Once the assessment tools have been established, it is good practice to map them back to the performance criteria. Table 1 shows a competency assessment matrix for aeronautical meteorological observers (AMO) while Table 2 is a matrix for aeronautical meteorological forecasters (AMF). The specific performance criteria are listed in the left column while the other columns represent one tool or type of tool. Ideally, each performance criterion should be assessed with at least two tools to demonstrate that the forecaster can apply the skills and knowledge in a variety of contexts. More than three assessment tasks for one performance criterion may be considered redundant. In a small organization, using two assessment tools might be difficult unless one of them requires less effort, such as existing or extended supervisor's reports.

    The check mark (red tick) indicates that the specific criterion could be assessed using the tool as provided and that this tool is recommended. The lightning icon (icon) indicates that a specific performance criterion might be assessed on a given day with the (direct observation) tool. The snowflake icon (snoe) signifies that this type of tool could be developed to assess the specific criterion.


    Competency

    Tool 23

    Tool 25

    Tool 22

    Tool 21

    Tool 24

    1 Analyze and Describe weather

     

     

     

     

     

    2.1 general observing process

    red tick

     

     

     

     

    2.1 wind

    red tick

     

     

     

     

    2.1 visibility, RVR and vertical vis

    icon

     

     

    red tick

     

    2.1 significant wx phenomena

    icon

     

     

     

     

    2.1 cloud type, ceiling

    icon

     

     

     

     

    2.1 temperature

    red tick

     

     

    red tick

     

    2.1 pressure

    red tick

     

     

     

     

    2.1 other phenomena

    icon

     

     

     

     

    2.2 interpret various sensors

    red tick

     

     

     

     

    2.3 issue obs on time

    red tick

     

     

     

     

    2.3 issue obs in correct format

    red tick

     

    red tick

     

     

    2.3 obs are within amend criteria

    red tick

     

    red tick

     

     

    3.1 apply QMS

     

     

    red tick

     

     

    3.2 quality check observations

     

     

     

     

     

    3.3 correct errors

     

     

    red tick

     

     

    3.4 monitor operational systems

     

     

    red tick

     

     

    4.1 ensure dissemination of obs

     

    red tick

     

     

     

    4.2 presentation of met info

     

     

     

     

    red tick

    4.3 alert forecasters to sig wx

     

    red tick

     

     

     

    Table 1:  Competency Assessment Matrix for Aeronautical Meteorological Observers.  Links lead to the specific tools as well as details of the performance criteria.


    Competency

    Tool 3

    Tool 5

    Tool 1

    Tool 2

    Tool 6

    Tool 4

    Tool 7

    1.1 analysis

    red tick

     

     

     

     

     

     

    1.1 diagnosis

    red tick

     

     

     

     

     

     

    1.2 monitor situation

    red tick

     

     

     

     

     

     

    1.3 assess need for amd

    red tick

     

     

    red tick

    red tick

     

    red tick

    2.1 general forecast process

    red tick

     

     

     

     

     

     

    2.1 temperature/humidity

    red tick

     

     

     

     

     

     

    2.1 wind

    red tick

     

     

     

    red tick

     

     

    2.1 pressure

    red tick

     

     

     

     

     

     

    2.1 cloud

    icon

     

    snoe

     

    snoe

     

    snoe

    2.1 precipitation

    icon

     

    snoe

     

    snoe

     

    snoe

    2.1f reduced visibility

    icon

     

    snoe

     

    snoe

     

    snoe

    2.1 obstructions to visibility

    icon

     

    snoe

     

    snoe

     

    snoe

    2.1 thunderstorms

    icon

     

    snoe

     

    snoe

     

    red tick

    2.1 turbulence

    icon

     

    red tick

     

    snoe

     

    snoe

    2.1 icing

    icon

     

    snoe

     

    snoe

     

    snoe

    2.1 wake vortex

    icon

     

    snoe

     

    snoe

     

    snoe

    2.2 forecast on time

    red tick

     

     

     

     

     

     

    2.2 correct format

    red tick

     

     

    red tick

     

    red tick

     

    2.2 within amend criteria

    red tick

     

     

    red tick

     

    red tick

     

    2.3 monitor adjacent forecasts/warnings

    red tick

     

     

     

     

    red tick

    red tick

    2.3 liaise with adjacent regions

    red tick

     

     

     

     

    red tick

    red tick

    3.1 severe thunderstorms

    icon

     

    snoe

     

    snoe

     

    red tick

    3.1 severe turbulence

    icon

     

    red tick

     

    snoe

     

    snoe

    3.1 severe wind and shear

    icon

     

    snoe

     

    red tick

     

    snoe

    3.1 severe icing

    icon

     

    snoe

     

    snoe

     

    snoe

    3.1 cloud below aerodrome minima

    icon

     

    snoe

     

    snoe

     

    snoe

    3.1 hazardous phenomena

    icon

     

    snoe

     

    snoe

     

    snoe

    3.1 sand/dust storms

    icon

     

    snoe

     

    snoe

     

    snoe

    3.1 volcanic ash

    icon

    red tick

    snoe

     

    snoe

     

    snoe

    3.1 tropical cyclones

    icon

     

    snoe

     

    snoe

     

    snoe

    3.1 radioactive cloud

    icon

     

    snoe

     

    snoe

     

    snoe

    3.2 warnings issued on time

    red tick

     

     

     

     

     

     

    3.2 correct format

     

     

     

    red tick

     

    red tick

     

    3.2 warnings meet update criteria

     

     

     

    red tick

     

    red tick

     

    3.3 monitor adjacent forecasts/warnings

    red tick

     

     

     

     

    red tick

     

    3.3 liaise with adjacent regions

    red tick

     

     

     

     

    red tick

     

    4.1 apply QMS

     

     

     

    red tick

     

     

     

    4.2 errors/unrepresentative obs

    red tick

     

     

     

     

     

     

    4.3 validates information

     

     

     

     

     

     

     

    4.4 monitor systems

    red tick

     

     

     

     

     

     

    5.1 ensure dissemination

    red tick

     

     

     

     

     

     

    5.2 briefing

    red tick

     

     

     

     

    red tick

    red tick

    Table 2:  Competency Assessment Matrix for Aeronautical Meteorological Forecasters.  Links lead to the specific tools as well as details of the performance criteria.