Course Content
Prepare to develop an assessment tool
1.1 Clarify tool purpose, target group and context of assessment 1.2 Analyse target group characteristics and identify their needs relevant to assessment tool design and development 1.3 Access and analyse nationally recognised units of competency to identify what is required to demonstrate competence 1.4 Analyse available assessment instruments for their suitability for use, and identify required modifications
0/4
Plan and design an assessment tool
2.1 Review own skills and knowledge required to develop the assessment tool and identify gaps in subject matter expertise, industry relevance and industry currency 2.2 Address identified gaps according to organisational procedures 2.3 Determine steps and estimate time needed for the design and development of assessment tool 3. Design assessment tool 3.1 Review and select assessment methods appropriate to purpose, target group, required evidence collection and assessment context 3.2 Check and confirm that combination of assessment methods meets unit of competency requirements and supports principles of assessment and rules of evidence 3.3 Identify instruments required to collect evidence using selected assessment methods and according to organisational requirements
0/3
Develop an assessment tool
4.1 Record the context and conditions for assessment 4.2 Develop tasks to be administered to candidates 4.3 Develop outline of evidence to be gathered from candidate 4.4 Develop instruments to be used to collect evidence from candidate in line with universal design principles and according to legislative and regulatory requirements 4.5 Develop criteria to be used to make judgements about whether competence has been achieved 4.6 Develop administration, recording and reporting requirements 4.7 Develop instructions for assessor and for candidate 4.8 Map assessment tool to the nationally recognised training product 4.9 Document draft assessment tool according to organisational procedures
0/3
Finalise the assessment tools
5.1 Undertake a systematic review of the assessment tool according to organisational procedures 5.2 Trial assessment tool to validate its content and applicability 5.3 Collect and document feedback on assessment tool and amend tool as required 5.4 Finalise and store assessment tool according to organisational procedures
0/1
TAEASS512 – Design and develop assessment tools
About Lesson

Here’s a breakdown of best-practice processes for RTOs in the areas you mentioned:

1. Designing, Developing and Documenting Assessment Tools:

  • Subject Matter Expertise: Involve qualified trainers and subject matter experts (SMEs) in the design process to ensure assessments accurately reflect industry standards and learning objectives.
  • Alignment with Standards: Align assessments with relevant training package requirements and national competency standards set by the Australian Skills Quality Authority (ASQA) https://www.asqa.gov.au/.
  • Validity and Reliability: Ensure assessments are valid (measure what they intend to) and reliable (consistent results across assessors). Pilot test with a sample group and revise based on feedback.
  • Clear Instructions and Formatting: Provide clear instructions, assessment tasks, and marking criteria for students and assessors. Ensure format is accessible and user-friendly.
  • Version Control: Implement a version control system to track changes made to assessment tools, with clear documentation of revisions and dates.

2. Gathering, Organising and Recording Assessment Evidence and Decisions:

  • Comprehensive Evidence Collection: Collect a variety of assessment evidence to demonstrate student competency, including assignments, projects, observation records, and witness testimonies.
  • Secure Storage: Store assessment evidence securely, electronically or in hard copy, following data privacy and confidentiality regulations.
  • Detailed Recording: Record assessment decisions clearly, including justification for results and any adjustments for reasonable adjustments.
  • Easy Retrieval: Organise and store assessment records in a way that allows for easy retrieval and review.

3. Accessing Subject Matter Experts (SMEs):

  • Qualifications and Experience: Seek SMEs with relevant industry experience and qualifications aligned with the training package being assessed.
  • Clear Selection Criteria: Develop clear selection criteria for SMEs and maintain a register of qualified individuals.
  • Induction and Training: Provide induction and training for SMEs on RTO assessment practices, including ethical considerations and RPL processes.
  • Conflict of Interest: Manage conflicts of interest and ensure SMEs are independent from students they assess.

4. Recognition of Prior Learning (RPL):

  • Competency-Based Assessment: Base RPL decisions on a thorough assessment of a student’s existing skills and knowledge against the unit requirements.
  • Variety of Evidence: Utilise a variety of evidence sources for RPL, such as work experience records, portfolios, qualifications, and testimonials.
  • Formal Process: Establish a formal RPL process with clear guidelines and application procedures.
  • Appeals Mechanism: Implement a fair and transparent appeals process for students whose RPL applications are rejected.

5. Version Control:

  • Version Tracking System: Utilise a version control system (VCS) to track changes made to all RTO documents, including training materials, assessment tools, and policies.
  • Version Control Procedures: Develop clear procedures for documenting revisions, assigning version numbers, and ensuring everyone uses the latest versions.
  • Version Approval: Establish an approval process for new versions of documents, ensuring compliance and quality.
  • User Access Control: Implement user access controls for the version control system, VCS, to prevent unauthorised modifications.

 

By following these best practices, RTOs can ensure their assessment processes are fair, reliable, and compliant with regulatory requirements.