Evaluation Plan – Formal Assessment Strategies with Rationale

PIDP 3230 – Evaluation of Learning
Vancouver Community College
Kathryn Truant
July 4, 2017

Oral & Maxillofacial Surgery Module for Certified Dental Assistants (CDA’s) Evaluation Plan A – Formal Assessment Strategies with Rationale

Course:

Oral and Maxillofacial Surgery Module for Certified Dental Assistants (CDA’s)

Evaluation Plan

Strategies

 

 Value
1. Theory Examinations 10%
2. Clinical Performance Checklists: Peer-Assessment 30%
3. Clinical Performance Checklists: Self-Assessment 30%
4. Handbook Evaluation 30%

Table 1

Course Goals Strategies
1 2 3 4
Demonstrate perioperative care of a patient
Prevent surgically associated infections
Assist in the extraction of teeth
Assist in oral pathologies
Assist in grafting procedures
Assist in implantology

Table 2

Formal Assessment Strategies Rationale

  1. Description of Oral & Maxillofacial Surgery Module for Certified Dental Assistants (CDA’s)

This advanced (hypothetical) training program in the dental specialty of Oral and Maxillofacial Surgery (OMS) enables Certified Dental Assistants regulated by the College of Dental Surgeons of BC (CDSBC) to learn and master the skills required for oral surgery. Although the course is optional for employment, successful completion will allow a CDA to receive an OMS designation with their licensure. The OMS Module provides the advanced knowledge, skills and attitudes necessary for a CDA to enter the specialty, and to provide a professional development opportunity for CDA’s currently working in an OMS clinic (Truant, 2016). The module consists of 6 units which progress through basic perioperative and asepsis standards, prior to the CDA receiving instruction on more advanced OMS procedures (see Table 2).

The duration of the course is 30 hours, which includes a 10-hour theory component and a 20-hour clinical component. The theory and clinical components of the OMS assistant course take place in an OMS clinic on simulated patients. It is delivered in a series of ten 3-hour lessons, two nights a week over 5 weeks. CDA’s will receive 12 continuing education hours toward their annual license renewal requirement. Learner success in the module is measured with two theory examinations, six clinical performance checklists (self and peer-assessed), and the completion of a handbook composed of unit handouts (see Table 1).

Stakeholders of the Module and Impact:

Stakeholder Impact
CDA (Learner) Then pursuit of advanced training requires financial and time commitment.  Learning is meaningful with genuine participation in all aspects of the learning process, which relies heavily on self-assessment and peer assessment. Learners become ambassadors of the module and their own profession.
CDA (Instructor) The instructor has a commitment to deliver a curriculum that aligns with what the learners set out to learn. The instructor has a commitment to provide evaluation that is valid and reliable. The instructor must strive to create a positive learning environment. The instructor is a representative of their profession and of the module.
CDA (Module Coordinator) The module coordinator has the responsibility of designing a curriculum that reflects what the CDA needs to learn to produce successful CDA’s in the specialty of OMS. The success of the learners reflects the module. The module coordinator has the responsibility of assigning skilled instructors. The module coordinator has the responsibility of balancing a budget, and promoting the module.
Oral and Maxillofacial Surgeons Employers trust that the module produces CDA’s with advanced knowledge, skills, and attitudes. Employers rely on the module to increase workplace safety and production. Employers may be motivated to support employees in the pursuit of the advanced training module.
College of Dental Surgeons of British Columbia (CDSBC) The CDSBC regulates dentistry in the public interest for the purpose of safety. The CDSBC endorses the module and approves special OMS designation on a CDA’s license. The CDSBC expects the module to produce skilled and professional members who represent the College.
The Citizens of British Columbia The public relies on a governing body (CDSBC) to register skilled professionals who meet all requirements for licensure.

Table 3

Grading System:

The grading system of the Oral & Maxillofacial Surgery Module for Certified Dental Assistants (CDA’s) is a culmination of formal assessment strategies noted in Table 1. The course outline explains that attendance at all sessions is mandatory, and that self and peer-assessment comprises most of the measurement. Evaluation of the learning that occurs relies on a criterion referenced assessment approach to grading: “criterion referenced assessment items measure a course, its objectives, content and activities directly” (Vancouver Community College, 2017). Success in the theory component of the module is measured using the absolute grading standard which aligns with a criterion referenced approach to assessment:

A = 90% +
B = 80-89%
C = 70-79%
D = 60-69%
F = Below 60%

Using this grading standard, a CDA must achieve a minimum of 70% on the theory exams to reflect a readiness to advance to the clinical portion of the unit, or to the next unit.

The clinical performance checklists and the handbook evaluation use the pass/fail system of grading where the CDA must complete the evaluations to pass. “In theory, whenever mastery learning is used (learners repeat course material until they have passed) a pass/fail system should be used” (Vancouver Community College, 2017). The evaluation process is summative because it “occurs at the end of a unit or course of study. Its purpose is to summarize what the learner has accomplished and the growth that has taken place” (Fenwick & Parsons, 2009, p. 23).

Time Spent on Formal Assessment Strategies:

The two theory examinations are 45 minutes each for a total of 1.5 hours spent on cognitive knowledge assessment instruments. The clinical performance checklists are 1 hour each for a total of 6 hours. The handbook evaluation takes place during the final session and will require 1 hour to complete. In total, the time spent on formal assessment strategies in the module is 8.5 hours, which provides ample time for instruction, discussion, informal assessment activities (e.g. Course-Related Self-Confidence Surveys), and practice of clinical performance.

  1. Purpose of Evaluation for Oral & Maxillofacial Surgery Module for Certified Dental Assistants (CDA’s)

In The Art of Evaluation, Fenwick & Parsons outline the nine purposes of evaluation (2009). These purposes and how they relate to the module are as follows:

  1. Measure learner performance with the goals of the module. A CDA will be evaluated with valid items that align with the course goals, and reliable items that are consistently taught in the course.
  2. Motivate learners on how they can improve their own learning. For example, do they need extra practice performing a clinical duty? Can CDA’s with prior OMS experience assist their peers? Using evaluation for this purpose gives learners control over the actions that they need to take to improve; this creates a positive learning environment.
  3. Monitor learner’s ongoing progress. The module is divided into 6 units that the CDA must master individually before progressing to the next unit.
  4. Assess teaching methods. How effective has instruction been? Are the students learning what the instructor is teaching? Are employers satisfied with the skills of the CDA’s who have taken the module?
  5. Look for ways that the module can be or should be revised? Has enough information been provided to the learner prior to a theory exam? Has sufficient time for students to practice been provided prior to a clinical performance evaluation? Can the learners share with the instructor of any new or insightful “tips of the trade?” Are there new techniques that the learner can suggest in improving the curriculum? Using evaluation for this purpose creates a positive learning environment.
  6. Provide information to the stakeholders (see Table 3). Evaluation is a quantitative process that can reassure all parties involved that the module is accountable for what it sets out to produce: CDA’s with advanced knowledge, skills, and attitudes in OMS.
  7. Assess learner’s background knowledge. Prior learning, specifically being a licensed CDA with clinical experience is a pre-requisite of the module. Teaching must be efficient. For example, if the module’s goals are to teach advanced skills, an instructor can assess at the onset of a course whether content is redundant.
  8. Assess learner satisfaction. Instructors have an obligation to provide the best service to their learners. Using evaluation for this purpose creates a positive learning environment when learners feel that they have input into improving the instruction process.
  9. Develop self-assessment skills in learners. Self-assessment is a key component to the evaluation process of the module; “learners develop strategies and criteria for monitoring their own ongoing performance” (Fenwick & Parsons, 2009, p.5).

3. Claims of Reliability and Validity for Each Formal Assessment Strategy on the Evaluation Plan for the Oral & Maxillofacial Surgery Module for Certified Dental Assistants (CDA’s) (see Table 1)

1. Theory examinations

The two theory examinations for Unit 1 & 2 of the module are reliable knowledge assessment instruments (objective) because they measure exactly what is supposed to be measured consistently.

Examples of Reliability:

Test/Retest Reliability – Testing the same group of students twice using the same exam will produce consistent results. There may be an improvement on the second attempt because students may learn from an error, but overall the results will be similar.

Internal Reliability – Items on the exams are consistent in terms of difficulty.

Inter-Rater Reliability – Exam results will be consistent regardless of instructor. The items are clear and the correct responses do not vary. The items are objective; therefore, measurement is reliable.

The two theory examinations are valid knowledge assessment instruments (objective) because they measure the theory items that are supposed to be measured and reflect instruction.

Examples of Validity:

Consequence Validity – The exams measure the learner’s intended knowledge of theory instruction as presented in comprehensive instruction.

Construct Validity – I couldn’t find an example of construct validity for the theory exams because the exams are comprehensive representations of the unit’s objectives, and are not task oriented. I couldn’t see a way to assess learner performance in this way.

Content Validity – Content of the exams are consistent with instruction.

Process Validity – The exams will obtain consistent results because the items reflect instruction.

Predictive Validity – The exams measure readiness to advance to clinical performance assessment.

2. Clinical Performance Checklists (Peer and Self-Assessment)

The 6 clinical performance checklists in the module are reliable authentic assessment instruments because they measure exactly what is supposed to be measured consistently.

Examples of Reliability:

Test-Retest Reliability – The checklists produce the same results when taken more than once by the same group of students. There is no room for error or omission, therefore a consistent result is achieved.

Internal Reliability – All the observable criteria items on the checklist are assessed at the same degree of difficulty. The learner simply needs to complete the criteria in sequential order.

Inter-Rater Reliability – The performance checklist results will be consistent provided the learner completes all criteria, regardless of the person assessing (i.e. peer-assessment, self-assessment, instructor assessment). The items are criterion referenced, therefore performance is measured by an absolute standard.

The 6 clinical performance checklists are valid authentic assessment instruments because they measure the observable criteria that is supposed to be measured and they reflect clinical instruction.

Examples of Validity:

Consequence Validity – The checklists measure the learner’s intended skills as performed.

Construct Validity – The checklists directly measure the learner’s performance of criteria items.

Content Validity – The criteria items reflect instruction (clinical demonstration).

Process Validity – The checklists will obtain consistent results. The checklists, as an instrument, measures the process involved in assessing performance.

Predictive Validity – The results of the checklists will indicate future performance as a competent OMS CDA.

3. Handbook Evaluation

The handbook evaluation of the module is a reliable student-created product because it measures the content of the module, and it measures it consistently.

Examples of Reliability:

Test-Retest Reliability – Learners are given equal opportunity and resources to compile handbook, therefore results will be consistent between learners.

Internal Reliability – The handbook evaluation progresses through the module sequentially, therefore degree of difficulty of compiling handbook progresses incrementally (consistently).

Inter-Rater Reliability – The items required for handbook completion are criterion referenced, therefore the compilation must adhere to external standards (licensed OMS duties). Handbook evaluation results will be consistent regardless of instructor.

The handbook evaluation of the module is a valid student-created product because it measures what it is supposed to measure (the goals of the module), and it reflects all unit handouts, theory instruction and clinical performance checklists.

Examples of Validity:

Consequence Validity – The handbook evaluation measures the intended knowledge skills and attitudes of the module as an entire objective.

Construct Validity – The attitudes, values and beliefs of the module are measured when the learner completes the handbook.

Content Validity – The content of the handbook evaluation reflects instruction. When complete, the handbook is a comprehensive compilation of the module in its entirety.

Process Validity – The handbook evaluation will produce consistent results. The evaluation is explained at the onset of the module, so learners know what is expected for evaluation.

Predictive Validity – The results of the handbook evaluation will indicate future performance as an OMS CDA. The handbook will become an articulate and comprehensive reference to their professional practice.

  1. Decision to use Formal Assessment Strategy from Evaluation Plan for Oral & Maxillofacial Surgery Module for Certified Dental Assistants (CDA’s): Clinical Performance Checklists: Peer-Assessment (see Table 1)

a. Why did I choose this formal assessment strategy?

The module supports a formal assessment strategy that facilitates peer-assessment, specifically an instrument that encourages peer-assessment whose items are criterion referenced. In Oral and Maxillofacial Surgery, “criterion-referenced evaluation compares a learner’s performance to an absolute, external standard or criterion” (Fenwick & Parsons, 2009, p. 39). The absolute criterion is set standards derived from the College of Dental Surgeons of BC (College of Dental Surgeons of BC, 2017).

Peer-assessment is an instrumental strategy in measuring the performance of a CDA entering the specialty of OMS, and for CDA’s wanting to improve their professional practice. Peer-assessment “develops better understanding of the expected standards and evaluation process” (Fenwick & Parsons, 2009, p. 52). Further, CDA’s can continue to use peer-assessment in their future professional practice as an active member of a surgical team by allotting ample opportunity to practice peer-assessment in the module.

b. Are the peer-assessed clinical performance checklists formative or summative?

The checklists are a formal assessment strategy, and therefore measure summative performance.

c. When will the peer-assessed clinical performance checklists occur and how much time do they require?

The checklists measure clinical performance following instructor demonstration, and take place at the end of each of the 6 units of the module. The time allowed to complete each checklist is one hour.

d. What goals in the module do peer-assessed clinical performance checklists reflect?

All the goals outlined in Table 2 are peer-assessed by the clinical performance checklists.

e. Are the peer-assessed clinical performance checklists cognitive or authentic assessments?

The checklists are authentic assessments. Although assessment is simulated, it provides authentic clinical learning that is relevant, and will offer constructive feedback.

f. What is the value of assessment?

Each peer-assessed clinical performance checklist is worth 5% of the total module evaluation with a combined total of 30% (see Table 1). I determined the value of each assessment in correlation to the equal importance of each module while taking into consideration the value of the three alternate formal assessment strategies.

g. How will the learners be prepared for assessment?

Learner preparation in this strategy also involves peer preparation. Defining clear assessment criterion in the form of unit checklists advises learners and their peers exactly what is expected of them, and what to expect from their peers. “To provide helpful summative assessment for a peer, learners need clear criteria, [and] clear indicators” (Fenwick & Parsons, 2009, p. 256).

h. How and when will feedback be provided?

Feedback will be provided during, and immediately following the clinical performance checklist by the classmate observing and recording performance. In addition, the instructor will circulate among learners during the formal assessments to give quick verbal feedback to everyone.

  1. How “Assessment for Learning” will be Applied in the Oral & Maxillofacial Surgery Module for Certified Dental Assistants (CDA’s):

Assessment for learning occurs during the learning process. “It is not directed toward measuring how much a learner has learned after a unit of study, rather, it is primarily focused on how learners might improve the quality of their work during their study” (Fenwick & Parsons, 2009, p 157).

This assessment approach is based on the belief that:

  1. Learners will improve if they understand the goals of learning.
  2. Learners will improve when they relate to the goals.
  3. Learners will improve when they engage with the goals enthusiastically.

(Fenwick & Parsons, 2009).

Assessment for learning will be applied by:

– observing and highlighting “aha” or “teaching” moments from learner or instructor errors by working together in a team approach (much like being on a surgical team) to improve procedures without singling a learner out.

– engage in activities that help learners evaluate their own and others’ work (e.g. self and peer-assessed checklists).

– finding opportunity to revise instruction based on “learning” moments.

Assessment for learning in the module is a valuable way to create a positive learning environment by helping learners learn how to be better learners.

@ Please refer to my Resources page for works cited