Assessment Services is a component of the Education and Training Directorate at RANZCOG and, as such, is responsible for the conduct of the College’s assessment processes and examinations, as overseen by the Education and Assessment Committee (EAC). The focus of the EAC, the Director of Education and Training and the Assessment Services department is to maintain a culture of best practice in RANZCOG examinations to further enhance examination processes.
Standard setting is the process used to select a passing score for an examination that requires a pass/fail decision. For many years, College examinations had a set or a fixed pass mark. Several years ago, after trialling alternative methods, the College replaced the set pass mark with the concept of a minimum acceptable passing standard (MAPS). The MAPS method is known as criterion-referenced standard setting and involves setting the pass mark to an absolute standard that denotes the required/acceptable level of competence or performance to be demonstrated by a candidate in order to pass each question and, ultimately, the examination.
Since its initial introduction for the MRANZCOG examinations, the application of the MAPS method has been reviewed and refined, and adopted for all College examinations. The College uses several different types of criterion-referenced standard setting methods, including modified Angoff and Rothmans, as different methods are considered more suited to particular types of examinations and cohort sizes.
Criterion reference standard setting is recognised as the method of choice for high-stakes examinations and is now a requirement of the Australian Medical Council (AMC). The College works hard to ensure that the standard is set robustly, objectively and consistently. All examiners now participate in moderation and education sessions before assessing to ensure that a fair pass mark is derived based on the knowledge expected from candidates and the examination difficulty.
Question writing, review and refinement
Statistical reviews of the MRANZCOG and DRANZCOG multiple choice question (MCQ) data banks are undertaken regularly to identify questions in need of revision, owing to poor performance ratings. When such questions are identified, they are temporarily withdrawn from the data bank and kept aside for a workshop where they are further reviewed and revised by Fellows or, if they are considered to be unrepairable or no longer current, they are withdrawn permanently from the bank. To review, refine and write new questions is time and resource intensive. To ensure high-quality questions, the relevant workshops cover areas such as the features of a good question, the principles behind writing a good question, how to ensure validity, the need for peer review and question performance. All MCQs have been blueprinted against the curriculum; areas not sufficiently addressed in existing examination questions are identified and targeted for question development at these workshops.
Feedback to examination candidates
All examination candidates now receive written information and feedback on their performance in the Written and Oral Examinations, regardless of whether they passed or failed the respective examination.
For the MRANZCOG Written MCQ examination, candidates receive the pass mark for the MCQ examination, their performance in relation to the passing mark expressed within a percentage range and their performance in the listed MCQ sub-heading topics.
For the MRANZCOG short answer question (SAQ) paper, candidates receive the pass mark for the SAQ examination, their performance in relation to the passing mark expressed within a percentage range and their performance in each of the 12 questions in relation to the MAPS score (well below, below, at, above or well above).
For the MRANZCOG Oral Examination, candidates receive the pass mark for the examination, their performance in relation to the passing mark expressed within a percentage range and their performance in each of the ten stations in relation to the MAPS score for that station (well below, below, at, above or well above MAPS). Similar written feedback is also provided to all Diploma and Subspecialty Examination candidates.
Access to question banks for candidates
To encourage learning and discussion with peers and ensure a level playing field, 100 MRANZCOG MCQs and 100 DRANZCOG MCQs have been placed on CLIMATE, the College’s eLearning platform. Questions can be accessed by all candidates enrolled in the respective qualifications.
Examination revision courses
Examination revision courses are offered by most of the Regional Offices to assist local candidates in preparing for their examinations. Guidelines have been developed to ensure the programs offered are uniform in purpose, content and quality; the revision programs are applied consistently across the states and regions with respect to the delivery, teaching and topics covered; and the program covers the areas that are core to the practice of obstetrics and gynaecology and those commonly addressed in the Written and Oral examinations. In addition, pass/fail results are now sent to Regional Offices once all results have been released to candidates, who can use the information to tailor and/or add additional revision courses and activities if deemed necessary.
Support for examiners
In order to build the reliability of the examination-marking process, strategies have been implemented to provide FRANZCOG examiners with information about standard setting, calibration and the importance of inter-marker reliability. Three ‘Marking Centres’ have been held for examiners marking the MRANZCOG SAQ paper over the last 12 months. The objectives of the full-day Marking Centre sessions have been to provide: support to the examiner cohort; an opportunity for examiners to confirm assessment principles and marking guidelines face-to-face; a method to encourage inter-marker consistency; and a process to expedite the marking of the many papers involved. These sessions have also allowed trialling of the new online marking portal during its development.
Feedback to examiners
For all College examinations, examiners are now provided with feedback on their performance. For the MRANZCOG and Subspecialty Written Examinations, examiners receive the average and the range of scores they awarded to candidates in comparison with that of their co-marker(s), the range of scores they assessed as being ‘At MAPS’ for the question(s) they marked and that of their co-marker(s) and the determined ‘At MAPS’ score for the relevant question(s). For the DRANZCOG, MRANZCOG and Subspecialty Oral Examinations, examiners receive information on the pass mark calculated from the standard setting process, the examiner pass mark calculated from their submitted scores during the standard-setting process and specific data on the stations they examined, which includes a comparison of their scores (median, range, mean) and MAPS with other examiners for those stations.
Providing individual feedback allows and encourages examiners to review their practice in comparison with that of their peers and is one of a range of steps implemented to ensure best practice.
Online marking and examinations portal
Considerable headway has been made with the design, development and deployment of an online marking and examinations portal located on the College’s Learning Management System. The online marking function is now fully operational, with all written examination papers marked online. The portal enables secure access to examination questions and candidate papers together with allocated grading criteria, the allocation of multiple markers, online submission of marks and written feedback by examiners and better reporting mechanisms.
The online examination function is continuing to be progressively developed and refined. Three online trials have already been held at secure Computer Centres in several states, with candidates volunteering to undertake their MCQ and SAQ examinations online. Following a staged implementation, it is envisaged that online completion of all College written examinations will be in place for all candidates from 2017.
Processes for marking and calculating examination results are continuously examined to ensure fairness to all candidates and to consider areas where improvements could be made. Other recent initiatives include:
- management of third marker marks to improve reliability and fairness;
- removal of discretionary marks in all SAQ papers;
- changes to the results calculations;
- trialling of alternative concurrent standard setting methods;
- SAQ writing workshops;
- updates to examination policy and regulations; and
- processes to address examination security.
The introduction of new, and the enhancement of existing, assessment processes represent significant advances in the quality and reliability of the examinations and assessment processes at RANZCOG. It must be acknowledged, however, the success of these continuous improvements is only possible because of the multiple hours, including at night and weekends, that many Fellows generously give as they work with College staff to ensure best practice.