Lifelong learning
Vol. 24 No 2 | Winter 2022
There’s more than one ‘C’ in curriculum
RANZCOG Education Directorate
Curriculum, Evaluation and Accreditation Unit

What does it mean for RANZCOG when we say ‘curriculum review’? The College offers a prevocational pathway, nine training programs, and equivalent specialist international medical graduate (SIMG) pathways for specialist and subspecialist programs. Each program comprises a complex mix of training, learning and assessment experiences, with accompanying processes, policies and regulations. ‘Curriculum review’, to put it mildly, covers a broad church.

The Curriculum Review Expert Advisory Panel (CREAP) was formed in 2018 to identify areas of specific focus for the College. Their report of mid-2019 established the main directions for curriculum review activities:

  • Structure and design of College curricula
  • Progression through programs to suit multiple career paths
  • A move to more programmatic forms of assessment

The key College committee currently leading this work is the Curriculum and Assessment Steering Group (CASG), chaired by the College’s Dean of Education, Prof Ian Symonds. They are supported by the Curriculum, Evaluation and Accreditation (CEA) unit of the College’s Education Directorate.

While the challenges of COVID-19 have sometimes slowed the work of curriculum review since early 2020 (dealing with operational needs has often had to take priority), significant progress has nonetheless been made across a range of areas – the ‘many Cs of curriculum’ detailed below.

The timely implementation of the outcomes of curriculum review is vital ahead of the College’s reaccreditation by the Australian Medical Council in 2023.

Contemporary – CanMEDS – Building for the future

The curricula of the College’s Diploma and Fellowship programs (though not subspecialty programs) currently use three ‘domains’ to define competent obstetricians and gynaecologists: Clinical Expertise, Academic Abilities and Professional Qualities. However, the three domains in place cannot facilitate a full articulation of all key roles within any College curriculum, since role attributes or requirements become homogenised and not sufficiently represented.

The College is therefore adopting the CanMEDS Physician Competency Framework for all College training programs. First developed by the Royal College of Physicians and Surgeons of Canada in 1996, CanMEDS has become the most recognised and most widely applied healthcare profession competency framework in the world, used by medical colleges worldwide to identify and describe the abilities physicians require to effectively meet the healthcare needs of the people they serve.

Abilities are grouped thematically under seven roles, enabling a full articulation of what is expected of a qualified practitioner – the ‘whole person’ – and thus a comprehensive exploration of the breadth and depth of each training program:

  • Medical Expert
  • Communicator
  • Collaborator
  • Leader
  • Health Advocate
  • Scholar
  • Professional

In the words of Prof Symonds, ‘Adopting the CanMEDs framework will bring RANZCOG into line with what is recognised internationally in curriculum design for medical education and help us to place more emphasis on a patient-centred approach to professional development.’

Graduate Outcomes Statements

With the CanMEDS framework in place, the next step has been to develop a Graduate Outcomes Statement (GOS) for each training program. The GOS provides a high-level articulation of expectations to answer the following question: ‘What should a qualified practitioner look like as they emerge from the program in ten years’ time?’

The GOS is based around the CanMEDS roles, thus helping to flesh out the real person emerging from a program. It also incorporates a statement about scope of practice to identify the knowledge, skills and attributes a qualified practitioner will be expected to have attained upon satisfactory completion of training, in order to practise independently.

Each relevant College committee is refining a GOS to suit their program. The development of a GOS will ensure that the CanMEDS framework is adapted appropriately for each College training program, and with sufficient future-thinking to maintain its currency for a significant period. The GOS can then be used as the basis for a full review of curriculum detail.

Clarity and consistency – adopting a unified approach to curriculum structure

It is a challenge for any curriculum to explain in full the depth and breadth of required skills and knowledge at particular points in training. It is also a challenge to link skills and knowledge effectively. This becomes an issue when formulating examinations and other assessments and ensuring that they are fit for purpose.

In their July 2019 report, the CREAP identified a number of broad structural issues with current curricula, including:

  • Conflicting information in different documents
  • Basic and advanced learning requirements not defined in the FRANZCOG curriculum
  • Inconsistent structures used for mapping assessments against key competencies, with no such mapping in subspecialty curricula.

To address these issues, a new standard curriculum structure design – known as Clinical Skills and Knowledge in Practice (CSKIP) – is to be applied across all College programs.

The CSKIP model provides a workable means of presenting learning outcomes and incorporating the necessary alignment with different stages of training, CanMEDS roles, teaching and learning strategies and assessment. It also shows the connections between each stage of training, and how knowledge is ‘nested’ within each stage – for example, knowledge acquired in early years is retained and extended in later years.

Prof Symonds suggests: ‘Having a common “language” for our different training programs not only reduces duplication of teaching and assessment resources, but also opens up greater possibilities of recognition of prior learning, and facilitates movement between different career pathways in women’s health within RANZCOG.’

‘There is now international recognition that every moment of assessment, no matter how casual, is a powerful tool to aid learning by providing meaningful feedback.’

A/Prof Robert Bryce, RANZCOG Specialist Advisor: Assessment


Connection and cohesion – FRANZCOG Advanced Training

The previous decade or more has seen progressive shifts in approach to the two-year advanced training component of the FRANZCOG program. No specific requirements were in place prior to 2010, and in the period to 2016, there was a simple requirement that training included at least 50% clinical experience. 2017 saw the first articulation of Advanced Training Modules (ATMs).

The future of FRANZCOG Advanced Training provides a pathway for all trainees, no matter what their specific area of interest, while maintaining a common scope of practice for all those elevating
to Fellowship.

The Advanced Training Pathway Framework (Figure 1) has been approved by the Education Standards Committee, and work is in progress to provide or refine the detailed requirements for each pathway. Of particular note:

  • There will be a clearly-defined FRANZCOG pathway for each of the five subspecialties
  • The introduction of a Sexual & Reproductive Health Pathway represents a major shift in addressing a key area in women’s health

Critical judgment – examinations and assessment

Enhancing programmatic approaches to assessment

It is accepted in medical education that no individual assessment can determine whether a candidate is truly knowledgeable or competent. We can draw an inference about a trainee’s true knowledge or competence, but this inference may be incorrect in some cases. Therefore, there is always a need to reduce the likelihood that:

  • truly knowledgeable or competent candidates are judged as insufficiently knowledgeable or competent; OR
  • truly unknowledgeable or incompetent candidates are judged as sufficiently knowledgeable or competent.

Modern medical education seeks to consider data from a program or suite of assessments in order to increase the likelihood of drawing a correct inference – hence programmatic assessment.

In reviewing assessment requirements for its programs, the College is investigating:

  • Introducing a wider range of workplace-based assessments (WBAs)
  • Improving the mechanisms behind training supervisor reports
  • Ensuring a reliable and valid suite of examinations is used as a part of the program, though not necessarily as hurdle requirements

Prof Symonds once more: ‘There is increasing recognition that obtaining a more rounded holistic view of trainee performance improves our ability to accurately and reliably judge whether trainees are ready to progress to the next stage of training or independent practice. We can best do this by moving away from a reliance on one single instrument or assessor, to gather feedback over time from different sources and then using this to make decisions on progression.’

Standard setting in examinations – the borderline approach

The concepts outlined above dovetail with a change in approach to examination standard setting.

There is a clear desire within the College to maintain a barrier role for examinations, but concern remains that current standard setting – which results in estimating a single point that is the pass/fail mark (in simplistic terms) – is unrealistic and unfair to some candidates.

2023 will see the phased introduction of a borderline approach to standard setting for College examinations. Clear passes will be identified at a point above borderline and clear fails below borderline. Candidates whose results fall between these boundaries will require further assessment – most likely through the use of increased WBAs
and improved Training Supervisor reports as outlined above.


The vast majority of training takes place in a hospital setting, where trainees are supported by a range of consultants who provide feedback about their capabilities and competency. To reinforce the future of the College’s training programs, particularly with the move to more programmatic forms of assessment, it is vital that all those involved in training are well-skilled in the provision of feedback. This will ensure trainees are supported to identify specific shortfalls in their performance as early as possible, enabling learning development plans and other mechanisms to be put in place to assist trainees address them before the assessment point.

Conduct – a note on bullying, harassment and discrimination

The College released the following statement in February 2022 in response to the publication of the report of the Bullying, Harassment and Discrimination (BHD) Advisory Working Group:
‘Any form of bullying, harassment or discrimination is unacceptable, and poses a risk to employee and patient health and safety.

RANZCOG is concerned that so many of our members and trainees have reported gender bias, discrimination, bullying and harassment in the workplace. We regret that this has occurred and are sorry for any adverse impact it may have had on their lives. All our members and trainees deserve to work and train in a safe environment.

The independent Bullying, Harassment and Discrimination (BHD) Advisory Working Group report, which we have published today, commits us to action to help build respectful work environments.’

Through the curriculum review, and accompanying improved accreditation standards for training sites, RANZCOG will continue to try to ensure that zero tolerance for bullying, harassment and discrimination is ingrained in its culture, and that our future professionals are trained to demonstrate and model appropriate behaviours at all times.


Leave a Reply

Your email address will not be published. Required fields are marked *