Skip to main content
Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Adv Med Educ Pract. 2024; 15: 207–216.
Published online 2024 Mar 20. doi: 10.2147/AMEP.S454467
PMCID: PMC10961076
PMID: 38525099

Reflecting on Experiences of Senior Medical Students’ External Clinical Teaching Visits in General Practice Placements: A Pilot Study

Abstract

Purpose

Australian general practice training uses external clinical teaching (ECT) visits for formative work-based assessments. ECT visits appoint senior general-practitioners (GPs) observe trainee GPs’ consultations, provide feedback, and make performance-enhancing recommendations. As ECT visits are one of the best assessment tools in Australian GP training, there is limited evidence of its use in undergraduate teaching. This study aims to introduce ECT visits and evaluate assessment tools during senior medical students’ GP placement.

Methods

This study included external and internal GP supervisors and twenty-five Chinese and Australian students during GP placements. The supervisors provided structured in-person feedback, while the ECT assessment tool used a standardised, validated feedback platform to assess every component of a consultation. Students’ feedback was recorded and collected by both internal and external supervisors, and then semantically analysed by external supervisors.

Results

Twenty-five ECT visit feedbacks were collected and analysed semantically. All participating students rated ECT visits excellently and confirmed the relevance of assessment tools for discussions with supervisors to achieve the designed learning outcomes. Chinese students rated the assessment tools as innovative from a cultural perspective and recommended the ECT visit teaching model and assessment tools to their home university, whereas Australian students suggested more ECT visits during GP placements. Time management was a limitation for both the students and supervisors.

Conclusion

ECT visit is an innovative placement teaching model and work-based assessment tool for senior medical students’ GP placements, and is rated as the most preferred formative assessment tool. The limitations of this study include small group of students/supervisors and lack of patient feedback; however, all of these limitations can be overcome by involving multiple GP clinics in ongoing large-scale study. ECT visits can be introduced quantitatively into students’ GP placement curricula to improve clinical reasoning, learning, and quality assurance with assessments during clinical placements.

Keywords: external clinical teaching, general practice placement, senior medical students placement learning, clinical reasoning learning, work-based assessment

Introduction

Formative assessment has been well documented to assist and direct learning in both undergraduate and postgraduate work-based teaching.1–4 In postgraduate teaching in Australian general practice registrar training scheme, external clinical teaching (ECT) visits as a form of formative assessment has been extensively applied during work-based training since 1996.5 During these visits, a senior general practitioner observes the consultations of registrars, provides structured feedback and makes recommendations to improve clinical performance and competence as well as provide opportunities for continuous professional development. There has been limited evidence evaluating the ECT visit and its assessment tool in postgraduate GP training,6 while no published evidence has been documented discussing the introduction and evaluation of this ECT visit assessment tool in undergraduate GP placement teaching with senior medical students. The aim of this pilot study was to reflect on the introduction of ECT visit and its assessment tool with evaluation when applied in GP placement with senior medical students.

Methods

Study Design Including Setting and Participants

This pilot study involved one external GP supervisor, a university GP academic, and two GP clinic supervisors who were in charge of allocated medical students’ teaching and learning during final-year GP clinical placements at a single GP clinic for four weeks. The twenty-five medical students allocated to the GP clinic included 15 Chinese exchange students from 2017 to 2019 and ten Australian students from 2020 to 2022. Once-off ECT visits were booked in the last two weeks of the 4-weeks’ GP placements. Prior to the ECT visit session, GP clinic supervisors randomly pre-booked to 6–8 patients for the participating students during the 3.5 hours session. The booked patients consented to the students’ consultations in the GPs’ role under the supervision of the GP academic external supervisor in the observers’ role. The observing external supervisors provided structured in-person feedback during the student-led consultation by using a standardised, validated consultation feedback proforma to assess the components of a consultation (Table 1), including introduction, history, physical examination, investigation, diagnosis, management, and closing summary.7,8 The external supervisor also pre-designed the students’ feedback questionnaire from medical educational perspectives (Table 2) and briefly interviewed the students with the questionnaires to collect students’ feedback at the end of each consultation for semantic analysis of students’ qualitative feedback. Structured feedback requires elements of frankness and openness between students and supervisors,2 which are less likely to be achieved if the assessment is summative, with the risk of penalty. All supervisors, including both external and clinical supervisors, received training in providing structured feedback using the Pendleton rules.9

Table 1

Components of the Quantitative Instrument (Assessment Tool)

Phase of Consultation Quantitative instrumentBelow Year level ExpectationMeet Year level ExpectationAbove Year level Expectation
Introduction: Students’ self-introductionInappropriate Wording and Body LanguageAppropriate Wording and Body LanguageAppropriate Wording and Body Language
Introduction: Empathy and Rapport BuildingInappropriate acknowledgement, patient uneaseAppropriate acknowledgement, patient at easeAppropriate acknowledgement, patient at ease
History Taking: Attentively ListeningInadequate and lackingAdequate but not completeAdequate and complete
History Taking: Nonverbal clues follow-upInadequate and lackingAdequate but not completeAdequate and complete
History Taking: Relevant question style/frameworkInadequate and lackingAdequate but not completeAdequate and complete
History Taking:
Eye Contact and Avoid Jargon
Lacking and inappropriateAdequate and appropriateAdequate and appropriate
History Taking: Psychosocial Factor considerationInappropriate and not relevantAppropriate and relevantAppropriate and relevant
History Taking: Problem List definitionInadequate, inappropriate and not relevantAdequate, appropriate and relevantAdequate, appropriate and relevant
Physical Exam: Relevancy to history takingInappropriate and not relevantAppropriate and relevantAppropriate and relevant
Physical Exam: Revised Problem List definitionInadequate, inappropriate and not relevantAdequate, appropriate and relevantAdequate, appropriate and relevant
Investigation:
GP clinic Bedside Investigation
Inadequate, inappropriate and not relevantAdequate, appropriate and relevantAdequate, appropriate and relevant
Investigation:
Routine and special Investigation
Inadequate, inappropriate and not relevantAdequate, appropriate and relevantAdequate, appropriate and relevant
Investigation:
Revised Problem List definition
Inadequate, inappropriate and not relevantAdequate, appropriate and relevantAdequate, appropriate and relevant
Diagnosis:
Final Problem List definition
Inadequate, inappropriate and not relevantAdequate, appropriate and relevantAdequate, appropriate and relevant
Diagnosis:
Initial Presenting Complaint defined
Inadequate, inappropriate and not relevantAdequate, appropriate and relevantAdequate, appropriate and relevant
Management:
Action for each defined problem
Inadequate and inappropriate actionAdequate and appropriate actionAdequate and appropriate action
Management:
Time & Resource allocation
Inadequate, inappropriate and not relevantAdequate, appropriate and relevantAdequate, appropriate and relevant
Management:
Patient Explanation & Involvement
Inadequate, inappropriate and not relevantAdequate, appropriate and relevantAdequate, appropriate and relevant
Management:
Illness Prevention, Health Promotion
Inadequate, inappropriate and not relevantAdequate, appropriate and relevantAdequate, appropriate and relevant
Closing summary:
Timing and Follow-up Arrangement
Inadequate, inappropriate and lackingAdequate, appropriate and relevantAdequate, appropriate and relevant
Closing summary:
Empathy demonstration
Inadequate, inappropriate and lackingAdequate, appropriate and relevantAdequate, appropriate and relevant
Closing summary:
Level of Confidence
Inadequate and lackingAdequate, appropriate and relevantAdequate, appropriate and relevant
Closing summary:
Overall Rating of Performance
Below expected year level standardsMeeting expected year level standardsAbove expected year level standards

Table 2

The Interview Questionnaires Include Seven Pre-Designed Questions

1How and what benefits have this teaching model provided in Improving CR learning with its five domains?
2In what ways has this teaching model Improved clinical competence?
3How has this teaching model Broadened salient learning points?
4In what ways has this teaching model made Feedback discussion with supervisors more constructive and interactive?
5How does this teaching model affect your OSCE exam preparation?
6Is there any Limitation from students’, patients’ and supervisors’ perspectives regarding this teaching model?
7What effects of this teaching model on your Readiness to become an intern?

The subsections in the proforma are further divided into behavioural descriptors, which is highly recommended as an educational tool by the RACGP.10 The ECT visit aims to assess senior-year medical students in a formative way at the level of competence required for independent practice as a junior doctor in Australia. This included the explanation that the ECT visit was formative in nature for providing teaching and feedback. The ECT visit supervisor prepared a standard qualitative report at the end of the consultation and discussed the learning points with take-home messages.

An evaluation questionnaire was distributed to each student by an external supervisor at the conclusion of their session visit (Table 3) and returned separately to the clinic supervisors to reduce the possibility of response bias.

Table 3

Students’ Questionnaire Using 5-Point Likert Scale (“Pilot ECT Visit Feedback Form Evaluation: This Form is Designed to Provide Information to Medical Education Research Team and Your Responses Will Be Treated as Confidential”)

12345*
Do you believe this form gave you useful feedback regarding your clinical reasoning skills?
Do you believe this form gave you useful discussion prompts for your supervisors?
Would you like to use this form with all your ECT visits, supervised and unsupervised consultations?
Do you think your received assessment on this form were relevant to your year-level learning?
Were you able to discuss any perceived learning points with your supervisor?
Do you think your received assessment on this form were benefit to your ongoing learning?
Do you believe this form gave you self-reflection model for your postgraduate learning?

Notes: *1 = strongly disagree, 2 = disagree, 3 = neither agree nor disagree, 4 = agree, 5 = strongly agree

The questionnaire used a 5-point Likert scale to ask students whether the tool provided valuable feedback on their consultation skills (testing validity), whether the tools provided useful prompts for discussions with the supervisor (testing educational impact), and whether the form should be used for all supervised and unsupervised consultations (testing acceptability). Students were asked if they believed their supervisor’s feedback was relevant (testing relevancy), and if they could discuss any learning points with both the external supervisor and GP clinic supervisors. The final two points that students were asked about were the assessment tool’s benefit in students’ ongoing learning in the postgraduate and career training phases. The results were condensed into agree or disagree. This pilot study is an evaluation of a quality assurance modification to the existing CMS and SYSU clinical placement educational program. Formal ethical approval was not obtained for this study.

Results

Twenty-five ECT visits will be conducted between August 2017 and September 2022. A total of 25 senior medical students’ feedback was collected by two supervisors using an assessment tool and pre-designed interview questionnaire. Among all participating students, fifteen students were final-year exchange students from China, while ten students were final-year students from Australia. All participating students rated the ECT visits excellently and agreed that the assessment tool provided useful feedback and relevant prompts for discussions with external supervisors. All the students stated that they liked the feedback assessment tool used for all future ECT visits. The outcomes of 25 students using the feedback assessment tool were presented, demonstrating that all students met their corresponding year-level requirements, with 13 students meeting above the year level in the overall rating (Table 4). Almost all Australian students were rated above the year level in all the components (introduction, history taking, physical examination, investigation, and diagnosis) of the assessment tool, except in management, which was equal to that of the Chinese students. Although the rating was not calculated in the final markings, all students felt that their ratings were fair, and the 15 Chinese students considered the assessment tool to be innovative from the perspectives of Chinese culture of learning, hence would recommend ECT visit and assessment tool to their university medical education department. Ten Australian students felt that they could discuss disagreements in ratings with external supervisors in order to gain more learning points from each consultation. The ten Australian students also stated that more sessions of ECT visits should be planned in their eight-week full-time placements. All students perceived that the external supervisors had correctly identified their individual strengths and weaknesses through just small group of patients, and the advice on how to improve their performance with the ability of analysing own strength and weakness reaped a long-term benefit on career clinical practice and ongoing learning. Conversely, regarding the adverse outcomes of this study, all students were unable to complete the consultation on schedule during the ECT visits, which is a limitation of this workplace assessment model. The brief self-reflective feedback from external supervisors was consistent with the students’ feedback on the assessment tool, confirming the academic value of delivering the designed specific learning outcomes to the students. Two external supervisors also mentioned that time management was the most crucial issue during ECT visits, especially when both students and supervisors were keen on achieving medical educational goals.

Table 4

Outcomes of ECT Visits Using the Feedback Assessment Tool

Phase of Consultation Quantitative instrumentStudents Below Year level ExpectationStudents Meeting Year level ExpectationStudents Above Year level Expectation
Introduction: Students’ self-introduction0Total 14
Australian 2
Chinese 12
Total 11
Australian 8
Chinese 3
Introduction: Empathy and Rapport Building0Total 14
Australian 2
Chinese 12
11
Australian 8
Chinese 3
History Taking: Attentively Listening0Total 11
Australian 3
Chinese 8
Total 14
Australian 7
Chinese 7
History Taking: Nonverbal clues follow-up0Total 11
Australian 3
Chinese 8
Total 14
Australian 7
Chinese 7
History Taking: Relevant question style/framework0Total 11
Australian 3
Chinese 8
Total 14
Australian 7
Chinese 7
History Taking:
Eye Contact and Avoid Jargon
0Total 11
Australian 3
Chinese 8
Total 14
Australian 7
Chinese 7
History Taking: Psychosocial Factor consideration0Total 11
Australian 3
Chinese 8
Total 14
Australian 7
Chinese 7
History Taking: Problem List definition0Total 11
Australian 3
Chinese 8
Total 14
Australian 7
Chinese 7
Physical Exam: Relevancy to history taking0Total 11
Australian 2
Chinese 9
Total 14
Australian 8
Chinese 6
Physical Exam: Revised Problem List definition0Total 11
Australian 2
Chinese 9
Total 14
Australian 8
Chinese 6
Investigation:
GP clinic Bedside Investigation
0Total 11
Australian 1
Chinese 10
Total 14
Australian 9
Chinese 5
Investigation:
Routine and special Investigation
0Total 11
Australian 1
Chinese 10
Total 14
Australian 9
Chinese 5
Investigation:
Revised Problem List definition
0Total 11
Australian 1
Chinese 10
Total 14
Australian 9
Chinese 5
Diagnosis:
Final Problem List definition
0Total 10
Australian 1
Chinese 9
Total 15
Australian 9
Chinese 6
Diagnosis:
Initial Presenting Complaint defined
0Total 10
Australian 1
Chinese 9
Total 15
Australian 9
Chinese 6
Management:
Action for each defined problem
0Total 12
Australian 5
Chinese 7
Total 13
Australian 5
Chinese 8
Management:
Time & Resource allocation
0Total 12
Australian 5
Chinese 7
Total 13
Australian 5
Chinese 8
Management:
Patient Explanation & Involvement
0Total 12
Australian 5
Chinese 7
Total 13
Australian 5
Chinese 8
Management:
Illness Prevention, Health Promotion
0Total 12
Australian 5
Chinese 7
Total 13
Australian 5
Chinese 8
Closing summary:
Timing and Follow-up Arrangement
0Total 12
Australian 5
Chinese 7
Total 13
Australian 5
Chinese 8
Closing summary:
Empathy demonstration
0Total 12
Australian 5
Chinese 7
Total 13
Australian 5
Chinese 8
Closing summary:
Level of Confidence
0Total 12
Australian 5
Chinese 7
Total 13
Australian 5
Chinese 8
Closing summary:
Overall Rating of Performance
0Total 12
Australian 5
Chinese 7
Total 13
Australian 5
Chinese 8

The results of questionnaire using a 5-point Likert scale showed that all seven questions were rated as “agree” and “strongly agree” (Table 5). These results indicate that the assessment form is valid for clinical skills learning, useful for prompting discussions with supervisors to gain learning points and widely accepted by students for ongoing learning with postgraduate training.

Table 5

Results of Students’ Questionnaire Using 5-Point Likert Scale (“Pilot ECT Visit Feedback Form Evaluation: This Form is Designed to Provide Information to Medical Educational Research Team and Your Responses Will Be Treated as Confidential”)

12345*
Do you believe this form gave you useful feedback regarding your clinical reasoning skills? Numbers of students in each scale223
Do you believe this form gave you useful discussion prompts for your supervisors? Numbers of students in each scale124
Would you like to use this form with all your ECT visits, supervised and unsupervised consultations? Numbers of students in each scale322
Do you think your received assessment on this form were relevant to your year-level learning? Numbers of students in each scale223
Were you able to discuss any perceived learning points with your supervisor? Numbers of students in each scale124
Do you think your received assessment on this form were benefit to your ongoing learning? Numbers of students in each scale124
Do you believe this form gave you self-reflection model for your postgraduate learning? Numbers of students in each scale322

Notes: *1 = strongly disagree, 2 = disagree, 3 = neither agree nor disagree, 4 = agree, 5 = strongly agree

Students’ feedback from the interview with pre-designed questionnaires after ECT visits showed very positive responses in the following quotes:

Clinical Reasoning Learning

Student one: In addition to traditional observer-style learning, I am comfortable to form my own clinical reasoning instead of attempting to remember someone else’s, thus increasing knowledge and clinical skill retention. A focused history and examination in consultant’s chair teach me the concept of “choosing wisely” for investigation. The teaching “On the Go” with supervised student-led consultation consolidate the student’s clinical reasoning learning of creating a comprehensive problem list in every component of clinical assessment, specifically to broaden the focus of differential diagnoses into a structured framework of “common” and “not to be missed”. The ECT visit allowed multi-specialty input into the clinical decision-making in some cases with allied health having significant input in the shared decision-making.

Clinical Competency

Student two: The teaching visit reinforces what we know and highlights gaps in our knowledge. The dedicated supervisor and patients made our clinical learning more relevant and enabled the development of our clinical competency.

Salient Learning Points

Student three: Playing the consultant’s role is an excellent way for students to practice their communication skills, including breaking bad news, motivational interviewing, and navigating a wide range of emotions from patients.

Feedback Discussion with a Supervisor

Student four: The interaction allowed reflection on the positives and negatives of the swapped role consultation, highlighting the most useful aspect where the supervisor was able to critique.

Impact on OSCE exam preparation

Student five: Each consultation was essentially like performing an OSCE station with immediate examiner feedback to prolong clinical skill retention.

Intern Readiness

Student six: The visit has prepared me well for becoming an intern – especially interacting with patients more confidently in clinical assessment and management as this will often be driven by consultant or registrar.

Discussion

Formative assessment allows learners to receive structured feedback about their present level of knowledge and skills and reflect on the best ways to improve their weaknesses.1–4 All students in this study embraced the use of this quantitative formative assessment tool during their ECT visits. This tool is based on the student-led consultation model in a consultant’s chair under supervision, and complements other formative assessments used in students’ GP placement. The findings of this study are consistent with other studies demonstrating learners’ positive evaluation of quantitative formative assessment at the undergraduate or postgraduate level of training in general practice.3,11,12

The unique contribution of this study to GP clinical placement teaching for senior medical students lies in the fact that ECT visits deliver consistent learning experiences in a realistic setting at a Metro GP clinic. Furthermore, students’ feedback from post ECT visit interview acknowledge the academic benefits in terms of improved clinical reasoning learning, clinical competence enhancement and preparing OSCE examination as well as promoting intern readiness. However, with respect to limitation in this study, the first limitation is that our supervising team was unable to link the ECT supervisors’ visit evaluation tool with other work-based assessments by the practice supervisor; therefore, we were unable to assess whether the ECT visits and the assessment tool had any impact on the students’ academic scores and rankings. We were also unable to compare the ECT assessment tool with the routine standard qualitative instrument that was concurrently used.

The second limitation is that the assessment tool lacks descriptive anchors or rubrics to describe the criteria for a mark on each scale. Both external supervisors tended not to use the lower end of scales and suggested a broader range from 1 to 7 (rather than 1–5), being much easier to apply. The lack of descriptive anchors or rubrics makes them unsuitable summary assessment tools. Both practice and external supervisors stated that ECT visits require extra time to integrate teaching into assessments and clinical services to make the teaching model benefit both students and patients.

The third limitation is the small sample size of both the students and supervisors. The participating students were from two medical schools with different curricula. Fifteen Chinese students used English for the first four years of study but needed to use Chinese during their clinical placement in the last two years in China. The outcomes of the rating on all students showed that Australian students’ consultation performance was rated slightly better in the components of introduction, physical examination, investigation, and diagnosis, while Chinese students’ consultation performance was equal to that of Australian students in the components of management and overall rating. The different curricula and languages may explain the subtle differences in their ratings between the Australian and Chinese students. With respect to the ECT visit supervisors, the two supervisors were well trained in using the ECT visits assessment tool for the GP registrar; however, they were initially learning to familiarise themselves with the modified version of the assessment for undergraduate clinical placement assessment. As the external supervisors completed more ECT visits using the tool, they concluded that the specific learning objectives and clinical reasoning learnings were effectively delivered to the students through the assessment tool, but with time management issues being raised for efficient clinical service delivery to the patients seen during the ECT visits. Another limitation of this study is the formal training of ECT supervisors in using the assessment tool in undergraduate ECT visits. As shown in a previous study,13 experienced ECT supervisors underwent formal training prior to their ECT visits.

Conclusion and Future Perspectives

This study demonstrated that a quantitative formative assessment tool used in senior medical students’ general practice placements was well-received by both students and supervisors in an Australian general practice clinic. Further in-depth research is required to establish rubrics for each criterion to improve reliability before applying this tool to summative, work-based assessments. A study with a larger sample size involving more students and ECT supervisors will be required in multiple general practice clinics with feedback from students, supervisors, and patients. The ultimate goal was to evaluate the ECT visit-teaching model and the associated assessment tool for potential implementation in the formal undergraduate GP placement curriculum as a summative assessment.

Acknowledgments

Shaoting Feng, Daya Yang and Kunsong Zhang are co-first authors for this study. We acknowledge the staff at both Curtin Medical School, Curtin University, Australia and the First Affiliated Hospital, Sun Yat-Sen University, China.

Funding Statement

There is no external funding for the study to be declared.

Data Sharing Statement

Data supporting the findings of this study are available from the corresponding author, Dr. Xu, upon reasonable request. All data for this study, which were not publicly available, were securely stored in the research drive with encrypted and password-protected files at Curtin University.

Ethics Approval and Consent to Participate

The research was carried out as a quality assurance process in accordance with the relevant guidelines and regulations of both Curtin University and the First Affiliated Hospital of Sun Yat-sen University. Ethics approval was not required for the participating students, as there is a formal legal agreement between Curtin University and the participating GP clinics, and MOU between Curtin University and Sun Yat-Sen University.

Author Contributions

All authors made a significant contribution to the reported manuscript in the conception, study design, execution, acquisition of data, analysis and interpretation of data. Every author took part in drafting, revising or critically reviewing the manuscript with final approval of the version to be published in this journal. All authors agree to be accountable for all aspects of the manuscript.

Disclosure

The authors report no conflicts of interest in this work.

References

1. Hays R, Wellard R. In-training assessment in postgraduate training for general practice. Med Educ. 1998;32(5):507–513. doi: 10.1046/j.1365-2923.1998.00231.x [PubMed] [CrossRef] [Google Scholar]
2. Holmwood C. Direct observation. A primer for supervisors of doctors in training. Aust Fam Phys. 1998;27(1–2):48–51. [PubMed] [Google Scholar]
3. Rolfe I, McPherson J. Formative assessment: how am I doing? Lancet. 1995;345(8953):837–839. doi: 10.1016/S0140-6736(95)92968-1 [PubMed] [CrossRef] [Google Scholar]
4. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357(9260):945–949. doi: 10.1016/S0140-6736(00)04221-5 [PubMed] [CrossRef] [Google Scholar]
5. Black F, Faux S, editors.. ECT Manual: Becoming an External Clinical Teacher. 1st ed. Melbourne: The Royal Australian College of General Practitioners Training Program; 1996. [Google Scholar]
6. Fraser J. Registrar clinical teaching visits: evaluation of an assessment tool. Aust Fam Phys. 2007;36(12):1070–1072. [PubMed] [Google Scholar]
7. Hays RB. Assessment of general practice consultations: content validity of a rating scale. Med Educ. 1990;24(2):110–116. doi: 10.1111/j.1365-2923.1990.tb02508.x [PubMed] [CrossRef] [Google Scholar]
8. Hays RB, Jones BF, Adkins PB, McKain PJ. Analysis of videotaped consultations to certify competence. Med J Aust. 1990;152(11):609–611. doi: 10.5694/j.1326-5377.1990.tb125395.x [PubMed] [CrossRef] [Google Scholar]
9. Pendleton D, Schofield T, Tate P, Havelock P. The Consultation: An Approach to Learning and Teaching. 1st ed. Oxford: Oxford University Press; 1984. [Google Scholar]
10. The Royal Australian College of General Practitioners. Companion to the Training Program Curriculum. Melbourne: The RACGP; 1999. [Google Scholar]
11. McKinley RK, Fraser RC, van der Vleuten C, Hastings AM. Formative assessment of the consultation performance of medical students in the setting of general practice using a modified version of the Leicester assessment package. Med Educ. 2000;34(7):573–579. doi: 10.1046/j.1365-2923.2000.00490.x [PubMed] [CrossRef] [Google Scholar]
12. Campbell LM, Murray TS. The effects of the introduction of a system of mandatory formative assessment for general practice trainees. Med Educ. 1996;30(1):60–64. doi: 10.1111/j.1365-2923.1996.tb00719.x [PubMed] [CrossRef] [Google Scholar]
13. Hastings A, Cameron D, Preston-Whyte E. Teaching and assessing consultation skills: an evaluation of a South African workshop on using the Leicester assessment package. SA Fam Pract. 2006;48(3):14a–14d. doi: 10.1080/20786204.2006.10873349 [CrossRef] [Google Scholar]

Articles from Advances in Medical Education and Practice are provided here courtesy of Dove Press

-