Point-of-Care Ultrasound Training for Family Medicine Residents: Examining the outcomes and feasibility of a pilot ultrasound curriculum

Gordon Yao, MD (ipr) BSc*; Taeyoung Peter Hong, MD CFPC*; Philip Lee, MD  CFPC (EM); Joseph Newbigging, MD CCFP (EM); Brent Wolfrom, MD CCFP

Queen’s University, Department of Medicine, Kingston, ON, Canada

*The first two authors should be considered as co-first authors for this paper.

Download article PDF – POCUS Journal 2019; 4(2):22-26.


It is estimated that 50% of deaths due to abdominal aortic aneurysms (AAA) could be prevented by a national screening program [1, 2, 3]. Thanks to technological ad­vancements and cost reductions, point-of-care ultrasound (POCUS) in family medicine (FM) is becoming more prev­alent [4, 5]. Despite the potential utility of POCUS in FM, of 224 FM residency programs surveyed, only 21% had developed a curriculum [6]. The main barriers identified to establishing a FM POCUS curriculum in Canadian FM residency programs were lack of trained faculty, lack of adequate equipment and lack of time in the curriculum [6].

Our study tested a pilot POCUS curriculum for first year FM residents which was developed to improve competency in screening for AAA using POCUS. To address the barrier of many learners, and few trained faculty, we incorporated a “train-the-trainer” model. The first set of two residents were trained by Canadian POCUS (CPOCUS)-certified faculty mem­bers [7] during a week of evening clinics. These two resi­dents subsequently trained the next subset of two resi­dents, tumbling forward over four weeks until eight resi­dents were trained. This minimized direct faculty teaching time.

The confidence, knowledge, and clinical competence of trainees were assessed at various time points to assess the efficacy of our curriculum.


Approval was obtained from the Queen’s University Research Ethics Board. Eight FM residents were recruited on a voluntary basis and surveyed to obtain demographic information and prior exposure to POCUS (Table 1).

Table 1. Prior experience of residents with US

Formal US training prior to study
Number of AAA scans performed prior to study
1 Basic/introductory ultrasound course (e.g. EDE1)
2 Obstetrical US training – Biophysical profile, Estimated fetal weight
3 Ultrasound training at Queen’s Family Clinic, ultrasound workshop as a medical student
4 to 8 None 0

One week prior to hands-on training, trainees were given access to a 30-minute video teaching basic knobology and physics, POCUS techniques, and clinical integration related to AAA. Subsequently, they were quizzed on this information to assess knowledge and confidence. Confi­dence was rated on a 5-point Likert scale (1 = “Not confi­dent at all”, 5 = “Fully confident”).

The first hands-on session was taught by four CPOCUS-certified faculty [7] and senior Emergency Medicine (EM) residents. Trainees practiced on fifteen medical student volunteers and one standardized patient (SP) with an AAA. All trainees obtained sixteen supervised scans dur­ing this session.

Further training took place in an after-hours walk-in-clinic (AHC) where patients were invited to volunteer to be scanned. Trainees were paired; Group 1 (trainees A and B) was trained by a POCUS certified instructor for 6 hours. Trainees A and B from Group 1 then trained Group 2 (trainees C and D) for 3 hours each. Group 2 then trained Group 3 and so forth. Using this rolling-forward structure, each trainee received 6 hours of training and provided 3 hours of teaching.

Trainees were assessed for competency using an objec­tive structured clinical examination (OSCE) tool which targeted patient preparation, image acquisition/optimization, image interpretation, and clinical integration. Knowledge and confidence were assessed in the same domains us­ing an electronic quiz and confidence survey. The same quiz was administered multiple times, but the answers were never provided.

For the OSCEs, two SPs (one with an AAA) were recruited. CPOCUS certified examiners blinded to the AHC training sessions scored trainees using an OSCE rubric created by the Queen’s Department of Emergency Medi­cine (Table 2).  

Table 2. OSCE Rubric with description of scores.

Domain Criteria assessed
  • Ergonomics (bed height, arm reach, etc.)
  • Patient position
  • Probe Selection
  • Gel application
  • Draping
  • Initial settings (depth, preset)
  • Patient engagement         
Image Acquisition
  • Starting location
  • Hand and probe position
  • Identify appropriate landmarks
  • Aorta discriminators
  • Timing and economy of movement
  • Measurement
Image Optimization
  • Centers area of interest
  • Appropriate gain
  • Frequency adjustment
  • Focal zone
  • Troubleshooting (gas, umbilicus, fat, artifacts)
Clinical Integration
  • Interpretation (indeterminate, AAA, normal)
  • Understands limitations of US scan
  • Management priorities

Score Description

Inferior – delayed or incomplete performance of all criteria.

Entrustment decision: observation only, no execution


Novice – delayed or incomplete performance of many criteria. 

Entrustment decision: direct supervision required


Competent – delayed or incomplete performance of some criteria. 

Entrustment decision: indirect supervision required


Advanced – competent performance of most criteria. 

Entrustment decision: independent performance with remote supervision


Superior – efficient and rapid performance of all criteria. 

Entrustment decision: supervision of trainees

The quiz, confidence survey, and OSCE were adminis­tered once all residents completed AHC training, as well as four months later to assess retention. The quiz and confidence survey were also administered after trainees had watched the training video but before hands-on train­ing. Formative group feedback was provided after the first OSCE. However, answers were never provided for the quiz or questions posed during OSCEs.

For quiz and confidence scores, single factor ANOVA was applied to the data to screen for significant differ­ences in scores after watching the Instructional video and when OSCE 1 and 2 were administered. If F > Fcrit, paired t-tests were applied to identify where significant differences were present (i.e. between instructional video and OSCE 1, between instructional video and OSCE 2 and/or between OSCE 1 and OSCE 2). Paired t-tests were utilized to assess whether significant differences (p<0.05) were found in competency scores between OSCE 1 and OSCE 2. The same test was used to assess for significant differences between Group 1 vs. Group 4 OSCE scores.
A summary of the study design is provided in Figure 1.

Figure 1. Pilot curriculum design with flow diagram depicting AHC rolling-forward structure of training and teaching.


Knowledge: quiz scores were 86.3%, 90.8% and 89.0% at Instructional video (i.e. after watching the video with no hands-on training), OSCE 1 and OSCE 2 respectively. There were no significant differences between scores (Figure 2).

Figure 2. Quiz scores at different times of administration. “Instructional video” indicates quiz administration after watching the instructional video for the first time.

Confidence: Between Instructional video and OSCE 1, all domains showed significant increase in confidence. Over­all confidence after watching the instructional video aver­aged a score of 1.75, increasing to 4.50 by OSCE 1 and remaining high at 4.33 by OSCE 2. There was no signifi­cant difference in confidence in all domains between the two OSCEs (Figure 3).

Figure 3. Confidence scores in each domain of clinical competency. Responses were gauged on a Likert scale where 1 = “Not confident at all” and 5 = “Fully confident.” * p < 0.05.

Competency: There was no significant difference in OSCE scores between the two OSCEs, suggesting com­petency was retained after training for a minimum of four months. In both OSCEs, all trainees except for one had an entrustment decision score of 4, which meant inde­pendent performance with remote supervision. One trainee in both OSCEs scored a 3 for entrustment decision, which meant indirect supervision was required; however, this was not the same trainee for both OSCEs, and the two individuals were only rated a 3 by one of two POCUS evaluators. All residents reported performing POCUS less than once a month between OSCEs 1 and 2. The aver­age score of all the domains were 3.75 and 3.70 for OSCE 1 and 2, respectively (Figure 4).

Figure 4. OSCE scores for each domain of clinical competency. Scores were gauged on a Likert scale where 1 = “Observation only, no execution”, 2 = “Direct supervision required”, 3 = “Indirect supervision required”, 4 = “Independent performance with remote supervision”, 5 = “Supervision of trainees.”

Effect of rolling-forward AHC training: OSCE scores for Group 1 (trainees A and B) in the AHC were not signifi­cantly different from OSCE scores for Group 4 (trainees G and H) in all competency domains (Figure 5).

Figure 5. OSCE scores for each domain of clinical competency between Group 1 (Trainee A and B) and Group 4 (Trainee G and H) in rolling-forward AHC training.


As part of the curriculum at Queen’s university, FM resi­dents participate in short horizontal experiences, each being an 8-16 hour commitment. The pilot POCUS curricu­lum had the same time commitment (total 12.5 hours) and ensured confidence, knowledge, and clinical compe­tence that was retained at four months post-training. Our study had a small sample size, but most of our trainees had no prior US training, which is common amongst other first year FM residents.

The rolling-forward “train-the-trainer” curriculum encour­aged professionalism and minimized demands on faculty to provide hands on training. There were no apparent dif­ferences in confidence, knowledge, or competency for trainees who were taught by faculty compared to trainees taught by other trainees.

During both OSCEs, the CPOCUS-certified examiner directly observed residents measure the aorta and assessed them using an established rubric in order to assess competence. However, one limitation is that the quantitative accuracy of the actual AAA measurements performed by the learner during training was not directly assessed by the instructor. This could have been achieved by comparing the trainee’s measurements with a CPOCUS-certified trainer’s findings on the same abdominal aorta. Accuracy is important because: 1) false negative measurements would mean that prevention of a life-threatening condition (AAA=related mortality, rupture and emergency repair) could have been avoided, and 2) false positive measurements generate unnecessary con­firmatory imaging, follow-up care including emergency transfer and specialist referrals, and undue patient stress. Blois (2012) did demonstrate that a family physician could develop an accuracy with less than 0.2 mm discrepancy from official measurements but the physician in this study received significantly more training (i.e. 50 supervised scans) [1].

In 2016, Wilkinson et al. also studied the effectiveness of a condensed POCUS curriculum but for the purposes of teaching cardiac scans [8]. Trainees were required to diagnose several pathologies, such as severe left ventricular dysfunction and ventricular septal defects. Of note, the curriculum was short and only provided 4 hours of either hands-on-training or simulation-based training taught by senior residents or ultrasound technicians [8].  This study raised a legitimate concern that condensed curriculums may have the undesired effect of increasing trainee confi­dence without the necessary increase in competency. Despite a significant increase in trainee confidence, there was an increase in the false positive rate after the training [8].

In contrast to these findings, our trainees exhibited con­comitant increases in confidence, knowledge, and clinical competency. This was likely because our curriculum taught a single application, was longer in duration (12.5 hours), and utilized several different teaching modalities including a didactic online lecture, hands-on practice taught by POCUS-trained physicians, and peer-to-peer training. In incorporating POCUS training into an already overflowing medical curriculum, it is vital that the POCUS curriculums developed not only increase trainee confi­dence but also ensure clinical competency and improve patient care.

For FM residents interested in incorporating POCUS into their future practice, studying the efficacy of this teaching curriculum when applied to an entire FM residency pro­gram should be considered. The supplemental curriculum we have developed has the potential to teach them a life-saving scan which would be indicated for many patients seen in a typical FM practice [1, 2].


1. Blois B. Office-based ultrasound screening for abdominal aortic aneu­rysm. Can Fam Physician 2012; 58(3): e172-8.

2. Bailey RP, Ault M, Greengold NL, et al. Ultrasonography performed by primary care residents for abdominal aortic aneurysm screening. J Gen Intern Med 2001; 16(12):845-849.

3. Abdominal Aortic Aneurysm [Internet]. Canadian Task Force 2017 [cited 2018 Oct 20]. Available from: https://canadiantaskforce.ca/guidelines/published-guidelines/abdominal-aortic-aneurysm/.

4. Moore CL, Copel JA. Point-of-care Ultrasonography. N Engl J Med 2011; 364(8):749-757.

5. Wordsworth S, Scott A. Ultrasound scanning by general practitioners: is it worthwhile? J Public Health Med 2002; 24(2): 88-94.

6. Canadian national survey of point-of-care ultrasound training in family medicine residency programs. Can Fam Physician. 2018; 64(10): e462–e467.

7. Canadian Point of Care Ultrasound Society [Internet]. Available from www.cpocus.ca.

8.Wilkinson JS, Barake W, Smith C, Thakrar A, Johri A. Limitations of Condensed Teaching Strategies to Develop Hand-Held Cardiac Ultraso­nography Skills in Internal Medicine Residents. CJC 2016; 32(8): 1034-1037.

Posted in .

Leave a Reply

Your email address will not be published. Required fields are marked *