A systematic review and meta-analysis on the impact of proficiency-based progression training on trainees’ performance outcomes in medicine.

Research output: Contribution to journalArticlepeer-review

31 Downloads (Pure)

Abstract

Importance: To date, the impact of 'proficiency-based progression' (PBP) methodology to learning clinical skills in comparison to the traditional approach to training has not been reviewed and analyzed systematically. Objective: To systematically analyze all published prospective, randomized and blinded clinical studies on the PBP training methodology. Data Sources: Comprehensive search of PubMed, Cochrane library’s Central, EMBASE, MEDLINE and Scopus databases, from their inception to 1st March 2020. All the references identified from bibliographies of key reviews on training were also screened. Study Selection: Inclusion criteria were studies using objective performance metrics and PBP methodology Data Extraction and Synthesis: Two independent reviewers extracted the data. The Medical Education Research Study Quality Instrument (MERSQI) was used to assess the methodological quality of the included studies. The risk of bias for all studies was assessed by three independent investigators and their inter-rater reliability (IRR) was calculated. Main Outcome(s) and Measure(s): The primary outcome was the number of procedural errors performed comparing PBP and non-PBP-based training pathways. Secondary outcomes were the number of procedural steps completed and the time to complete the task/procedure. Results were pooled using biased corrected standardized mean difference (SMD) and ratio-of-means (ROM) using random-effects models. (Is the biased correction Hedges’ g?) (Is the ratio of means the same as the response ratio, and if so does one not require a ratio scale?) (I don’t get the link between ROM and a random effects model.) (In a random-effects model the effect size varies depending on the study, unlike the fixed effects model where it is assumed that all of the studies share a common effect size.) Results: From the initial pool of 468 studies a total of 12 studies, with a total of 239 participants, were included in the current study. The mean MERSQI score of the included studies was high (15.5). When comparing standard simulation based training to PBP training a reduction in the number of errors was reported (SMD -2.68, 95%CI: -3.52; -1.83; p < 0.001) and procedural time was also reduced (SMD -0.93, 95% CI: -1.55; -0.30; p = 0.003), while the number of performed steps increased (SMD 3.46, 95% CI: 2.13; 4.79; p < 0.001). Using a ROM comparison PBP was estimated to reduce the mean number of errors by 58% and procedural time taken by 15%, while on average increasing the number of steps taken by 43% when compared to standard training. As a test of sensitivity a series of subgroup analyses were conducted based on intraoperative performance assessments and these supported the above results. (How do we know that a MERSQI score of 15.5 is high?) (Maybe something more needs to be said about the subgroup analyses.) Conclusions and Relevance: Our systematic review and meta-analysis confirms that PBP training improves trainees’ performances, by decreasing procedural errors and procedural time, while increasing the number of correct steps taken when compared to standard simulation-based training.
Original languageEnglish
JournalAnnals of Surgery
Publication statusAccepted/In press - 13 Nov 2020

Keywords

  • Surgical training
  • proficiency-based progression training
  • proficiency-based metrics
  • objective performance metrics
  • procedural errors
  • procedural steps
  • technology enhanced training
  • simulation-based training

Fingerprint

Dive into the research topics of 'A systematic review and meta-analysis on the impact of proficiency-based progression training on trainees’ performance outcomes in medicine.'. Together they form a unique fingerprint.

Cite this