Endoscopy 2019; 51(11): 1017-1026
DOI: 10.1055/a-0991-0044
Original article
© Georg Thieme Verlag KG Stuttgart · New York

ERCP assessment tool: evidence of validity and competency development during training

Keith Siau
1   Joint Advisory Group on Gastrointestinal Endoscopy, Royal College of Physicians, London, United Kingdom
2   College of Medical and Dental Sciences, University of Birmingham, Birmingham, United Kingdom
3   Endoscopy Unit, Dudley Group NHS Foundation Trust, Dudley, United Kingdom
,
Paul Dunckley
1   Joint Advisory Group on Gastrointestinal Endoscopy, Royal College of Physicians, London, United Kingdom
4   Department of Gastroenterology, Gloucestershire Hospitals NHS Foundation Trust, Gloucester, United Kingdom
,
Mark Feeney
1   Joint Advisory Group on Gastrointestinal Endoscopy, Royal College of Physicians, London, United Kingdom
5   Department of Gastroenterology, Torbay and South Devon NHS Foundation Trust, Torquay, United Kingdom
,
Gavin Johnson
1   Joint Advisory Group on Gastrointestinal Endoscopy, Royal College of Physicians, London, United Kingdom
6   Department of Gastroenterology, University College London Hospitals NHS Foundation Trust, London, United Kingdom
,
on behalf of the Joint Advisory Group on Gastrointestinal Endoscopy › Author Affiliations
Further Information

Publication History

submitted 14 March 2019

accepted after revision 08 July 2019

Publication Date:
10 September 2019 (online)

Abstract

Background The endoscopic retrograde cholangiopancreatography (ERCP) direct observation of procedural skills (DOPS) is a 27-item competency assessment tool that was developed to support UK ERCP training. We evaluated validity of ERCP DOPS and competency development during training.

Methods This prospective study analyzed ERCP DOPS performed in the UK between July 2016 and October 2018. Reliability was measured using Cronbach’s alpha, and DOPS scores were benchmarked using the contrasting groups method. The percentage of competent scores was averaged for each item, domain, and overall rating, and stratified by lifetime procedure count to evaluate learning curves. Multivariable analyses were performed to identify predictors of DOPS competence.

Results 818 DOPS (109 trainees, 80 UK centers) were analyzed. Overall Cronbach’s alpha was 0.961. Attaining competency in 87 % of assessed DOPS items provided the optimal competency benchmark. This was achieved in the domain sequence of: pre-procedure, post-procedure management, endoscopic non-technical skills, cannulation & imaging, and execution of selected therapy, and across all items after 200 – 249 procedures (89 %). After 300 procedures, the benchmark was reached for selective cannulation (89 %), but not for stenting (plastic 73 %; metal 70 %), sphincterotomy (80 %), and sphincteroplasty (56 %). On multivariable analysis, lifetime procedure count (P = 0.002), easier case difficulty (P < 0.001), trainee grade (P = 0.03), and higher lifetime DOPS count (P = 0.01) were predictors of DOPS competence.

Conclusion This study provides novel validity, reliability, and learning curve data for ERCP DOPS. Trainees should have a minimum of 300 hands-on ERCP procedures before undertaking summative assessment for independent practice.

Figs. 1s, 2s

 
  • References

  • 1 Siau K, Hawkes ND, Dunckley P. Training in endoscopy. Curr Treat Options Gastroenterol 2018; 16: 345-361
  • 2 Siau K, Green JT, Hawkes ND. et al. Impact of the Joint Advisory Group on Gastrointestinal Endoscopy (JAG) on endoscopy services in the UK and beyond. Frontline Gastroenterol 2019; 10: 93-106
  • 3 Endoscopic retrograde cholangiopancreatography (ERCP). 2016 Available from: https://www.thejag.org.uk/Downloads/DOPS%20forms%20(international%20and%20reference%20use%20only)/Formative%20DOPS_ERCP.pdf [Accessed: 6 November 2018]
  • 4 Siau K, Dunckley P, Valori R. et al. Changes in scoring of Direct Observation of Procedural Skills (DOPS) forms and the impact on competence assessment. Endoscopy 2018; 50: 770-778
  • 5 Crossley J, Jolly B. Making sense of work-based assessment: ask the right questions, in the right way, about the right things, of the right people. Med Educ 2012; 46: 28-37
  • 6 Messick S. Validity of psychological assessment: validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. Am Psychol 1995; 50: 741
  • 7 Cook DA, Hatala R. Validation of educational assessments: a primer for simulation and beyond. Adv Simul 2016; 1: 31
  • 8 Rios J, Wells C. Validity evidence based on internal structure. Psicothema 2014; 26: 108-116
  • 9 Hinkle DE, Wiersma W, Jurs SG. Applied statistics for the behavioral sciences. 5. Boston: Houghton Mifflin; 2003
  • 10 Bland JM, Altman DG. Statistics notes: Cronbach’s alpha. BMJ 1997; 314: 572
  • 11 Jørgensen M, Konge L, Subhi Y. Contrasting groups’ standard setting for consequences analysis in validity studies: reporting considerations. Adv Simul 2018; 3: 5
  • 12 Wani S, Hall M, Wang AY. et al. Variation in learning curves and competence for ERCP among advanced endoscopy trainees by using cumulative sum analysis. Gastrointest Endosc 2016; 83: 711-719
  • 13 Wani S, Keswani R, Hall M. et al. A prospective multicenter study evaluating learning curves and competence in endoscopic ultrasound and endoscopic retrograde cholangiopancreatography among advanced endoscopy trainees: the Rapid Assessment of Trainee Endoscopy Skills Study. Clin Gastroenterol Hepatol 2017; 15: 1758-1767
  • 14 Ekkelenkamp VE, Koch AD, Haringsma J. et al. Quality evaluation through self-assessment: a novel method to gain insight into ERCP performance. Frontline Gastroenterol 2014; 5: 10-16
  • 15 Ekkelenkamp VE, Koch AD, Rauws EA. et al. Competence development in ERCP: the learning curve of novice trainees. Endoscopy 2014; 46: 949-955
  • 16 ERCP – the way forward, a standards framework. British Society of Gastroenterology ERCP Working Party; 2014 Available from: https://www.bsg.org.uk/asset/341DCD67-426A-44F4-910DD392C8A39606 [Accessed 28 February 2019]
  • 17 Verma D, Gostout CJ, Petersen BT. et al. Establishing a true assessment of endoscopic competence in ERCP during training and beyond: a single-operator learning curve for deep biliary cannulation in patients with native papillary anatomy. Gastrointest Endosc 2007; 65: 394-400
  • 18 Mandai K, Uno K, Fujii Y. et al. Number of endoscopic retrograde cholangiopancreatography procedures required for short biliary cannulation time. Gastroenterol Res Pract 2017; 2017: 1515260
  • 19 Wani S, Keswani RN, Petersen B. et al. Training in EUS and ERCP: standardizing methods to assess competence. Gastrointest Endosc 2018; 87: 1371-1382
  • 20 Jones DB, Hunter JG, Townsend CM. et al. SAGES rebuttal. Gastrointest Endosc 2017; 86: 751-754
  • 21 Voiosu T, Bălănescu P, Voiosu A. et al. Measuring trainee competence in performing endoscopic retrograde cholangiopancreatography: a systematic review of the literature. 2019; 7: 239-249
  • 22 Bekkali NL, Johnson GJ. Training in ERCP and EUS in the UK anno 2017. Frontline Gastroenterol 2017; 8: 124-128
  • 23 Poppers DM, Cohen J. The path to quality colonoscopy continues after graduation. Gastrointest Endosc 2019; 89: 493-495
  • 24 Ward ST, Mohammed MA, Walt R. et al. An analysis of the learning curve to achieve competency at colonoscopy using the JETS database. Gut 2014; 63: 1746
  • 25 Lee TJW, Siau K, Esmaily S. et al. Development of a national automated endoscopy database: the United Kingdom National Endoscopy Database (NED). United Eur Gastroenterol J 2019; DOI: 10.1177/2050640619841539.
  • 26 Cotton PB, Feussner D, Dufault D. et al. A survey of credentialing for ERCP in the United States. Gastrointest Endosc 2017; 86: 866-869
  • 27 Siau K, Anderson JT, Valori R. et al. Certification of UK gastrointestinal endoscopists and variations between trainee specialties: results from the JETS e-portfolio. Endosc Int Open 2019; 7: E551-E560