CC BY-NC-ND 4.0 · Indian J Plast Surg 2019; 52(02): 216-221
DOI: 10.1055/s-0039-1695658
Original Article
Association of Plastic Surgeons of India

Objective Assessment of Microsurgery Competency—In Search of a Validated Tool

Sheeja Rajan
1   Department of Plastic Surgery, Government Medical College, Kozhikode, Kerala, India
2   MCI Regional Centre for Medical Education Technology, Kozhikode, Kerala, India
,
Ranjith Sathyan
1   Department of Plastic Surgery, Government Medical College, Kozhikode, Kerala, India
,
L. S. Sreelesh
1   Department of Plastic Surgery, Government Medical College, Kozhikode, Kerala, India
,
Anu Anto Kallerey
3   Taluk Hospital Nadapuram, Kozhikode, Kerala, India
,
Aarathy Antharjanam
4   Taluk Hospital Pazhayangadi, Kannur, Kerala, India
,
Raj Sumitha
1   Department of Plastic Surgery, Government Medical College, Kozhikode, Kerala, India
,
Jinchu Sundar
1   Department of Plastic Surgery, Government Medical College, Kozhikode, Kerala, India
,
Ronnie Johnson John
1   Department of Plastic Surgery, Government Medical College, Kozhikode, Kerala, India
,
S. Soumya
1   Department of Plastic Surgery, Government Medical College, Kozhikode, Kerala, India
› Author Affiliations
Further Information

Publication History

Publication Date:
16 September 2019 (online)

Abstract

Microsurgical skill acquisition is an integral component of training in plastic surgery. Current microsurgical training is based on the subjective Halstedian model. An ideal microsurgery assessment tool should be able to deconstruct all the subskills of microsurgery and assess them objectively and reliably. For our study, to analyze the feasibility, reliability, and validity of microsurgery skill assessment, a video-based objective structured assessment of technical skill tool was chosen. Two blinded experts evaluated 40 videos of six residents performing microsurgical anastomosis for arteriovenous fistula surgery. The generic Reznick's global rating score (GRS) and University of Western Ontario microsurgical skills acquisition/assessment (UWOMSA) instrument were used as checklists. Correlation coefficients of 0.75 to 0.80 (UWOMSA) and 0.71 to 0.77 (GRS) for interrater and intrarater reliability showed that the assessment tools were reliable. Convergent validity of the UWOMSA tool with the prevalidated GRS tool showed good agreement. The mean improvement of scores with years of residency was measured with analysis of variance. Both UWOMSA (p-value: 0.034) and GRS (p-value: 0.037) demonstrated significant improvement in scores from postgraduate year 1 (PGY1) to PGY2 and a less marked improvement from PGY2 to PGY3. We conclude that objective assessment of microsurgical skills in an actual clinical setting is feasible. Tools like UWOMSA are valid and reliable for microsurgery assessment and provide feedback to chart progression of learning. Acceptance and validation of such objective assessments will help to improve training and bring uniformity to microsurgery education.

Supplementary Material

 
  • References

  • 1 Barnes RW, Lang NP, Whiteside MF. Halstedian technique revisited. Innovations in teaching surgical skills. Ann Surg 1989; 210 (01) 118-121
  • 2 Ramachandran S, Ghanem AM, Myers SR. Assessment of microsurgery competency-where are we now?. Microsurgery 2013; 33 (05) 406-415
  • 3 Reznick RK, MacRae H. Teaching surgical skills--changes in the wind. N Engl J Med 2006; 355 (25) 2664-2669
  • 4 Doyle JD, Webber EM, Sidhu RS. A universal global rating scale for the evaluation of technical skills in the operating room. Am J Surg 2007; 193 (05) 551-555 , discussion 555
  • 5 Saleh GM, Voyatzis G, Hance J, Ratnasothy J, Darzi A. Evaluating surgical dexterity during corneal suturing. Arch Ophthalmol 2006; 124 (09) 1263-1266
  • 6 Balasundaram I, Aggarwal R, Darzi LA. Development of a training curriculum for microsurgery. Br J Oral Maxillofac Surg 2010; 48 (08) 598-606
  • 7 Martin JA, Regehr G, Reznick R. et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg 1997; 84 (02) 273-278
  • 8 Goldenberg MG, Grantcharov TP. Video-analysis for the assessment of practical skill. Tijdschrift voor Urologie 2016; 6 (08) 128-136
  • 9 Mota P, Carvalho N, Carvalho-Dias E, João Costa M, Correia-Pinto J, Lima E. Video-based surgical learning: improving trainee education and preparation for surgery. J Surg Educ 2018; 75 (03) 828-835
  • 10 Hu YY, Mazer LM, Yule SJ. et al. Complementing operating room teaching with video-based coaching. JAMA Surg 2017; 152 (04) 318-325
  • 11 Herrera-Almario GE, Kirk K, Guerrero VT, Jeong K, Kim S, Hamad GG. The effect of video review of resident laparoscopic surgical skills measured by self- and external assessment. Am J Surg 2016; 211 (02) 315-320
  • 12 Grober ED, Hamstra SJ, Wanzel KR. et al. Laboratory based training in urological microsurgery with bench model simulators: a randomized controlled trial evaluating the durability of technical skill. J Urol 2004; 172 (01) 378-381
  • 13 Grober ED, Hamstra SJ, Wanzel KR. et al. Validation of novel and objective measures of microsurgical skill: hand-motion analysis and stereoscopic visual acuity. Microsurgery 2003; 23 (04) 317-322
  • 14 Ezra DG, Aggarwal R, Michaelides M. et al. Skills acquisition and assessment after a microsurgical skills course for ophthalmology residents. Ophthalmology 2009; 116 (02) 257-262
  • 15 Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative “bench station” examination. Am J Surg 1997; 173 (03) 226-230
  • 16 Temple CL, Ross DC. A new, validated instrument to evaluate competency in microsurgery: the University of Western Ontario Microsurgical Skills Acquisition/Assessment instrument [outcomes article]. Plast Reconstr Surg 2011; 127 (01) 215-222
  • 17 Carmines ED, Zeller RA. eds. Reliability and Validity Assessment (Quantitative Applications in the Social Sciences). Thousand oaks, CA: Sage Publications; 1979
  • 18 Byrt T. How good is that agreement?. Epidemiology 1996; 7 (05) 561
  • 19 Lee S, Frank DH, Choi SY. Historical review of small and microvascular vessel surgery. Ann Plast Surg 1983; 11 (01) 53-62
  • 20 Wanzel KR, Hamstra SJ, Caminiti MF, Anastakis DJ, Grober ED, Reznick RK. Visual-spatial ability correlates with efficiency of hand motion and successful surgical performance. Surgery 2003; 134 (05) 750-757
  • 21 Murdoch JR, Bainbridge LC, Fisher SG, Webster MH. Can a simple test of visual-motor skill predict the performance of microsurgeons?. J R Coll Surg Edinb 1994; 39 (03) 150-152
  • 22 Dreyfus SE. The five-stage model of adult skill acquisition. Bull Sci Technol Soc 2004; 24 (03) 177-181
  • 23 Dreyfus HL, Dreyfus SE, Athanasiou T. Mind Over Machine: The Power of Human Intuition and Expertise in the Era of the Computer. New York, NY: The Free Press; 1986
  • 24 Scott DJ, Valentine RJ, Bergen PC. et al. Evaluating surgical competency with the American Board of Surgery In-Training Examination, skill testing, and intraoperative assessment. Surgery 2000; 128 (04) 613-622
  • 25 Ghanem AM, Hachach-Haram N, Leung CC, Myers SR. A systematic review of evidence for education and training interventions in microsurgery. Arch Plast Surg 2013; 40 (04) 312-319
  • 26 Starkes JL, Payk I, Hodges NJ. Developing a standardized test for the assessment of suturing skill in novice microsurgeons. Microsurgery 1998; 18 (01) 19-22
  • 27 Hong JW, Kim YS, Lee WJ, Hong HJ, Roh TS, Song SY. Evaluation of the efficacy of microsurgical practice through time factor added protocol: microsurgical training using nonvital material. J Craniofac Surg 2010; 21 (03) 876-881
  • 28 Kalu PU, Atkins J, Baker D, Green CJ, Butler PE. How do we assess microsurgical skill?. Microsurgery 2005; 25 (01) 25-29
  • 29 Aggarwal R, Grantcharov T, Moorthy K, Milland T, Darzi A. Toward feasible, valid, and reliable video-based assessments of technical surgical skills in the operating room. Ann Surg 2008; 247 (02) 372-379
  • 30 Chan W, Niranjan N, Ramakrishnan V. Structured assessment of microsurgery skills in the clinical setting. J Plast Reconstr Aesthet Surg 2010; 63 (08) 1329-1334
  • 31 Dumestre D, Yeung JK, Temple-Oberle C. Evidence-based microsurgical skills acquisition series part 2: validated assessment instruments--a systematic review. J Surg Educ 2015; 72 (01) 80-89
  • 32 van Hove PD, Tuijthof GJ, Verdaasdonk EG, Stassen LP, Dankelman J. Objective assessment of technical surgical skills. Br J Surg 2010; 97 (07) 972-987
  • 33 Insel A, Carofino B, Leger R, Arciero R, Mazzocca AD. The development of an objective model to assess arthroscopic performance. J Bone Joint Surg Am 2009; 91 (09) 2287-2295
  • 34 Atkins JL, Kalu PU, Lannon DA, Green CJ, Butler PE. Training in microsurgical skills: does course-based learning deliver?. Microsurgery 2005; 25 (06) 481-485
  • 35 Selber JC, Chang EI, Liu J. et al. Tracking the learning curve in microsurgical skill acquisition. Plast Reconstr Surg 2012; 130 (04) 550e-557e
  • 36 Kaufman HH, Wiegand RL, Tunick RH. Teaching surgeons to operate--principles of psychomotor skills training. Acta Neurochir (Wien) 1987; 87 (01) (02) 1-7
  • 37 Pandey VA, Wolfe JH, Black SA, Cairols M, Liapis CD, Bergqvist D. European Board of Vascular Surgery. Self-assessment of technical skill in surgery: the need for expert feedback. Ann R Coll Surg Engl 2008; 90 (04) 286-290
  • 38 Leung CC, Ghanem AM, Tos P, Ionac M, Froschauer S, Myers SR. Towards a global understanding and standardisation of education and training in microsurgery. Arch Plast Surg 2013; 40 (04) 304-311
  • 39 Satterwhite T, Son J, Carey J. et al. Microsurgery education in residency training: validating an online curriculum. Ann Plast Surg 2012; 68 (04) 410-414
  • 40 Tolba RH, Czigány Z, Osorio Lujan S. et al. Defining standards in experimental microsurgical training: recommendations of the European Society for Surgical Research (ESSR) and the International Society for Experimental Microsurgery (ISEM). Eur Surg Res 2017; 58 (05) (06) 246-262