CC BY-NC-ND 4.0 · Arch Plast Surg 2022; 49(05): 668-675
DOI: 10.1055/s-0042-1756349
Original Article

Watch One, Do One? A Systematic Review and Educational Analysis of YouTube Microsurgery Videos, and a Proposal for a Quality Assurance Checklist

1   St. Andrew's Centre for Plastic Surgery and Burns, Broomfield Hospital. Chelmsford, United Kingdom
2   Group for Academic Plastic Surgery, Blizard Institute, Queen Mary University of London, London, United Kingdom.
,
3   Plastic and Reconstructive Surgery Section, Surgery Division, School of Medicine, Pontificia Universidad Católica de Chile. Santiago, Chile
,
5   Kellogg College, University of Oxford, Oxford, United Kingdom
4   Plastic Surgery Department, Royal Victoria Infirmary, Newcastle upon Tyne, United Kingdom
,
Fateh Ahmad
1   St. Andrew's Centre for Plastic Surgery and Burns, Broomfield Hospital. Chelmsford, United Kingdom
,
Claudio Guerra
3   Plastic and Reconstructive Surgery Section, Surgery Division, School of Medicine, Pontificia Universidad Católica de Chile. Santiago, Chile
,
4   Plastic Surgery Department, Royal Victoria Infirmary, Newcastle upon Tyne, United Kingdom
› Author Affiliations
Funding This study was funded entirely by its authors.
 

Abstract

Background Educational resources on the internet are extensively used to obtain medical information. YouTube is the most accessed video platform containing information to enhance the learning experience of medical professionals. This study systematically analyzed the educational value of microsurgery-related videos on this platform.

Methods A systematic review was conducted on YouTube from April 18 to May 18, 2020, using the following terms: “microsurgery,” “microsurgical,” “microsurgical anastomosis,” “free flap,” and “free tissue transfer.” The search was limited to the first 100 videos, and two independent reviewers screened for eligible entries and analyzed their educational value using validated scales, including a modified version of the DISCERN score (M-DISCERN), Journal of the American Medical Association (JAMAS) benchmark criteria, and the Global Quality Score (GQS). Evaluation of video popularity was also assessed with the video power index (VPI).

Results Of 356 retrieved videos, 75 (21%) were considered eligible. The educational quality of videos was highly variable, and the mean global scores for the M-DISCERN, JAMAS, and GQS for our sample were consistent with medium to low quality.

Conclusions A limited number of videos on YouTube for microsurgical education have high-educational quality. The majority scored low on the utilized criteria. Peer-reviewed resources seem to be a more reliable resource. Although the potential of YouTube should not be disregarded, videos should be carefully appraised before being used as an educational resource.


#

Introduction

Surgical training programs aim to provide their trainees with the knowledge and skills necessary for independent specialist practice.[1] However, the educational resources available for this purpose are in constant evolution. Information technologies have changed how we conduct many of our professional activities, and medical education has not been an exception.[2]

Traditionally, surgical techniques were learned through reading seminal textbooks, performing anatomical dissections, and attending surgical sessions.[3] In this way, a trainee would familiarize themselves with an operation, assisting until ready to perform a given procedure under supervision. This approach is inevitably limited by the clarity of an article or textbook on a given technique, the access to an anatomical laboratory, a trainer's teaching qualities, and the available caseload.[4]

Using multimedia resources to aid surgical training is not new, as videos demonstrating operations can be easily found in hospital libraries, conferences, and online publications.[5] Watching a surgical technique allows the trainee to familiarize themselves with the crucial steps of an operation, even if it is not encountered previously.

YouTube is a free online video-sharing platform owned by Google LLC (California). With 500 hours of videos uploaded every minute and an average of 1.9 billion logged-in monthly users, it is the second most popular social media interface on the world wide web.[6] Videos showing an interventional procedure can be easily uploaded to YouTube making them readily accessible for the global surgical community without the need to go through an editorial process or peer review.

The practice of microsurgery requires the acquisition of exact technical skills for both vessel anastomosis and flap raising.[7] Watching a video demonstration of a microsurgical anastomosis is a common starting point for many instructional courses. A particular task can be demonstrated and broken down into multiple steps to facilitate learning. The trainee can then practice what has been seen in a simulated task trainer or supervised in a clinical setting.[8] Similarly, watching a video demonstrating a particular flap raising technique can help the trainee prepare for a case.

The quality of medical videos available on YouTube has been assessed in other specialties, with specific instruments developed to determine the educational value of this material.[9] [10] Even though appraisal in some areas of plastic surgery has been reported, this is not the case for microsurgery.[5] [11]

This study aims to describe the characteristics of microsurgery-related videos on YouTube and analyze their quality and educational value.


#

Methods

On April 18, 2020, YouTube searches were performed using Google Chrome web browser in “Incognito Mode,” previously having erased all cookies, and browsing history. This was so to avoid user-based video recommendations that could affect searches. The following terms were inquired: “microsurgery,” “microsurgical,” “microsurgical anastomosis,” “free flap,” and “free tissue transfer.” Two independent authors retrieved search results (A.N. and J.E.B.), and the first 100 videos for each search with their links were stored in an Excel spreadsheet (Microsoft Corporation, WA).

Predefined inclusion and exclusion criteria based on the instrument described by Azer et al[12] were used to select eligible videos for further evaluation ([Table 1]). The surgical specialty involved was noted for the videos (reviewer 2) open for the public domain, demonstrating an operation.

Table 1

Video selection criteria

Inclusion

Exclusion

Major criteria

Content

 ● The microsurgical technique is adequate

 ● Topic, creator, and organization producing the video are mentioned

 ● Videos should have an educational value aimed for health professionals

Video

 ● Images are clear

Sound

 ● Sound is clear and background is free from noise

Minor criteria

 ● Covered topics are identified

 ● Designed at the level of a trainee or consultant in plastic surgery

 ● Educational objectives are stated

 ● The video uses non-living or living simulation models and/or patients

Content

 ● Nonmedical (advertisement or commercial) microsurgical videos

 ● Patient-oriented educational videos

 ● Information regarding purpose, content or authorship is not mentioned

Video

 ●  Images, graphics and subtitles are unclear.

Audio

 ● Inaudible audio and/or lack subtitles.

 ● Excessive background noise is audible

Videos demonstrating basic microsurgical skills (i.e., vessel anastomosis under a microscope), and those involving more complex plastic surgery operations (i.e., raising a free tissue flap) were selected by both reviewers and compared. A third independent reviewer assessed the video when there was discordance, determining its inclusion.

Metadata regarding uploading user, operating surgeon, number of views, comments, likes, and dislikes were noted. These were used to calculate like ratios [like × 100 / (like + dislike)], viewing ratios (number of views/days), and video power index (VPI), as described by Erdem and Karaca[13] (like ratio × view ratio / 100).

Resultant videos were analyzed blindly by two validated plastic surgeons with microsurgical experience (O.F.F.-D. and A.N.) using three scales adapted for this purpose ([Table 2]). These scales were the Global Quality Score (GQS),[14] the Journal of the American Medical Association (JAMAS) benchmark criteria[15] and an adapted modification of the DISCERN questionnaire (M-DISCERN)[16] for this present study ([Table 3]). A third author resolved disagreements (J.E.B.). Descriptive statistics were used for video evaluation results and the Statistical Package for the Social Sciences version 21.0 (SPSS Inc. IL).

Table 2

Educational assessment scales

Instrument

Domains

Outcome measure

The Global Quality Score[14]

Three questions, assessing:

 ● The overall quality and flow of information

 ● Accessibility and quality of the content

 ● How useful reviewers consider a video

5-point scale for each domain

Score for each question is summed and divided by 5

Possible scores: 1–3

Journal of American Medical Association Score[15]

Four questions assessing the sufficiency of the information provided relate to:

 ● Authorship

 ● Attribution

 ● Disclosure

 ● Currency

4-point scale for each domain

Score for each question is summed and divided by 4

Possible scores: 1–4

Modified-DISCERN[16]

16 items that evaluate the quality of health information regarding treatment options. Only the first 8 questions were included plus the last question, as items 9–15 are patient-related

5-point scale for each domain

Score for each question is summed

Possible scores: 9–45

Table 3

Modified DISCERN questionnaire

 ● Are the aims clear?

 ● Does it achieve the aim?

 ● Is it relevant?

 ● Is it clear what sources of information were used to compile the publication (other than the author or producer)?

 ● Is it clear when the information used or reported in the publication was produced?

 ● Is it balanced and unbiased?

 ● Does it provide details of additional sources of support and information?

 ● Does it refer to areas of uncertainty?

 ● Based on the answers to all of the above questions, rate the overall quality of the publication as a source of information about treatment choices?


#

Results

A total of 356 videos were found and retrieved. Thirty-seven duplicate entries were identified and removed, resulting in 319 videos. Of these, 29 (9%) were excluded, as these were nonmedical. A further 22 (7%) corresponded to medical videos aimed at patients or the industry. Plastic surgery procedures accounted for 26% of the included technical videos, followed by neurosurgery (23%), basic microsurgical skills (14%), urology (10%), and dentistry (7%). The remaining 4% consisted of gynecology, general surgery, otorhinolaryngology, and ophthalmology operations.

Our inclusion criteria screened 83 videos containing flap raising demonstrations and 46 basic microsurgical skills. Thirty-six videos were excluded due to poor quality or unorthodox surgical technique (i.e., a clear breach of the sterile technique). Detailed full-video analysis following Azer et al[12] methodology resulted in the exclusion of another 19 videos. This resulted in 75 videos for complete educational content evaluation: 28 demonstrate basic microsurgical skills and 47 flap raising techniques. A Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA)-style diagram showing the video selection process can be appreciated in [Fig. 1].

Zoom Image
Fig. 1 PRISMA flow diagram. PRISMA, Preferred Reporting Items for Systematic Reviews and Meta-analyses.

The mean duration of basic skills surgical videos was 10.30 minutes, and the mean number of views was 17,475. The mean like ratio was 95.59, and the average total comments were 11.6. The mean scores obtained by videos according to VPI, JAMAS, GQS, and M-DISCERN scoring systems were respectively 9.57, 2.7, 2.53, and 29.61 ([Table 3]). Regarding the flap raising videos, the average time duration was 13.78 minutes, and the mean recorded views were 16,857. The average like ratio was 91.90, and the average number of contents was 6.70. The average scores acquired by videos using the VPI, JAMAS, GQS, and M-DISCERN scoring systems showed 11.87, 2.4, 2.54, and 27.23, respectively ([Table 3]). Video meta-data are illustrated in a scatter plot ([Fig. 2]) to demonstrate the number of viewings and the average of the obtained scores for eligible videos.

Zoom Image
Fig. 2 Scatter plot showing number of views and obtained average scores for the eligible entries. GQS, global quality score; JAMAS, Journal of American Medical Association Score; M-DISCERN, modified-DISCERN score.

The top five entries for each category were ranked to their average score, accordingly to the sum of each score, and divided by the number of scoring tests (GQS, JAMAS, and M-DISCERN). We found no discrepancies in the ranking of the videos, and the scores are presented in [Table 4].

Table 4

Meta-data and video score results

Basic skills videos (n = 28)

Mean ± SD

Flap raising videos (n = 47)

Mean ± SD

Mean duration (min)

10.30 (10.70)

13.78 (13.43)

Mean number of views

17,475 (1,9973)

16,857 (SD ± 20,899)

Mean views per day (view ratio)

9.92 (11.4)

12.63 (15.82)

Mean likes/dislikes

95.9 (5.9)/4.1 (5)

84.34 (94.87)/5.02 (6.83)

Mean like ratio

95.59 (5.85)

91.90 (14.44)

Mean total comments

11.6 (34.77)

6.70 (11.5)

Mean positive/negative comments

2.04 (2.6)/0.24 (0.81)

3.65 (5.03)/0.14 (0.41)

Mean VPI

9.57 (11.48)

11.87 (14.85)

JAMAS mean score

2.7 (0.6)

2.4 (0.53)

GQS mean score

2.53 (0.39)

2.54 (0.44)

M-DISCERN mean score

29.16 (5.53)

27.23 (5.38)

Abbreviations: GQS, global quality scale; JAMAS, Journal of American Medical Association score; M-DISCERN, modified DISCERN score; SD, standard deviation; VPI, video power index.



#

Discussion

Videos demonstrating surgical techniques have proven to be a helpful resource for surgical trainees. Audiovisual material can show the applied anatomy and necessary steps of any operation. This has been reflected in the increasing use of video-based learning as a distinctive feature of many surgical trainees preparing before surgery,[17] [18] resorting to repositories such as YouTube.[5]

In this study, we assessed the educational value of YouTube videos on microsurgery using scales previously validated in the literature. Over the years, this platform has refined a search algorithm, taking user preferences and once watched entries into count. This aims to bring content to users' attention that might be of their liking and intends to extend users' time on YouTube.[19] To reduce the risk of bias derived from YouTube results algorithm, we performed our systematic search using an “Incognito Mode,” avoiding the influence of cookies and previous browsing history.

The DISCERN questionnaire[16] was initially designed to evaluate the quality of information regarding treatment options. Although it has been widely used to assess surgical videos since its inception, it was not intended for video content evaluation. Thus, the exact character of the scoring system may falsely lower scores across the study.[20] Therefore, a modified version (M-DISCERN) of this scale was used in the present study by eliminating items related to patient choices and perspectives. The GQS directly assesses the educational content of audiovisual material. This can provide a general overview of its quality, and we believe that it was the most straightforward scale for our objective. JAMAS, on the other side, involves a focused appraisal of the authorship and copyright of eligible videos. The use of multiple scores to assess the educational quality was employed in the present study, because there is no single score that contains all the elements to evaluate a video fully. However, we believe in the importance of the GQS score by offering a simple, rapid, and concise way to measure the content quality. When comparing our scores to other studies that assessed surgical education videos, we found similar low scores, and their informative aspect was inadequate.[9]

Despite thoroughly filtering our results, the overall poor results reflect the wide disparity of content available on YouTube. While some videos were examples of excellent quality ([Table 4]), there were also entries on the opposite side of the spectrum. Furthermore, the quality of a surgical video does not seem to correlate with the number of views received on YouTube, as illustrated in [Fig. 1]. Editorial leadership and unbiased peer-review are fundamental pillars that guarantee the quality of scientific publications. Both are absent on YouTube, where the sole indication of value is related to views and “likes.” Anyone can upload audiovisual material to this platform, making it readily available to a global audience.

Low-quality videos defying the evidence-based principles of state-of-the-art surgical care pose a risk, easily misguiding trainees and nonspecialists. Contrary to YouTube, audiovisual material available in peer-reviewed forums, such as journal publications, scientific societies, and associations, usually goes through stricter appraisal before publication. New social media platforms also risk producing misleading video graphic content which must be examined for its potential benefit. Independent from its source, a critical appraisal is key to identifying which videos possess an educational value translated to safe clinical practice. Given this situation, microsurgical societies may have a great opportunity to submit peer-reviewed video graphic content to YouTube with specified keywords and titles which they can then promote. Additionally, substantial social media sharing, interacting with the audience, and partnering with academic organizations can all contribute to increasing the accessibility of this type of information for the public.

A well-executed and edited recording of the raising of a free flap offers an educational opportunity only surpassed by real-life itself. Surgical videos can display applied anatomy, indications, techniques, tips, and pitfalls in a way textbooks cannot offer. Although considerable effort was put into some eligible entries in our review, the lack of relevant narration or subtitles, suboptimal authorship, and copyright information affected the overall appraisal of these entries. Therefore, we recommend authors to upload content to open video repositories to comply with minimum publication-level standards ([Table 5]). We believe that this proposed checklist can set the standard for a new scoring system applied to a surgical video displayed on any electronic platform (including social media and webinars) and overturn deceptive traits (number of views or likes) that do not correlate to highly educational content ([Table 6]).

Table 5

Top 5 videos for basic microsurgery and flap raising

Title

Length (min)

Views

Likes

Dislikes

VPI

M-DISCERN

GQS

JAMAS

Basic microsurgery skills

1

Microsurgical technique for 1 mm vessel end to end anastomosis by Yelena Akelina DVM[21]

03:10

689

3

1

0.4

38.5

3

2.8

2

Chang's technique of sequential end-to-side microvascular anastomosis [22]

02:39

5,124

42

1

4.7

36.5

3

3.6

3

Microsurgical repair of the rat sciatic nerve [23]

11:52

21,482

NA

NA

NA

35.5

2.9

3.4

4

Improving microvascular anastomosis efficiency by combining open-loop and airborne suture techniques[24]

02:00

2,156

12

0

3.2

35.5

2.8

2.8

5

Introduction to microsurgery part 1[25]

06:20

6,979

68

1

3.0

35.5

2.9

2.4

Flap raising

1

Elevation of thin anterolateral thigh flap on superficial fascia plane by Jp (Joon Pio) Hong[26]

07:36

12,837

113

3

8.3

44

3

4

2

Fibular free flap[27]

20:11

28,303

228

9

34.6

31

3

3.1

3

Latissimus dorsi free flap[28]

28:33

22,687

101

10

28.4

31

2.9

2.8

4

Ulnar fasciocutaneous free flap [29]

04:32

7,176

26

3

3.4

31

2.9

2.8

5

Radial forearm flap[30]

07:01

7,152

49

3

7.9

31

2.8

2.4

Abbreviations: GQS, global quality scale; JAMAS, Journal of American Medical Association score; M-DISCERN, modified-DISCERN score; N/A, not applicable; VPI, video power index.


Table 6

Proposed video checklist for assuring quality of surgical educational videos (SURG-ED video checklist)

SURG-ED video checklist

Authorship

 ● Title, authors, affiliation, date, and place of creation are comprehensively outlined.

Content

 ● Aims: topic and educational objectives are clear and scientifically correct.

 ● Surgical technique: technical demonstration is clear and concise.

 ● Video: images are clear.

 ● Audio: sound is clear.

References

 ● Sources should be listed, up-to-date and adequately referenced.

 ● Copyright information clearly stated.

Ethics

 ● Patient consent should be obtained and patient dignity respected.

 ● Surgical video production should be overseen by a health professional.

Disclosure

 ● Sponsorship, advertising, and commercial funding should be disclosed, along with any potential conflict of interest.

Multiple limitations can be identified in our study. Even though we pursued a systematic review using the YouTube search engine, the mechanics behind this are concealed. We tried to minimize the risk of bias. However, it is impossible to conduct a PRISMA compliant review within this platform. Inevitably, limiting the first hundred entries for each search has excluded videos with potentially high educational value. However, videos below this threshold are less likely to be found by users, given YouTube unique mechanics. We chose to use the DISCERN, JAMAS, and GQS instruments, as they are the more widely used in literature. Like any other measuring system, these have their strengths and weaknesses, and a detailed discussion of these is beyond the scope of this report.

YouTube is an unparalleled resource of audiovisual material that can be useful for microsurgical training. However, its design cannot ascertain the quality of this material. Users should be wary of accessing YouTube for educational purposes, as each video should be appraised on its own merits. We recommend trainees to use well-established peer-review resources when possible. YouTube contributors could quickly improve the quality of their submissions by following the guidance provided in this report.


#
#

Conflict of Interest

None declared.

Ethical Approval

Not required.


Authors' Contributions

O.F.F.-D., J.E.B., and A.N. conceptualized the idea. A.N. and J.E.B. conducted the systematic video review. O.F.F.-D. and A.N. undertook the video analysis. F.A., C.G., and M.R. provided senior support and supervision. All authors contributed to the writing of the manuscript.


Note

This article was presented as a poster at the BAPRAS Winter Scientific Meeting on December 7, 2020.



Address for correspondence

Oscar F. Fernandez-Diaz, MD, MSc
St. Andrew's Centre for Plastic Surgery and Burns, Broomfield Hospital, Mid and South Essex Hospital Services NHS Trust
Chelmsford, Essex CM1 7ET
United Kingdom   

Publication History

Received: 06 December 2021

Accepted: 17 February 2022

Article published online:
23 September 2022

© 2022. The Korean Society of Plastic and Reconstructive Surgeons. This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/)

Thieme Medical Publishers, Inc.
333 Seventh Avenue, 18th Floor, New York, NY 10001, USA


Zoom Image
Fig. 1 PRISMA flow diagram. PRISMA, Preferred Reporting Items for Systematic Reviews and Meta-analyses.
Zoom Image
Fig. 2 Scatter plot showing number of views and obtained average scores for the eligible entries. GQS, global quality score; JAMAS, Journal of American Medical Association Score; M-DISCERN, modified-DISCERN score.