Are Video Recordings Reliable for Assessing Surgical Performance? A Prospective Reliability Study Using Generalizability Theory

Andreas Frithioff*, Martin Frendø, Søren Foghsgaard, Mads Sølvsten Sørensen, Steven Arild Wuyts Andersen

*Corresponding author for this work

Abstract

INTRODUCTION: Reliability is pivotal in surgical skills assessment. Video-based assessment can be used for objective assessment without physical presence of assessors. However, its reliability for surgical assessments remains largely unexplored. In this study, we evaluated the reliability of video-based versus physical assessments of novices' surgical performances on human cadavers and 3D-printed models-an emerging simulation modality.

METHODS: Eighteen otorhinolaryngology residents performed 2 to 3 mastoidectomies on a 3D-printed model and 1 procedure on a human cadaver. Performances were rated by 3 experts evaluating the final surgical result using a well-known assessment tool. Performances were rated both hands-on/physically and by video recordings. Interrater reliability and intrarater reliability were explored using κ statistics and the optimal number of raters and performances required in either assessment modality was determined using generalizability theory.

RESULTS: Interrater reliability was moderate with a mean κ score of 0.58 (range 0.53-0.62) for video-based assessment and 0.60 (range, 0.55-0.69) for physical assessment. Video-based and physical assessments were equally reliable (G coefficient 0.85 vs. 0.80 for 3D-printed models and 0.86 vs 0.87 for cadaver dissections). The interaction between rater and assessment modality contributed to 8.1% to 9.1% of the estimated variance. For the 3D-printed models, 2 raters evaluating 2 video-recorded performances or 3 raters physically assessing 2 performances yielded sufficient reliability for high-stakes assessment (G coefficient >0.8).

CONCLUSIONS: Video-based and physical assessments were equally reliable. Some raters were affected by changing from physical to video-based assessment; consequently, assessment should be either physical or video based, not a combination.

Original languageEnglish
JournalSimulation in healthcare : journal of the Society for Simulation in Healthcare
Volume18
Issue number4
Pages (from-to)219-225
Number of pages7
ISSN1559-2332
DOIs
Publication statusPublished - 1 Aug 2023

Keywords

  • Cadaver
  • Clinical Competence
  • Computer Simulation
  • Humans
  • Prospective Studies
  • Reproducibility of Results
  • Video Recording

Fingerprint

Dive into the research topics of 'Are Video Recordings Reliable for Assessing Surgical Performance? A Prospective Reliability Study Using Generalizability Theory'. Together they form a unique fingerprint.

Cite this