TY - JOUR
T1 - Are Video Recordings Reliable for Assessing Surgical Performance?
T2 - A Prospective Reliability Study Using Generalizability Theory
AU - Frithioff, Andreas
AU - Frendø, Martin
AU - Foghsgaard, Søren
AU - Sørensen, Mads Sølvsten
AU - Andersen, Steven Arild Wuyts
N1 - Copyright © 2022 Society for Simulation in Healthcare.
PY - 2023/8/1
Y1 - 2023/8/1
N2 - INTRODUCTION: Reliability is pivotal in surgical skills assessment. Video-based assessment can be used for objective assessment without physical presence of assessors. However, its reliability for surgical assessments remains largely unexplored. In this study, we evaluated the reliability of video-based versus physical assessments of novices' surgical performances on human cadavers and 3D-printed models-an emerging simulation modality.METHODS: Eighteen otorhinolaryngology residents performed 2 to 3 mastoidectomies on a 3D-printed model and 1 procedure on a human cadaver. Performances were rated by 3 experts evaluating the final surgical result using a well-known assessment tool. Performances were rated both hands-on/physically and by video recordings. Interrater reliability and intrarater reliability were explored using κ statistics and the optimal number of raters and performances required in either assessment modality was determined using generalizability theory.RESULTS: Interrater reliability was moderate with a mean κ score of 0.58 (range 0.53-0.62) for video-based assessment and 0.60 (range, 0.55-0.69) for physical assessment. Video-based and physical assessments were equally reliable (G coefficient 0.85 vs. 0.80 for 3D-printed models and 0.86 vs 0.87 for cadaver dissections). The interaction between rater and assessment modality contributed to 8.1% to 9.1% of the estimated variance. For the 3D-printed models, 2 raters evaluating 2 video-recorded performances or 3 raters physically assessing 2 performances yielded sufficient reliability for high-stakes assessment (G coefficient >0.8).CONCLUSIONS: Video-based and physical assessments were equally reliable. Some raters were affected by changing from physical to video-based assessment; consequently, assessment should be either physical or video based, not a combination.
AB - INTRODUCTION: Reliability is pivotal in surgical skills assessment. Video-based assessment can be used for objective assessment without physical presence of assessors. However, its reliability for surgical assessments remains largely unexplored. In this study, we evaluated the reliability of video-based versus physical assessments of novices' surgical performances on human cadavers and 3D-printed models-an emerging simulation modality.METHODS: Eighteen otorhinolaryngology residents performed 2 to 3 mastoidectomies on a 3D-printed model and 1 procedure on a human cadaver. Performances were rated by 3 experts evaluating the final surgical result using a well-known assessment tool. Performances were rated both hands-on/physically and by video recordings. Interrater reliability and intrarater reliability were explored using κ statistics and the optimal number of raters and performances required in either assessment modality was determined using generalizability theory.RESULTS: Interrater reliability was moderate with a mean κ score of 0.58 (range 0.53-0.62) for video-based assessment and 0.60 (range, 0.55-0.69) for physical assessment. Video-based and physical assessments were equally reliable (G coefficient 0.85 vs. 0.80 for 3D-printed models and 0.86 vs 0.87 for cadaver dissections). The interaction between rater and assessment modality contributed to 8.1% to 9.1% of the estimated variance. For the 3D-printed models, 2 raters evaluating 2 video-recorded performances or 3 raters physically assessing 2 performances yielded sufficient reliability for high-stakes assessment (G coefficient >0.8).CONCLUSIONS: Video-based and physical assessments were equally reliable. Some raters were affected by changing from physical to video-based assessment; consequently, assessment should be either physical or video based, not a combination.
KW - Cadaver
KW - Clinical Competence
KW - Computer Simulation
KW - Humans
KW - Prospective Studies
KW - Reproducibility of Results
KW - Video Recording
UR - http://www.scopus.com/inward/record.url?scp=85166389095&partnerID=8YFLogxK
U2 - 10.1097/SIH.0000000000000672
DO - 10.1097/SIH.0000000000000672
M3 - Journal article
C2 - 36260767
SN - 1559-2332
VL - 18
SP - 219
EP - 225
JO - Simulation in healthcare : journal of the Society for Simulation in Healthcare
JF - Simulation in healthcare : journal of the Society for Simulation in Healthcare
IS - 4
ER -