TY - JOUR
T1 - Collecting Validity Evidence for Simulation-Based Assessment of Point-of-Care Ultrasound Skills
AU - Jensen, Jesper Kørup
AU - Dyre, Liv
AU - Jørgensen, Mattis Enggaard
AU - Andreasen, Lisbeth Anita
AU - Tolsgaard, Martin Grønnebaek
N1 - © 2017 by the American Institute of Ultrasound in Medicine.
PY - 2017/12
Y1 - 2017/12
N2 - OBJECTIVES: The aim of this study was to examine the validity of a simulator test designed to evaluate focused assessment with sonography for trauma (FAST) skills.METHODS: Participants included a group of ultrasound novices (n = 25) and ultrasound experts (n = 10). All participants had their FAST skills assessed using a virtual reality ultrasound simulator. Procedural performance on the 4 FAST windows was assessed by automated simulator metrics, which received a passing or failing score. The validity evidence for these simulator metrics was examined by a stepwise approach according to the Standards for Educational and Psychological Testing. Metrics with validity evidence were included in a simulator test, and the reliability of test scores was determined. Finally, a pass/fail level for procedural performance was established.RESULTS: Of the initial 55 metrics, 34 (61.8%) had validity evidence (P < .01). A simulator test was constructed based on the 34 metrics with established validity evidence, and test scores were calculated as percentages of the maximum score. The median simulator test scores were 14.7% (range, 0%-47.1%) and 94.1% (range, 94.1%-100%) for novices and experts, respectively (P < .001). The pass/fail level was determined to be 79.7%.CONCLUSIONS: The performance of FAST examinations can be assessed in a simulated setting using defensible performance standards, which have both good reliability and validity.
AB - OBJECTIVES: The aim of this study was to examine the validity of a simulator test designed to evaluate focused assessment with sonography for trauma (FAST) skills.METHODS: Participants included a group of ultrasound novices (n = 25) and ultrasound experts (n = 10). All participants had their FAST skills assessed using a virtual reality ultrasound simulator. Procedural performance on the 4 FAST windows was assessed by automated simulator metrics, which received a passing or failing score. The validity evidence for these simulator metrics was examined by a stepwise approach according to the Standards for Educational and Psychological Testing. Metrics with validity evidence were included in a simulator test, and the reliability of test scores was determined. Finally, a pass/fail level for procedural performance was established.RESULTS: Of the initial 55 metrics, 34 (61.8%) had validity evidence (P < .01). A simulator test was constructed based on the 34 metrics with established validity evidence, and test scores were calculated as percentages of the maximum score. The median simulator test scores were 14.7% (range, 0%-47.1%) and 94.1% (range, 94.1%-100%) for novices and experts, respectively (P < .001). The pass/fail level was determined to be 79.7%.CONCLUSIONS: The performance of FAST examinations can be assessed in a simulated setting using defensible performance standards, which have both good reliability and validity.
KW - Adult
KW - Clinical Competence/statistics & numerical data
KW - Computer Simulation
KW - Female
KW - Humans
KW - Male
KW - Middle Aged
KW - Point-of-Care Systems
KW - Reproducibility of Results
KW - Ultrasonography/methods
U2 - 10.1002/jum.14292
DO - 10.1002/jum.14292
M3 - Journal article
C2 - 28646627
SN - 0278-4297
VL - 36
SP - 2475
EP - 2483
JO - Journal of Ultrasound in Medicine
JF - Journal of Ultrasound in Medicine
IS - 12
ER -