Page 313 - International Perspectives on Effective Teaching and Learning in Digital Education
P. 313

Assessment Tools for Non-Technical Skills

               A total of seven different NTS assessment tools were identified in the stud-
             ies. The Team Emergency Assessment Measure (TEAM) tool was used in four
             studies analysed, while the Non-Technical Skills for Trauma (T-NOTECHS) tool
             was used in two studies. Other tools were employed in a single study only.
             All the mentioned tools were used in a multitude of clinical scenarios, all of
             which focused on the management of critically ill patients, whether adults
             or children, in high-risk environments such as the emergency department or
             operating room. In these scenarios, the healthcare teams consisted primarily
             of physicians, particularly surgeons, paediatricians and anaesthesiologists,
             as well as nurses. However, some scenarios also included multidisciplinary
             teams consisting of respiratory therapists, paramedics, pharmacists, and
             students from various healthcare disciplines, reflecting the diverse and col-
             laborative nature of real-world clinical care. The scenarios were conducted in
             either a simulation laboratory or a real clinical setting (in-situ simulation). The
             most commonly used simulation modality was the high-fidelity simulator.
               Table  lists the NTS skills in the seven assessment tools evaluated. The
             most frequently assessed NTS, as determined by the frequency count, were
             communication (n = 6), situation awareness (n = 5),  cooperation/teamwork
             (n = 5), and leadership (n = 4). While all tools assess three to five NTS domains,
             the number of items in each tool varies considerably, ranging from four items
             (Non-Technical Skills for Surgeons – NOTSS) to 46 items (Team Performance
             During Simulated Crises Instrument – TPDSCI). Each study reported validation
             evidence for its quantitative assessment strategy, either within the same arti-
             cle or by citing a related article that included this information. Seven studies
             used previously established NTS assessment tools, and the remaining three
             studies utilised newly developed or modified assessment strategies. Most of
             the assessment tools included had been validated in terms of scoring and
             generalisation. A smaller number of tools had been validated for extrapola-
             tion inference, such as factor analysis. Internal consistency was reported for
             four assessment tools. The reported rater agreement and internal consisten-
             cy scores were all above .7 (acceptable), with most scores above .8 (good)
             in the case of the TEAM tool. All tools demonstrated satisfactory inter-rater re-
             liability, as indicated by intra-class correlation or kappa coefficients. However,
             only two tools (Team Performance Observation Tool – TPOT and TEAM) were
             subjected to a test-retest reliability assessment. Factor analysis was conduct-
             ed for TEAM (Freytag et al., 19) and T-NONTECHS (Repo et al., 19), reveal-
             ing a robust construct validity in both cases. There were also other validation
             parameters for the included assessment strategies, such as content validity
             using the content validity index (CVI), unidimensional validity or face validity.


                                                                            313
   308   309   310   311   312   313   314   315   316   317   318