Skill Performance

Instruments to assess or evaluate skill acquisition for the clinical nursing role

Lambton, J., Pauly O’Neill, S., & Dudum, T. (2008). Simulation as a strategy to teach clinical pediatrics within a nursing curriculum. Clinical Simulation in Nursing, 4(3), e79-e87. doi:10. 1016/j.ecns.2008.08.001

Murray, D., Boulet, J., Ziv, A., Woodhouse, J., Kras, J., & McAllister, J. (2002). An acute care skills evaluation for graduating medical students: A pilot study using clinical simulation. Medical Education, 36(9), 833-841.

Vincent, M.A., Sheriff, S., Mellott, S. (2015). The Efficacy of High-fidelity Simulation on Psychomotor Clinical Performance Improvement of Undergraduate Nursing Students, CIN: Computers, Informatics, Nursing, 33(2), 78-84.

Actions, Communication, and Teaching in the Simulation Tool

Sanko, J.S., Shekhter, I., Gattamorta, K.A., & Birnbach, D.J. (2016, November). Development and psychometric analysis of a tool to evaluate confederates. Clinical Simulation in Nursing, 12(11), 475-481. http://dx.doi.org/10.1016/j.ecns.2016.07.006

Check-off tool for critical elements in nursing students

Herm, S., Scott, K. & Copley, D. (2007). Sim’sational revelations. Clinical Simulation in Nursing Education, 3(1), e25-e30. DOI: http://dx.doi.org/10.1016/j.ecns.2009.05.036

Family Care Rubric

Van Gelderen, S., Krumwiede, N., & Christian, A. (2016). Teaching family nursing through simulation: Family-care rubric development. Clinical Simulation in Nursing12(5), 159-170. https:// dx.doi.org/10.1016/j.ecns.2016.01.002.

Facilitator Competence

Leighton, K., Mudra, V., & Gilbert, G.E. (2018). Development and psychometric evaluation of the Facilitator Competency Rubric. Nursing Education Perspectives, 39(6), E3-E9. DOI: 10.1097/01. NEP. 0000000000000409 https://tinyurl.com/ya2tepcv

Group Rubrics and Evaluation Tools

Kim, J., Neilipovitz, D., Cardinal, P., Chiu, M., & Clinch, J. (2006). A pilot study using high-fidelity simulation to formally evaluate performance in the resuscitation of critically ill patients: The University of Ottawa Critical Care Medicine, High-Fidelity Simulation, and Crisis Resource Management I Study. Critical Care Medicine, 34(8), 2167-2174.

Malec, J., Torsher, L., Dunn, W., Wiegman, D., Arnold, J., Brown, D., et al. (2007). The Mayo high-performance teamwork scale: Reliability and validity for evaluating key crew resource management skills. Simulation in Healthcare, 2(1), 4-10.

Millward, L., & Jeffries, N. (2001). The team survey: A tool for health care team development. Journal of Advanced Nursing35(2), 276-287.

Health Communication Assessment Tool©

Pagano, M.P., O'Shea, E.R., Campbell, S.H., Currie, L.M., Chamberlin, E., and Pates, C.A. (2015). Validating the Health Communication Assessment Tool© (HCAT). Clinical Simulation in Nursing, 11(9), 402-410.

Nurse Competence Scale

Meretoja, R., Isoaho, H., Leino-Kilpi H. (2004). Nurse competence scale: Development and psychometric testing. Journal of Advanced Nursing, 47(2), 124-133.

Nurse Resident’s Readiness for Entry into Practice Competence Questionnaire

Beyea, S. C., von Reyn, L., Slattery, M. J. (2007). A nurse residency program for competency development using human patient simulation. Journal for Nurses in Staff Development23(2), 77-82.

Nursing Student Teamwork Skills

Smith, S., Farra, S., Ten Eyck, R., & Bashaw, M. (2015). Development of an instrument to measure nursing student teamwork skills. Clinical Simulation in Nursing, 11(12), 507-512. https:// dx.doi.org/10.1016/j.ecns.2015.10.006.

Objective Structured Assessment of Technical Skills (OSATS)

Martin, J.A., Regehr, G., Reznick, R., Macrae, H., Murnaghan, J., Hutchison, C., and Brown, M. (1997). Objective structured assessment of technical skills (OSATS) for surgical residents. British Journal of Surgery, 84, 273-278.

Objective Structured Clinical Assessment Tool

Najjar, R. H., Docherty, A., & Miehl, N. (2016). Psychometric properties of an objective structured clinical assessment tool. Clinical Simulation in Nursing, 12(3), 88-95. http://dx.doi.org/10.1016/ j.ecns.2016.01.003.

Organization-level Evaluation

Leighton, K., Foisy-Doll, C., & Gilbert, G. E. (2018). Development and psychometric evaluation of the Simulation Culture Organizational Readiness Survey (SCORS). Nurse Educator, 43(5), 251-255. DOI: 10.1097/NNE.0000000000000504.

Quint Leveled Clinical Competency Tool

Prion, S. K., Gilbert, G. E., Adamson, K. A., Kardong-Edgren, S., & Quint, S. (2017). Development and testing of the Quint Leveled Clinical Competency Tool. Clinical Simulation in Nursing, 13(3), 106- 115. http://dx.doi.org/10.1016/j.ecns.2016.10.008.

Safety Outcomes

Shearer, J. E. (2013). High-fidelity simulation and safety: An integrative review. Journal of Nursing Education, 52(1), 39-45.

Simulation-based resuscitation scenario assessment tool

Hall, A. K., Pickett, W., & Dagnone, J, D. (2012). Development and evaluation of a simulation-based resuscitation scenario assessment tool for emergency medicine residents. CJEM : Journal of the Canadian Association of Emergency Physicians, 14(3), 139-46.

Simulation Module for Assessment of Residents Targeted Event Responses (SMARTER)

Rosen, M., Salas, E., Silvestri, S., Wu, T., & Lazzara, E. (2008). A measurement tool for simulation-based training in emergency medicine: The simulation module for assessment of resident targeted event responses (SMARTER) approach. Simulation in Healthcare, 3(3), 170-179.

Teamwork Observation Tool

Curran V, Casimiro L, Banfield V, et al. Interprofessional Collaborator Assessment Rubric. Academic Health Council of Canada, 2010 (http://www.med.mun.ca/CCHPE/Faculty-Resources/Interprofessional-Collaborator-Assessment-Rubric.aspx

Curran V, Hollett A, Casimiro LM, et al. (2011). Development and validation of the interprofessional collaborator assessment rubric (ICAR). Journal of Interprofessional Care, 25, 339-44.

Lineberry M, Bryan E, Brush T, Carolan TF, Holness D, Salas E, King H. (2013). Measurement and training of Team STEPPS dimensions using the Medical Team Performance Assessment Tool. Joint Commission Journal of Quality and Patient Safety, 39, 89-95.

The Seattle University Simulation Evaluation Tool©

Mikasa, A.W., Cicero, T.F., and Adamson, K.A. (2012). An outcome-based evaluation tool to evaluate student performance in high-fidelity simulation. Clinical Simulation in Nursing9(9), e361-e367.

The Sweeney-Clark Simulation Performance Evaluation Tool

Clark, M. (2006). Evaluating an obstetric trauma scenario. Clinical Simulation in Nursing Education, 2, e13-e16.

Video Analysis

Kim, S., Brock, D., Prouty, C. D., Odegard, P. S., Shannon, S. E., Robins, L., Gallagher, T. (2011). A web-based team-oriented medical error communication assessment tool: Development, preliminary reliability, validity, and user ratings. Teaching and Learning in Medicine, (23)1, 68–77.

Sulaiman ND, Hamdy H. (2013). Assessment of clinical competencies using clinical images and videos "CIVA". BMC Medical Education, 13:78.

Yoo MS, Son YJ, Kim YS, Park JH. (2009). Video-based self-assessment: implementation and evaluation in an undergraduate nursing course. Nurse Education Today, 29, 585-589.

Weighted Scoring Tool for Clinical Objectives

Gore, T., Hunt C., & Raines, K. (2008). Mock hospital unit simulation: A teaching strategy to promote safe patient care. Clinical Simulation in Nursing, 4(5). Doi: 10.1016/j.ecsn.2008.08.006

Theme picker

Learner Satisfaction

Quantitative or qualitative measures of students’ responses

Abdo, A., & Ravert, P. (2006). Student satisfaction with simulation experiences. Clinical Simulation in Nursing Education2(1), e13-e16.

Mole, L., & McLafferty, I. (2004). Evaluating a simulated ward exercise for third-year student nurses. Nurse Education in Practice, 4, 91-99.

Schoening, A., Sittner, B., & Todd, M. (2006). Simulated clinical experience: Nursing student’s perceptions and the educator’s role. Nurse Educator 31(6), 253-258.

Casey-Fink Graduate Nurse Experience Survey

Fink, R., Krugman, M., Casey, K. & Goode, C. (2008). The graduate nurse experience: Qualitative residency program outcomes. The Journal of Nursing Administration, 38(7-8), 341-348.

Casey, K., Fink, R., Krugman, M. &Propst, J. (2004). The graduate nurse experience. The Journal of Nursing Administration, 34(6), 303-311.

Evaluation of simulation experience

McCausland, L., Curran, C., & Cataldi, P. (2004). Use of a human simulator for undergraduate nurse education. International Journal of Nursing Education Scholarship1(1), Article 23.

Simulation Design Scale

Dobbs, C., Sweitzer, V., & Jeffries, P. (2006). Testing simulation design features using an insulin management simulation in nursing education. Clinical Simulation in Nursing Education, 2(1), e17-e22.

Information from the SIRC website: http://sirc.nln.org/mod/page/view.php?id=88:

The five design features include: 1) objectives/information; 2) support; 3) problem solving; 4) feedback; 5) fidelity. The instrument has two parts: one asks about the presence of specific features in the simulation, the other asks about the importance of those features to the learner. Content validity was established by ten content experts in simulation development and testing. The instrument's reliability was tested using Cronbach's alpha, which was found to be 0.92 for the presence of features, and 0.96 for the importance of features.

Student Perception of Effective Teaching in Simulation Scale

Reese, C. (2012). Measuring effective teaching in simulation: Development of the student perception of effective teaching in simulation scale, Clinical Simulation in Nursing, 8(8), e411.

Student Satisfaction and Self-Confidence in Learning

Information from the SIRC website: http://sirc.nln.org/mod/page/view.php?id=88
Designed to measure student satisfaction (five items) with the simulation activity and self-confidence in learning (eight items) using a five-point scale. Reliability was tested using Cronbach's alpha: satisfaction = 0.94; self-confidence = 0.87

Theme picker

Knowledge/Learning

Awareness, understanding, and expertise acquired in a specific domain

Delupis, D. D., Pisanelli, P., Di Luccio, G., Kennedy, M., Tellini, S., Nenci, N., Franco Gensini, G. (2014). Communication during handover in the pre-hospital/hospital interface in Italy: From evaluation to implementation of multidisciplinary training through high-fidelity simulation. Internal and Emergency Medicine, 9(5), 575-582. doi:http://dx.doi.org.ezp.welch.jhmi.edu/10.1007/s11739-013-1040-9

Goode, C. J., Lynn, M. R., Krsek, C., Bednash, G. D. (2009). Nurse Residency Programs: An Essential Requirement for Nursing. Nursing Economics, 27(3), 142-147.

McWilliam PL, Botwinski CA (2012). Identifying strengths and weaknesses in the utilization of Objective Structured Clinical Examination (OSCE) in a nursing program. Nursing Education Perspectives33(1), 35-39.

Schmitz CC, Chipman JG , Luxenberg MG, Beilman GJ. (2008). Professionalism and communication in the intensive care unit: reliability and validity of a simulated family conference. Simulation in Healthcare, 3(4), 224-238.

Basic Knowledge Assessment Tool

Toth, J. (2008). The Basic Knowledge Assessment Tool (BKAT). Retrieved from http://www.bkat-toth.org/

Hoffman, O'Donnell, & Kim (2007). The effects of human patient simulators on basic knowledge in critical care nursing with undergraduate senior nursing students. Simulation in Healthcare, 2(2), 110-114.

Challenging Acute Nursing Event (CANE)

Walshe, N., O’Brien, S., Murphy, S., and Hartigan, I. (2011). Integrative learning through simulation and problem-based learning, Clinical Simulation in Nursing, 9(2), e47-e54.

Clinical Learning Environment Comparison Survey(used in NCSBN study)

Leighton, K. (2015). Development of the Clinical Learning Environment Comparison Survey. Clinical Simulation in Nursing, 11(1), 44-51.

Communication

Campbell, S. H, Aredes, N. d. A, Bontinen, K., Lim, Y., DuManoir, C., Tharmaratnam, T., & Stephen, L. (2021). Global Interprofessional Therapeutic Communication Scale© Short Form (GITCS©): Feasibility Testing in Canada. Clinical Simulation in Nursing, 65, 7-17. https://doi.org/10.1016/j.ecns.2021.12.006

Foronda, C.L., Alhusen, J., Budhathoki, C., Lamb, M., Tinsley, K., Bauman, E. (2015). A mixed-methods, international, multisite study to develop and validate a measure of nurse-to-physician communication in simulation. Nursing Education Perspectives36 (6), 383-388.

Manojlovich, M., et al. (2011). Developing and testing a tool to measure nurse/physician communication in the intensive care unit. Journal of Patient Safety, 7(2), 80-84.

Critical Assessment Skills

Gibbons, S., Adamo, G., Padden, D., Ricciardi, R., Graziano, M., Levine, E., et al. (2002). Clinical evaluations in advanced nursing education: Using standardized patients in health assessment. Journal of Nursing Education, 41(5), 215-221.

Educational Practices Questionnaire (Student Version)

Information from the SIRC website: http://sirc.nln.org/mod/page/view.php?id=88

A 16-item instrument using a five-point scale was designed to measure whether four educational practices (active learning, collaboration, diverse ways of learning, and high expectations) are present in the instructor-developed simulation, and the importance of each practice to the learner. Reliability was tested using Cronbach's alpha. Presence of specific practices = 0.86; importance of specific practices = 0.91

IP Implicit Association Test

A test of unconscious biases associated with nursing and physician roles through collaboration with the Harvard University Project Implicit.

http://www.projectimplicit.net/infrastructure.html

https://implicit.harvard.edu/implicit/Study?tid=-1

Multi-source Feedback Tool for Interprofessional Collaborative Practice

Brown JM, Lowe K, Fillingham J, Marphy PN, Bamfoth M, Shaw JF. (2014). An investigation into the use of multi-source feedback (MSF) as a work-based assessment tool. Medical Teacher, 36, 997-1004.

Curran V, Hollett A, Casimiro LM, McCarthy, Banfield V, Hall P, Lackie K, Oandasan I, Simmons B, Wagner S. (2011). Development and validation of the interprofessional collaborator assessment rubric (ICAR). Journal of Interprofessional Care, 25, 339-44.

Ushiro, R. (2009). "Nurse-Physician Collaboration Scale: development and psychometric testing." Journal of Advanced Nursing, 65(7), 1497-1508.

Simulation Effectiveness Tool – Modified (original SET 2005)

Leighton, K., Ravert, P., Mudra, V., Macintosh, C. (2015). Updating the Simulation Effectiveness Tool: Item modifications and reevaluation of psychometric properties. Nursing Education Perspectives, 36(5), 317-323. DOI:10.5480/1 5-1671.

Simulation Research Rating rubric for published articles

Fey, M. K., Gloe, D., & Mariani, B. (2015). Assessing the quality of simulation-based research articles: A rating rubric. Clinical Simulation in Nursing, 11(12), 496-504. http://dx.doi.org/ 10.1016/j.ecns.2015.10.005.

Simulation Learning Effectiveness

Chen, S-L., Huang, T-W., Liao, I-C, Liu, C (2015). Development and validation of the Simulation Learning Effectiveness Inventory, Journal of Advanced Nursing, 71(10), 1365-2648. http://dx.doi.org/10.1111/jan.12707

The Creighton Competency Evaluation Instrument (C-CEI)

Todd, M., Manz, J., Hawkins, K., Parsons, M., & Hercinger, M. (2008). The development of a quantitative evaluation tool for simulation in nursing education. International Journal of Nursing Education Scholarship, 5(1). Article 41.

Adamson, K. A., Parsons, M. E., Hawkins, K., Manz, J. A., Todd, M., & Hercinger, M. (2011). Reliability and internal consistency findings from the C-SEI. Journal of Nursing Education, 50(10), 583-586. https: //dx.doi.org/10.3928/01484834-20110715-02

Parsons, M.E., Hawkins, K.S., Hercinger, M., Todd, M., Manz, J.A., Fang, X. (2012). Improvement in Scoring Consistency for the Creighton Simulation Evaluation Instrument©Clinical Simulation in Nursing, 8(6), e233-e238.

Theme picker

Critical Thinking/Clinical Judgement

Performance-Based Development System Assessment

Wangensteen, S., Johansson, I.S., Nordström, G. (2010). The first year as a graduate nurse--an experience of growth and development. Journal of Clinical Nursing, 17(14), 1877-85.

Fero L.J., Witsberger, C.M., Wesmiller, S.W., Zullo, T.G., and Hoffman, L.A. (2009). Critical thinking ability of new graduate and experienced nurses. Journal of Advanced Nursing, 65(1), 139-48.

Sorensen, H.A., Yankech, L.R. (2008). Precepting in the fast lane: improving critical thinking in new graduate nurses. The Journal of Continuing Education in Nursing, 39(5), 208-216.

The Clinical Simulation Evaluation Tool (CSET)

Radhakrishnan, K., Roche, J., & Cunningham, H. (2007). Measuring clinical practice parameters with human patient simulation: A pilot study. International Journal of Nursing Education Scholarship, 4(1). Article 8

The Lasater Clinical Judgment Rubric (LCJR©)

Lasater, K. (2007). Clinical judgment development: using simulation to create an assessment rubric. Journal of Nursing Education, 46(11), 496-503.

Adamson, K. A., Gubrud, P., Sideras, S., & Lasater, K. (2012). Assessing the reliability, validity, and use of the Lasater Clinical Judgment Rubric: Three approaches. Journal of Nursing Education, 51(2), 66-73. https: //dx.doi.org/10.3928/01484834-20111130-03

Mariani, B., Cantrell, M.A., Meakim, C., Prieto, P., and Dreifuerst, K. (2012). Structured Debriefing and Students' Clinical Judgment Abilities in Simulation, Clinical Simulation in Nursing, 9(5), e147-e155.

Reinhardt, A. C., Mullins, I. L., De Blieck, C., & Schultz, P. (2012). IV insertion simulation: Confidence, skill and performance. Clinical Simulation in Nursing, 8(6), e157-e167. http://dx.doi.org/10.1016/j.ecns. 2010.09.001.

The Simulation Thinking Rubric (STR)

Doolen, J. (2015). Psychometric properties of the simulation thinking rubric to measure higher-order thinking in undergraduate nursing students. Clinical Simulation in Nursing, 11(1), 35-43. http://dx.doi.org/10.1016/j.ecns.2014.10.007.

Theme picker

Self-confidence/Self-efficacy

Emergency Response Performance Tool

Arnold, J. J., Johnson, L. M., Tucker, S. J., Malec, J. F., Henrickson, S. E., & Dunn, W. F. (2009). Evaluation tools in simulation learning: Performance and self-efficacy in emergency response. Clinical Simulation in Nursing, 5(1). Doi:10.1016/j.ecns.2008.10.003.

Debriefing

Instruments used to evaluate debriefing methods, scales, and experiences

Bradley, C.S. & Dreifuerst, K.T. (2016) Pilot Testing the Debriefing for Meaningful Learning Evaluation Scale. Clinical Simulation in Nursing, 12(7), 277-280.

Debriefing Assessment for Simulation in Healthcare (DASH) https://harvardmedsim.org/debriefing-assesment-simulation-healthcare.php

Reed, S.J. (2012). Debriefing experience scale: Development of a tool to evaluate the student learning experience in debriefing, Clinical Simulation in Nursing, 8(6), e211-e217.

Waznonis A. (2014). Methods and evaluations for simulation debriefing in nursing education. Journal of Nursing Education. 53(8) 459-465. DOI: 10.3928/01484834-20140722-13

Outcome Present State Test (OPT) Model Debriefing Tool

Kuiper, R.A., Heinrich, C., Matthias, A., Graham, M. & Bell-Kotwell, L. (2008). Debriefing with the OPT model of clinical reasoning during high fidelity patient simulation. International Journal of Nursing Education Scholarship, 5(1), Article 17. DOI: 10.2202/1548-923X.1466.

Video Training Tools

From Suzie Kardong-Edgren and Katie Adamson

These 3 grant-funded videos demonstrate 3 levels of student performance (poor, middle of the road or average student, and high functioning student) in the same scenario. Viewers are blinded to the level of performance by the video labels (circle, square, triangle), though it is fairly obvious when watching. These videos can be used to (1) TRAIN RATERS on the use of a new tool, or (2) get inter-rater reliability of evaluators on a tool. We do not want to tell you which is which, you will have to watch for yourself.

Theme picker