VALIDITY AND RELIABILITY OF NURSING INTERVENTION CLASSIFICATION: SELF-CARE ASSISTANCE ON PATIENTS WITH STROKE

Intansari Nurjannah


DOI: https://doi.org/10.33546/bnj.728

Abstract


Background: Nursing intervention is part of nursing process. The accurateness of intervention needs to be explored through an effort to measure validity and reliability of the intervention.

Objectives:  This study aimed to investigate the validity and reliability of four Nursing Intervention Classifications (NICs) of Self-Care Assistance (SCA) on patients with stroke.

Methods: Validity measurement involved 4 experts, while reliability involved 7 samples for each NIC. Validity was analyzed using content validity index (I-CVI and S-CVI), while reliability was analyzed using kappa and percent agreement.

Results: Sixteen activities of NICs (I-CVI score less than 0.78) were eliminated and two activities considered not applicable. The results of reliability were above 0.85 kappa value with 85% of percent agreement.

Conclusion: Elimination of not valid activities increased reliability.


Keywords


validity; reliability; nursing intervention classification

Full Text:

PDF

References


Bujang, M., A, & Baharum, N. (2017). Guidelines of the minimum sample size requirements for Kappa agreement test. Epidemiology, Biostatistics and Public Health, 14(2).

Bulechek, G., M, Butcher, H., K, Dochterman, J., M, & Wagner, C., M. (2013). Nursing Intervention Classification (NIC) Dalam Bahasa Indonesia. Missouri: Elsevier Mosby.

Butcher, H. K., Bulechek, G. M., Dochterman, J. M. M., & Wagner, C. (2018). Nursing Interventions classification (NIC)-E-Book: Elsevier Health Sciences.

DeVon, H. A., Block, M. E., Moyle‐Wright, P., Ernst, D. M., Hayden, S. J., Lazzara, D. J., . . . Kostas‐Polston, E. (2007). A psychometric toolbox for testing validity and reliability. Journal of Nursing Scholarship, 39(2), 155-164.https://doi.org/10.1111/j.1547-5069.2007.00161.x

Graham, M., Milanowski, A., & Miller, J. (2012). Measuring and Promoting Inter-Rater Agreement of Teacher and Principal Performance Ratings. Online Submission.

Gwet, K. L. (2014). Handbook of inter-rater reliability: The definitive guide to measuring the extent of agreement among raters: Advanced Analytics, LLC.

Kozlowski, S. W., & Kirsch, M. P. (1987). The systematic distortion hypothesis, halo, and accuracy: An individual-level analysis. Journal of Applied Psychology, 72(2), 252. https://doi.org/10.1037/0021-9010.72.2.252

McCray, G. (2013). Assessing inter-rater agreement for nominal judgement variables. Paper presented at the Language Testing Forum.

Morris, R., MacNeela, P., Scott, A., Treacy, P., Hyde, A., O'Brien, J., . . . Drennan, J. (2008). Ambiguities and conflicting results: the limitations of the kappa statistic in establishing the interrater reliability of the irish nursing minimum data set for mental health: a discussion paper. International Journal of Nursing Studies, 45, 645-647. https://doi.org/10.1016/j.ijnurstu.2007.07.005

Neundorf, K. (2002). The content analysis guidebook. Thousands Oaks: Sage.

Olshansky, E., Lakes, K. D., Vaughan, J., Gravem, D., Rich, J. K., David, M., . . . Cooper, D. (2012). Enhancing the construct and content validity of rating scales for clinical research: Using qualitative methods to develop a rating scale to assess parental perceptions of their role in promoting infant exercise. The International Journal of Educational and Psychological Assessment, 10(1), 36.

Osborne, J., W. (2008). Best practice in quantitative methods. California: Sage.

Peyré, S., Peyré, C., Hagen, J., & Sullivan, M. (2010). Reliability of a procedural checklist as a high-stakes measurement of advanced technical skill. The American Journal of Surgery, 199(1), 110-114. https://doi.org/10.1016/j.amjsurg.2009.10.003

Polit, D. F., & Beck, C. T. (2006). The content validity index: are you sure you know what's being reported? Critique and recommendations. Research in Nursing & Health, 29(5), 489-497. https://doi.org/10.1002/nur.20147

Polit, D. F., Beck, C. T., & Owen, S. V. (2007). Is the CVI an acceptable indicator of content validity? Appraisal and recommendations. Research in Nursing & Health, 30(4), 459-467. https://doi.org/10.1002/nur.20199

Roach, K., E. (2006). Measurement of health outcomes: reliability, validity and responsiveness. . Journal of Prosthetics and orthotics, 18(IS), 8-12. https://doi.org/10.1097/00008526-200601001-00003

Rubio, D. M., Berg-Weger, M., Tebb, S. S., Lee, E. S., & Rauch, S. (2003). Objectifying content validity: Conducting a content validity study in social work research. Social Work Research, 27(2), 94-104. https://doi.org/10.1093/swr/27.2.94

Rushforth, H. (2007). Objective structured clinical examination (OSCE): review of literature and implications for nursing education. Nurse Education Today, 27(5), 481-490. https://doi.org/10.1016/j.nedt.2006.08.009

Souza, A. C. d., Alexandre, N. M. C., & Guirardello, E. d. B. (2017). Psychometric properties in instruments evaluation of reliability and validity. Epidemiologia e Serviços de Saúde, 26(3), 649-659.

Tidsrand, J., & Horneij, E. (2009). Inter-rater reliability of three standardized functional tests in patient with low back pain. BMC Musculoskeletal Disorders, 10(5), 1471-2472. https://doi.org/10.1186/1471-2474-10-58

van der Vlauten, C. (2000). Validity of final examinations in undergraduate medical training. BMJ, 321(7270), 1217. https://doi.org/10.1136/bmj.321.7270.1217

Wynd, C. A., Schmidt, B., & Schaefer, M. A. (2003). Two quantitative approaches for estimating content validity. Western Journal of Nursing Research, 25(5), 508-518. https://doi.org/10.1177/0193945903252998

Zamanzadeh, V., Ghahramanian, A., Rassouli, M., Abbaszadeh, A., Alavi-Majd, H., & Nikanfar, A.-R. (2015). Design and implementation content validity study: development of an instrument for measuring patient-centered communication. Journal of Caring Sciences, 4(2), 165. https://doi.org/10.15171/jcs.2015.017


Refbacks

  • There are currently no refbacks.




Copyright (c) 2019 Belitung Nursing Journal

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.