4.7 Article

Assessment of the Human Factors Analysis and Classification System (HFACS): Intra-rater and inter-rater reliability

Journal

SAFETY SCIENCE
Volume 82, Issue -, Pages 393-398

Publisher

ELSEVIER
DOI: 10.1016/j.ssci.2015.09.028

Keywords

HFACS; Inter-rater reliability; Intra-rater reliability; Classification; Human-error

Ask authors/readers for more resources

The Human Factors Analysis and Classification System (HFACS) is a framework for classifying and analyzing human factors associated with accidents and incidents. The purpose of the present study was to examine the inter- and intra-rater reliability of the HFACS data classification process. Methods: A total 125 safety professionals from a variety of industries were recruited from a series of two-day HFACS training workshops. Participants classified 95 real-world causal factors (five causal factors for each of the 19 HFACS categories) extracted from a variety of industrial accidents. Inter-rater reliability of the HFACS coding process was evaluated by comparing performance across participants immediately following training and intra-rater reliability was evaluated by having the same participants repeat the coding process following a two-week delay. Results: Krippendorffs Alpha was used to evaluate the reliability of the coding process across the various HFACS levels and categories. Results revealed the HFACS taxonomy to be reliable in terms of inter- and intra-rater reliability, with the latter producing slightly higher Alpha values. Conclusion: Results support the inter- and intra-rater reliability of the HFACS framework but also reveal additional opportunities for improving HFACS training and implementation. (C) 2015 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available