Fiabilidá humana

De Wikipedia
Saltar a: navegación, buscar

Ye una disciplina que forma parte del campu de la fiabilidá de sistemes, na midida na que l'home puede ser consideráu como parte integrante d'un sistema.

Considérase que'l componente humanu ye d'una complexidá enforma mayor que cualesquier otru componente y, poro, les técniques aplicables al estudiu de la fiabilidá humana o, complementariamente, del erru humanu son específicos ya integren aspeutos psicolóxicos y organizacionales a les habituales técniques matemátiques.

Técniques d'analises de la fiabilidá humana[editar | editar la fonte]

Esisten una variedá de métodos pal analís de la fiabilidá humana (HRA - human reliability analysis),[1][2] principalmente estremaos en dos categoríes: los basaos nel analís probabilístico de riesgos y la teoría del control cognitivu.

Técniques basaes n'analises probabilístico de riesgos[editar | editar la fonte]

Una forma d'analizar la fiabilidá humana ye como estensión directa del analís probabilístico de riesgos (n'inglés, probabilistic risk assessment -PRA-): de la mesma forma que les máquines pueden fallar nuna planta industrial, una persona cometer erros. En dambos casos l'analís por descomposición funcional apurre un ciertu nivel de detalle col qu'asignar probabilidaes d'escurrimientu del erru. Esta idea básica ta detrás de la técnica THERP (Technique for Human Erru Rate Prediction),[3] que pretende calcular probabilidaes d'erru humanu incorporables a un analís PRA. Una forma simplificada de THERP ye la técnica ASEP (Accident Sequence Evaluation Program), que foi implementada como ferramienta informático: Simplified Human Erru Analysis Code (SHEAN).[4] Más apocayá More recently, la US Nuclear Regulatory Commission publicó'l métodu d'analís SPAR (Standardized Plant Analysis Risk).[5][6]

Técniques basaes nel control cognitivu[editar | editar la fonte]

Erik Hollnagel desenvolvió esta llinia de trabayu, que denomina Contextual Control Model (COCOM),[7] y desenvolvió el métodu CREAM (Cognitive Reliability and Erru Analysis Method).[8] COCOM modela'l comportamientu humanu como un conxuntu de maneres de control y propón un modelu de cómo asoceden les transiciones ente les distintes maneres de control.

Erru humanu[editar | editar la fonte]

L'erru humanu foi citáu como causante o factor contributivu en desastres y accidentes n'industries diverses como enerxía nuclear, aviación, esploración espacial y medicina.

Categoríes d'erru humanu[editar | editar la fonte]

Hai diverses maneres de categorización del erru humanu:[9][10]

  • Exóxenu / endóxenu[11]
  • Valoración de la situación / respuesta planiada[12]
  • Por nivel d'analís; por casu: perceptivu / cognitivu / comunicativu / organizativo.

Sistema d'Analís y Clasificación de Factores Humanos[editar | editar la fonte]

El 'Sistema d'Analís y Clasificación de Factores Humanos' (n'inglés Human Factors Analysis and Classification System -HFACS-) foi desenvueltu primeramente como marcu de trabayu pa la comprensión del erru humanu como causa d'accidentes d'aviación.[13][14] Ta basáu los estudios de James Reason's sobre erru humanu en sistemes complexos. HFACS estrema ente "fallos activos" n'acciones insegures, "fallos latentes", supervisión insegura ya influencies de la organización.

Discutiniu[editar | editar la fonte]

Dellos investigadores argumentaron que la catalogación de les acciones humanes en términos de "correctu" o "incorrectu" ye una simplificación escesivo y perxudicial pal analís d'un fenómenu complexu.[15][16] En llugar d'eso, defenden que sería más granible enfocar la cuestión dende'l puntu de vista de la variabilidá del comportamientu humanu.

Ver tamién[editar | editar la fonte]

Notes[editar | editar la fonte]

  1. Kirwan and Ainsworth, 1992
  2. Kirwan, 1994
  3. Swain & Guttman, 1983
  4. Wilson, 1993)
  5. SPAR-H
  6. Gertman et al., 2005
  7. (Hollnagel, 1993)
  8. (Hollnagel, 1998)
  9. Jones, 1999
  10. Wallace and Ross, 2006
  11. Senders and Moray, 1991
  12. Roth et al., 1994
  13. Shappell and Wiegmann, 2000
  14. Wiegmann and Shappell, 2003
  15. Hollnagel, Y. (1983). Human erru. (Position Paper for NATO Conference on Human Erru, August 1983, Bellagio, Italy)
  16. Hollnagel, Y. and Amalberti, R. (2001). The Emperor’s New Clothes, or whatever happened to “human erru”? Invited keynote presentation at 4th International Workshop on Human Erru, Safety and System Development.. Linköping, June 11-12, 2001.

Referencies[editar | editar la fonte]

  • Gertman, D. L. and Blackman, H. S. (2001). Human reliability and safety analysis data handbook. Wiley.
  • Gertman, D., Blackman, H., Marble, J., Byers, J. and Smith, C. (2005). The SPAR-H human reliability analysis method. NUREG/CR-6883. Idaho National Laboratory, prepared for O. S. Nuclear Regulatory Commission.[1]
  • Hollnagel, Y. (1993). Human reliability analysis: Context and control. Academic Press.
  • Hollnagel, Y. (1998). Cognitive reliability and erru analysis method: CREAM. Elsevier.
  • Hollnagel, Y. and Amalberti, R. (2001). The Emperor’s New Clothes, or whatever happened to “human erru”? Invited keynote presentation at 4th International Workshop on Human Erru, Safety and System Development.. Linköping, June 11-12, 2001.
  • Hollnagel, Y., Woods, D. D., and Leveson, N. (Eds.) (2006). Resilience engineering: Concepts and precepts. Ashgate.
  • Jones, P. M. (1999). Human erru and its amelioration. In Handbook of Systems Engineering and Management (A. P. Sage and W. B. Rouse, eds.), 687-702. Wiley.
  • Kirwan, B. (1994). A Guide to Practical Human Reliability Assessment. Taylor & Francis.
  • Kirwan, B. and Ainsworth, L. (Eds.) (1992). A guide to task analysis. Taylor & Francis.
  • Norman, D. (1988). The psychology of everyday things. Basic Books.
  • Reason, J. (1990). Human erru. Cambridge University Press.
  • Roth, Y. et al. (1994). An empirical investigation of operator performance in cognitive demanding simulated emergencies. NUREG/CR-6208, Westinghouse Science and Technology Center. Report prepared for Nuclear Regulatory Commission.
  • Sage, A. P. (1992). Systems engineering. Wiley.
  • Senders, J. and Moray, N. (1991). Human erru: Cause, prediction, and reduction. Lawrence Erlbaum Associates.
  • Shappell, S. & Wiegmann, D. (2000). The human factors analysis and classification system - HFACS. DOT/FAA/AM-00/7, Office of Aviation Medicine, Federal Aviation Administration, Department of Transportation..[2]
  • Swain, A. D., & Guttman, H. Y. (1983). Handbook of human reliability analysis with emphasis on nuclear power plant applications.. NUREG/CR-1278 (Washington D.C.).
  • Wallace, B. and Ross, A. (2006). Beyond human erru. CRC Press.
  • Wiegmann, D. & Shappell, S. (2003). A human erru approach to aviation accident analysis: The human factors analysis and classification system.. Ashgate.
  • Wilson, J.R. (1993). SHEAN (Simplified Human Erru Analysis code) and automated THERP. United States Department of Energy Technical Report Number WINCO--11908. [3]
  • Woods, D. D. (1990). Modeling and predicting human erru. In J. Elkind, S. Card, J. Hochberg, and B. Huey (Eds.), Human performance models for computer-aided engineering (248-274). Academic Press.

Saber más[editar | editar la fonte]

  • Autrey, T.D. (2007). editorial=Practicing Perfection Institute Mistake-Proofing Six Sigma: How to Minimize Project Scope and Amenorga Human Erru.
  • Davies, J.B., Ross, A., Wallace, B. and Wright, L. (2003). Safety Management: a Qualitative Systems Approach. Taylor and Francis.
  • Dekker, S.W.A., (2005). Ten Questions About Human Erru: a new view of human factors and systems safety]. Lawrence Erlbaum Associates.
  • Dekker, S.W.A., (2006). The Field Guide to Understanding Human Erru. Ashgate.
  • Dekker, S.W.A., (2007). Just Culture: Balancing Safety and Accountability. Ashgate.
  • Dismukes, R. K., Berman, B. A., and Loukopoulos, L. D. (2007). The limits of expertise: Rethinking pilot erru and the causes of airline accidents. Ashgate.
  • Forester, J., Kolaczkowski, A., Lois, Y., and Kelly, D. (2006). Evaluation of human reliability analysis methods against good practices. NUREG-1842 Final Report. O. S. Nuclear Regulatory Commission. [4]
  • Goodstein, L. P., Andersen, H. B., and Olsen, S. Y. (Eds.) (1988). Tasks, errors, and mental models. Taylor and Francis.
  • Grabowski, M. and Roberts, K. H. (1996). Plantía:Doi-inline, IEEE Transactions on Systems, Man, and Cybernetics, Volume 26, Non. 1, January 1996, 2-16.
  • Greenbaum, J. and Kyng, M. (Eds.) (1991). Design at work: Cooperative design of computer systems. Lawrence Erlbaum Associates.
  • Harrison, M. (2004). Human erru analysis and reliability assessment. Workshop on Human Computer Interaction and Dependability, 46th IFIP Working Group 10.4 Meeting, Siena, Italy, July 3-7, 2004. [5]
  • Hollnagel, Y. (1991). The phenotype of erroneous actions: Implications for HCI design. In G. W. R. Weir and J. L. Alty (Eds.), Human-computer interaction and complex systems.. Academic Press.
  • Hutchins, Y. (1995). Cognition in the wild. MIT Press.
  • Kahneman, D., Slovic, P. and Tversky, A. (Eds.) (1982). Judgment under uncertainty: Heuristics and biases. Cambridge University Press.
  • Leveson, N. (1995). Safeware: System safety and computers. Addison-Wesley.
  • Morgan, G. (1986). Images of organization. Sage.
  • Mura, S. S. (1983). Licensing violations: Legitimate violations of Grice's conversational principle. In R. Craig and K. Tracy (Eds.), Conversational coherence: Form, structure, and strategy (101-115). Sage.
  • Perrow, C. (1984). Normal accidents: Living with high-risk technologies. Basic Books.
  • Rasmussen, J. (1983). Skills, rules, and knowledge: Signals, signs, and symbols and other distinctions in human performance models. IEEE Transactions on Systems, Man, and Cybernetics, SMC-13, 257-267.
  • Rasmussen, J. (1986). Information processing and human-machine interaction: An approach to cognitive engineering. Wiley.
  • Silverman, B. (1992). Critiquing human erru: A knowledge-based human-computer collaboration approach. Academic Press.
  • Swets, J. (1996). Signal detection theory and ROC analysis in psychology and diagnostics: Collected papers. Lawrence Erlbaum Associates.
  • Tversky, A. and Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 1124-1131.
  • Vaughan, D. (1996). The Challenger launch decision: Risky technology, culture, and deviance at NASA. University of Chicago Press.
  • Wallace, B. and Ross, A. (2006). Beyond human erru. CRC Press.
  • Woods, D. D., Johannesen, L., Cook, R., and Sarter, N. (1994). Behind human erru: Cognitive systems, computers, and hindsight. CSERIAC SOAR Report 94-01. Crew Systems Ergonomics Information Analysis Center, Wright-Patterson Air Force Base, Ohio.

Enllaces esternos[editar | editar la fonte]

Documentos d'estandarización y guíes[editar | editar la fonte]

Ferramientes[editar | editar la fonte]

Centros d'investigación[editar | editar la fonte]

Reportaxes[editar | editar la fonte]



Fiabilidad humana