top of page

Research

Human-Automation Trust and Attention Allocation

In many professional domains, automation is implemented to counteract the attentional demand imposed by the task. Yet, human operators either misuse unreliable automation or disuse reliable automation (Parasuraman & Riley, 1997). Several literatures suggest that trust is a critical construct that drives human-automation interaction (Hoff & Bashir, 2015; Karpinsky et al., 2018; Lee & See, 2004). According to Lee and See (2004), trust in automation is based on three information sources including performance (i.e., the automation’s behavior), process (i.e., the mechanism of the automation), and purpose (i.e., system designer’s intention for developing). Previous work indicated that increasing the difficulty of the tracking task (i.e., high task load) degraded performance- and process-based trust, but not purpose-based trust (Karpinsky et al., 2018). This indicates that participants perceived trust in automation based on the what the automation is doing and how the automation works. Furthermore, participants exhibited lower fixation on the semi-automated system monitoring display under high task load condition, indicating that attentional resources are allocated to monitor the automation. However, the relationship between trust and attention is still unclear. This research examines the relationship between trust in automation and attentional allocation .

References

  • Hoff, K. A., & Bashir, M. (2015). Trust in automation: Integrating empirical evidence on factors that influence trust. Human Factors, 57, 407-434.

  • Karpinsky, N. D., Chancey, E. T., Palmer, D. B., & Yamani, Y. (2018). Automation trust and attention allocation in multitasking workspace. Applied Ergonomics, 70, 194-201.

  • Lee, J. D., & See, K. A. (2004). Trust in automation: Designing for appropriate reliance. Human Factors, 46, 50-80.

  • Parasuraman, R., & Riley, V. (1997). Humans and automation: Use, misuse, disuse, abuse. Human Factors, 39, 230-253.

Human-Autonomy Teaming in Advanced Air Mobility (AAM) Operation  

The rationale behind this research is to facilitate human-autonomy teaming in advanced air mobility (AAM) operation. The concept of AAM has been emerging due to the implementation of advanced technologies in current aviation systems. AAM vehicles is expected to carry consumer products and passengers across urban and rural areas (National Academics of Sciences, Engineering, and Medicine). Notably, increasingly autonomous systems will be implemented in future AAM vehicles. Increasingly autonomous systems exceed the capability of typical automation. That is, the system has achieved full autonomy whereby the system possesses all three dimensions including viability (i.e., the ability to perform basic functions in the environment), independence (i.e., the ability to perform a task without support from human operators), and self-governance (i.e., the ability to freely set goals and develop operational plans; Kaber, 2018). Yet, AAM operators will likely use the automation counterproductively (i.e., misuse unreliable automation or disuse reliable automation) due to poor trust perception (Parasuraman & Riley, 1997). A few works indicated trust as a critical factor in human-autonomy teaming in multi-vehicle operation (Chancey & Politowicz, 2020; Chancey et al., 2021; Sato et al., 2022), but empirical works on trust in human-autonomy teaming in multi-vehicle operation are scarce. This research examines the role of trust in human-autonomy teaming in multi-vehicle operation.

References

  • Chancey, E. T., & Politowicz, M. (2020). Designing and Training for Appropriate Trust in Increasingly Autonomous Advanced Air Mobility Operations: A Mental Model Approach: Version 1.

  • Chancey, E. T., Politowicz, M. S., & Le Vie, L. (2021). Enabling advanced air mobility operations through appropriate trust in human-autonomy teaming: Foundational research approaches and applications. In AIAA Scitech 2021 Forum (p. 0880).

  • Kaber, D. B. (2018). A conceptual framework of autonomous and automated agents. Theoretical Issues in Ergonomics Science, 19, 406-430.

  • National Academies of Sciences, Engineering, and Medicine. (2020). Advancing aerial mobility: A national blueprint. National Academies Press.

  • Parasuraman, R., & Riley, V. (1997). Humans and automation: Use, misuse, disuse, abuse. Human Factors, 39, 230-253.

  • Sato, T., Politowicz, M. S., Islam, S., Chancey, E. T., & Yamani, Y. (2022, September). Attentional considerations in advanced air mobility operations: control, manage, or assist?. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 66, No. 1, pp. 28-32). Sage CA: Los Angeles, CA: SAGE Publications.

bottom of page