Sponsor: NSF

PI: Dan Szafir, co-PIs: Christoffer Heckman, Danielle Albers Szafir

Abstract: Robots may augment emergency response teams by collecting information in environments that may be dangerous or inaccessible for human responders, such as in wildfire fighting, search and rescue, or hurricane response. For example, robots might collect critical visual, mapping, and environmental data to inform responders of conditions ahead that could improve their awareness of the operational environment.聽However, response teams currently have little ability to directly access robot-collected information in the field, despite its value for rapidly responding to local conditions, because current systems typically route the data through a central command post. Through collaboration with several local response groups, the project team will develop better understandings of responders' needs and concerns around robot-collected data, algorithms and visualizations that meet those needs using augmented reality technologies, and systems that integrate well with responders' actual work practices. The team will also develop novel algorithms for 3D scene reconstruction and simultaneous location and mapping that will be useful for a broad variety of applications. Overall, the project will contribute empirical knowledge of how different factors of ARHMD visualizations influence data interpretation, novel algorithms for estimating, correcting, and sharing maps between intermittently-networked agents in the field, and information regarding how data from collocated robots can mediate human-robot interactions, particularly within the context of emergency response.

Publications

Chen, Zhaozhong and Heckman, Christoffer and Julier, Simon and Ahmed, Nisar.聽"Weak in the NEES?: Auto-Tuning Kalman Filters with Bayesian Optimization,"聽2018 21st International Conference on Information Fusion (FUSION 2018),听2018.听听听

Walker, Michael E. and Szafir, Daniel and Rae, Irene.聽"The Influence of Size in Augmented Reality Telepresence Avatars,"聽2019 IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR 2019),听2019.听听听

Szafir, Daniel.聽"Mediating Human-Robot Interactions with Virtual, Augmented, and Mixed Reality,"聽International Conference on Human-Computer Interaction,聽v.11575,听2019.听

Elliott, M., C. Xiong, C. Nothelfer, & D. Albers Szafir. 鈥淎 Design Space of Vision Science Methods for Visualization Research.鈥 IEEE Transactions on Visualization, 2021 (to appear).

Whitlock, M., K. Wu, & D. Albers Szafir. 鈥淒esigning for Mobile and Immersive Visual Analytics in the Field.鈥 IEEE Transactions on Visualization and Computer Graphics (TVCG), 2019.

Whitlock, M., D. Albers Szafir, & Kenny Gruchalla. 鈥淗ydrogenAR: Interactive Data-Driven Storytelling for Dispenser Reliability.鈥 In the Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR), 2020 (to appear).

Whitlock, M., S. Smart, & D. Albers Szafir. 鈥淕raphical Perception for Immersive Analytics.鈥 IEEE Virtual Reality, 2020.

Whitlock, M., J. Mitchell, N. Pfeufer, B. Arnot, R. Craig, B. Wilson, B. Chung, & D. Albers Szafir. 鈥淢RCAT: In Situ Prototyping of Interactive AR Environments.鈥 International Conference on Virtual and Mixed Reality (VAMR), 2020.

Whitlock, M., D. Leithinger, D. Szafir, & D. Albers Szafir. 鈥淭oward Effective Multimodal Interaction in Augmented Reality.鈥 4th Workshop on Immersive Analytics: Envisioning Future Productivity for Immersive Analytics at ACM CHI 2020, 2020.

Whitlock, M., & D. Albers Szafir. 鈥淪ituated Prototyping of Data-Driven Applications in Augmented Reality.鈥 Interaction Design and Prototyping for Immersive Analytics at CHI 2019, 2019.