Medical Assistance Systems
Poster Presentation at HCI International 2025
At the 27th International Conference on Human-Computer Interaction, Dennis Dübeler, student research assistant of the Medical Assistance Systems group, presented his short paper "Humans Follow AI-Advice by (Un-)Trustworthy Virtual Agents in Ethical Healthcare Decisions" as a poster.
The paper presents the results of a study conducted as part of Dennis Dübeler's bachelor's thesis: Disclosing the training data of AI advisors has no significant impact on how these advisors are perceived or on the extent to which they influence human decision-making—even when the AI agents appear untrustworthy. In particular, there are no significant differences in the perception of the different Floka agents (trustworthy, untrustworthy, neutral). The only factors influencing human decision-making are the perceived likability and intelligence of the agents (measured using the Godspeed questionnaire). These findings have important implications for the robustness of human-in-the-loop systems.
Dübeler, D., Schütze, C., Müller, A., Richter, B., Wrede, B. (2025). Humans Follow AI-Advice by (un-)trustworthy Virtual Agents in Ethical Healthcare Decisions. In: Stephanidis, C., Antona, M., Ntoa, S., Salvendy, G. (eds) HCI International 2025 Posters. HCII 2025. Communications in Computer and Information Science, vol 2529. Springer, Cham. https://doi.org/10.1007/978-3-031-94171-9_24