When Disclosure is Involuntary: Empowering Users with Control to Reduce Concerns


Deception, chat bot, conversational agent, human-computer interaction


This study reports the results of a laboratory experiment exploring interactions between humans and a conversational agent. Using the ChatScript language, we created a chat bot that asked participants to describe a series of images. The two objectives of this study were (1) to analyze the impact of dynamic responses on participants’ perceptions of the conversational agent, and (2) to explore behavioral changes in interactions with the chat bot (i.e. response latency and pauses) when participants engaged in deception. We discovered that a chat bot that provides adaptive responses based on the participant’s input dramatically increases the perceived humanness and engagement of the conversational agent. Deceivers interacting with a dynamic chat bot exhibited consistent response latencies and pause lengths while deceivers with a static chat bot exhibited longer response latencies and pause lengths. These results give new insights on social interactions with computer agents during truthful and deceptive interactions.

Original Publication Citation

Schuetzler, R. M., Giboney, J. S., Grimes, G. M., & Buckman, J. (2014) Facilitating natural conversational agent interactions: Lessons from a deception experiment. International Conference on Information Systems. Auckland, New Zealand, December 12-16.

Document Type

Conference Paper

Publication Date


Permanent URL



International Conference on Information Systems




Marriott School of Business


Information Systems

University Standing at Time of Publication

Assistant Professor