Man vs. machine: Investigating the effects of adversarial system use on end-user behavior in automated deception detection interviews

Keywords

Deception, Credibility assessment, Adversarial system, Countermeasures, Mandatory technology adoption, Concealed information test (CIT)

Abstract

Deception is an inevitable component of human interaction. Researchers and practitioners are developing information systems to aid in the detection of deceptive communication. Information systems are typically adopted by end users to aid in completing a goal or objective (e.g., increasing the efficiency of a business process). However, end-user interactions with deception detection systems (adversarial systems) are unique because the goals of the system and the user are orthogonal. Prior work investigating systems-based deception detection has focused on the identification of reliable deception indicators. This research extends extant work by looking at how users of deception detection systems alter their behavior in response to the presence of guilty knowledge, relevant stimuli, and system knowledge. An analysis of data collected during two laboratory experiments reveals that guilty knowledge, relevant stimuli, and system knowledge all lead to increased use of countermeasures. The implications and limitations of this research are discussed and avenues for future research are outlined.

Original Publication Citation

Proudfoot, J. G., Boyle, R., & Schuetzler, R. M. (2016) Man vs. machine: Investigating the effects of adversarial system use on end-user behavior in automated deception detection interviews. Decision Support Systems, 85, pp. 23–33.

Document Type

Peer-Reviewed Article

Publication Date

2016-5

Permanent URL

http://hdl.lib.byu.edu/1877/8392

Publisher

Decision Support Systems

Language

English

College

Marriott School of Business

Department

Information Systems

University Standing at Time of Publication

Assistant Professor

Share

COinS