论文标题

模拟社会存在对自动机器人的信任,隐私问题和使用意图的影响

Simulating the Effects of Social Presence on Trust, Privacy Concerns & Usage Intentions in Automated Bots for Finance

论文作者

Ng, Magdalene, Coopamootoo, Kovila P. L., Toreini, Ehsan, Aitken, Mhairi, Elliot, Karen, van Moorsel, Aad

论文摘要

Finbots是基于自动决策技术的聊天机器人,旨在促进可访问的银行业务并支持客户做出财务决策。聊天机器人的患病率正在增加,有时甚至可以模仿人类的社会规则,期望和规范,从而减少了人与人之间的互动的必要性。随着银行和财务咨询平台朝着创建机器人迈进,以提高消费者信任和采用率的当前状态,我们调查了具有和没有社会情感特征的聊天机器人小插曲的效果,对将聊天机器人用于财务支持的目的使用。我们对n = 410名参与者进行了受试者间的在线实验。向对照组的参与者提供了一个小插图,描述了一个名为XRO23的安全可靠的聊天机器人,而实验组的参与者则配有一个小插图,描述了一个安全可靠的聊天机器人,该聊天机器人更像是人类,名为Emma。我们发现,Vignette Emma并没有提高参与者的信任水平,也没有降低其隐私问题,即使它增加了对社会存在的看法。但是,我们发现使用介绍的聊天机器人进行财政支持的意图受到人们对机器人的人性和信任的积极影响。与艾玛(Emma)相比,参与者还更愿意分享对XRO23的帐号,分类代码和付款信息等对财务敏感的信息 - 揭示了信息共享中的技术和机械Finbot的偏爱。总体而言,这项研究有助于我们理解使用具有不同功能作为金融技术的聊天机器人的意图,特别是在独立于财务功能设计时,社会情感支持可能不会受到青睐。

FinBots are chatbots built on automated decision technology, aimed to facilitate accessible banking and to support customers in making financial decisions. Chatbots are increasing in prevalence, sometimes even equipped to mimic human social rules, expectations and norms, decreasing the necessity for human-to-human interaction. As banks and financial advisory platforms move towards creating bots that enhance the current state of consumer trust and adoption rates, we investigated the effects of chatbot vignettes with and without socio-emotional features on intention to use the chatbot for financial support purposes. We conducted a between-subject online experiment with N = 410 participants. Participants in the control group were provided with a vignette describing a secure and reliable chatbot called XRO23, whereas participants in the experimental group were presented with a vignette describing a secure and reliable chatbot that is more human-like and named Emma. We found that Vignette Emma did not increase participants' trust levels nor lowered their privacy concerns even though it increased perception of social presence. However, we found that intention to use the presented chatbot for financial support was positively influenced by perceived humanness and trust in the bot. Participants were also more willing to share financially-sensitive information such as account number, sort code and payments information to XRO23 compared to Emma - revealing a preference for a technical and mechanical FinBot in information sharing. Overall, this research contributes to our understanding of the intention to use chatbots with different features as financial technology, in particular that socio-emotional support may not be favoured when designed independently of financial function.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源