Studies on RA adoption have found that certain groups are more resistant, due to demographic characteristics like age, income and education, largely due to a trust barrier. The Reference Study provided an extension to this literature by examining this demographic group specifically. The results showed that the use of VR enables the lower-income and less financial literate to develop trust, via emotive and psychological drivers. It also significantly influenced their intention to sign up. This is an important first step towards improving their financial inclusion.
Current research on RA has identified that the lack of human interaction is a key impediment to RA adoption. Accordingly, the Reference Study sought to explore further applications of anthropomorphism, building on recent research in this area. Anthropomorphism has a few major dimensions, widely under broad categories of “form” (“likeness to a human”) and “function” (“acting like a human”) [20]. Previous studies had explored certain dimensions, for example “naming” [29], showing a named visual avatar [2], or providing chatbot interaction [28]. The Reference Study adopted a similar trajectory but intensified the anthropomorphic mechanism in both form and function. By using VR characters of finance professionals within a virtual bank setting, it sought to create a digital twin of a real-life human advisor. The VR Avatar had a comprehensive visual representation which was animated on demand (satisfying “form”), and was able to “advise” the user (satisfying “function”). The VR characters were also designed to generate “likeability” and “perceived intelligence”. In previous studies, these had been found to be key mediators in enhancing anthropomorphism on usage intentions [11].
Furthermore, the Reference Study showed that the significant impact was related to the development of trust, in particular. As such, it validates earlier literature that VR is able to develop some of the main trust constructs [49]. Users had felt that the VR characters were devoted and helpful (“benevolence”), responsive and informative (“competence”), transparent (“integrity”), and consistent (“predictability”) [49]. More importantly, it was able to have a significant effect on traditionally resistant groups such as the lower-income, lower-educated, and elderly groups. This suggests that this trust development may be particularly helpful within the context of “vulnerability” or lower financial and digital literacy [38].
Quantitative and qualitative results also illustrated that the emotional experience was largely positive, suggesting that this was due to the “Immersive”, “Interactive”, and “Imaginative” (“I3”) aspects of VR experience [31][32]. In line with earlier literature on the hedonistic element of RA usage, apparently, the user’s enhanced experience became an intrinsic motivator [30].
Another finding was that the use of VR was not effective in the cognitive engagement aspect (for example “usefulness in meeting financial objectives”), although it had a significant impact on users’ intention to sign up. There could be a few implications from this finding. Firstly, it challenges some of the literature which has employed primarily cognitive or utilitarian models of RA Adoption, like the TAM models. Secondly, it may also imply that the anthropomorphism of RA systems may be more effective on the emotive, and psychological aspects of trust, rather than the cognitive aspects. The potential inference is that distinct demographic cohorts may display contrasting behavior, and necessitate separate adoption models. It thus prompts inquiry into the effectiveness of conducting research from a standardized user perspective. Further studies can be carried to explore this line of reasoning.
Current RA literature has also tended towards a static characterization of the relationship between user demographics, attributes, and behavior. For example, younger demographics tend to have certain attributes which lead them to behave in a certain way. However, the application of anthropomorphic techniques may challenge some of these assumptions, because it may radically change the relationship between user demographic and adoption. For example, elderly users generally have more negative attitudes towards technology use, but are more sensitive to the influence of anthropomorphism [11], which may change their attitudes and behavior significantly. In this case, anthropomorphism can actually drive a clear distinction between demographics and behavior, such that these variables are not presumed to operate in conjunction.
Indeed, the results of the Reference Study support this theory, whereby older age groups were overwhelmingly receptive to the use of VR. It is proposed that older groups may suffer from multiple and inter-related emotional or psychological resistance factors. They may have trust issues (NARS), but also competency issues (low digital literacy), causing Computer Anxiety [11]. Because the VR characters were able to actively “push” the information to users and simplify the user navigation process, it could have eliminated a few of these attitudinal obstacles. Future studies could undertake the in-depth investigation of specific resistance/adoption drivers within each demographic group, rather than ascribing behavior solely on the basis of demographics.
With regards to practical implications, the Reference Study suggests that RAs may need to employ clearly differentiated strategies if they are targeting different consumer segments. RAs may like to consider different types of system designs to cater to specific types of users. Dimensions to consider would be socio-economic variables, digital literacy, and competence levels. This is in accordance with previous literature, which contends that many current RA platforms are overly simplistic and generalized. The Reference Study also implies that certain user groups may require more financial literacy education than anticipated for their specific needs. This is in line with much of the current literature which finds that the issue of inadequate financial literacy is a pressing concern. Hence RA systems need to place priorities on providing more educational content, and delivering it effectively.
The current RA system design literature has developed many useful findings on how to improve system design and architecture. However, the approach taken can somewhat be characterized as incrementalist in nature. The Reference Study suggests that intense anthropomorphism may radically change user attitudes. It sets the stage for future studies to explore transformative models of financial advisory, especially in this era of abundant AI technologies. For example, future studies or commercial applications may explore the use of a real Metaverse setting whereby VR Avatars can provide real-time financial literacy education and actually interact actively with users.
Alternatively, RA platforms can consider integrating generative AI chatbots (such as chatGPT) with the VR system, enabling users to receive comprehensive financial knowledge on demand. Other design ideas include the personalization of VR characters who mimic the user’s own demographic characteristics and communication styles, to enhance engagement levels.
|
[1] |
Abraham, F., Schmukler, S. L., & Tessada, J. (2019). “Robo-advisors: Investing through machines,” World Bank Research and Policy Briefs, (134881). |
[2] |
Adam, M., Toutaoui, J., Pfeuffer, N., & Hinz, O. (2019), “Investment decisions with robo-advisors: the role of anthropomorphism and personalized anchors in recommendations,” In: Proceedings of the 27th European Conference on Information Systems (ECIS), Stockholm & Uppsala, Sweden, June 8–14, 2019. |
[3] |
Baker, H. K., & Ricciardi, V. (2014), “How biases affect investor behaviour,” The European Financial Review, 7-10. |
[4] |
Barber, B. M., Lee, Y. T., Liu, Y. J., & Odean, T. (2009), “Just how much do individual investors lose by trading?” The Review of Financial Studies, 22(2), 609-632. |
[5] |
Bauman, A., & Bachmann, R. (2017), “Online consumer trust: Trends in research,” Journal of technology management & innovation, 12(2), 68-79. |
[6] |
Beckett, A., Hewer, P., & Howcroft, B. (2000), “An exposition of consumer behaviour in the financial services industry,” International Journal of Bank Marketing, 18(1), 15-26. |
[7] |
Belanche, D., Casaló, L. V., & Flavián, C. (2019), “Artificial Intelligence in FinTech: understanding robo-advisors adoption among customers,” Industrial Management & Data Systems, 119(7), 1411-1430. |
[8] |
Benartzi, S., & Thaler, R. H. (2007), “Heuristics and biases in retirement savings behavior,” Journal of Economic perspectives, 21(3), 81-104. |
[9] |
Beyer, M., de Meza, D., & Reyniers, D. (2013), “Do financial advisor commissions distort client choice?” Economics Letters, 119(2), 117-119. |
[10] |
Bianchi, M., & Brière, M. (2021), “Augmenting Investment Decisions with Robo-Advice,” Université Paris-Dauphine Research Paper, 3751620. |
[11] |
Blut, M., Wang, C., Wünderlich, N. V., & Brock, C. (2021), “Understanding anthropomorphism in service provision: a meta-analysis of physical robots, chatbots, and other AI,” Journal of the Academy of Marketing Science, 49, 632-658. |
[12] |
Chalmers, J., & Reuter, J. (2010), “What is the Impact of Financial Advisors on Retirement Portfolio Choices & Outcomes?” National Bureau of Economic Research, no. onb10-05. |
[13] |
Chang, A. (2012), “UTAUT and UTAUT 2: A review and agenda for future research,” The Winners, 13(2), 10-114. |
[14] |
Cheng, X., Guo, F., Chen, J., Li, K., Zhang, Y., & Gao, P. (2019), “Exploring the trust influencing mechanism of robo-advisor service: A mixed method approach,” Sustainability, 11(18), 4917. |
[15] |
Cheng, Y. M. (2020), “Will robo-advisors continue? Roles of task-technology fit, network externalities, gratifications and flow experience in facilitating continuance intention,” Kybernetes, 50(6), 1751-1783. |
[16] |
Christoffersen, S. E., Evans, R., & Musto, D. K. (2013), “What do consumers’ fund flows maximize? Evidence from their brokers’ incentives,” The Journal of Finance, 68(1), 201-235. |
[17] |
D’Acunto, F., Prabhala, N., & Rossi, A. G. (2019), “The promises and pitfalls of robo-advising,” The Review of Financial Studies, 32(5), 1983-2020. |
[18] |
Davis, F. (1989), “Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology,” MIS Quarterly, 13, 319-340. |
[19] |
De Graaf, M. M., & Allouch, S. B. (2013), “Exploring influencing variables for the acceptance of social robots,” Robotics and autonomous systems, 61(12), 1476-1486. |
[20] |
Duffy, B. R. (2003), “Anthropomorphism and the social robot,” Robotics and autonomous systems, 42(3-4), 177-190. |
[21] |
Fahruri, A., Hamsal, M., Furinto, A., & Kartono, R. (2022), “Conceptual Model of Technology Acceptance Model Modification on Robo Advisor Acceptance in Indonesia,” Journal of International Conference Proceedings (JICP) vol. 5, no. 1, pp. 467-477. |
[22] |
Fama, E. F. (1970). "Efficient Capital Markets: A Review of Theory and Empirical Work." Journal of Finance, vol. 25, pp. 383-417. |
[23] |
Fan, L., & Swarn, C. (2020), “The utilization of robo-advisors by individual investors: An analysis using diffusion of innovation and information search frameworks,” Journal of Financial Counseling and Planning. 31(1), 130-145. |
[24] |
Figà-Talamanca, G., Tanzi, P. M., & D’Urzo, E. (2022), “Robo-advisor acceptance: Do gender and generation matter?” PloS one, 17(6), e0269454. |
[25] |
Fisch, J. E., Labouré, M., & Turner, J. A. (2019), “The Emergence of the Robo-advisor,” The Disruptive Impact of FinTech on Retirement Systems, 13. |
[26] |
Gomber, P., Koch, J., & Siering, M. (2017), “Digital finance and FinTech: Current research and future research directions,” Zeitschrift Für Betriebswirtschaft, 87(5), 537-580. |
[27] |
Hayes, A. S. (2021), “The active construction of passive investors: roboadvisors and algorithmic ‘low-finance,’” Socio-Economic Review, 19(1), 83-110. |
[28] |
Hildebrand, C., & Bergner, A. (2021), “Conversational robo advisors as surrogates of trust: onboarding experience, firm perception, and consumer financial decision making,” Journal of the Academy of Marketing Science, 49, 659-676. |
[29] |
Hodge, F. D., Mendoza, K. I., & Sinha, R. K. (2021), “The effect of humanizing robo‐advisors on investor judgments,” Contemporary Accounting Research, 38(1), 770-792. |
[30] |
Hohenberger, C., Lee, C., & Coughlin, J. F. (2019), “Acceptance of robo‐advisors: Effects of financial experience, affective reactions, and self‐enhancement motives,” Financial Planning Review, 2(2), e1047. |
[31] |
Huang, H. M., Liaw, S. S., & Lai, C. M. (2016), “Exploring learner acceptance of the use of virtual reality in medical education: a case study of desktop and projection-based display systems,” Interactive Learning Environments, 24(1), 3-19. |
[32] |
Hudson, S., Matson-Barkat, S., Pallamin, N., & Jegou, G. (2019), “With or without you? Interaction and immersion in a virtual reality experience,” Journal of Business Research, 100, 459-468. |
[33] |
Jung, D., Dorner, V., Weinhardt, C., & Pusmaz, H. (2017), “Designing a robo-advisor for risk-averse, low-budget consumers,” Electronic Markets, 28(3), 367-380. |
[34] |
Jung, D., Erdfelder, E., & Glaser, F. (2018), “Nudged to win: Designing robo-advisory to overcome decision inertia,” In: proceedings of the 26th European Conference on Information Systems (ECIS 2018), Portsmouth, UK. |
[35] |
Kilic, M., Heinrich, P., & Schwabe, G. (2015), “Coercing into completeness in financial advisory service encounters,” In proceedings of the 18th ACM conference on Computer supported cooperative work & social computing, pp. 1324-1335. |
[36] |
Kitces, M. (2017), “Adopting a Two-Dimensional Risk-Tolerance Assessment Process,” The Nerd’s Eye View. |
[37] |
Klass, J. L. and E. Perelman (2016), The Evolution of Advice: Digital Investment Advisers as Fiduciaries, Morgan Lewis. New York. |
[38] |
Koehn, D. (2003), “The nature of and conditions for online trust,” Journal of Business Ethics, 43, 3-19. |
[39] |
Kostovetsky, L. (2016), “Whom do you trust?: Investor-advisor relationships and mutual fund flows,” The Review of Financial Studies, 29(4), 898-936. |
[40] |
Lourenço, C. J., Dellaert, B. G., & Donkers, B. (2020). Whose algorithm says so: The relationships between type of firm, perceptions of trust and expertise, and the acceptance of financial Robo-advice,” Journal of Interactive Marketing, 49, 107-124. |
[41] |
Maume, P. (2019), “Regulating robo-advisory,” Texas International Law Journal, 55(1), 49-87. |
[42] |
McKnight, D. H., & Chervany, N. L. (2001), “What trust means in e-commerce customer relationships: An interdisciplinary conceptual typology,” International journal of electronic commerce, 6(2), 35-59. |
[43] |
Mcknight, D. H., Carter, M., Thatcher, J. B., & Clay, P. F. (2011), “Trust in a specific technology: An investigation of its components and measures,” ACM Transactions on management information systems (TMIS), 2(2), 1-25. |
[44] |
Mesbah, N., Tauchert, C., Olt, C. M., & Buxmann, P. (2019), “Promoting trust in AI-based expert systems. In 25th Americas Conference on Information Systems (AMCIS) 2019, Cancun, Mexico. |
[45] |
Niszczota, P., & Kaszás, D. (2020), “Robo-investment aversion,” PLOS ONE, 15(9), 1-19. |
[46] |
Nomura, T., Kanda, T., Suzuki, T., & Kato, K. (2004, September), “Psychology in human-robot communication: An attempt through investigation of negative attitudes and anxiety toward robots,” In RO-MAN 2004. 13th IEEE international workshop on robot and human interactive communication (IEEE catalog No. 04TH8759), pp. 35-40. |
[47] |
Nourallah, M., Öhman, P., & Amin, M. (2022), “No trust, no use: how young retail investors build initial trust in financial robo-advisors,” Journal of Financial Reporting and Accounting, 21(1), 60-82. |
[48] |
Ostern, N., Schöler, J., & Moormann, J. (2020), “Toward voice-enabled robotic advisory for personalized wealth management,” In Proceedings of the 26th Americas Conference on Information Systems (AMCIS), 2020. |
[49] |
Papadopoulou, P. (2007), “Applying virtual reality for trust-building e-commerce environments,” Virtual Reality, 11(2), 107-127. |
[50] |
Philippon, T. (2019), “On fintech and financial inclusion”, National Bureau of Economic Research, No. w26330. |
[51] |
Phoon K F, Koh, F (2018), “Robo-Advisors and Wealth Management,” The Journal of Alternative Investments, Winter 2018, pp. 79-95. |
[52] |
Ruf, C., Back, A., Bergmann, R., & Schlegel, M. (2015, January), “Elicitation of requirements for the design of mobile financial advisory services--instantiation and validation of the requirement data model with a multi-method approach.” In 2015 48th Hawaii International Conference on System Sciences, pp. 1169-1178. |
[53] |
Rühr, A. (2020), “Robo-advisor configuration: an investigation of user preferences and the performance-control dilemma,” In 28th European Conference on Information Systems (ECIS) June 15–17, 2020. |
[54] |
Salo, A. (2017), “Robo advisor, your reliable partner? Building a trustworthy digital investment management service, Master's thesis, University of Tampere, Finland. |
[55] |
Sironi, P. (2016), FinTech Innovation - From Robo-Advisory to Goal Based Investing and Gamification, John Wiley & Sons. |
[56] |
Tan, G. K. S. (2020), “Robo-advisors and the financialization of lay investors,” Geoforum, 117, 46-60. |
[57] |
Torno, Albert & Metzler, Dennis R. & Torno, Vanessa. (2021), “Robo-What?, Robo-Why?, Robo-How? – A Systematic Literature Review of Robo-Advice,” in Pacific Asia Conference Information System (PACIS), 92. |
[58] |
Uhl M W, Rohne P R (2018), “Robo-Advisors versus Traditional Investment Advisors: An Unequal Game,” The Journal of Wealth Management, Summer 2018, pp. 44-50. |
[59] |
Venkatesh, V., and Davis, F.D (2003), “User Acceptance of Information Technology: Toward a Unified View.”, MIS Quarterly, 27(3), pp. 425-478. |
[60] |
Waliszewski, K., & Warchlewska, A. (2020), “Attitudes towards artificial intelligence in the area of personal financial planning: a case study of selected countries,” Entrepreneurship and Sustainability Issues, 8(2), 399. |
[61] |
Woodyard, A. S., & Grable, J. E. (2018), “Insights into the Users of Robo-Advisory Firms,” Journal of Financial Service Professionals, Sep 2018, vol. 72, no. 5, pp. 56-66. |
|
|
|