Downloads

Ni, Y. (2024). The Impact of Explainable AI on Customer Trust and Satisfaction in Banking. Journal of Information, Technology and Policy. https://doi.org/10.62836/jitp.v1i1.165

The Impact of Explainable AI on Customer Trust and Satisfaction in Banking

This study employed a structured questionnaire to gather data from bank customers, focusing on customer perceptions of Explainable AI, Customer Trust (CT), and Customer Satisfaction (CS) in the banking sector. A total of 180 questionnaires were distributed, with 169 valid responses analyzed. The study selected various indicators such as Customer Age, Education Level, Risk Appetite, Previous Internet Experience, Previous AI Experience, Engagement, Post-Use Feedback, Personalized Service, Prediction Accuracy, Data Privacy, System Reliability, Service Efficiency, and Information Push to assess their impact on customer trust and satisfaction. Reliability and validity analyses ensured the robustness of the collected data. Ordered prohbit analysis revealed significant influences of variables like Customer Risk Preference, Perceived Innovation, and Perceived Accuracy on customer trust and satisfaction.

explainable AI customer trust customer satisfaction

References

  1. Burgt, J.V.D. Explainable AI in banking. Journal of Digital Banking 2020, 4(4), 344–350.
  2. Kuiper, O.; van den Berg, M.; van der Burgt, J.; Leijnen, S. Exploring explainable ai in the financial sector: Perspectives of banks and supervisory authorities. In Proceedings of the Artificial Intelligence and Machine Learning: 33rd Benelux Conference on Artificial Intelligence, BNAIC/Benelearn 2021, Esch-sur-Alzette, Luxembourg, 10–12 November 2021; Revised Selected Papers 33; Springer International Publishing: Cham, Switzerland; pp. 105–119.
  3. De Lange P. E.; Melsom B.; Vennerød C. B.; Westgaard S. Explainable AI for credit assessment in banks. Journal of Risk and Financial Management 2022, 15(12), 556.
  4. Carter S.; Hersh J. (2022). Explainable ai helps bridge the ai skills gap: Evidence from a large bank.
  5. Hanif, A. Towards explainable artificial intelligence in banking and financial services. 2021. arXiv: 2112. 08441.
  6. Dikmen M.; Burns C. The effects of domain knowledge on trust in explainable AI and task performance: A case of peer-to-peer lending. International Journal of Human-Computer Studies 2022, 162, 102792.
  7. Fritz-Morgenthal S.; Hein B.; Papenbrock J. Financial risk management and explainable, trustworthy, responsible AI. Frontiers in Artificial Intelligence 2022, 5, 779799.
  8. Navarro, C.M.; Kanellos, G.; Gottron, T. Desiderata for Explainable AI in statistical production systems of the European Central Bank. In Proceedings of the Joint European Conference on Machine Learning and Knowledge Discovery in Databases, Bilbao, Spain, 13 – 17 September 2021; Springer International Publishing: Cham, Switzerland; pp. 575–590.
  9. Adams, J.; Hagras, H. A type-2 fuzzy logic approach to explainable AI for regulatory compliance, fair customer outcomes and market stability in the global financial sector. In Proceedings of the 2020 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), Glasgow, UK, 19–24 July 2020; pp. 1–8.
  10. Gramespacher T.; Posth J. A. Employing explainable AI to optimize the return target function of a loan portfolio. Frontiers in Artificial Intelligence 2021, 4, 693022.

Supporting Agencies

  1. Funding: Not applicable.