Generative AI Revolutionizes Financial Services

Generative AI, including platforms like ChatGPT, is transforming industries by making processes simpler, more efficient and easier to interact with. However, in the heavily regulated financial services sector the benefits also come with some serious risks. So it's vital that this emerging technology is employed responsibly in order to maintain stability and trust .

Author

  • Emmanuel Mogaji

    Associate Professor in Marketing, Keele University

Financial services are no strangers to advances in technology, but generative AI presents an uncharted and complex landscape for the industry. Insights into its potential often come from consulting reports or academic opinion pieces , which tend to be speculative and lack real-world data.

This gap in understanding inspired my recent research . Through interviews with bank managers and industry experts, I explored the challenges and opportunities associated with integrating generative AI into financial services. My research also delves into how this transformative tech is reshaping the consumer experience.

Generative AI goes far beyond using ChatGPT to produce text or DALL-E 3 for creating images. It can be used to analyse a consumer's financial history and behaviour to tailor products such as loans, investment plans or insurance policies for them. And generative AI can also be used to make quick decisions about loan applications .

A consumer may be accustomed to using their bank's chatbot to get information about a product. Erica, Bank of America's virtual financial assistant, has facilitated more than two billion interactions with 42 million clients. (On average, Erica handles two million queries every day.)

But we don't yet know who bears responsibility for the advice and products that generative AI suggests. Does accountability lie with bank managers, leadership or the AI itself?

For example, if a consumer relies on generative AI for financial advice, it's not clear that this advice will be credible and suitable. Critics cite biases and its lack of nuanced understanding and judgement .

Still, it's seen as a valuable "second pair of eyes" for wealth managers, with potential to evolve into a reliable tool for individual investors too.

AI can make wealth management more accessible and efficient. Robo-investment platforms use AI to create personalised investment strategies, managing portfolios based on goals and risk tolerance. This approach reduces costs and offers 24/7 portfolio monitoring without requiring direct human oversight.

But given the financial sector's high stakes and stringent need for accuracy, AI tools must be both reliable and precise . Yet, the question remains: can this level of trust and assurance ever be fully guaranteed?

Getting personal

Personalisation is set to become a cornerstone of financial services. My earlier research looked at how AI tools are being used to craft tailored marketing emails and advertising campaigns.

With banks now able to access diverse customer data sets and harness the creative power of AI, the potential for personalised adverts and customised financial products is immense. But at the same time, the balance between relevance and privacy becomes increasingly delicate.

And it's not just about banks and institutions using AI legitimately. Generative AI can produce misleading or even fictitious adverts, potentially ushering in an era of deepfakes that trick consumers or leave them doubting what's real.

As this landscape evolves, consumers need to stay alert and critically evaluate marketing messages. AI may mean scams look more sophisticated - but the usual safeguarding methods, like checking that messages come from official websites, emails or verified accounts, still apply.

Beware of "act now" urgency tactics, poor grammar or altered URLs (like, for example, "paypa1.com" - with the digit 1 - instead of "paypal.com"). Just because an advert online has your name on it does not mean it is meant for you. It could have been generated by AI to convince you to click - with potentially disastrous consequences.

Generative AI is here to stay in every sphere of our lives. This is a new landscape for consumers, so it is vital that they watch how they engage with adverts, tools and technology . While financial services are well regulated, consumers must ensure they are engaging only with the genuine tools provided by their bank.

And while ChatGPT can offer advice, its developer OpenAI won't take responsibility for the recommendations it makes. If you want to use AI, it's far better to engage the chatbot provided by your bank . That way, you can be sure you are getting information from a credible source.

The regulated space for financial services providers, including their use of chatbots, places the responsibility on banks to meet legal and compliance obligations. This ensures they protect consumers, provide accurate and reliable information and remain within industry standards and regulations. The same cannot always be said for generative AI more broadly.

Regulators also have a role to play in reassuring and educating consumers about the emerging trends of generative AI. The Financial Conduct Authority and the Advertising Standards Authority must ensure there are flexible frameworks in place that can keep pace with the rapid advances in AI technology.

This will involve creating clear guidelines for the development, use and oversight of generative AI systems, balancing innovation with consumer protection.

The generative AI genie won't be going back in the bottle. It will continue to become an ever-more integral part of everyday life - so consumers need to be proactive to engage with this new and quickly evolving technology.

The Conversation

Emmanuel Mogaji does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

/Courtesy of The Conversation. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).