Commonwealth Bank (CBA) is one of the first banks in Australia to implement artificial intelligence (AI) technology and machine learning techniques to detect abusive behaviour in transaction descriptions within the CommBank App and Netbank.
The new model was developed in the CBA AI Labs, and for the first time allows the Bank to proactively identify instances of technology-facilitated abuse, a targeted form of domestic and family violence. The AI model complements the Bank's automatic block filter that was implemented last year across its digital banking channels to stop transaction descriptions that include threatening, harassing or abusive language.
Justin Tsuei, General Manager Community and Customer Vulnerability, said: "Technology-facilitated abuse is a serious problem, and completely unacceptable behaviour. We want to ensure our customers feel safe when they are using our platforms, and it's our responsibility to do everything we can to provide the right measures of protection across our channels.
"The new model, which uses advanced AI and machine learning techniques, allows us to provide a more targeted and proactive response than ever before. It builds on the work we have already done to fortify our digital channels from being used to perpetrate technology-facilitated abuse, including updating our Acceptable Use Policy and implementing an automatic block on offensive language being used in transaction descriptions."
The new model reflects the Bank's increased focus in using AI-powered optimisation in order to deliver greater innovation across its digital channels.
"As Australia's leading digital bank, we are continuously looking for new and better ways to improve our products, channels and services. The use of AI technology and machine learning techniques to help us address a serious issue like technology-facilitated abuse demonstrates how we can use innovative technology to create a safer banking experience for all customers, especially for those in vulnerable circumstances like victim-survivors of domestic and family violence.
"With this new model in place, not only are we able to proactively detect possible instances of abuse in transaction descriptions, but we can do so at an incredible scale," said Mr Tsuei.
Over a three month period, from 1 May to 31 July 2021, over 100,000 transactions were blocked by the automatic filter that prevents offensive language being used in transaction descriptions on the CommBank App and Netbank. Of those instances, the new AI model detected 229 unique senders of potentially serious abuse, which were then manually reviewed to determine severity and the appropriate action required from the Bank.
The Bank has introduced a range of interventions to support customers who are impacted by domestic and family violence, and who are recipients of technology-facilitated abuse. Depending on the severity of the abuse, interventions include:
- de-linking the victim-survivor's bank account from PayID so that the perpetrator can no longer use their email address, mobile number or ABN to send them abusive transactions;
- setting up new safe accounts for victim-survivors;
- referring victim-survivors to external support organisations;
- sending warning letters to perpetrators; and
- introducing a process where, in extreme cases, the Bank will terminate a customer's banking relationship if they continue to breach the Acceptable Use Policy.