Innovating with data to manage fraud risk in the open banking era

The evolving digital era of open banking offers many opportunities for financial institutions, their partners and their customers. But it also brings new fraud risk management challenges for retail and commercial banks.

Financial institutions’ ability to address these challenges rests on their capacity to innovate in the way they manage and use data from multiple sources, requiring a rethinking of IT architecture.

Fraud management processes for financial institutions typically focus on critical workflows such as client onboarding, authentication, transaction screening and payment processing.

Now, as the volume of electronic transactions grows across the globe, the financial services sector must deal with increasingly sophisticated fraud (particularly card fraud and remote banking fraud) and regulations that encourage competition and innovation while simultaneously countering fraudulent activity.


Regulatory challenges

Regulators everywhere are seeking to open the financial services market by enabling third parties to access customers’ financial data (with customers’ permission). At the same time, they have an obligation and mandate to safeguard the interests, and particularly the financial security, of these customers.

In the EU, for instance, the Revised Payment Services Directive (PSD2) aims to further integrate the European payments market while ensuring payments are more secure and consumers are protected. PSD2 is explicitly designed to promote innovative digital payments, through open banking, and improve the safety of cross-border European payments services.

An important component of PSD2 is a requirement for strong customer authentication (SCA) on most electronic payments.

SCA requires two-factor authentication, using at least two of the following factors – something the customer knows such as a password or PIN, something the customer has, such as a hardware token or mobile phone, or something the customer is, such as through face recognition or a fingerprint.

This presents a challenge for banks by adding friction to the purchase experience, making it more likely the customer will abandon the transaction.

As long as there are some exemptions in the regulation, which allow certain low fraud risk transactions to be authorised without two-factor authentication, it is in the interests of consumers, merchants and banks to maximise the number of qualifying transactions that benefit from such exemptions.

The most common exemption under PSD2 is the transaction risk analysis (TRA) exemption. A merchant can request a TRA exemption for transactions that the issuing bank or payment service provider (PSP) consider to be low risk. But the bank or PSP needs to maintain low overall fraud levels to be eligible to apply the TRA exemption.

Banks that do not make optimum use of frictionless risk-based authentication stand to lose revenues, as customers switch to using other banks or payment methods that offer a smoother payment experience.


Systems challenges

At the same time, the evolution of open banking and increasing margin pressures mean mainstream retail and commercial banks face growing competition from emerging fintechs and neobanks.

To effectively compete in this environment, banks need to find innovative ways to manage their fraud risk, while providing that smooth payment experience. And this requires adding to their simple, rules-based systems with machine learning–based risk-analysis systems that use fraud trend and transaction data.

Machine learning (ML) can greatly reduce both false positives (unnecessary processes) and false negatives (higher fraud levels) in risk-analysis systems.

Financial institutions also require massive diversification in risk profiling data sources and data points, including behavioural data, purchasing history, issuing bank data, cross-merchant deny lists, transaction velocity and public records.


Data-centric challenges

For banks, these new systems and capabilities mean they must be able to process increasing volumes of data at great speeds.

Their systems need to be able to quickly ingest real-time, historic and unstructured data from many sources. And they need to be able to maximise the performance of ML processes, such as learning ‘models’ and ‘inference’, irrespective of customers’ locations, for real-time risk decisioning. At the same time, banks must simplify data governance and compliance.

That’s not easy as banks and all other enterprises face the increasing challenge of ‘Data Gravity’ – the phenomenon where data attracts applications and services such as analytics and ML, creating even more data, as the accelerated creation of data continues to grow between different systems and capture points, it becomes increasingly more difficult and expensive to move. Financial institutions are particularly susceptible to the issue as they increase their use of these applications and data in general. According to the Data Gravity Index™, banking and financial services are projected to experience a 146% compound annual growth rate (CAGR) of data gravity intensity through 2024, followed by insurance with a 143% CAGR in the same period.

However, solving the Data Gravity challenge can be achieved with data-centric infrastructure harnessing connected data communities to enable a low-latency data network, which in simple terms requires replicating data across multiple data centres, maximising the speed of data exchange, regardless of geography.


The solution

PlatformDIGITAL offers financial institutions and their vendors the global data centre and interconnectivity platform they need to support their complete data pipeline. PlatformDIGITAL provides the data-centric IT infrastructure with its proven Pervasive Datacenter Architecture (PDx), for scaling digital business and efficiently managing data gravity challenges. It taps into Digital Realty’s and Interxion: A Digital Realty Company’s global data centre platform of more than 285 facilities across more than 20 countries on six continents.

With this global platform and scalable architecture, financial institutions can optimise data exchange at every stage, from data ingestion to ML model training and inference. What’s more, it can be done in close proximity to both data sources and users accessing that data, minimising latency.

Ultimately, this means banks can better manage their fraud risk, while providing the smooth payment experiences that customers expect.

To learn more about how Interxion can help financial services companies achieve this, see our PDx Solution Brief: Optimising Financial Services Data Exchange. It provides a codified strategy and solution approach to address industry-specific challenges, overcome data gravity barriers, unlock new growth opportunities, and more. Or contact us to see how we can help.