Financial institutions are in a constant balancing act: open up access to capital for borrowers to promote economic growth and financial inclusion and mitigate against the risk exposure lending produces — sometimes with disastrous implications for the global economy.
Historically, banks are taking a reactive approach to risk: reactionary measures were largely behind financial institutions’ pullback from the small business lending market following the 2008 global financial crisis, for example.
According to Trisha Price, Chief Product Officer at bank technology provider nCino, that reactive approach is rooted in one of the largest challenges today for banks and their lending operations.
“Currently, there is a reactive approach in banking because data is not readily at the fingertips of the front line making everyday risk decisions,” she told PYMNTS in a recent interview.
Loan portfolio data and access to “multidimensional analytics” about lending operations and risk exposures is essential to banks and other financial institutions in their quest to balance access to capital with risk mitigation. Not only having data on their portfolio at-the-ready, but being able to analyze and make use of that data for actionable insights, is not necessarily the norm in today’s banking system, said Price.
On top of that balancing act is the rising pressure of regulatory compliance, too.
Increasingly, financial institutions are struggling to manage the more complex demands of global regulators, and though this challenge has given rise to the RegTech market and enticed banks to turn to technology to manage their compliance needs, there is no one-size-fits-all solution.
Data, too, is at the heart of lenders’ compliance pain points, particularly as the burden of gathering data about borrowers increases. But data aggregation efforts are guided by current regulatory requirements, and because those requirements often change, lenders can stumble.
According to Price, financial institutions must standardize their data-gathering practices across all units of their business to promote reliability of data and be able to share any necessary information with authorities. At the same time, she said, FIs have to ensure customers have a positive experience with the bank.
She pointed to the Fair Lending Act as one example of both the burden and opportunity in data when it comes to banks’ compliance needs. Gathering data from borrowers can be a key pain point both for the client and the bank, but addressing that friction can make for a better borrowing experience while promoting efficiencies and an elevated level of compliance: Sophisticated data analytics could uncover unintentional biases in a bank’s lending practices, for instance.
With open banking quickly gaining steam at a global scale, the opportunities for banks to be able to access more data on their clients will also grow.
“Open banking will help with analyzing exposure and performance as it allows account aggregation for transparency purposes, and for a true 360-degree view of the customer,” Price said, adding that unlocking customer data also enables a financial institution to more seamlessly aggregate information from a range of platforms and customer account products.
Once again, though, this process is about balance: Promoting a painless data collection experience for bank and borrower, while also ensuring data security and privacy to comply with increasingly tougher regulations.
According to Price, data is at the heart of all of these balancing acts. In pursuit of that balance, nCino announced this week the acquisition of Visible Equity, a company that will enable nCino to integrate its technology that wields data analytics to promote compliance and enhanced portfolio management capabilities.
“Financial institutions are always striving to better balance the three most important pillars of profitability, growth and soundness,” she said. “In order to balance, it is important to understand the key metrics associated — such as customer profitability, loan loss forecasts including probability of default and loss given default, and optimized application analytics.”