Today’s data-driven, digital landscape is transforming and optimizing business processes in ways previously unimaginable.
Top executives at Alphabet, Microsoft and Bank of America (BoA) and more have devoted significant time during their earnings calls this week highlighting the internal impact of generative artificial intelligence (AI) tools that have taken the marketplace by storm in recent months.
Alphabet and Google Chief Financial Officer Ruth Porat added, “we use AI across almost every internal financial task.”
Microsoft Chairman and CEO Satya Nadella recited a litany of use cases and partnerships illustrating how the computing giant is bringing advanced generative AI to clients, emphasizing that he looks “forward to continuing this journey in what is a generational shift.”
For its part, Bank of America underscored that it has been using its own virtual AI assistant, Erica, internally, capturing “the extreme benefits it provides in allowing teammates to work much more quickly and efficiently within our own systems.”
Broadly speaking, AI can help enhance three core organizational needs. The first is automating back-office and financial business processes and activities; the second is providing leaders with real-time insights through data analysis; and the third is offering new relational touchpoints for engaging with both customers and staff.
The countless terabytes of data collected today are what allow organizations to visualize and automate relationships across different locations, departments and systems — and data rests at the heart of the generative AI tools and capabilities that represent the next wave of economic innovation.
But not all data is created equal, and not all data is activated effectively across enterprise systems.
Patrick Murphy, founder and CEO of construction technology company Togal.AI, told PYMNTS earlier this month that his company has “spent the last three years labeling hundreds of thousands of plans and literally millions of objects on blueprints in a very consistent, very accurate manner so our algorithms can read and make sense of them — something critical to putting us in a unique position to leverage this new information.”
PYMNTS has for years been tracking the modernization of payments systems and capabilities as businesses move away from legacy, often manual back-end processes.
This ongoing digitization has resulted in a treasure trove of data for enterprises to leverage, but it has also resulted in a surplus of redundant information and hard-to-sift-through mountains of information.
Fortunately, the deep learning capabilities of modern AI have reached the technical tipping point where today’s algorithms can transform these large datasets into projects that enhance business processes.
“If you think about it from the supplier standpoint, the organizations receiving the payments are probably the most data-hungry and get the most value from data,” Zachary Lynn, head of revenue operations at Boost Payment Solutions, told PYMNTS Monday (April 24).
Common examples include automating billing and accounting reconciliations, extracting information from legal and contractual documents using natural language processing (NLP), and updating customer relationship management (CRM) and enterprise resource planning (ERP) systems in real time without the need for manual intervention.
Modern solution providers like Auditoria.AI have already started adding generative AI to their products in order to automate finance workflows and engage with customers, suppliers, vendors and stakeholders through conversational email.
That’s because by using AI and machine learning (ML) technologies, organizations can connect historically disparate and fragmented data to get a more unified picture of their operations, as well as identify previously obscured opportunity areas.
“Data is foundational to building the models, training the AI — the quality and integrity of that data is important,” Michael Haney, head of Cyberbank Digital Core at FinTech platform Galileo, the sister company of Technisys, told PYMNTS in March.
To train an AI model to perform to the necessary standard, many enterprises are relying solely on their own internal data to avoid compromising model outputs.
For example, Bank of America told investors on its April 18 earnings call that its own internal AI tools use a special predictive language program endemic to BoA’s own business data sets, making the models and solutions captive to company data while allowing employees to surface necessary information in real time.
“There is a lot of value [around generative AI capabilities], but the key question is when can we use it without the fear of bias and where this information is coming from,” BoA’s CEO Brian Moynihan said at the time. “We need to understand how the AI-driven decisions are made…”
The ability to gain key benefits from AI implementations depends almost entirely on an organization’s data preparedness, as these processes may require a modernized infrastructure.
PYMNTS’ latest research collaboration with Banyan, “Meeting the Need for Item-Level Receipt Data,” revealed that the degree to which an organization has modernized its operational infrastructure may pre-determine its use of innovative future-fit solutions, as well as the impact of those solutions’ effectiveness.
“The data that exists within companies is at the heart of everything that drives better decision-making,” Emburse CFO Adriana Carpenter said in a December interview with PYMNTS. “Historically, and very commonly, companies do not have a data strategy or data governance models to harness that data and to be able to layer on the tech that’s available to drive better decision-making.”
For all PYMNTS B2B coverage, subscribe to the daily B2B Newsletter.