
The following is a sponsored post from Ted O’Connor, SVP and Head of Business Development—Sell Side, with global fintech company Arcesium.
Arcesium delivers an advanced data, operations, and analytics platform used by some of the world’s most sophisticated financial institutions, including hedge funds, banks, institutional asset managers, and private equity firms.
Every bank is in a different stage of its data journey. Recently, while attending the InvestOps Europe conference in Paris, one of the presenters mentioned that when it comes to gauging the level of confidence banking leadership has in the integrity of its data, 95% confidence in their data is the barometer to which they need to adhere. Ninety-five percent has always been a desirable grade to get on a paper or in a class, but is it good enough when talking about a multinational bank operating in dozens of jurisdictions?
Like the air we breathe, data is odorless, colorless, silent, and hard to measure. That is, until data is presented next to dollar signs on a disclosure report, balance sheet, or interminable spreadsheet; then it becomes real. The past few years have seen financial institutions grappling with suddenly ballooning volumes of financial data, not an easy ask for legacy data systems and banks that might run on scores of different systems.
The 95% confidence fallacy
While a 95% confidence interval[i] in data is the target, banks really have only 80-90% confidence in their data today. In a 2024 study of sell-side reference data operations, over 90% reported that poor data quality caused issues in clearing and settlement, risk management, and regulatory reporting, with 80% citing challenges in automated trading and market connectivity emanating from inaccurate data.[ii] Moreover, that 80-90% is a bit of an illusion. Here’s the reality. Say, I am a bank CTO or chief data scientist, and I have 80% confidence in the data that is coming to me via any type of transaction. I then push that data into the clearing or matching process. Then, I push it into the settlement process—and there’s cash movement that goes along with this. That data keeps getting pushed from one process to the next, to the next, and the next, which means there’s a little bit of degeneration that happens all the way through. By the time I get to the end of my processes, I have 50% confidence in my data, and that little anomaly from the first process becomes a serious data problem 10 steps later. However, this is an inscrutable problem to recognize, much less solve. It depends on the robustness of the institution’s existing data and operational infrastructure, the stage of its data transformation journey, and the asset classes and structures involved.
Meanwhile, the risk of getting it wrong is high. On the undesirable end of the 95% spectrum, Citi shelled out about a billion dollars in fines in the last five years for irregularities in its regulatory reporting data and governance failures, and responded by spending millions modernizing its technology.[iii] Deutsche Bank, Wells Fargo, and Mitsubishi Bank are examples of institutions that have worked through confidential supervisory findings called Matters Requiring Attention (MRAs) and Matters Requiring Immediate Attention (MRIAs). Many of these have been rooted in data processes. In this context, even 95% (and even if it were a true 95%) isn’t enough for global banks—UBS, for instance, has a balance sheet larger than the Swiss economy. A Swiss bailout of such a bank is challenging. The risk needs to be near-zero, which means confidence needs to be near-perfect.
Is AI the key?
AI has lit a fire in the bellies of buy-side and sell-side institutions alike, as they know their data house must be in order for the AI house to be in order. According to Deloitte, “Banks’ AI readiness is often slowed by the data foundations that models depend on. Poor infrastructure can result in data sprawl, vulnerability, and limited data-led innovation, limiting model efficacy.”[iv] But once a bank has their AI game in place, it can play a pivotal role in bringing order to the data chaos. There are several data quality management functions that AI agents are already helping with. For example, one financial institution recently leveraged generative AI to automate data lineage capture and metadata generation, achieving 40% to 70% productivity gains in specific tasks.[v]
AI presents ready-assistance for unstructured data, in particular. If managing structured data is like sorting pre-labeled packages, managing unstructured data with AI is like instantly reading thousands of handwritten letters, identifying key facts in each one, and organizing those facts into a searchable spreadsheet—a task impossible for humans at scale. But, again, the art of the possible when it comes to AI will come back to data quality; it will require institutions to centralize their data management capabilities, with an emphasis on tools that support strong data lineage and reporting accuracy.
The 100% data confidence paradigm
Having a 95% data confidence barometer presents several pitfalls when executing tech transformations. Regulatory considerations, data governance challenges (especially with unstructured data), surging market volumes, private credit, and the adoption of AI in the financial services industry are forces that cannot be ignored. Realistically, banking leaders need to keep their eyes on the 100% prize for quality data management.[vi] Everybody under the roof will do a better job if they trust that the information they do their jobs with is reliable, timely, and precise.
[i] Investopedia, May 6, 2025. https://www.investopedia.com/terms/c/confidenceinterval.asp#toc-explain-like-im-five
[ii] Acuity Knowledge Partners, November 2024. https://assets.ctfassets.net/cy2jgjrgaerj/5V6yrRfzYZU1LXqUgvulAD/ed8d59627717a3fafe96f36123d36e8e/increasing-efficiency-in-sell-side-reference-data-management-fow.pdf
[iii] Banking Dive, July 11, 2024. https://www.bankingdive.com/news/citi-occ-fed-135-million-penalties-2020-orders-data-quality-risk-management-control-fraser-hsu/721061/
[iv] Deloitte, October 30, 2025. https://www.deloitte.com/us/en/insights/industry/financial-services/financial-services-industry-outlooks/banking-industry-outlook.html
[v] BCG, May 6, 2025. https://www.bcg.com/publications/2025/tech-banking-transformation-starts-with-smarter-tech-investment
[vi] Arcesium, February 2, 2026. https://www.arcesium.com/resources/driving-trusted-data-framework-for-banks?utm_source=one-off&utm_medium=display&utm_campaign=MC-2026-Q1_SS-Data-Quality-To-Do-List&utm_content=finovate-sponsored-article
Views: 76






