Data management is evolving as a critical business function and properly identifying financial instruments is now a top priority for data and risk management professionals. Financial reforms from the Basel Committee on Banking Supervision (BCBS 239) to the G20 recommendations now require company officers at financial institutions to attest to the quality and accuracy of their data, portfolio and fund valuations, due diligence around pricing and order handling, and reliability of risk models.
It can be difficult, if not impossible, to comply with these compliance obligations and fiduciary responsibilities without effective data management processes in place. Data management starts with the ability to trace data lineage – that is, the ultimate origin of, and history behind data. To accomplish that, you need a reliable marker – a unique identifier – that links data as it evolves from beginning to end and never changes or expires. This establishes a reliable framework to distinguish between two similar corporate bonds with different issuers, or between dozens of swap contracts with like terms but based on slightly different curves.
Managing data quality and tracking data lineage is a challenge for professionals across any financial organization. Everyone from the trading desk to risk management to operations and senior management all need the same data to make critical business decisions daily. Unfortunately, they often get different answers when performing similar calculations because they aren’t using equivalent data. The data being used may, on the face, appear to represent the same thing, but instead are from different sources, have broken lineage and/or exist in different contexts.
A recent Tabb Group report titled, “Building a Framework for Innovation and Interoperability“, found that over 53 percent of firms operate using more than one security master and nearly 25 percent of asset servicers utilize more than ten “masters”. Surveyed firms report this translates directly into issues of poor data quality (95 percent), trade errors and operational issues (78 percent) and overall lack of interoperability (59 percent).
The lack of a standard framework to reconcile data sets across business functions or asset class allows the problem of poor data quality to persist by disrupting data lineage and hampering efforts to improve data management controls.
To bridge the gap and address this problem, some financial and data professionals are considering how an instrument identification framework can support the data management process.
Specific to this need, more than 76 percent of firms surveyed by Tabb called for an instrument identification framework that uses open and freely distributable identifiers. Almost a quarter of asset management firms surveyed said they were embracing the Financial Instrument Global Identifier (FIGI) expressly to address data quality and operational reconciliation issues.
Senior Tabb Group analyst Dayle Scher said her research revealed that an open framework for identifying securities could result in cost savings from enhanced data quality because “operational efficiencies can be achieved through the reduction of manual mapping and cross-referencing activities.‟
The Tabb report’s results illustrate that the industry desires a new and different solution to these persistent and historically embedded data problems of security identification. There is also almost uniform, overwhelming support for an open standard that has the same qualities as the FIGI framework.
Tabb identified lack of vision, legacy infrastructure and embedded legacy datasets as the primary barriers for positive change. But change is coming, evidenced by clear uptake of FIGI as a solution, especially in the investment manager and hedge fund sectors, traditionally the more innovating and forward looking.
Lingering misunderstanding of FIGI notwithstanding, the report shines a light on the fact that FIGI has a role to play in the future of data management, and the future of financial services – from blockchain to the metadata management of the many disparate information sources that will continue to exist, and likely grow.
While firms must address challenges of regulation such as BCBS 239, they must still address the daily technical challenge of data management. Both objectives are a priority because they are essential to remaining competitive. Universal open data standards, like the FIGI, provide the foundational framework that data professionals and business users need, and are asking for.
To download a free copy of the full report “Building a Framework for Innovation and Interoperability”, please visit Tabb FORUM.