Key challenges in Treasury 4.0 implementation

This article was written by Andre Pereira, Hedge Accounting Business Analyst and David Wiggins, Workflow Specialists Team Lead, at Bloomberg. 

Our clients often ask us about leveraging more forward-looking data in their treasury processes. A practical example relates to counterparty credit analysis where many tell us that their current process relies predominantly on credit ratings and analyzing a company’s financial statements. While credit ratings are a critical part of the story, they are also ‘rear view mirror’ type analytics and a perfect example of a data set which treasurers are looking to supplement with forward-looking data whether is comes from credit models, CDS, consensus views etc.

The potential for this is enticing, but there are looming challenges for such an endeavor.

Fragmentation of the technology stack

One of the main obstacles corporate treasurers face, is the current fragmentation of their technology stack. Such is the current combination of in-house, third-party vendors’ applications, data sources and manual processes. BCG, a consultant firm, in their digital treasury paper, indicates “the fragmented data and IT landscape” as one of the three most critical challenges to digitization. In their words, “treasury personnel often have to toggle between multiple systems to get needed information, efforts that consume time, cost, and manual labour. The lack of integration also means that functionality like cash flow forecasting or valuation often has to be replicated in different systems, which can create data reconciliation challenges and operational risks.”

This seems to be confirmed by PwC’s views as well. They list the following challenges when discussing about financial risk technology:

  • “Manual business activities [with] multiple handoffs creating inefficient labor intensive risk processes;
  • Poor data quality [that leads] to inaccurate outcomes, poor accuracy and risk monitoring;
  • Disconnected data: [point-to-point data sourcing with few golden sources and an inability to make data joins across multiple domains, making it difficult to provide holistic risk monitoring;
  • Risk analytics: multiple platforms, carried out within data silos; and
  • Segregated systems: duplicate capabilities across functional areas leading to a complex IT landscape, limiting ability for change, collaboration and innovation.”

This is important, given that it is the treasury departments who bear the financial and operational responsibility of designing, configuring and maintaining all the different integration points. Such fragmentation can result in high costs of maintaining multiple databases, integrating and reconciling them. In the past, this led many corporate treasury departments to opt for ‘all-in-one’ systems rather than combining ‘best of breed’. But ultimately the convenience of a single system became more important than reaching the desired level of sophistication and accuracy.

Given the rapid changing regulatory and fin-tech environment it is hard to envision that any system will be able to both excel and rapidly evolve in all of the required activities.

Delivery applications: Database, engine and interface

Another key challenge is associated with the way an application is built and delivered to the final user. With the risk of over-simplifying, we could define an application as computer program that contains three elements: a database, an engine and an interface.

A database is the repository for market data, and trade information, for example. The engine is the purpose of such application, a certain calculation where data is transformed. And the interface, direct or indirect, is how the user interacts with the application, by uploading data or requesting a calculation. An example of all these working together could be a tool for fair values derivatives.

It is so common that we don’t even think about them in their components – nor would we buy any application that would offer only one or two of those. However, they are extremely difficult to unpack. Offering limited, predefined endpoints, or elements of interaction, so that those applications can be integrated with other systems. For example, an uploader to populate the database, and a .csv file containing the results generated by the application.

These are two very different examples, but both could indeed create obstacles to a smooth transition to digitization of the overall process. And definitely both have been on the treasurer’s minds. But “with so many mission critical items on their plates over the past decade, few treasuries have had the time or resources to invest in strategic matters”, – says BCG.

They continue: “but now that markets are largely stabilizing and many regulatory reforms are completed, treasuries have the opportunity to make up for lost time and use digitization to unlock significant long-term value.”

Treasury 4.0 is set to offer the best of both worlds. On one hand, it will focus on decoupling and modularization of the technological offerings. But on the other, it will offer solutions to seamless integrate those offerings, so they can communicate in real time.

Let’s unveil one of the potential applications of this new paradigm.

A very simple linear example would be the ability to visualize on the screen of an application provided by vendor A, a valuation that is sourced from an engine provided by vendor B; such engine whilst making its calculations requesting the necessary market data from a database provided by vendor C. Let’s now assume that the application from vendor A is the system where I store all my information in respect to the portfolio of FX forwards. But this application doesn’t have the ability to fair value such deals. A new system will need to be acquired, and this would normally require the extra work of populating that system’s data warehouse, and its regular reconciliation. But, particularly with the advent of APIs, a request for valuation can be made between the systems provided from vendors A and B, and concurrently information about the trade is also sent. Similarly, the engine provided by our vendor B won’t store any information about market data; only communicating with the database provided by vendor C, if and when a request for valuation is made. This is a very efficient way to interact with all these three vendors:

  • One doesn’t require to maintain market data or unnecessary valuations on application from vendor A
  • One doesn’t require to maintain a database at all with our vendor B
  • One doesn’t need to request unnecessary market data from vendor C
  • And one is not bothered with visual interfaces from B or C; our interaction would be limited to webAPI requests

So, an in form of conclusion, irrespective of the challenges, the digitization process is well under way. Jane Thier, at CFO Dive, a specialized online publication, says that “73% of CFOs say they’re retooling their division with the latest tech”. The main reason for this is that many believe the “necessary data is trapped across systems or organized manually in spreadsheets, which prevents them from drawing insights and value”.

This is consistent with established trends; Accenture – continues Ms. Thier – finds 60% of traditional finance tasks are automated today, on average, and predicts it will grow to 80% in five years.”

Recommended for you

Request a Demo

Bloomberg quickly and accurately delivers business and financial information, news and insight around the world. Now, let us do that for you.