This interview was first published in the Jul-Sep Issue of Risk & Compliance magazine.
Janos Renz Hotz is the global head of product of Bloomberg transaction cost analysis (BTCA). With a focus on trading, compliance and TCA across asset classes, he has helped build workflows addressing best execution, trade surveillance and analytical needs across functional areas.
Q: Could you provide an overview of how firms trading equities are currently managing their best execution obligation? In general, is there room for improvement?
While we still have a number of firms looking at best execution as a quarterly operation to be done with a lengthy report in hand, the majority are embracing a more process-driven approach. This typically starts with the definition of an order execution policy (OEP), usually grounded in analysis of the firms flow over a period of time – ideally one year or more – and a solid understanding of the objectives of funds and their trading patterns. The OEP is then implemented using an exceptions-based workflow that monitors the firm’s flow and raises any deviations from the OEP as actionable tickets.
Typically, this is substantiated with periodical reports demonstrating the OEP and the evolution of the parameters of the OEP. There is lots of room for improvement; mainly, how to utilise the insights that transaction cost analysis (TCA) data offers – both from the order flow and the exceptions workflow – to improve decision making at point of trade. While generic TCA models are helping, the ideal scenario is to support trading decisions using insights gained and lessons learned from one’s own order flow – and only when necessary from a set of peer universe analysis. This should also form the basis of the next steps in order flow automation – away from just a round robin broker wheel.
Q: How would you gauge the move toward best execution across other asset classes?
The adoption of TCA and best execution frameworks in other asset classes, especially foreign exchange (FX) and fixed-income (FI), is much more fragmented. Some firms are barely touching it, while others are proceeding at breakneck speed, at least in comparison to the time it took us to get where we are in equities. For those at the vanguard, lessons learned are proving to be accelerators across the field. However, everybody can agree on lingering doubts on the data quality, especially in more illiquid instruments. The consensus is landing on a set of benchmark waterfalls that start with the most granular millisecond data all the way to coarser end of day prices – depending on what is available for what instrument. This is augmented by contextual data such as order difficulty, spread to benchmarks, index comparison and momentum adjustments. Notwithstanding the fragmented approach, the momentum is clear across the industry. Best execution and TCA are expanding across asset classes.
Q: In your opinion, what do firms need to bear in mind when building a best execution framework? To what extent should this framework make sense of data and automate workflow?
The basics are always the same: data quality is paramount. Without a solid understanding of one’s own workflows, associated availability and resolution of data, even the best order execution policy will fail. Once the basics are out of the way, it is important to form a proper workflow around what exceptions your own data throws out. This should take into consideration the objective of the trade, coupled with the data quality available for that instrument. Eventually, you want your TCA framework to support your trading decisions, using the insights gained from your own orders and, when you have a new or seldom-traded security, the collective insights from your peer universe. This decision support should take the form of insights provided to traders at point of trade in an execution management system (EMS), as well as be part of the mix of data used in order flow automation – eventually moving away from a simple broker wheel to a more intelligent approach, with lessons learned and insights baked into the logic of your automation strategy.
The caveat is to be careful not to have signals from TCA become self-reinforcing to a single destination. An enforced set of randomness should continue to be preserved in any workflow, but especially in automated workflows where the decision cycle can be much more accelerated.
Q:As far as firms’ reporting obligations are concerned, could you outline the current requirements and expectations following introduction of the second Markets in Financial Instruments Directive (MiFID II)? In light of MiFID II, what can firms do to improve their best execution reviews?
Now in the second year of the Markets in Financial Instruments Directive (MiFID II), we are seeing a greater emphasis on the quality of the data in the regulated reports for BestEx – RTS28 and RTS27 – as well as bigger emphasis on the evolution of the OEP in place.
One of the most important improvements firms should consider is making an OEP fit the objective of the underlying fund – tailoring the OEP to what your trade is wanting to achieve, for example treating opportunistic high alpha flow differently from low impact, slower flow. Using a generic OEP on mixed strategies is a recipe for too many exceptions, as it muddles data with little to no insights and is an opportunity lost to maximise alpha.
Q: What considerations should firms make when evaluating available vendors and solutions to achieve best execution?
The key is to choose a vendor that covers the breadth of what you trade, coupled with the economies of scale that are offered with vendors that do more than BestEx with your data – TCA, decision support, market abuse and compliance and even trading. You ideally want a vendor that is order management system (OMS) agnostic and has experience with many systems – allowing you to grow and change over time – as well as a large dataset of trades.
Peer universe data sets are especially important at the beginning, where you want to baseline your own best execution policies. These stay important as you look towards insights, and even automation, when you do not have enough data and need a collective view. The more data you have, the stronger the analytics.
Q: What steps should firms take to build a best execution framework that optimises data quality and trading workflow? How important is it to have the right infrastructure in place?
Getting good quality transactional data in your BestEx framework is as important as the quality of your analytics. This means that your workflows might have to evolve to capture better data points – just as straight-through processing (STP) brought a measure of data quality, you need to consider EMSs and OMSs and even chat-based systems that bring together the right timestamps for your trading – especially in asset classes that previously relied on more manual workflows.
The key is to gather as much data as you can and then have a flexible approach in how you classify your trades in your OEP. Think about the difference between measuring an algorithmic order and a request for quote (RFQ) workflow, or even the timestamps offered from a chat-based solution. Select your workflows carefully and balance the workflows that fit best, and get the data necessary to prove and improve on the cost of your trading.
Q: Looking ahead, what is the future of best execution? Do firms need to continually improve and update their systems and processes to stay on the right path?
The future is around insights, just as much as the present is about exceptions. While the data quality component is key for exceptions, it becomes critical for insights. Only an iterative, continuously improving approach to capturing data around your workflow can keep pace. Without a ‘virtuous cycle’ for improving data capture, measuring trends, setting thresholds for exceptions, gaining insights and applying those insights to your trading, it is difficult to satisfy the fundamental goals of best execution, reduce your trading costs and retain your margins as you grow.