Bloomberg Quant (BBQ) Seminar Series updates
Stay informed
About
The Bloomberg Quant (BBQ) Seminar Series takes place in New York and covers a wide range of topics in quantitative finance. Each session is chaired by Bruno Dupire, head of Quantitative Research at Bloomberg LP, and features a keynote speaker presenting his or her current research. This presentation is followed by several “lightning talks” of 5 minutes each in quick succession. This format gives the audience the opportunity to be exposed to a wider variety of topics.
Past sessions
Keynote: Leif Andersen, Global Head of The Quantitative Strategies Group, Bank of America Merrill Lynch
A canonical problem with a myriad of applications is the computation of expectations of functions of random variables. In Finance, many advanced models involve random variables whose distribution laws that are best captured in Fourier space, where the regularly (uniformly spaced) FFT then provides a standard tool for efficient computations. For applications of modern computation methods (e.g., advanced integration techniques with spectral convergence), however, working on regular spaced grids is far from optimal. In this talk, we introduce the type-3 NUFFT and show how it can be combined with modern quadrature methods (e.g., Double-Exponential transformations and adaptive methods) to significantly improve standard algorithms, whether executed in the spaces of densities or of characteristic functions.
Keynote: Adil Reghai, Head of Quantitative Research on Equity and Commodity Derivatives, Natixis
In this keynote, we show how exotic greeks are essential in the PnL explanation exercise. We show how a well-chosen scenario can be used to compute effectively those exotic greeks in a production system. We then combine price adjustment techniques (fudge analysis) with singular perturbation approach to design exotic scenarios. These are the building blocks for understanding the impact of Local Stochastic Volatility (LSV) on exotic derivatives through a simple and elegant formula. This formula is tested on autocalls under real parameters and gives an excellent fit.
Keynote: Alexandre Antonov, Chief Analyst, Danske Bank
Artificial neural networks have recently been proposed as accurate and fast approximators in various derivatives pricing applications. Their extrapolation behaviour cannot be controlled due to the complex functional forms typically involved. We overcome this significant limitation and develop a new type of neural network that incorporates large-value asymptotics, allowing explicit control over extrapolation.
Keynote: John Hull, Maple Financial Professor of Derivatives and Risk Management, Joseph L. Rotman School of Management, University of Toronto
A common approach to valuing exotic options involves choosing a model and then determining its parameters to fit the volatility surface as closely as possible. We refer to this as the model calibration approach (MCA). This paper considers an alternative approach where the points on the volatility surface are features input to a neural network. We refer to this as the volatility feature approach (VFA). We conduct experiments showing that VFA can be expected to outperform MCA for the volatility surfaces encountered in practice. Once the upfront computational time has been invested in developing the neural network, the valuation of exotic options using VFA is very fast. VFA is a useful tool for the estimation of model risk. We illustrate this using S&P 500 data for the 2001 to 2019 period.
Keynote: Igor Tulchinsky, Founder, Chairman and CEO, WorldQuant
Amid the rising volume of data and rapid advancements in technology, quantitative methods are reshaping the financial industry and beyond. In this talk, we explore the future of quantitative finance and the analytical power of pairing advanced technologies with human capital for the benefit of business and societies. We will also discuss the growing potential of technology to enable meaningful collaboration and partnership between individuals around the world.
Keynote: Jim Gatheral, Presidential Professor of Mathematics, Baruch College, CUNY
In 2018, I presented the exponentiation theorem at BBQ – a result that permits the (formal) computation of the characteristic function for any stochastic volatility model in terms of diamond trees. The proof was a little obscure, even magical. In this talk, I will show how the exponentiation theorem follows straightforwardly from a more general but much easier to understand “broken exponential martingale” expansion. As applications, I will give expressions for the triple joint MGF in affine forward variance models and for the leverage swap under rough Heston.
Keynote: Blanka Horvath, King’s College London and The Alan Turing Institute
Neural network based data-driven market simulation unveils a new and flexible way of modelling financial time series, without imposing assumptions on the underlying stochastic dynamics. Though in this sense generative market simulation is model-free, the concrete modelling choices are nevertheless decisive for the features of the simulated paths. We give a brief overview of currently used generative modelling approaches and performance evaluation metrics for financial time series, and address some of the challenges to achieve good results in the latter. We also contrast some classical approaches of market simulation with simulation based on generative modelling and highlight some advantages and pitfalls of the new approach. While most generative models tend to rely on large amounts of training data, we present here a generative model that works reliably even in environments where the amount of available training data is notoriously small. Furthermore, we show how a rough paths perspective combined with a parsimonious Variational Autoencoder framework provides a powerful way for encoding and evaluating financial time series in such environments where available training data is scarce. Finally, we also propose a suitable performance evaluation metric for financial time series and discuss some connections of our Market Generator to deep hedging.
Keynote: Jesper Andreasen, Kwantfather, Saxo Bank
We discuss the subject of multi-asset volatility arbitrage and consider a class of multi-asset local volatility models where the local correlation is specified through local volatility of spreads. For a discrete version of the model we present an efficient and simple way of calibration by Monte-Carlo. Applications to foreign exchange, interest rates, and equities are discussed.
Keynote: Jean-Philippe Bouchaud, Chairman and Chief Scientist, CFM
Price impact refers to the correlation between an incoming order (to buy or to sell) and the subsequent price change. That a buy (sell) trade should push the price up (down) is intuitively obvious and is easily demonstrated empirically. Such a mechanism must, in fact, be present for private information to be incorporated into market prices. But it is also a sore reality for trading firms for which price impact induces large (but often overlooked) extra costs, especially when strategies are “crowded.” Monitoring and controlling impact has therefore become one of the most active domains of research in quantitative finance since the mid-nineties. A large amount of empirical results has accumulated over the years concerning the dependence of impact on traded quantities, the time evolution of impact, the impact of metaorders, cross-impact, etc. In this mini-review talk, we will present the most salient empirical findings, and some theoretical ideas that have been proposed to rationalize them. Some remaining puzzles and open problems will be discussed as well.
Keynote: Nassim Nicholas Taleb, NYU Tandon School and Universa Investments
A talk around topics from Statistical Consequences of Fat Tails: Real World Epistemology, Preasymptotics, and Applications (2020). We consider the consequences of Fat tailedness on portfolio theory, risk parity, and various theories of diversification. We show why higher moments are required to use tools from modern financial theory (portfolio theory and factor analyses) and how these assumptions are not met in practice. We show the effect of the absence of ellipticality on portfolio construction. We discuss the flaws in the current “patches” to model conditional correlations and volatility.
Keynote: Marcos López de Prado, Professor of Practice at Cornell University, and CIO at True Positive Technologies
We introduce the nested clustered optimization algorithm (NCO), a method that tackles both sources of efficient frontier’s instability. Monte Carlo experiments demonstrate that NCO can reduce the estimation error by up to 90%, relative to traditional portfolio optimization methods (e.g., Black-Litterman).
Keynote: Alexander Lipton, Chief Technical Officer, Sila Money; Professor and Dean’s Fellow, The Hebrew University of Jerusalem; Connection Science Fellow, MIT
Existing banking and payment systems, while still working, are obsolete and no longer aligned with the constantly changing requirements of the modern world. While open-access Internet protocols have unleashed a wave of creativity and growth in numerous fields, banking is not one of them. The reason stems mostly from the fact that successful open-access protocols for money and identity, while sorely needed, are conspicuously absent at present. We argue that a regulatory compliant, fiat-backed tokenized medium of exchange, can help to fill this gap. The corresponding token is an electronic analog of cash, with all its pluses and minuses. While such tokens can have numerous fintech applications, we consider just a few representative examples. Experience shows that all decentralized crypto coins are inherently unstable, which makes them less than useful for commercial applications. Unfortunately, building a successful stable token is hard.
Contrary to the often-made claims, it is not possible to build a truly decentralized stable token, so that any potentially successful stable coin has to be partially centralized. The degree of decentralization can vary. We describe three approaches, including fully collateralized custodial tokens, partially collateralized custodial tokens, and dynamically stabilized tokens, and conclude that only fully collateralized tokens can be stable, even under extreme circumstances. We argue that asset-backed digital trade coins have a great future as much needed counterbalance to fiat currencies.
Keynote: Julien Guyon, Bloomberg L.P.
Since VIX options started trading in 2006, many researchers have tried to build a model that jointly and exactly calibrates to the prices of S&P 500 (SPX) options, VIX futures and VIX options. So far the best attempts, which used parametric continuous-time jump-diffusion models on the SPX, could only produce approximate fits. In this talk we solve this longstanding puzzle using a completely different approach: a nonparametric discrete-time model. The model is cast as a dispersion-constrained martingale transport problem which is solved using the Sinkhorn algorithm. We prove by duality that the existence of such model means that the SPX and VIX markets are jointly arbitrage-free. The algorithm identifies joint SPX/VIX arbitrages should they arise. Our numerical experiments show that the algorithm performs very well in both low and high volatility environments. Finally, we briefly discuss:
(i) how our technique extends to continuous-time stochastic volatility models;
(ii) a remarkable feature of the SPX and VIX markets: the inversion of convex ordering, and how classical stochastic volatility models can reproduce it;
(iii) why, due to this inversion of convex ordering, and contrary to what has often been stated, among the continuous stochastic volatility models calibrated to the market smile, the Dupire local volatility model does not maximize the price of VIX futures.
Keynote: Ioannis Karatzas, Columbia University
We develop a mathematical theory for finance based on the following “viability’’ principle: that it should not be possible to fund a nontrivial liability, starting with arbitrarily small initial capital. In the context of continuous semi-martingale asset prices, we show that proscribing such egregious forms of what is commonly called “arbitrage” (but allowing for the possibility that one portfolio might outperform another), turns out to be equivalent to each of the following conditions:
(i) a portfolio with the local martingale numeraire property exists,
(ii) a growth-optimal portfolio exists,
(iii) a portfolio with the log-optimality property exists,
(iv) a strictly positive local martingale deflator exists,
(v) the market has locally finite maximal growth, and
(vi) there is no possibility for making “unbounded profits with bounded risk”.
(Joint work — book of the same title — with Constantinos Kardaras at LSE.)
Keynote: Peter Carr
While the seminal contributions of Black Scholes and Merton to options pricing were published in 1973, much was known by them and others in 1969. In this talk, we turn back the clock exactly 50 years and try to determine what was known and not known about pricing options on November 18, 1969.
Keynote: Bruno Dupire
Surprisingly, common market practices leave free convexity on the table. Markets trade thousands of underlyings, each one with tens or even hundreds of options, quoted throughout the day. Needless to say, the quotes are not generated manually. They are automated and derived from a functional form with a few parameters. If we know this parameterization, we know in advance that the prices tomorrow of many traded securities will belong to a low dimensional (number of parameters) manifold in a high dimensional (number of securities) space. If the vector of today prices does not belong to the convex hull of the manifold it creates arbitrage. We examine several market practices (recalibration of Black-Scholes or of stochastic volatility models, interest rate interpolation by piecewise constant instantaneous forward rates…) and show that many violate the no arbitrage condition.
Keynote: Jerome Pesenti
Facebook is currently using AI across its family of apps to benefit billions of people around the world. In this talk, VP of AI Jerome Pesenti speaks to how Facebook approaches deploying AI at this scale, the challenges to successfully doing so, and the specific tools and techniques that can help other businesses solve for these issues.
Keynote: Robert F. Engle, New York University Stern School of Business
Geopolitical events may impact volatilities of all asset classes, sectors and countries. It is shown that innovations to volatilities are correlated across assets and asset classes and therefore can be used to forecast or hedge geopolitical risks. The paper develops a statistical model of a Geopolitical Volatility Factor or GPVF, based on the standardized residuals from a factor model with GARCH style residuals. A test for GPVF is developed with estimation algorithms. These are applied to 9 ETFs of sectors of the US and to 45 MSCI country ETFs. The model has asset pricing implications for hedging geopolitical risks.
Fabio Mercurio, Bloomberg L.P.
In this talk, we define and model forward risk-free term rates, which appear in the payoff definition of derivatives, and possibly cash instruments, based on the new interest-rate benchmarks that will be replacing IBORs globally. We show that the classic interest-rate modeling framework can be naturally extended to describe the evolution of both the forward-looking (IBOR-like) and backward-looking (setting-in-arrears) term rates using the same stochastic process. We then introduce an extension of the LIBOR Market Model (LMM) to backward-looking rates. This extension, which we call generalized forward market model (FMM), completes the LMM by providing additional information about the rate dynamics between fixing/payment times, and by implying dynamics of forward rates under the classic money-market measure. Our FMM formulation is based on the concept of extended zero-coupon bonds, which proves to be very convenient when dealing with backward-looking setting-in-arrears rates. Thanks to this, not only the bonds themselves, but also the forwards and swap rates, along with their associated forward measures, can be defined at all times, even those beyond their natural expiries.
Keynote: Gordon Ritter, Ph.D.
Reinforcement learning is a way of training a machine to find an optimal policy for a stochastic optimal control system, without explicitly building a model for the system. In reinforcement learning, the search for optimal policies is organized around the search for the optimal value function (in the sense of the Hamilton-Jacobi-Bellman equation). We show that many problems in finance are special cases of this framework; for example, any derivative that can be priced by replication has the property that its price is given by the value function of the dynamic replicating portfolio strategy. Optimal execution problems — such as the Almgren–Chriss model and its extensions, are also problems of this type, and reinforcement learning techniques can be used to train agents which are capable of executing, or hedging, optimally. This remains true in the presence of market impact, and is insensitive to the type of market impact model that is used.
Professor Peter Carr, NYU
The buyer of just-in-time portfolio insurance can floor one out of a set of N daily price relatives at some contractually specified positive constant. The decision whether or not to floor a given price relative must be made just after the price relative is realized. We use a little known branch of mathematics called pseudo-analysis to value the insured portfolio in closed form. Pseudo analysis allows the addition and/or multiplication binary operations to differ from ordinary addition and multiplication. For example, ordinary addition can be replaced by the maximum operation while ordinary multiplication is unchanged.
Professor Andrew W. Lo, MIT
Financial AI seems so close, yet so far. We have automated trading algorithms, machine-learning models of credit risk, electronic exchanges, robo advisors, and cryptocurrencies, but machines still haven’t replaced portfolio managers, financial advisors, and bankers. So what’s missing? Not artificial intelligence. We have yet to develop an algorithmic understanding of human behavior as it is, rather than as it should be. We need a theory of artificial stupidity.
Charles-Albert Lehalle, Senior Research Advisor, Capital Fund Management
Since academic “factors” attracted a lot of attentions the last 10 years, it is of paramount importance to understand the difference between “paper portfolios”, i.e. the result of backtests, with real ones. One of the main sources of differences are the transaction costs. Being able to compute the transaction costs not only enables us to apply a corrective term to the Sharpe ratio of factors like Size, Value and Momentum, it allows to estimate their “break even”, giving crucial information about the capacity of such factors. This talk will discuss various ways of estimating transaction costs. We will mainly compare a non-parametric and a parametric approach, the later using a model for transaction costs that includes market impact. To provide real numbers, we first reimplemented these three factors and relied on a unique database (ANcerno), containing a very large sample of asset managers large trades, in the U.S., over 15 years.
Professor Hélyette Geman
The first part of the talk will review the way the economic assumption of No Arbitrage, combined to powerful results established in probability theory in the fairly recent past, leads to a number of beautiful results in Quantitative Finance; in particular, i) the existence of ‘pricing measures’ under which the prices of primitive securities – in the right numeraire – are martingales; ii) under the physical measure P (the one under which big data are accumulated) , No Arbitrage implies that normality of asset returns can be recovered through a stochastic time change where the clock is driven by the order flow.
The second part of the talk will discuss some key features of cryptocurrencies observed so far, and which methodology can be proposed to analyze this particular asset class stored in a new type of inventory called Blockchain.
Professor Walter Schachermayer
A basic problem when trading in financial markets is to analyze the price movement caused by placing an order. Clearly we expect – ceteris paribus – that placing an order will move the price to the disadvantage of the agent. This price movement is called the market impact. Following the recent work of A. Kyle and A. Obizhaeva we apply dimensional analysis – a line of arguments well known in classical physics – to analyze to which extent the square root law applies. This universal law claims that the market impact is proportional to the square root of the size of the order. We also analyze the dependence of the trading activity on a stock, i.e. number of trades per unit of time, in dependence of some suitable explanatory variables. Dimensional analysis leads to a 2/3 law: the number of trades is proportional to the power 2/3 of the exchanged risk. The mathematical tools of this analysis reside on elementary linear algebra. Joint work with Mathias Pohl, Alexander Ristig and Ludovic Tangpi.
Jim Gatheral, Presidential Professor, Baruch College, CUNY
We use the Alòs Itô Decomposition Formula to express certain conditional expectations as exponentials of forests of trees. Each tree represents iterated applications of a new diamond operator. As one application, we compute an exact formal expression for the leverage swap for any stochastic volatility model expressed in forward variance form. As another, we show how to extend the Bergomi-Guyon expansion to all orders in volatility of volatility. Finally, we compute exact expressions under rough volatility, obtaining in particular the fractional Riccati equation for the rough Heston characteristic function. As a corollary, we compute a closed-form expression for the leverage swap in the rough Heston model.
Marcos López de Prado, CEO, True Positive Technologies
The rate of failure in quantitative finance is high, and particularly so in financial machine learning. The few managers who succeed amass a large amount of assets, and deliver consistently exceptional performance to their investors. However, that is a rare outcome, for reasons that will become apparent in this presentation. Over the past two decades, I have seen many faces come and go, firms started and shut down. In my experience, there are 7 critical mistakes underlying most of those failures.
Buy Rough, Sell Smooth
Paul Glasserman, Jack R. Anderson Professor of Business Decision, Risk and Operations, Columbia Business School, New York
Recent work has documented roughness in the time series of S&P 500 volatility and investigated its implications for option pricing. We study a strategy for trading stocks based on a measure of their implied roughness. A strategy that goes long the roughest stocks and short the smoothest stocks earns statistically significant excess annual returns of 6-8%, depending on the time period. The result is robust to controlling for a broad range of factors, including liquidity. We compare alternative measures of roughness and find that the profitability of the strategy relies on sorting stocks based on implied rather than realized roughness. We discuss possible explanations. This is joint work with Pu He.
An Auction-and-protocol Approach to Converting Legacy LIBOR Derivatives to New Reference Rates
Darrell Duffie, Dean Witter Distinguished Professor of Finance, Stanford Graduate School of Business
This talk outlines an auction-and-protocol method for converting trillions of dollars (notional) of legacy LIBOR-linked derivatives to new reference rates, such as the Secured Overnight Financing Rate (SOFR). Because LIBOR is a significantly higher interest rate than SOFR, the receiver of LIBOR on a legacy contract will require compensation to convert. The key design problem is how to discover a fair rate of compensation, maturity by maturity, and how to obtain the agreement of contract holders. The presentation will include the implications of new theoretical results by Anthony Zhang of Stanford University on how to set the bid-ask-spread fee that will be charged to protocol users and paid to auction participants, so as to encourage robust price discovery in the auction and avoid free riding on the protocol.
Statistics of VIX Futures and Applications to Trading Volatility Exchange-Traded Products
Marco Avellaneda, Professor, Courant Institute of mathematical Sciences; Finance Concepts LLC
We study the dynamics of VIX futures and ETNs/ETFs. We find that contrary to classical commodities, VIX and VIX futures exhibit large volatility and skewness, consistent with the absence of cash-and-carry arbitrage. The constant-maturity futures (CMF) term-structure can be modeled as a stationary stochastic process in which the most likely state is a contango with VIX ≈ 12% and a long-term futures price V∞ ≈ 20%. We analyze the behavior of ETFs and ETNs based on constant-maturity rolling futures strategies, such as VXX, XIV and VXZ, assuming stationarity and through a multi-factor model calibrated to historical data. We find that buy-and-hold strategies consisting of shorting ETNs that roll long futures, or buying ETNs that roll short futures, will produce theoretically-sure profits if it is assumed that CMFs are stationary and ergodic. To quantify further, we estimate a 2-factor lognormal model with mean-reverting factors to VIX and CMF historical data from 2011 to 2016. The results confirm the profitability of buy-and-hold strategies, but also indicate that the latter have modest Sharpe ratios, of the order of SR = 0.5 or less, and high variability over 1-year horizon simulations. This is due to the surges in VIX and CMF backwardations which are experienced sporadically, but also inevitably, in the volatility futures market.
Applications of ML to Finance
Bruno Dupire, Head of Quant Research, Bloomberg L.P.
A major task in finance is to establish a link between observables (structured and unstructured data) and future behavior, a typical learning task. Naturally, the field is embracing Machine Learning and we explore the use of the latest techniques to attack these problems, with special focus on forecasting, sentiment analysis, smart strategies, covariance matrix cleaning, dimension reduction, satellite images analysis and different tools to apply various methods to different data sets. We also show how to use visualizations in order to understand better what the machine understands.
How Much SRISK is Too Much
Robert Engle, Nobel Prize in Economics, Professor of Finance, NYU Stern
When financial firms are undercapitalized, they are vulnerable to external shocks. This is commonly measured by stress tests or market based measures such as SRISK (systemic risk). More importantly, the natural response to this risk is to raise capital and this can endogenously start a financial crisis. Excessive credit growth can be interpreted as undercapitalization of the financial sector. Hence we can ask how much SRISK can an economy stand and what is the probability of a crisis. Using a crisis intensity variable constructed by the Romers, a Tobit model is estimated for 23 developed economies. The probability of crisis and an SRISK capacity measure can be computed from these estimates. These are plotted for several countries.
Revisiting Sources of Return and Risk in Portfolios
Robert Litterman, Senior Partner, Kepos Capital
Asset allocation has in recent years been enhanced by recognizing the factors that drive return and risk in portfolios. This talk will review the historical evidence on factor returns and risk and the implications for portfolio construction.
The Present of Futures: Calculating Convexity Adjustments in the New Multi-Curve World
Fabio Mercurio, Global Head of Quantative Analytics, Bloomberg L.P.
Futures convexity adjustments in the multi-curve world depend on: i) the distribution of forward LIBORs, ii) the distribution of OIS rates, and iii) the correlation between LIBORs and OIS rates. In this talk, we introduce a new multi-curve framework for pricing futures convexity adjustments. We assume that forward LIBORs follow a one-factor shifted-lognormal LMM, and that OIS rates evolve according to a general one-factor Cheyette (1992) model. We will consider two particular cases: 1) OIS rates follow a one-factor Hull-White (1990) model; 2) the LIBOR-OIS basis volatility is minimal, which includes the sub-case of deterministic basis when LIBOR and OIS curves are perfectly correlated. We will derive explicit convexity adjustments in both cases, and analyze the impact of the cap skew in their valuation. We will show that the extent a cap smile affects futures pricing depends on the chosen OIS dynamics, or equivalently on the assumed behavior of the LIBOR-OIS basis.
Get Informed
Sign up today to receive invitations to future events in this series.