MARKET MICROSTRUCTURE AND HIGH-FREQUENCY DATA

Chicago, June 1-3, 2017

Return to main conference page.

PROGRAM

Thursday June 1 | 5727 S University Ave
Time Speaker Title (click for abstract)
8:30am Registration and breakfast
9:00am Conference opening
9:10am
Torben Andersen Intraday Trading Invariance in Foreign Exchange Futures
9:50am Kim Christensen The Drift Burst Hypothesis
10:30am Coffee Break
11:00am Olivier Scaillet High-Frequency Jump Analysis of the Bitcoin Market
11:40am Suzanne Lee The Impact of Jumps on Carry Trade Return
12:20pm Lunch at GCIS
1:50pm George Tauchen Exact Bayesian Moment Based Inference for the Distribution of the Small-Time Movements of an Ito Semimartingale
2:30pm Knut Are Aastveit Bayesian Predictive Density Combinations for Exchange Rate Models
3:10pm Coffee Break
3:40pm Dacheng Xiu When Moving Average Models Meet High-Frequency Data: Uniform Inference on Volatility
4:20pm Ruey Tsay Efficient estimation of Value at Risk by Effective Data Pooling
5:00pm Day 1 concludes
Friday June 2 | 5727 S University Ave
Time Speaker Title (click for abstract)
9:30am Registration and breakfast
10:00am Steve Xu HFT Principles
10:40am Mini-Break
11:00am Paul Besson Main shortcomings of standard volume based trading algorithms and potential improvements of passive-volume indexing
11:40am Aurelien Alfonsi Optimal Execution in a Hawkes Price Model and Calibration
12:20pm Lunch at GCIS
1:50pm Lingjiong Zhu Dark Pool Trading: A Hawkes Process Approach
2:30pm Tzu-Wei Yang A reduced-form model for level-1 limit order books
3:10pm Coffee Break
3:40pm Agostino Capponi Intraday Market Making with Overnight Inventory Costs
4:20pm Andrew Papanicolaou Trading Illiquid Goods: Market Making as a Sequence of Sealed-Bid Auctions, with Analytic Results
5:00pm Reception in the Reading Room (down the hallway from the Lecture Room)
Saturday June 3 | 5727 S University Ave
Time Speaker Title (click for abstract)
9:30am Registration and breakfast
10:00am Ravi Jagannathan Frequent Batch Auctions vs Continuous Trading in Stocks: Effect on Volume, Liquidity & Crash Risk
10:40am Markus Pelger Large-dimensional factor modeling based on high-frequency observations
11:20am Lunch at GCIS
12:50pm Han Xiao Autoregressive Model for Matrix Valued Time Series
1:30pm Zhengjun Zhang Generalized Autoregressive Conditional Frechet Models for Maxima
2:10pm Coffee Break
2:40pm Viktor Todorov Nonparametric Option-based Volatility Estimation
3:20pm Richard Chen Model-Free Approaches to Discern Non-Stationary Microstructure Noise and Time-Varying Liquidity in High-Frequency Data
4:00pm Conference concludes

ABSTRACTS

and BIOS (for speakers who do not have web sites)

Bayesian Predictive Density Combinations for Exchange Rate Models
Knut Are Aastveit
Bank of Norway
We study the predictive ability of macroeconomic fundamentals for monthly exchange rates using a Bayesian predictive combination approach. Our combination approach accounts for time-varying uncertainty of several model and data features in order to provide more accurate and complete density forecasts. The combination weights are latent random variables that depend on past history. The combined density scheme is incorporated in a Bayesian Sequential Monte Carlo method which re-balances the set of forecasted densities in each period using updated information on the time-varying weights. In this way, we are able to weight data uncertainty, parameter uncertainty, model uncertainty, including model incompleteness, and uncertainty in the combination of weights in a coherent way. In an empirical exercise, we study the forecasting performance of our combination approach relative to other combination approaches and common benchmarks for seven major exchange rates vis-a-vis the US dollar over the period 2000-2014. We find that our combination approach improves point and density forecasts, relative to various benchmark approaches, by magnitudes of 10-20 percent and 30-40 percent, respectively. While accounting for weight uncertainty plays a role in improving the density forecasting performance, the main bulk of the gains, both in terms of point and density forecasting performance, stems from allowing for model incompleteness in the combination scheme.
Optimal Execution in a Hawkes Price Model and Calibration
Aurelien Alfonsi
CERMICS
We study a linear price impact model including other liquidity takers, whose flow of orders follows a Hawkes process. The optimal execution problem is solved explicitly in this context, and the closed-formula optimal strategy describes in particular how one should react to the orders of other traders. Then, we provide some theoretical extensions and a calibration protocol for our optimal execution model. The Hawkes parameters and the propagator are estimated independently on financial data from stocks of the CAC40, and we backtest the optimal execution strategy on market data.
Intraday Trading Invariance in Foreign Exchange Futures
Torben Andersen
Kellogg
Prior work of Andersen, Bondarenko, Kyle and Obizhaeva (2015) establishes that the intraday trading patterns in the E-mini S&P 500 futures contract are consistent with the following invariance relationship: The return variation per transaction is log-linearly related to trade size, with a slope coefficient of -2. This association applies both across the intraday diurnal pattern and across days in the time series. The factor of proportionality deviates sharply from prior hypotheses relating volatility to transactions count or trading volume. This paper documents that a similar invariance relation holds for foreign exchange futures. However, the log-linear association is not fixed, but shifts over time reflecting an, all else equal, declining trend in the average trade size. The findings are remarkably robust across the full set of currency contracts explored, providing challenges to market microstructure research to rationalize these tight intraday and intertemporal interactions among key market activity variables.
Main shortcomings of standard volume based trading algorithms and potential improvements of passive-volume indexing
Paul Besson
Kepler Chevreux
Main trading algorithms are split into two types, "time-scheduled" algos (VWAP, TWAP, IS) and "volume-driven" algos (Percentage of volume, In Line volume, etc.). "Volume-driven" algos are among the most popular. In this note they will be referred to as "Standard Volume" algos. Although tremendous research efforts have been invested in trading algorithms over the past 20 years, most current trading algos are still widely indexed on the past traded volume. This hard-wired indexing type is rarely challenged, although we will show that standard "volume-driven" algos’ performance could be greatly improved simply by changing the "Standard Volume" indexing to a "Passive Volume" indexing; the passive volume is simply the volume traded passively, according to the side of the transaction. Thus, for a buy order, only the "Bid volume" will be selected, while for a sell order, only the "Ask volume" will be selected. Standard Volume algos induce predictable aggressive trades. Large aggressive orders in the market often trigger, in turn, aggressive trades from Standard Volume algos. This phenomenon leads to late aggressive trading, which is particularly predictable and detrimental to execution performance. In contrast, Passive Volume indexing strongly reduces the proportion of aggressive orders. With time, trading imbalances normalise, and the passive volume share becomes steadier. This makes Passive Volume indexing practical for trades with durations above 60 minutes on large caps. Passive Volume algos outperform standard ones by 0.2 spreads. According to our simulations, performances improve markedly when a Passive Volume algo is used, versus Arrival Price as well as versus VWAP Benchmarks The present article was established by Kepler Cheuvreux Quantitative Research team.
Bio
Paul Besson is Head of Quantitative Research at Kepler Cheuvreux (Europe's largest independent brokerage), and his main areas of research are market microstructure and execution on equities. Since 2012, he has headed up Kepler Cheuvreux's research partnership with the Institut Louis Bachelier (ILB) and the Toulouse School of Economics (TSE) to promote academic research on empirical microstructure concerns. Paul joined Kepler Cheuvreux in January 2012 after 15 years in fund management. He worked as a fund manager in quantitative arbitrage, for both hedge funds and institutional funds. He also has three years' experience as a buy-side Head of Quantitative Research. In addition, Paul was a lecturer in finance for five years at Sciences-Po and HEC in Paris. Paul is a graduate of ENSAE (National School of Statistics Paris).
Intraday Market Making with Overnight Inventory Costs
Agostino Capponi
Columbia
We model a market making HFT that seeks to end the day flat to avoid overnight inventory costs. Although these costs only apply at the end of the day, they impact intraday price dynamics and generate a negative price-inventory relationship. The sensitivity of prices to inventory levels intensifies with time, strengthening price impact and widening bid-ask spreads. These predictions are consistent with U.S. Treasury data. A comparative statics analysis reveals that while inventory costs harm price stability, this effect is attenuated by higher trading activity. A welfare analysis shows that these costs have the greatest negative impact in inactive markets.
Model-Free Approaches to Discern Non-Stationary Microstructure Noise and Time-Varying Liquidity in High-Frequency Data
Richard Chen
Chicago
In this paper, we provide non-parametric statistical tools to test stationarity of microstructure noise in general hidden Ito semimartingales, and discuss how to measure liquidity risk using high-frequency financial data. In particular, we investigate the impact of non-stationary microstructure noise on some volatility estimators, and design three complementary tests by exploiting edge effects, information aggregation of local estimates and high-frequency asymptotic approximation. The asymptotic distributions of these tests are available under both stationary and non-stationary assumptions, thereby enable us to conservatively control type-I errors and meanwhile ensure the proposed tests enjoy the asymptotically optimal statistical power. Besides it also enables us to empirically measure aggregate liquidity risks by these test statistics. As byproducts, functional dependence and endogenous microstructure noise are briefly discussed. Simulation with a realistic configuration corroborates our theoretical results, and our empirical study indicates the prevalence of non-stationary microstructure noise in New York Stock Exchange.
Autoregressive Model for Matrix Valued Time Series
Han Xiao
Rutgers
In finance, economics and many other fields, observations in a matrix form are often observed over time. For example, several key economic indicators are reported in different countries every quarter. Various financial characteristics of many companies are reported over time. Import-export figures among a group of countries can also be structured in a matrix form. Although it is natural to turn the matrix observations into a long vector then use standard vector time series models, it is often the case that the columns and rows of a matrix represent different sets of information that are closely interplayed. We propose a novel matrix autoregressive model that maintains and utilizes the matrix structure to achieve greater dimensional reduction as well as easier interpretable results. The model can be further simplified by a set of reduced rank assumptions. Estimation procedure and its theoretical properties are investigated and demonstrated with simulated and real examples
The Drift Burst Hypothesis
Kim Christensen
Aarhus
The drift burst hypothesis postulates the existence of short-lived locally explosive trends in the price paths of financial assets. The recent US equity and treasury flash crashes can be viewed as two high profile manifestations of such dynamics, but we argue that drift bursts of varying magnitude are an expected and regular occurrence in financial markets that can arise through established mechanisms of liquidity provision. At a theoretical level, we show how to build drift bursts into the continuous-time Itô semi-martingale model in such a way that the fundamental arbitrage-free property is preserved. We then develop a non-parametric test statistic that allows for the identification of drift bursts from noisy high-frequency data. We apply this methodology to a comprehensive set of tick data and show that drift bursts form an integral part of the price dynamics across equities, fixed income, currencies and commodities. A majority of the identified drift bursts are accompanied by price reversion and can therefore be regarded as "flash crashes." The reversal is found to be stronger for negative drift bursts with large trading volume, which is consistent with endogenous demand for immediacy during market crashes.
Frequent Batch Auctions vs Continuous Trading in Stocks: Effect on Volume, Liquidity & Crash Risk
Ravi Jagannathan
Kellogg
TBA
The Impact of Jumps on Carry Trade Return
Suzanne Lee
Georgia Tech
This paper investigates how jump risks are priced in currency markets. We find that currencies whose changes are more sensitive to negative market jumps provide significantly higher expected returns. The positive risk premium constitutes compensation for the extreme losses during periods of market turmoil. Using the empirical finding, we propose a jump modified carry trade strategy, which has approximately 2-percent point (per annum) higher returns than the regular carry trade strategy. These findings result from the fact that negative jump betas are significantly related to the riskiness of currencies and business conditions.
Trading Illiquid Goods: Market Making as a Sequence of Sealed-Bid Auctions, with Analytic Results
Andrew Papanicolaou
NYU Tandon
We provide analytic results for the optimal control problem faced by a market maker who can only obtain and dispose of inventory via a sequence of sealed-bid auctions. Under the assumption that the best competing response is exponentially distributed around a commonly discerned fair market price we examine properties of the market maker's optimal behavior. We show that simple adjustments to skew and width accommodate customer arrival imbalance. We derive a straightforward relationship between the market marker's fill probability and direct holding costs. A simple formula for optimal bidding in terms of (non-myopic) inventory cost is presented. We present the results as a perturbation of an improvement to a ``linear skew, constant width'' (CWLS) market making heuristic.
Large-dimensional factor modeling based on high-frequency observations
Markus Pelger
Stanford
This paper develops a statistical theory to estimate an unknown factor structure based on financial high-frequency data. I derive an estimator for the number of factors and consistent and asymptotically mixed-normal estimators of the loadings and factors under the assumption of a large number of cross-sectional and high-frequency observations. The estimation approach can separate factors for continuous and rare jump risk. The estimators for the loadings and factors are based on the principal component analysis of the quadratic covariation matrix. The estimator for the number of factors uses a perturbed eigenvalue ratio statistic. The results are obtained under general conditions, that allow for a very rich class of stochastic processes and for serial and cross-sectional correlation in the idiosyncratic components. I apply the approach to the U.S. equity market and find four stable continuous factors and one stable jump factor.
High-Frequency Jump Analysis of the Bitcoin Market
Olivier Scaillet
Geneva
We use the database leak of the exchange Mt. Gox to analyze the dynamics of the price of bitcoin on the period June 2011 to November 2013. The data set gives us a rare opportunity to observe the emergence of a retail-focused, highly speculative and unregulated market at a tick frequency with trader identifiers at the transaction level. We find 124 jump dates—or one jump per week—and relate them to market activity and liquidity factors. Jumps are frequent events and they cluster in time. They are predicted by the order flow imbalance and the preponderance of aggressive traders, as well as a widening of the bid-ask spread. Jumps have short-term positive impact on market activity and illiquidity and see a persistent change in the price.
Exact Bayesian Moment Based Inference for the Distribution of the Small-Time Movements of an Ito Semimartingale
George Tauchen
Duke
We modify the Gallant-Tauchen (1996) efficient method of moments (EMM) method to perform exact Bayesian inference, where exact means no reliance on asymptotic approximations. We use this modification to evaluate the empirical plausibility of recent predictions from high frequency financial theory regarding the small-time movements of an Ito semimartingale. The theory indicates that the probability distribution of the small moves should be locally stable around the origin. It makes no predictions regarding large rare jumps, which get filtered out. Our exact Bayesian procedure imposes support conditions on parameters as implied by this theory. The empirical application uses S\&P Index options extending over a wide range of moneyness, including deep out of the money puts. The evidence is consistent with a locally stable distribution valid over most of the support of the observed data while mildly failing in the extreme tails, about which the theory makes no prediction. We undertake diagnostic checks on all aspects of the procedure. In particular, we evaluate the distributional assumptions regarding a semi-pivotal statistic, and we test by Monte Carlo that the posterior distribution is properly centered with short credibility intervals. Taken together, our results suggest a more important role than previously thought for pure jump-like models with diminished, if not absent, diffusive component.
Nonparametric Option-based Volatility Estimation
Viktor Todorov
Kellogg
In this talk we first review the different methods for recovering volatility non-parametrically from high-frequency return data. We then derive analogues of some of these methods for recovering volatility from options written on the underlying asset. The option data is observed with error and we prove the consistency of the option-based volatility estimators. We further derive a Central Limit Theorem for the estimators. The limiting distribution is mixed-Gaussian and depends on the quality of the option data on the given date as well as on the overall state of the economy. We compare the option and return based volatility estimators and present numerical experiments documenting the superior performance of the former.
Efficient estimation of Value at Risk by Effective Data Pooling
Ruey Tsay
Chicago Booth
Value at Risk is one of the most important measures used in financial risk management. It is a challenging problem to obtain reliable estimate of VaR, since there is often a lack of sufficient amount of observations to accurately estimate the tail quantile. In this talk we propose a general approach of pooling data from other 'similar' companies to increase the available data and to make more accurate estimation. A similarity measure is proposed to identify the nearest neighbors and a bandwidth selection procedure is developed for practical guidance. Simulation and real data examples are presented.
When Moving Average Models Meet High-Frequency Data: Uniform Inference on Volatility
Dacheng Xiu
Chicago Booth
In this paper, we propose a general framework of volatility inference with noisy high-frequency data. The observed transaction price follows a continuous-time Ito-semimartingale, contaminated by a discrete-time MA$(\infty)$ noise associated with the arrival of trades. Our estimator is obtained by maximizing the likelihood of a misspecified MA model with homoscedastic innovations. We show that this quasi-likelihood estimator is consistent with respect to the quadratic variation of the semimartingale, and that the estimator is asymptotically mixed normal. We propose a AIC/BIC type criterion for order selection, and establish uniformly valid inference on volatility, while allowing for model selection mistakes. In addition, our estimator is adaptive to the presence of the noise, and its convergence rate varies from $n^{-1/4}$ to $n^{-1/2}$, depending on the magnitude of the noise. We thereby provide uniform inference on volatility over small and large noises. Finally, we present the semiparametric efficiency bound on volatility estimation, from which our estimator deviates slightly. To implement our likelihood estimator, we adopt Kalman filter and a state-space representation, which is tuning-free and warrants a positive estimate in finite sample. In contrast, we show that the classical Whittle approximation is inconsistent under in-fill asymptotics.
HFT Principles
Steve Xu
Hehmeyer LLC
HFT is a highly debated topic just as much as misconceived by media, public and even by many professional market practitioners. Structured often as a proprietary trading business, most HFT trading firms try to remain an ultra-low profile to protect their trade secrets. This mysteriousness does not help their public image, unfortunately, though HFT is more of a result from combining finance, science and engineering than any other investment or trading methods. In this talk, Mr. Xu will demystify what HFT really is, correct some common misconceptions, explain basic principles behind HFT trading, introduce some common HFT strategy types, and explain why, in his mind, HFT has existed, will exist and must exist in one form or another in a free market place.
Bio
Mr. Steve Xu got his bachelor's degrees in Geology and Economics from Peking University and got masters degrees in Geophysics and Financial Engineering from University of Michigan. In 2005, he joined Jump Trading as a trader specialized in index futures trading. In 2009, he joined HTG Capital Partners (now renamed to Hehmeyer LLC) as a partner and built a trading group specialized in trading index futures, treasury futures, commodity futures and options. Steve is president of the Chinese Trading and Derivatives Association.
A reduced-form model for level-1 limit order books
Tzu-Wei Yang
Minnesota
One popular approach to model the limit order books dynamics of the best bid and ask at level-1 is to use the reduced-form diffusion approximations. It is well known that the biggest contributing factor to the price movement is the imbalance of the best bid and ask. We investigate the data of the level-1 limit order books of a basket of stocks and study the numerical evidence of drift, correlation, volatility and their dependence on the imbalance. Based on the numerical discoveries, we develop a nonparametric discrete model for the dynamics of the best bid and ask, which can be approximated by a reduced-form model with analytical tractability that can fit the empirical data of correlation, volatilities and probability of price movement simultaneously. This is a joint work with Prof. Lingjiong Zhu.
Generalized Autoregressive Conditional Frechet Models for Maxima
Zhengjun Zhang
Wisconsin
This talk introduces a novel dynamic framework to integrate the static generalized extreme value (GEV)distribution with dynamic modeling approach for the modeling of maxima in financial time series. Specifically, generalized autoregressive conditional Fr\'{e}chet (GACF) models for maxima are proposed. The GACF allows for time-varying scale parameter (volatility) and shape parameter (tail index) of Fr\'echet distributions. The GACF model provides a direct and accurate modeling of the time varying behavior of maxima and offers a new angle to study the tail risk dynamics in financial markets. Probabilistic properties of GACF are fully studied and an irregular maximum likelihood estimator is used for model estimation, with its statistical properties investigated. Simulation study shows the flexibility of GACF and confirms the reliability of its estimators. The results of two real data examples in which GACF is used for market tail risk monitoring and VaR calculation are presented, where significant improvement over static GEV has been observed. Empirical result of GACF is consistent with the findings of the dynamic peak-over-threshold (POT) literature, that the tail index of financial markets varies through time (a joint work with Zifeng Zhao and Rong Chen.)
Dark Pool Trading: A Hawkes Process Approach
Lingjiong Zhu
Florida State
Dark pools are automated trading facilities which do not display bid and ask quotes to the public. In this talk, we use the Hawkes process to model the clustered arrival of trades in a dark pool and analyze various performance metrics including time-to-first-fill, time-to-complete-fill and the expected fill rate of a resting dark order. This is based on the joint work with Xuefeng Gao and Xiang Zhou.

The Stevanovich Center is supported by the generous philanthropy of
University of Chicago Trustee Steve G. Stevanovich, AB '85, MBA '90.