We propose a model to describe stock pinning on option expiration dates. We argue that if the open interest in a particular contract is unusually large, Delta-hedging in aggregate by market-makers can impact the stock price and drive it to the strike price of the option. We derive a stochastic differential equation for the stock price which has a singular drift that accounts for the price-impact of Delta-hedging. According to this model, the stock price has a finite probability of pinning at a strike. We calculate analytically and numerically this probability in terms of the volatility of the stock, the time-to-maturity, the open interest for the option under consideration and a ``price-elasticity. constant that models price impact. We also present strong evidence of the validity of the model, based on historical data from 1996-2004.
Likelihood Estimation of Latent Affine Processes
This article develops a direct filtration-based maximum likelihood methodology for estimating the parameters and realizations of affine processes with latent state variables. Rather than working with probability densities, which are not generally known in continuous-time finance models, a procedure is developed for recursively updating the associated characteristic functions of latent variables conditional upon past discrete-time data. Filtered estimates of latent variable realizations are directly generated within the procedure, while the likelihood function of observed data necessary for parameter estimation can be evaluated numerically by Fourier inversion. An application to daily stock index returns over 1953-96 reveals substantial divergences from EMM-based estimates of latent stochastic volatility and jump risk -- in particular, more substantial and time-varying jump risk. The relevance for pricing stock index options is discussed.
Alexandre d'Aspremont (Department of Electrical Engineering and Computer Science, University of California, Berkeley) email@example.com
Moment Approach to the Static Arbitrage Problem on Baskets
We consider the problem of computing upper and lower bounds on the price of a European basket call option, given prices on other similar baskets. We focus here on an interpretation of this program as a generalized moment problem, using results by Berg & Maserick (1984), Putinar & Vasilescu (1999) and Lasserre (2001) on harmonic analysis on semigroups, the K-moment problem and its applications to optimization. These allow us to derive tractable necessary and sufficient conditions for the absence of static arbitrage between basket straddles, hence on basket calls and puts.
Pricing Models from Option Prices: A Statistical Approach
to an Ill-Posed Inverse Problem
Keywords: Bayesian methods, evolutionary algorithms, ill--posed inverse problems, model calibration, particle methods, option pricing.
The inverse problem of recovering an option pricing model (or risk--neutral process) from a set of given market prices of options, known in finance as the model calibration problem, has been treated in the literature either as an exact inversion in presence of continuum data or as an optimization problem involving non--linear least squares or regularized versions of it. When applied to a given set of market prices, these methods yield a single set of model parameters calibrated to the market, whereas in principle (infinitely) many solutions can exist. The non-uniqueness of the solution is not simply a mathematical nuisance: it reflects model uncertainty and should not be neglected.
We describe here a statistical approach to the model calibration problem, which allows for incomplete data and takes into account the multiplicity of solutions: we propose a random search algorithm which converges to a random sample from the set of calibrated models. Starting from an IID population of candidate solutions drawn from a prior distribution of the set of model parameters, the population of parameters is updated through cycles of independent random moves followed by ``selection" using the calibration criterion. We examine conditions under which such an evolving population converges to a set of calibrated models.
Through an analogy with systems of interacting particles, a "propagation of chaos" result allows us to interpret the result of our algorithm as a random IID sample drawn from the set of calibrated models, whose heterogeneity can be used to quantify the degree of ill--posedness of the inverse problem. Building upon this idea, we propose a minimax measure of model uncertainty for the price of an exotic option which takes into account the value of liquidly traded ("vanilla") options.
Our algorithm yields a computable example of coherent and convex measures of risk, which are compatible with observed prices of benchmark options.
We test this approach both on simulated data and empirical data sets of index and foreign exchange options in the context of diffusion models.
Hedge Fund Performance and Risk Profile Analysis: Non-Linear Statistics and Risk Factor Identification
Hedge Fund positional transparency has raised a number of issues between investors and Funds of Funds managers on the one hand, and hedge fund managers on the other hand. We show that, in general, this controversy is almost irrelevant, and that, indeed, appropriate statistical techniques allow to extract from historical return series most of the risk information. Moreover, a large part of this information cannot be detected, just knowing the positions of the fund at a given date.
We will, in particular, focus on the importance of taking into account the non-linear relationship between hedge fund returns and market factors. We shall also show the relevance of rolling statistics of the market in the explanation of return series.
Demonstration of Riskdata's Fund Risk Profiling tool FOFiX® (poster session)
We present how, in practice, works Riskdata's fund analyser in order to produce the risk profile of Hedge Funds and of Funds of Hedge Funds, that is, how market factors may impact the Fund or the FoF returns. During the demo, we shall compare, on actual fund data series, the various statistical methods that are presented during the talk, which mainly consists of back-testing results of these methods.
Nicole EL KAROUI (Centre de Mathématiques Appliquées Ecole Polytechnique) firstname.lastname@example.org
Joint work with Asma Meziou.
Risk Measures and Robust Optimization Problems
We discuss the structure of convex risk measures and the solution of some related robust optimization problems. The talk will be based on joint work with A. Schied and on recent results by A. Schied and A. Gundel.
We consider stochastic volatility diffusion models where volatility is driven by two factors running on short and long time scales respectively. Perturbations techniques, singular and regular, are very efficient to approximate option prices. We show that five parameters are needed to capture the main effects due to stochastic volatility. Furthermore we reduce the parametrization to four effective parameters which can easily be calibrated to the implied volatility surface. Finally we explain how to use these parameters to price other exotic derivatives. Joint work with G. Papanicolaou, R. Sircar and K. Solna. Papers available at: www.math.ncsu.edu/~fouque/PubliFM.
Craig Alan Friedman (Standard and Poor's and New York University's Courant Institute of Mathematical Sciences) email@example.com
We review a coherent, financially based approach for measuring model performance and building probabilistic models that learn from data. We give information theoretic interpretations of our model performance measures and provide new generalizations of entropy and Kullback-Leibler relative entropy. For investors with utility functions in a three-parameter logarithmic family, our model building method leads to a regularized relative entropy minimization. We review applications of this methodology to two credit problems: estimating the conditional probability of default, given side information and estimating the conditional density of recovery rates of defaulted debt, given side information.
Hélyette Geman (DESS 203 "Security Markets, Commodity Markets and Risk Management" University Paris Dauphine & ESSEC) firstname.lastname@example.org
Jump Lévy Processes for Asset Price Modelling
Articles: Pure Jump Lévy Processes for Asset Price Modelling.pdf
Stochastic Volatility for Lévy Processes.pdf
The goal of the paper is to show that some types of Lévy processes such as the hyperbolic motion and the CGMY are particularly suitable for asset price modelling and option pricing. We wish to review some fundamental mathematic properties of Lévy distributions, such as the one of infinite divisibility, and how they translate observed features of asset price returns. We explain how these processes are related to Brownian motion, the central process in finance, through stochastic time changes which can in turn be interpreted as a measure of the economic activity. Lastly, we focus on two particular classes of pure jump Lévy processes, the generalized hyperbolic model and the CGMY models, and report on the goodness of fit obtained both on stock prices and option prices.
Peter W. Glynn (Department of Management Science and Engineering, Stanford University) email@example.com
Estimation Methods for Discretely Observed Markov Processes
When Markov processes are continuous observed, it is generally possible to write down the likelihood explicitly. Given the statistical efficiency of maximum likelihood-based methods, the corresponding maximum likelihood estimators are generally then the method of choice for parameter estimation. However, in many settings, the processes of interest are not continuously observed. The difficulty of computing the corresponding transition density that enters the likelihood then creates a tension between what is statistically efficient and what is computationally tractable. In particular, one may need to consider non-likelihood based methods for computing parameter estimates. In this talk, we will discuss some of the mathematical and computational issues that arise at this interface between computation and statistics.
Yevgeny Goncharov (Department of Mathematics, University of Michigan, Ann Arbor ) firstname.lastname@example.org
New Approaches to Valuation of CMO's (poster session)
Popularity of Collateralized Mortgage Obligations declined in recent years due losses experienced by CMO investors incurred by refinancing waves. Inability to properly hedge CMO's can be partially attributed to complexity of their valuation. Cash flow in CMO's can be very complex and this Gordian knot is currently cut with simulations of prepayment that demand a lot of computational time. I present two new ideas to remove this necessity. Price representations are given as Feynman-Kac expectations, and this allows me to calculate the price via a PDE. In one case the PDE is formulated with the terminal condition given on a manifold rather then on a hyperplane "t=maturity."
Victor Isakov (Department of Mathematics and Statistics, Wichita State University, Wichita, KS 67260-0033, USA) email@example.com
We consider the problem of recovery of the time independent volatility from the current market data. By using the Dupire equation we reduce this problem to an inverse problem for a parabolic equation with the final overdetermination. We review available uniqueness and stability results for this inverse problem and two numerical algorithms, based on use of the fundamental solution and on a linearization. We discuss results of their numerical tests and further possibilities and challenges.
We formulate the optimal hedging problem when the underlying stock price has jumps, especially for insiders who have more information than general public. The jumps in the underlying price process depends on another diffusion process, which models a sequence of firm specific information. This diffusion process is observed only by insiders. Nevertheless, the market is incomplete to insiders as well as to general public. We use the local risk minimization method to find a closed form of an optimal hedging strategy. We also provide a numerical example of the value process of an option based on the local risk minimization approach in this setting.
Wei Li (Department of Finance, Henry B. Tippie College of Business, The University of Iowa, Iowa City, IA 52242-1000) firstname.lastname@example.org
Joint work with Ashish Tiwari email@example.com
Performance Chasing, Mutual Fund Tournaments, and Managerial
Incentives (poster session)
Why do mutual fund investors chase past winner funds despite the absence of performance persistence among such funds? In this paper we adopt a tournament framework to analyze the incentives of two fund managers, with unequal performance at an interim stage, who compete for investor cash flows. Our model is characterized by an absence of differential ability among competing fund managers. We show that in equilibrium (a) it is optimal for the fund manager who is trailing behind at the interim stage (i.e., the interim loser) to increase the idiosyncratic risk of her portfolio, and (b) risk-averse investors anticipate the incentives facing losing fund managers and rationally chase winners. Our analysis yields a number of testable predictions. In particular, we show that the increase in the idiosyncratic risk of the interim loser manager~Rs portfolio is directly related to the magnitude of the performance gap at the interim stage, and to the strength of the investor (cash flow) response to the relative performance rankings of the funds (i.e., the strength of the tournament effect). Furthermore, we show that the ex-ante utility of long-term fund investors is decreasing in the strength of the tournament effect. Our results have implications for several aspects of fund design including the optimal fund entry/exit policy, and choice of organizational form (i.e., closed-end vs. open-end).
Keywords: Nash Equilibrium, Portfolio risk, Mutual Fund Tournaments, Delegated Portfolio Management, Performance Chasing, Managerial Incentives.
It is known that the Black Scholes model does not price all European options quoted on a given market in a consistent way. In reality the implied volatility generally shows a dependence on both the option maturity and strike. The aim of this talk is to incorporate the effect of this dependence in the pricing and hedging of structured securities, with particular interest on securities dependent on many assets that each show a volatility smile/skew.
We start from the formulation of an embryonic stochastic volatility model and of its projection onto the local volatility manifold. Both models have the advantage of being as tractable as Black and Scholes', with a combination of simplicity and tractability that make them extremely appealing to practitioners. We then show that these models can be extended in an intuitive way from the univariate to the multivariate setting. In particular in the local volatility version, the resulting theory allows to sample from an entirely new type of dynamics that still enjoys an internal consistency with the observed volatility surfaces for the individual securities, but with strong computational implications on the calculation of prices of European options on baskets of securities.
Rituparna Sen (Department of Statistics, University of Chicago) firstname.lastname@example.org
An important aspect of the stock price process, which has often been ignored in the financial literature, is that prices on organized exchanges are restricted to lie on a grid. We consider pure jump models for the stock price process which integrate the randomness of jump times with the discreteness of the jump size. The convergence, estimation, discrete time approximation, and uniform integrability conditions for this model are studied. The effect of stochastic volatility is studied in this setting. A Bayesian filtering technique is proposed as a tool for risk neutral valuation and hedging. This emphasizes the need for using statistical information for valuation of derivative securities, rather than relying on implied quantities.
Michael Stutzer (Burridge Center for Securities Analysis and Valuation, Leeds School of Business) email@example.com
Razor Critique of Investor Objective Functions: Neither Samuelson
nor Rabin and Thaler Are Right
Influential early articles by Paul Samuelson advocated use of expected concave utility of wealth criteria in T-repeated betting and investment problems. He and other founders of modern decision theory viewed their work as normative prescriptions for choice under uncertainty; not just as predictive theories of pre-existing behavior. Results of Rabin (Econometrica, 2000), exposited and applied in Rabin and Thaler (Journal of Economic Perspectives, 2001, pp. 219-232), directly challenged both the prescriptive and predictive usefulness of any expected concave utility criterion in these settings. As a predictive alternative, they advocated the use of loss averse preferences as a substitute for expected concave utility.
While different systems of preference axioms have been found that respectively imply the use of expected utility and loss aversion criteria, they are not normatively convincing. Moreover, neither loss aversion criteria, nor several other alternatives to expected utility, do anything to solve a problem that plagues both the prescriptive and predictive use of expected utility: prescriptive results critically depend on practically unobservable, adjustable preference parameters, and hence ad-hoc techniques for attempting to indirectly identify them.
As a simpler alternative criterion, this paper proposes the probability of outperforming an observable benchmark the agent wants to beat. This criterion does not suffer from the possible ills of some other probabilistic criteria that were (influentially) critiqued by Samuelson. Large deviations theory is used to show that for suitably large T, this criterion is equivalent to maximizing an expected CRRA (power) habit-formation utility, but with a coefficient of risk aversion that varies endogenously with the alternative evaluated. This eliminates the adjustable curvature parameter used in other expected and non-expected utility (e.g. loss aversion) preference theories, in accord with the scientific principle of parsimonious parameterization called Ockham's Razor.
Peter Tankov (Centre de Mathématiques Appliquees, Ecole Polytechnique, Palaiseau, France.) firstname.lastname@example.orgNon-Parametric Calibration of Jump-Diffusion Option-Pricing Models
Joint work with Rama Cont.
We present a non-parametric method for calibrating jump-diffusion and, more generally exponential Lévy models to a finite set of observed option prices. We show that the usual formulations of the inverse problem via nonlinear least squares are ill-posed and propose a regularization method based on relative entropy: we reformulate our calibration problem into a problem of finding a risk neutral exponential Lévy model that minimizes a certain weighted sum of the pricing error and the relative entropy of the pricing measure with respect to a chosen prior model. We discuss the numerical implementation of our method using a gradient based optimization algorithm and show both theoretically and via simulation tests on various examples that the entropy penalty resolves the numerical instability of the calibration problem. Finally, we apply our method to data sets of index options and discuss the empirical results obtained.
Thaleia Zariphopoulou (Department of Mathematics, University of Texas at Austin) email@example.com
A class of optimal investment and consumption models in incomplete market environments will be analyzed. The focus will be on a universal characterization of the optimal portfolios (myopic and excess risky demand) in terms of hedging strategies of supporting pseudoclaims. These claims are written on the market price of risk and are priced by indifference. Recent results on indifference prices will be used for the sensitivity and robustness analysis of the optimal investments. Issues related to model specification, and to the interplay between market incompleteness and risk preferences, will be also discussed.
This work incorporates the systematic risk of regime shifts into a general equilibrium model of the term structure of interest rates. The model shows that there is a new source of time-variation in bond term premiums in the presence of regime shifts. This new component is a regime-switching risk premium that depends on the covariations between discrete changes in marginal utility and bond prices across different regimes. A closed-form solution for the term structure of interest rates is obtained under an affine model using log-linear approximation. The model is estimated by Efficient Method of Moments. The regime-switching risk is found to be statistically significant and mostly affect the long-end of the yield curve. This is a joint work with Shu Wu at the University of Kansas.
JEL Classification: G12, E43
Key Words: The Term Structure, General Equilibrium, Markov Regime Shifts
Gady Zohar (Faculty of IE & Management Technion - Israel Institute of Technology) firstname.lastname@example.org
Yields in Bond Hedging (poster
Joint work with Haim Reisman.
We explore a dynamic term structure factor model that implicitly allows for arbitrage opportunities, and we estimate it on Treasury data. Using this model we construct instantaneously risk free portfolios, and we write down a formula for their possibly non-zero excess returns. Our model anticipates that such excess returns may be quite large. When testing the performance of these portfolios we find that their returns in practice perfectly match what our model predicts. An important implication of our approach is that hedging against factor risk may involve substantial excess gains or losses, that can be determined by our model.
Connect With Us: