Main navigation | Main content

HOME » PROGRAMS/ACTIVITIES » Annual Thematic Program

PROGRAMS/ACTIVITIES

Annual Thematic Program »Postdoctoral Fellowships »Hot Topics and Special »Public Lectures »New Directions »PI Programs »Industrial Programs »Seminars »Be an Organizer »Annual »Hot Topics »PI Summer »PI Conference »Applying to Participate »

Abstract for IMA Workshop

Model Implementation, Algorithms and Software Issues

Model Implementation, Algorithms and Software Issues

May 3-7, 2004

**Pierre
Collin-Dufresne** (University of California, Berkeley)
dufresne@andrew.cmu.edu

**Identification
and Estimation of 'Maximal' Affine Term Structure Models: An
Application to Stochastic Volatility**

Slides: pdf

Paper:
pdf

We
propose a canonical representation for affine term structure
models where the state vector is comprised of the first few
Taylor-series components of the yield curve and their quadratic
[4] (co-)variations. With this representation: (i) the state
variables have simple physical interpretations such as level,
slope and curvature, (ii) their dynamics remain affine and tractable,
(iii) the model is by construction `maximal' (i.e., it is the
most general model that is econometrically identifiable), and
(iv) model-insensitive estimates of the state vector process
implied from the term structure are readily available. (Furthermore,
this representation may be useful for identifying the state
variables in a squared-Gaussian framework where typically there
is no one-to-one mapping between observable yields and latent
state variables). We find that the `unrestricted' A_{1}(3)
model of Dai and Singleton (2000) estimated by `inverting' the
yield curve for the state variables generates volatility estimates
that are *negatively* correlated with the time series
of volatility estimated using a standard GARCH approach. This
occurs because the `unrestricted' A_{1}(3) model imposes
the restriction that the volatility state variable is simultaneously
a linear combination of yields (i.e., it impacts the cross-section
of yields), and the quadratic variation of the spot rate process
(i.e., it impacts the time-series of yields). We then investigate
the A_{1}(3) model which exhibits `unspanned stochastic
volatility' (USV). This model predicts that the cross section
of bond prices is independent of the volatility state variable,
and hence breaks the tension between the time-series and cross-sectional
features of the term structure inherent in the unrestricted
model. We find that explicitly imposing the USV constraint on
affine models significantly improves the volatility estimates,
while maintaining a good fit cross-sectionally.

**Ron
S. Dembo**
(Founding Chairman, Algorithmics Incorporated) dembo@algorithmics.com

**Risk
Measurement; Risk Architecture and the Bank of the Future**

Slides: html
pdf
ps
ppt

Measuring the risk of a large financial institution is a gargantuan task. There have been major improvements in doing so over the past few years. These have resulted in the ability of institutions to take on more and more complexity, thereby keeping the risk management treadmill alive and well. We discuss how risk is actually measured, some major new accomplishments, such as real-time risk based on simulation and highlight some of the interesting research problems that are being addressed, such as real-time bank-wide optimization.

**Gregory
R. Duffee**
(Haas School of Business, University of California-Berkeley)
duffee@haas.berkeley.edu
http://faculty.haas.berkeley.edu/duffee/

**Estimation
of Dynamic Term Structure Models**

Slides:
duffee_ima_present.pdf

Paper: duffee_stanton.pdf

This talk discusses the finite sample properties of some of the standard techniques used to estimate modern term structure models. For sample sizes similar to those used in most empirical work, I note three surprising conclusions. First, maximum likelihood produces strongly biased parameter estimates. Second, despite having the same asymptotic efficiency as maximum likelihood, the small sample performance of Efficient Method of Moments (a commonly used method for estimating complicated models) is unacceptable even in the simplest term structure settings. Third, the linearized Kalman filter is a tractable and reasonably accurate estimation technique that I recommend in settings where maximum likelihood is impractical.

**Philip
H. Dybvig**
(Washington University in Saint Louis) pdybvig@dybfin.wustl.edu

**Mandatory
or Voluntary Retirement**

Slides:
pdf

Saving for retirement is a primary end purpose of many parts of the financial sector, including pension plans, life insurance, and indeed much of retail banking and brokerage. As a step towards understanding these markets, we solve the lifetime consumption and investment problem of a competitive agent who faces voluntary or mandatory retirement. The model includes such realistic features as stochastic age-dependent wage, age-dependent life-table mortality and age-dependent preferences for working as well as a constraint that prevents borrowing against future labor income. The tightly approximated model is solved parametrically in the dual, in closed form up to determination of some constants. The solution uses the technique of Carr (for American Options) and Liu and Loewenstein (for transaction costs) of making the nonstationary problem into a sequence of stationary problems by approximating a fixed horizon by a sequence of stationary random horizons.

**Jean-Pierre
Fouque**
(Department of Mathematics, North Carolina State University)
fouque@math.ncsu.edu
http://www.math.ncsu.edu/~fouque

**Variance
Reduction for MC Methods to Evaluate Option Prices Under Multi-Factor
Stochastic Volatility Models**

Slides:
pdf

We present variance reduction methods for Monte Carlo simulations to evaluate European and Asian options in the context of multiscale stochastic volatility models. European option price approximations, obtained from singular and regular perturbation analysis [J.P. Fouque, G. Papanicolaou, R. Sircar and K. Solna: Multiscale Stochastic Volatility Asymptotics, SIAM Journal on Multiscale Modeling and Simulation {\bf 2(1)}, 2003], are used in important sampling techniques, and their efficiencies are compared. Then we investigate the problem of pricing arithmetic average Asian options (AAOs) by Monte Carlo simulations. A two-step strategy is proposed to reduce the variance where geometric average Asian options (GAOs) are used as control variates. Due to the lack of analytical formulas for GAOs, it is then necessary to consider efficient Monte Carlo methods to estimate the unbiased means of GAOs. The second step consists in deriving formulas for approximated prices based on perturbation techniques, and in computing GAOs by using important sampling. Numerical results illustrate the efficiency of our method.

Joint work with Chuan-Hsiang (Sean) Han.

**Paul
Glasserman** (Graduate School of Business, Columbia
University) pg20@columbia.edu

**Monte Carlo Pricing of American Options: Overview
and New Results**

Slides: pdf

Paper: pdf

An American option allows the holder to choose the time of exercise, so valuing such an option entails solving an optimal stopping problem. This "free boundary" problem presents a challenge for Monte Carlo methods. The first part of this talk will be an overview of methods developed in recent years to address this problem. These methods apply weighted backward induction to simulated paths, with weights defined through likelihood ratios, through calibration, or implicitly through regression. The second part of this talk analyzes conditions for convergence as both the number of paths and number of basis functions for regression grow. Using polynomials in the regressions, the number of paths must grow exponentially with the number of basis functions to assure convergence when applied to Brownian motion, faster when applied to geometric Brownian motion. This analysis is based on joint work with Bin Yu.

**David
C. Heath **(Department
of Mathematical Sciences, Center for Computational Finance,
Carnegie Mellon University) heath@red.math.cmu.edu

**Efficient
Option Valuation Using (Non-Recombining) Trees **

Paper: pdf

Joint work with Stefano Herzel.

We propose an algorithm for the discrete approximation of continuous market price processes which uses trees instead of lattices. We show that it is convergent when used to price both European and American options and that it is more efficient, for some models, than the usual recombining schemes.

**Dmitry
Kramkov**
(Center for Computational Finance, Carnegie Mellon University)
kramkov@andrew.cmu.edu

**Risk-Tolerance
Wealth Processes and Sensitivity Analysis of Utility Based
Prices**

Slides:
pdf

We present the asymptotic analysis of the marginal utility based prices of contingent claims in incomplete financial models with respect to the number of these claims held in the portfolio. Our main result states that such an approximation preserves a number of important qualitative properties of the original utility based prices if and only if there is a risk-tolerance wealth process. The talk is based on a joint paper with Mihai Sîrbu.

**Joseph
Langsam** (Morgan Stanley) Joseph.Langsam@morganstanley.com

**Changing
Dynamics in the Securities Market**

Slides: html pdf ps ppt

Mathematical finance has evolved since the early days of Black-Scholes with the assumptions of lognormal dynamics, constant interest rates, and constant volatility. The growth of the derivatives market and product innovation in new markets has forced "Wall Street" to confront the complexities of far more generalized dynamics. In this talk, I will review the dynamics, complexities, and apparent conundrums in the modeling of a variety of financial products including interest rate, foreign exchange, equity, electricity, and credit products. Many questions will be asked and many problems posed, but few answers will be given and fewer solutions will be offered.

**Michael
Ludkovski**
(Department of Operations Research and Financial Engineering,
Princeton University) mludkovs@Princeton.EDU

**Convenience
Yield Model with Partial Observations and Exponential Utility**
(poster session)

Joint work with Rene Carmona.

We consider the problem of pricing claims for delivery of crude oil or natural gas to a given location. We work with a three factor model for the asset spot, the convenience yield and the locational basis. The convenience yield is taken to be unobserved and must be filtered. Our methodology is indifference pricing with exponential utility. Assuming the basis is independent from the spot, the partially observed stochastic control problem can be expressed as a Feynman-Kac expectation. If the basis is also independent from the convenience yield, the resulting indifference price is trivial. Otherwise, we show how to numerically compute the expectation using a Kalman or particle filter. The basic model may be generalized to include nonlinear dynamics. We finish by performing comparative statics and relating the results to the full information setting.

**Curt
Randall**
( SciComp Inc.) randall@scicomp.com

**Software
Synthesis - Pricing without Programming**

Software synthesis methods applied to the development of derivative pricing and hedging models allow quantitative analysts, researchers, and risk managers to rapidly generate models without programming. The high level language developed for SciFinance will be used to illustrate how a financial compiler generates source code. This presentation will show how a 10 line specification for a complex financial derivative can generate a ready to use model to price the instrument often comprising thousands of lines of source code. Examples will be shown over several asset classes using both PDE and Monte Carlo methods. Attendees will be given a web link to access papers and a free sample pricing code that illustrates the use of software synthesis.

**Mathias
Rousset **(
Lab. Statistique et Probabilités, Université Paul Sabatier)
rousset@cict.fr

**Sampling
Prescribed Distributions with Interacting Particle Systems**

We present a new class of interacting Metropolis models having a prescribed limiting distribution. In contrast to traditional Monte-Carlo methods, and when the population size is large, the decay to equilibrium does not depend on the target distribution. Some conclusive simulations are presented, focusing on diffusive models and their implementation.

**Louis
Scott** (Morgan Stanley & Co.) Louis.Scott@morganstanley.com

**Stochastic
Volatility and Jumps: Risk Management and Hedging Strategies**

Slides:
LOS_Slides_SVJ_2004.pdf

The talk covers stochastic volatility and jumps from a risk management perspective. Various topics include gap risk in the underlying prices, gap risk in the option implied volatilites, and some analysis of the effects on the greeks (risk exposures). Examples for stock index options are covered. Question: what should you do with the skew in a stress test for stock prices? A typical stress test is to decrease stock pirces by 20% and increase the at-the-money implied volatilities by something more than 20%.

1) Overview

Volatility Risk and Jump Risk (Gap Risk)

Discipline and Risk Management

The Role of Models | |

Greeks: delta, gamma, kappa/vega, theta, PV01 | |

Stress Tests and Scenario Analyses: revalue portfolios for extreme, but plausible, market changes | |

Valuation and Relative Value Trading | |

Additional Tools for Understanding Market Dynamics and Risks | |

New Markets: CDO's and Basket Default Swaps | |

Elements of gap risk | |

Stress Tests: what happens to tranches when one or several names go into financial distress? | |

Model Correlation for Relative Value Trading |

2) Overview of Stochastic Volatility/Jump Models

3) Hedging Stochastic Volatility Risk

Measuring and Hedging Kappa/Vega Risk | |

Compute kappa for all options and manage overall kappa exposure | |

Compute kappa from Black-Scholes model | |

Bucket kappa exposure by type and term | |

Hedging Stochastic Volatility | |

Treat stochastic volatility as another random state variable and compute its partial | |

Compute vol exposure from stochastic volatility model |

4) Managing Jump Risk

Need to balance long/short positions in options

Brute Force: simulate jumps and revalue all positions, both options and hedges | |

Compute a 99% VaR Loss | |

May need to use additional measures of tail risk | |

Run scenarios which incorporate plausible jump risks, or the ones that could be most damaging | |

Equity markets down 30% and increase equity implied volatilities | |

Recompute kappa or stochastic volatility risk under each scenario |

5) Several Examples for Stressing Equity Option Skew Curves

6) Evaluating the Risks of Option Writing

Net short equity options, delta hedge using spot or futures | |

This trading strategy receives the equity volatility risk premium. | |

Exposure is volatility risk and jump risk. | |

The risk premium reflects the fact that these risks occur at bad times. | |

These risks are highly correlated with negative returns on market portfolios. |

Sell puts and delta hedge with long spot/futures positions | |

Stocks drop suddenly and implied volatilities increase. | |

Strategy is a double loser under this scenario. |

Sell calls and delta hedge with short spot/futures positions | |

Stock prices drop: gain on short call positions, but lose on hedges. | |

Delta on the calls decreases and the gain on the calls is smaller because of option gamma. | |

The increase in implied volatility reduces the gain on the call position even further. |

Long residential mortgages, short an interest rate option

No risk premium for FX volatility and interest rate volatility.

End

**Steven
E. Shreve** (Department of Mathematical Sciences,
Carnegie Mellon University) shreve@matt.math.cmu.edu

**A Two-Person Game for Pricing Convertible Bonds **

Slides:
pdf
ps

A firm issues a convertible bond. At each subsequent time, the bondholder must decide whether to keep the bond, thereby collecting coupons. or to convert the bond for stock. The bondholder wishes to choose a conversion strategy to maximize the bond value. Subject to some restrictions, the bond can be called by the issuing firm, which acts to maximize equity value and thus minimize bond value. This creates a two-erosn game, and we model the bond price as the value of this game. We show, however, that under the assumption that dividends are paid at a lower rate than the short-term interest rate, this game reduces to one of two optimal stopping problems, and which is the relevant problem can be determined apriori.

Because the dividends paid depend on the value of equity, which in turn depends on the value of the bond, the dynamics of the firm value cannot be specified until the bond pricing problem is solved. As a result, the optimal stopping problems which must be solved lead to nonlinear partial differential equations. These can be solved by a fixed-point method.

This work is Mihai Sîrbu's Ph.D. dissertation.

**Mihai
Sîrbu** (Department of Mathematical Sciences,
Carnegie Mellon University) msirbu@andrew.cmu.edu
http://www.math.cmu.edu/users/msirbu/

**Perpetual
Convertible Bonds ** (poster
session)

Slides: pdf

In a similar model to the presentation of Steven E. Shreve ("A Two-Person Game for Pricing Convertible Bonds"), we consider the problem of pricing a convertible bond that has no maturity date. The problem reduces to solving a nonlinear ODE and to a min-max argument. The Perpetual Convertible Bond represents the asymptotic behavior for the finite maturity case. The presentation is based on joint work with Igor Pikovsky and Steven E. Shreve.

**Srdjan
D. Stojanovic**
(Department of Mathematical Sciences, University of Cincinnati)
srdjan@math.uc.edu
http://math.uc.edu/~srdjan/

**Pricing
Options Under Stochastic Volatility: Complete Solution**

Paper: pdf

We have found, at least from the practical point of view, the complete solution of the option pricing problem for underlying securities obeying stochastic volatility price dynamics. In particular, we have found the exact expression for the "market price of volatility risk." The pricing problem is reduced to solving an uncoupled system of a Monge-Ampère type PDE and a Black-Scholes type PDE. The general problem of hedging in such an environment is solved too. Results of computational experiments will be presented as well.

**Yong
Zeng**
(Department of Mathematics and Statistics, University of Missouri
at Kansas City) zeng@mendota.umkc.edu
http://mendota.umkc.edu

**A
General Equilibrium Model of the Term Structure of Interest
Rates Under Regime-Switching Risk**

Slides:
pdf
ps

Paper: pdf
ps

This work incorporates the systematic risk of regime shifts into a general equilibrium model of the term structure of interest rates. The model shows that there is a new source of time-variation in bond term premiums in the presence of regime shifts. This new component is a regime-switching risk premium that depends on the covariations between discrete changes in marginal utility and bond prices across different regimes. A closed-form solution for the term structure of interest rates is obtained under an affine model using log-linear approximation. The model is estimated by Efficient Method of Moments. The regime-switching risk is found to be statistically significant and mostly affect the long-end of the yield curve. This is a joint work with Shu Wu at the University of Kansas.

**Yong
Zeng**
(Department of Mathematics and Statistics, University of Missouri
at Kansas City) zeng@mendota.umkc.edu
http://mendota.umkc.edu

**A
Class of Micro-Movement Models of Asset Price with Continuous-Time
Bayesian Inference via Filtering** (poster
session)

A rich class of micro-movement models that describe the transactional price behavior is proposed. The model ties the sample characteristics of micro-movement and macro-movement in a consistent manner. An important feature of the model is that it can be transformed to a filtering problem with counting process observations. Consequently, the complete information of price and trading time is captured and then is utilized in Bayesian inference via filtering for the parameter estimation and model selection. The evolution equations characterizing likelihoods, posteriors, and Bayes factors are derived. Recursive algorithms are constructed via the Markov chain approximation method to compute likelihoods, posteriors, and Bayes factors. The consistencies (or robustness) of the recursive algorithms are proven. Two micromovement models are studied in detail. One is the model built on geometric Brownian motion (GBM) and the other is on the GBM plus jumping stochastic volatility. Simulation results show that the Bayes estimates for time-invariant parameters are consistent, the Bayes estimates for stochastic volatility capture the movement of volatility, and the Bayes factor can effectively selects the right model. Real-world applications to Microsoft transaction data are also provided.