Luca Benzoni (Finance Department, University of Minnesota) email@example.com http://legacy.csom.umn.edu/WWWPages/FACULTY/lbenzoni/
Stochastic Volatility, Mean Drift, and Jumps in the Short-Term Interest Rate (poster session)
Joint work with Torben G. Andersen (Northwestern University) and Jesper Lund (Nykredit Bank).
We find that an intuitively appealing and fairly manageable continuous-time model provides an excellent characterization of the U.S. short-term interest rate over the post Second World War period. Our three-factor jump-diffusion model consists of elements embodied in existing specifications, but our approach appears to be the first to successfully accommodate all such features jointly. Moreover, we conduct simu ltaneous and efficient inference regarding all model components, which include a shock to the interest rate process itself, a time-varying mean reversion factor, a stochastic volatility factor and a jump process. Most intriguingly, we find that the restrictions implied by an affine representation of the jump-diffusion system are not rejected by the U.S. short rate data. This allows for a tractable setting for a ssociated asset pricing applications.
Testing Factor-Model Explanations of Market Anomalies
A number of recent papers have attempted to explain the size and book-to-market anomalies with either (1) factor models based on economically motivated factors, or (2) with conditional CAPM or CCAPM models with economically motivated conditioning variables. These papers use similar methodologies and similar test assets, and generally fail to reject the proposed models. We argue that these tests may fail to reject because of low statistical power of the tests against reasonable alternative hypotheses, rather than because the models are consistent with the data. We propose an alternative test methodology with higher power against the proposed alternatives, and show that the new test methodology results in the rejection of several of the proposed factor models at high levels of significance.
Michael A.H. Dempster (Centre for Financial Research, Judge Institute of Management, University of Cambridge & Cambridge Systems Associates Limited) firstname.lastname@example.org
This talk reports on work undertaken with the support of HSBC to understand the $1.4 B per day global currency market. After a general introduction, the detailed structure of the global FX market will be described with a focus on the roles of the major market makers and the EBS and Reuters 3000 electronic interdealer markets. Next modelling individual agents, traders and market makers with computational learning techniques based on extensive quote, trade, agent order flow and order book data seen by a market maker will be reported. Finally, work in progress to construct realistic agent simulation models of the essence of the global market will be discussed which attempts to capture the current mechanisms of price discovery - at least over intervals shorter than those at which macroeconomic fundamentals are thought to dominate market movements.
I present a framework for modeling part of the dynamics of the term structure. The framework can be used to link the term structure to observed variables such as inflation and output. Its partial nature allows us to dispense with yield-based factors (e.g., latent factors) while retaining restrictions associated with no-arbitrage. I apply the model to the joint dynamics of inflation and the term structure. As other research has noted, both short-term and long-term bond yields adjust gradually to a change in inflation. I find that the dynamics of the price of interest rate risk needed to fit this pattern from 1983 through 2003 are implausible. An alternative interpretation is that investors were systematically surprised by the slow adjustment of short-term yields to inflation.
Philip H. Dybvig (Olin School of Business, Washington University in Saint Louis) email@example.com
of Interest Data
Absent unreasonably strong assumptions, financial theory places almost no restriction on interest rates and bond prices. If the short rate process exists (not even an implication of most preferences we study), then bond and interest derivative prices are given by expected discounted values using the rolled-over spot rate for discounting and risk-neutral ("martingale'') probabilities for computing expectations. Absent theoretical guidance, the choice of interest rate process should ideally be dictated by the data. This presentation explores the interest-rate process starting with the sample version of the quadratic variation of the three-year Treasury Bill discount process, using about 50 years worth of daily data from the Fed's H15 tape. This analysis updates an analysis done in 1990 with an eye toward looking at the impact of what seems to be a unique regulatory and economic environment today, but the major conclusions are unchanged. A final comment relates the analysis to a result on parameter uncertainty from a FAJ paper with Bill Marshall.
I will present a variety of empirical results based on a study of the London Stock Exchange. The data set contains about 350M events, including every action by every trader on every stock, making it possible to reconstruct the limit order book at any instant in time. This study has generated a variety of new empirical results, including a characterization of the approximate power law behavior and long-memory effects associated with price returns, order placement, and the spread. My collaborators and I have shown that price changes are largely driven by fluctuations in liquidity. A model for order flow is developed, that when simulated along with its impact on prices, explains many of statistical properties of the data very well. Finally, time permitting, I will present some preliminary results developing an agent ecology of arbitraguers who exploit liquidity demanders, and discuss their affect on prices. These results illustrate first, that there are many strong regularities in market behavior at the microstructure level, and second, that many aspects of these regularities can be understood based on what might be characterized as low intelligence models of agent behavior.
Joint work with Robert A. Stine.
Almost everyone you talk to claims to have a scheme that "beats the market." How should we test such claims? We created a test (based on Bennett's inequality) that only assumes that CAPM excess returns should be martingale. But the claimants scoff at our test and say that it doesn't have sufficient power to show the beauty of their scheme.
With tongue firmly in cheek, we will provide a few schemes that will pass any weakening of our test. This has been a wonderful teaching aid, since the schemes are understandable to MBAs. Finally we will revisit Fama and French's book to market ratio as a way of generating excess returns and ask does it have enough jump to pass our statistical test.
Xavier Gabaix (Department of Economics, Massachusetts Institute of Technology) firstname.lastname@example.org http://econ-www.mit.edu/faculty/xgabaix/papers.htm
Joint work with Parameswaran Gopikrishnan, Vasiliki Plerou, and H. Eugene Stanley (Center for Polymer Studies and Department of Physics, Boston University).
Insights into the dynamics of a complex system are often gained by focusing on large fluctuations. For the financial system huge databases now exist which facilitates the analysis of large fluctuations and the characterization of their statistical behavior [1,2]. Power laws appear to describe histograms of relevant financial fluctuations, such as fluctuations in stock price, trading volume, and the number of trades [3-10]. Remarkably, the exponents that characterize these power laws are similar for different types and sizes of markets, for different market trends, and even for different countries - suggesting that a generic theoretical basis may underlie these phenomena. Based on a plausible set of assumptions, we propose a model that provides an explanation for these empirical power laws. In addition, our model explains certain striking empirical regularities that describe the relationship between large fluctuations in prices, trading volume, and the number of trades. In our model, large movements in stock market activity arise from the trades of the large participants. Starting from an empirical characterization of the size distribution of large market participants (mutual funds), we show that their trading behavior when performed in an optimal way, generates power-laws observed in financial data.
Rohitha Goonatilake (Department of Mathematical and Physical Sciences, Texas A&M International University) email@example.com
Evaluation and Analysis of a 20-Year Deferred Annuity Product
This project analyzes an annuity product that suits the needs of today's American family under moderate assumptions. It helps in the study of the pricing accuracy in a mutual life insurance company and to better understand the extent of the analysis and computations involved in developing a 20-year deferred annuity product designed for a group of 1000 people; ages ranging from 30 - 40 years and having a 5 year old child.
Robust Control and Prediction
When confronting a stochastic environment, a decision-maker may not have full confidence in his probabilistic assignments and may not observe the full array state variables that characterize the probabilistic model. Instead he or she may wish to explore how decision rules perform when the stochastic specification is altered or perturbed. In this paper we consider decision problems in which a class of such perturbations are permitted. By introducing these perturbations, decision rules for prediction and control are made to be more robust. We develop and explore recursive formulations of the robust control/prediction problem and deduce corresponding risk-sensitive recursions that feature a distinct risk-adjustment for predicting the hidden Markov states.
Joint with Marco Cagetti, Thomas J. Sargent and Noah Williams.
Narasimhan Jegadeesh (Emory University) Narasimhan_Jegadeesh@bus.emory.edu
Joint work with Woojin Kim.
This paper examines analyst recommendations in the G7 countries and evaluates the value of these recommendations over the 1993 to 2002 period. We find that the frequencies of sell and strong sell recommendations in all countries are far less than that of buy and strong buy recommendations. The frequency of sell recommendations is the lowest in the U.S. We also find that stock prices react significantly to recommendation revisions on the revision day and on the following day in all of these countries except Italy. We find the largest price reactions in the U.S., followed by Japan. We also evaluate trading strategies that buy upgraded stocks and sell downgraded stocks. Here again, we find the highest profits in the U.S., followed by Japan.
is the Dean's Distinguished Professor at the Goizueta Business
School, Emory University, and Woojin Kim is a doctoral student
at the University of Illinois at Urbana-Champaign. We would
like to thank Cliff Green and Michael Weisbach, and the seminar
participants at Duke University, the University of Alabama at
Tuscaloosa, the University of Illinois at Urbana-Champaign,
and Vanderbilt University for helpful comments. We are responsible
for any errors.
Contact information: Narasimhan Jegadeesh, Goizueta Business School, 1300 Clifton Road, Atlanta, GA 30322, email: Narasimhan.Jegadeesh@bus.emory.edu; Woojin Kim, 340, Wohlers Hall, University of Illinois at Urbana-Champaign, Champaign, IL 61820, email: firstname.lastname@example.org.
A Tale of Two Growths: Modeling Stochastic Endogenous Growth and Growth Stocks
This paper extends the deterministic endogenous R&D growth model to a stochastic endogenous growth model, which is used to study growth stocks. The model provides an understanding of the links between economic growth, monopolistic competition in R&D, and the valuation of growth stocks. With the presence of stochastic shocks, the model leads to a decomposition of the value of growth stocks. The decomposition implies that the value of growth stocks should be very volatile, while the long-run average return is roughly equal to the growth rate of R&D labor. The model also explains an empirical size distribution puzzle observed for the cross-sectional study of growth stocks.
Nick Laskin (IsoTrace Lab, Department of Physics, University of Toronto) email@example.com
Jump Dynamics and Stochastic Volatility for Stock Returns (poster session)
We develop approach to model components of return distribution, which are assumed to be led by a news arrival random process. It is assumed that the compound generalized Poisson process governs information arrivals. The compound generalized Poisson process captures long-memory effect, which results in non-exponential distribution of interarrival times. The conditional variance of returns is decomposed into two components, a smoothly evolving component for standard diffusion of past news impacts and the component related to the information arrival process that generates jump stream with fractional statistics. The developed model predicts impact of large changes in stock returns on volatility. Empirical evidence of the impact jump versus normal return innovations and time-series of jump clustering has been presented.
Kiseop Lee (Department of Mathematics, University of Louisville) firstname.lastname@example.org
Estimation of Liquidity Risk by Multiple Change-Point Models (poster session)
Liquidity risk is often defined as the additional risk in the market due to the timing and size of a trade. Based on a pioneering work of Cetin et al. we develop an estimation method which is practically of use. Our new method estimates liquidity cost by applying a sequential multiple change-point detection algorithm to a broken-line regression model.
Ding Li (Department of Economics and Finance, Northern State University) Ding.Li@northern.edu
Empirical Study of Investment Behavior in Equity Markets Using Wavelet Methods (poster session)
This empirical study addresses stock returns behavior using wavelet methods in time-scale domain. Financial markets data revealed more complex dynamic patterns than random walk, the objective of this study is to apply scale analysis to explore the scale-dependent property of stock returns behavior to support the reference-dependence theory in behavioral finance. In this research, we study eleven years of daily returns for three hundred stocks sampled from the S&P 1500 index. The sample data is further categorized into groups according to their market capitalizations, divided into three time periods, and wavelet decomposed at level six. Our findings support the reference-dependence argument. We find patterns that stock returns statistical properties are scale-dependent. Our results show that stock returns are non-normally distributed and nonstationary at small scales but normal and stationary at relatively larger scales. We find significant market effects on individual assets and mixed results on different stock caps. Also stock returns cannot always be modeled as long memory processes. Our results support that people associate different investment horizons with different mental accounts.
Juyoung Lim (Department of Mathematics, The University of Texas at Austin) email@example.com
Joint work with M. Avellaneda.
We present a statistical method to estimate conditional expectation of multivariate diffusion process in short time horizon. The result includes asymptotic convergence theorem for estimator and its standard error that is based on Large Deviation Principle. Quantities from multivariate diffusion process are often analytically intractable and this method gives an effective method to estimate them without simulation and offers a way to undertand its risk profile intuitively.
An application is demonstrated with relative value pricing of multi asset derivatives such as index option and swaption.
Information, Diversificiation, and Cost of Capital
We study the pricing implications of information in a noisy rational expectations model with a factor structure for multi-asset payoffs. There are two classes of price taking investors in our model; informed investors who receive private signals on systematic and idiosyncratic components of asset payoffs, and uninformed investors who draw imperfect inferences about those signals from prices. We solve the equilibrium explicitly. We show that only information about systematic factors matters in determining asset risk premiums, when the number of the risky assets is large. Idiosyncratic risk as well as the information associated with them is fully diversifiable.
Return and Dividends (poster
Joint work with Andrew Ang (Columbia University and NBER).
We characterize the joint dynamics of expected returns, stochastic volatility, and prices. In particular, with a given dividend process, one of the processes of the expected return, the stock volatility, or the price-dividend ratio fully determines the other two. For example, the stock volatility determines the expected return and the price-dividend ratio. By parameterizing one, or more, of expected returns, volatility, or prices, common empirical specifications place strong implicit, and sometimes inconsistent, restrictions on the dynamics of the other variables. Our results are useful for understanding the risk-return trade-off, as well as the predictability of stock returns.
Information in Option Volume for Future Stock Prices
Joint work with Allen M. Poteshman (University of Illinois at Urbana-Champaign).
We find strong evidence that option trading volume contains information about future stock price movements. Taking advantage of a unique dataset from the Chicago Board Options Exchange, we construct put to call ratios for underlying stocks, using volume initiated by buyers to open new option positions. Performing daily crosssectional analyses from 1990 to 2001, we find that buying stocks with low put/call ratios and selling stocks with high put/call ratios generates an expected return of 40 basis points per day and 1 percent per week. This result is present during each year of our sample period, and is not affected by the exclusion of earnings announcement windows. Moreover, the result is stronger for smaller stocks, indicating more informed trading in options on stocks with less efficient information flow. Our analysis also sheds light on the type of investors behind the informed option trading. Specifically, we find that option trading from customers of full service brokers provides the strongest predictability, while that from firm proprietary traders is not informative. Finally, in contrast to the equity option market, we do not find any evidence of informed trading in the index option market.
Monika Piazzesi (Graduate School of Business, University of Chicago) firstname.lastname@example.org http://gsbwww.uchicago.edu/fac/monika.piazzesi/research/
Prices as Risk-Adjusted Forecasts of Monetary Policy
Many researchers have used federal funds futures rates as measures of financial markets' expectations of future monetary policy. However, to the extent that federal funds futures reflect risk premia, these measures require some adjustment for risk premia. In this paper, we document that excess returns on federal funds futures have been positive on average. We also document that expected excess returns are strongly countercyclical. In particular, excess returns are surprisingly predictable by employment growth and other business-cycle indicators such as Treasury yields and corporate bond spreads. Excess returns on eurodollar futures display similar patterns. We document that simply ignoring these risk premia has important consequences for the future expected path of monetary policy. We also investigate whether risk premia matter for conventional measures of monetary policy surprises.
Michael Tehranchi (Department of Mathematics, University of Texas at Austin) email@example.com
Optimal Portfolio Choice in Bond Markets (poster session)
We consider the Merton problem of optimal portfolio choice when the traded instruments are the set of zero-coupon bonds. Working within an infinite-factor Markovian Heath-Jarrow-Morton model of the interest rate term structure, we find conditions for the existence and uniqueness of optimal trading strategies. When there is uniqueness, we provide a characterization of the optimal porfolio.
Ruey S. Tsay (Graduate School of Business, University of Chicago) firstname.lastname@example.org
This talk is concerned with estimating stochastic diffusion models with leverage effects and with or without jumps. Several methods have been proposed in the literature to estimate such models including efficient method of moments (EMM) and Markov chain Monte Carlo (MCMC) method. For MCMC methods, most of the existing methods cannot deal with leverage effects or require intensive computation. We discuss the difficulties of the estimation problem and propose a modified method that can estimate the model efficiently. Simulation and real examples are used to compare estimation results of various methods.
Diane Louise Wilcox (Department of Mathematics and Applied Mathematics, University of Cape Town) email@example.com
Periodicity and Scaling of Eigenmodes in an Emerging Market (poster session)
Joint work with Tim Gebbie.
We investigate periodic, aperiodic and scaling behaviour of eigenmodes, i.e. daily price fluctuation time-series derived from eigenvectors, of correlation matrices of shares listed on the Johannesburg Stock Exchange (JSE) from January 1993 to December 2002. Periodic, or calendar, components are investigated by spectral analysis. We demonstrate that calendar effects are limited to eigenmodes which correspond to eigenvalues outside the Wishart range. Aperiodic and scaling behaviour of the eigenmodes are investigated by using rescaled-range methods and detrended fluctuation analysis (DFA). We find that the eigenmodes which correspond to eigenvalues within the Wishart range are dominated by noise effects. In particular, we find that interpolating missing data or illiquid trading days with a zero-order hold introduces high frequency noise and leads to the overestimation of uncorrected (for serial correlation) Hurst exponents. DFA exponents of the eigenmodes suggest an absence of long-term memory.
Shu Wu (Department of Economics, The University of Kansas) firstname.lastname@example.org
Interest Rate Risk and the Forward Premium Anomaly in Foreign Exchange Markets (poster session)
premiums implied by the yield curves across countries, uncovered interest rate parity (UIP) is still strongly rejected by the data. Moreover, factors that predict the excess bond returns are found not significant at all in predicting the foreign exchange returns. These results reject the joint restrictions on the exchange rate and interest rates imposed by dynamic term structure models, suggesting that foreign exchange markets and bond markets may not be fully integrated and we have to look beyond interest rate risk in order to understand the exchange rate anomaly.
Yong Zeng (Department of Mathematics and Statistics, University of Missouri at Kansas City) email@example.comFiltering with a Marked Point Process Observation: Applications to the Econometrics of Ultra-High-Frequency Data (poster session)
Ultra-high-frequency (UHF) data are naturally modeled as a marked point process (MPP), because of the random arrival times as well as the associated marks such as price, volume and ask and bid quotes at an arrival time. Even though econometricians model UHF data as a MPP, they view UHF data as an irregularly-spaced time series. Here, we take the angle of probabilists and view UHF data as an observed sample path of a marked point process (MPP). Then, we propose a general filtering model for UHF data where the signals are latent processes with time-varying parameters and the observations are in a generic mark space with other observable factors. The latent process and parameters are jointly modeled by a martingale problem and the observable factors are allowed in the stochastic intensity kernel of the MPP. In this way, we obtain a unified framework for many existing models for UHF data.
The powerful tools of stochastic filtering are introduced for developing the statistical foundations of the proposed model. The likelihoods, posterior, likelihood ratios and Bayes factors, are studied. They all are of continuous time, of infinite dimension and are characterized by stochastic differential equations such as filtering equations. To calculate, for example, likelihoods or posterior of a proposed model, consistent algorithms are required. Mathematical foundations for consistent, efficient algorithms are established. There are two general approaches for constructing recursive algorithms. One approach is Kushner's Markov chain approximation method, and the other is Sequential Monte Carlo method or particle filtering method. The latter approach is more attractive in that it can mitigate and even avoid the ``curse of dimensionality'' in complex models. Especially, Bayesian inference (estimation and model selection) via filtering are developed for the proposed model.
Connect With Us: