Monday, June 10, 2002 - 4:30pm - 5:30pm
- Accelerating Moment Release in Modified Stress Release Models of Regional Seismicity
Mark Bebbington (Massey University)Steven Jaume (College of Charleston)
Joint work with Mark S. Bebbington, IIS&T, Massey University, Private Bag 11222, Palmerston North, New Zealand m.bebbington.massey.ac.nz
We show how the stress-release process, by making the distribution of assigned magnitudes dependent on the stress, can produce earthquake sequences characterized by accelerating moment release (AMR). The magnitude distribution is a modified Gutenberg-Richter power law, which is equivalent to the square-root of energy released having a tapered Pareto distribution. The mean of this distribution is controlled by the location of the tail-off. In the limit as the tail-off point becomes large, so does the mean magnitude, corresponding to an acceleration to criticality of the system. Synthetic earthquake catalogs were produced by simulation of differing variants of the system. The factors examined were how the event rate and mean magnitude vary with the level of the process, and whether this underlying variable should itself correspond to strain or seismic moment. Those models where the stress drop due to an earthquakes is proportional to seismic moment produce AMR sequences, whereas the models with with stress drop proportional to Benioff strain do not. These results suggest the occurrence of AMR is strongly dependent upon how large earthquakes effect the dynamics of the fault system in which they are embedded. We have also demonstrated a means of simulating multiple AMR cycles and sequences, which may assist investigation of parameter estimation and hazard forecasting using AMR models.
- Renewal Processes for Great Events: Bayesian Nonparametric Interevent Time Density Estimation
Renata Rotondi (Italian Research Council (CNR))
The renewal process is one of the simplest history dependent point processes after the stationary Poisson process; its conditional intensity depends on the elapsed time since the last occurence time and this dependence is expressed through the probability distribution of the time T between consecutive events. I think that more meaningful results could be obtained by using more general distributions than the ones proposed in the literature: Lognormal, Gamma, Weibull and Doubly exponential distributions. The choice of these distributions forces certain assumptions, e.g. concerning the unimodality, that can be unjustified by the real data. To avoid this difficulty I have assumed that the distribution to estimate is random, distributed according to a stochastic process called Polya tree (Lavine, Ann. Stat. (1992)). The inferential procedure followed is fundamentally based on the building of a binary, recursive partition of the support of the distribution and on the updating, through the observations, of the a priori probabilities that the T variable belongs to each of the subsets of the partition. This method has been applied to the set of strong events which occurred in the seismic zones of Southern Italy; the results obtained have been compared, on the basis of the Bayes factor, with the ones provided by the most popular parametric distributions for T.
- Multiple Infrasound Arrays Processing
Sung-Eun Kim (University of Cincinnati)
Joint work with Robert H. Shumway, Dept. of Statistics, University of California, Davis.
Integrating or fusing array data from various sources will be extremely important in making the best use of networks for detecting signals and for estimating their velocities and azimuths. In addition, studying the size and shape of location ellipses that use velocity, azimuth and travel time information from a integrated collection of small arrays to locate the event will be critical in evaluating our overall capability for monitoring a Comprehensive Test Ban Treaty (CTBT). We have developed a small-array theory that characterizes the uncertainty in estimated velocities and azimuths for different infrasonic array configurations and levels of signal correlation. The performance of simple beam forming and a generalized likelihood beam that is optimal under signal correlation have been compared.
We have developed an integrated approach to using wavenumber parameters and their covariance properties from a collection of local arrays for estimating location, along with an uncertainty ellipse. Hypothetical wavenumber estimators and their uncertainties are used as input to a Bayesian nonlinear regression that produces fusion ellipses for event locations using probable configurations of detecting stations in the proposed global infrasound array.
- Bath's Law and the Gutenberg-Richter Relation
Maura Murru (Instituto Nazionale di Geofisica e Vulcanologia)
We revisit the issue of the so called Bath's law concerning the difference D1 between the magnitude of the mainshock, M0, and the second largest shock, M1, in the same sequence, considered by various authors, in the past, approximately equal to 1.2. Feller demonstrated in 1966 that the D1 expected value was about 0.5 given that the difference between the two largest random variables of a sample, N, exponentially distributed is also a random variable with the same distribution. Feller's proof leads to the assumption that the mainshock comes from a sample, which is different from the one of its aftershocks.
A mathematical formulation of the problem is developed with the only assumption being that all the events belong to the same self-similar set of earthquakes following the Gutenberg-Richter magnitude distribution. This model shows a substantial dependence of D1 on the magnitude thresholds chosen for the mainshocks and the aftershocks, and in this way partly explains the large D1 values reported in the past. Analysis of the New Zealand and PDE catalogs of shallow earthquakes demonstrates a rough agreement between the average D1 values predicted by the theoretical model and those observed. Limiting our attention to the average D1 values, Bath's law doesn't seem to strongly contradict the Gutenberg-Richter law. Nevertheless, a detailed analysis of the observed D1 distribution shows that the Gutenberg-Richter hypothesis with a constant b-value doesn't fully explain the experimental observations. The theoretical distribution has a larger proportion of low D1 values and a smaller proportion of high D1 values than the experimental observations. Thus Bath's law and the Gutenberg-Richter law cannot be completely reconciled, although based on this analysis the mismatch is not as great as has sometimes been supposed.
- A Stochastic Two-node Stress Transfer Model Reproducing Omori's Law
Mark Bebbington (Massey University)
Joint work with K. Borovkov (Department of Mathematics and Statistics, University of Melbourne, Victoria 3052, Australia) email@example.com
We present an alternative to the epidemic type aftershock sequence (ETAS) model of Ogata (1988). One node (denoted A) is loaded by external tectonic forces at a constant rate, with events (mainshocks) occurring randomly according to a hazard which is a function of the stress level at the node. Each event is a random negative jump in the stress level, and transfers a random amount of stress to the second node (B). Node B experiences events (aftershocks) in a similar way, with hazard a function of the stress level at that node only. When that hazard function satisfies certain simple conditions the frequency of events at node B, in the absence of any new events at node A, follows Omori's law. When node B is allowed tectonic input, which may be negative, i.e., aseismic slip, the frequency of events takes on a decay form that parallels the constitutive law derived by Dieterich (1994), which fits very well to the modified Omori law. We illustrate the model by fitting it to aftershock data from the Valparaiso earthquake of March 3 1985.
- Multidimensional Wavelet Analysis of Point Processes
Alexey Lyubushin (Russian Academy of Sciences)
Methodologically, analysis of seismic catalogs is more difficult than processing of time series. This is due to the fact that the analysis of point processes, including earthquakes sequences, does not allow the direct application of a vast variety of methods, parametrical models, and fast algorithms developed in the theory of signals. Actually, application of these methods requires a preliminary conversion of seismic catalogs to time series, which are sequences of values with a given constant time step. Formally, this conversion is not difficult and can be realized via calculation of either average values of a certain catalog parameter (for example, energy released during an earthquake) in successive non-overlapping time windows of a constant width or cumulative values of these characteristics with a constant time step (cumulative curves). However, the resulting time series are essentially non-Gaussian and include either outliers or step-like features (in cumulative curves) due to the time non-uniformity of seismic catalogs (gaps and groups of events such as swarms and aftershocks) and concentrating of major seismic energy in rare but strong events. Although classical methods of the signal analysis, based on the Fourier transformation and calculating of covariances, are formally applicable to the processing of these time series, they are ineffective due to large biases in estimates caused by outliers (or steps).
In the report, to avoid this limitation, the signal is expanded in orthogonal finite functions - Haar's wavelets. The compactness of the basis functions involved in the signal expansion makes it possible to analyze not only Gaussian but also essentially non-stationary time series, which allows the application of non-parametric methods of analysis of multidimensional time series to non-Gaussian signals, including series obtained from seismic catalogs. A method of joint analysis of seismic regimes is proposed for recognition collective behavior phenomena of seismicity in a group of areas that form a large seismically active region. The method is based on the robust multidimensional wavelet analysis of square root values of earthquake energies released in each of the areas (the so-called cumulative Benioff curves proportional to the values of elastic stresses accumulated and released in an earthquake source). This method is a further development of the method of wavelet-aggregated signals previously proposed by the author to analyze multidimensional time series of geophysical monitoring. It is based on robust multidimensional analysis of canonical and principal components of wavelet coefficients. The method is exemplified by applying it to a number of seismically active regions.
Key words: time series, seismic process, earthquake prediction, collective behavior, wavelet analysis, Benioff's curves.
Lyubushin A.A. (2000) Wavelet-Aggregated Signal and Synchronous Peaked Fluctuations in Problems of Geophysical Monitoring and Earthquake Prediction. - Izvestiya, Physics of the Solid Earth, vol.36, 2000, pp. 204-213.
- Do You Live in a Bad Neighborhood?: Maybe Site-Specific PSHA is an Oxymoron
Dan O'Connell (U.S. Bureau of Reclamation)
For low annual exceedence probabilities, PSHA results are dominated by the extreme tail behavior of empirical peak horizontal acceleration (PHA) distributions. Three-dimensional elastic finite-difference calculations were used to assess the influence of 3D shallow ( 1.1 * mean shear-wave velocity). Median PHAs were 1.7 times larger for the lower velocity sites relative to the higher velocity sites. Thus, a significant fraction of observed PHA dispersion may be related to shallow 3D velocity variations. 3D site responses may resolve the PSHA versus precariously-balanced rock enigma: The ergodic hypothesis is probably statistically correct over a large area, but makes little sense for site-specific estimation of peak ground motion scaling, particularly at rock sites. Rock sites tend to have the highest velocities, and the lowest peak amplitudes and peak amplitude dispersions, in their neighborhoods. Diminished directivity > 10 km from strike-slip faults, and directivity's limited extent as a function of area for strike-slip earthquakes, are also significant factors. Site-specific PSHA requires 3D site-response investigations because local 3D velocity structure produces biases in both PHA scaling and PHA dispersion.
- Bayesian Analysis of a Marked Point Process: Application in Seismic Hazard Assessment
Renata Rotondi (Italian Research Council (CNR))
Joint work with E. Varini (Universiat` L. Bocconi, Milano, Italy).
The point processes are the stochastic models most suitable for describing physical phenomena that appear at irregularly spaced times, like the earthquakes. These processes are uniquely characterized by their conditional intensity, that is the probability that at least an event occurs in the infinitesimal interval (t , t + t ) given the history of the process up to t. The seismic phenomenon shows different behaviours at different time and size scales; in particular, the occurrence of destructive shocks over some centuries in a seismogenic region may be explained by the elastic rebound theory. This theory has inspired the so-called stress release models; in fact their condition intensity translates the idea that an earthquake produces a sudden decrease of the amount of strain accumulated, gradually over time, along a fault and the subsequent event would occur when the stress exceeds the strength of the medium. This work has a double objective: the formulation of these models in the Bayesian framework and the addition of a mark to each event, that is its magnitude, modelled through a distribution that, at time t, depends on the stress level accumulated up to that instant. The parameter space then turns out constrained and dependent on the data; this makes Bayesian computation and analysis complicated. We have resorted to Monte Carlo methods to solve these problems.
- Demonstrations of Space-time Seismicity Analysis (Poster)
Yosihiko Ogata (The Institute of Statistical Mathematics)
The hierarchical space-time ETAS (HIST-ETAS) model is proposed to estimate regional characteristics of seismic activity through an objective Bayesian method. I would like to show several outcomes analyzed by applying the HIST-ETAS model to Japanese datasets to discuss their geophysical implications.