Results 1  10
of
28
Y.: 2003, Seismic gaps and earthquakes
 108 (BIO) (ESE 61
"... [1] McCann et al. [1979] published a widely cited ‘‘seismic gap’ ’ model ascribing earthquake potential categories to 125 zones surrounding the Pacific Rim. Nishenko [1991] published an updated and revised version including probability estimates of characteristic earthquakes with specified magnitude ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
[1] McCann et al. [1979] published a widely cited ‘‘seismic gap’ ’ model ascribing earthquake potential categories to 125 zones surrounding the Pacific Rim. Nishenko [1991] published an updated and revised version including probability estimates of characteristic earthquakes with specified magnitudes within each zone. These forecasts are now more than 20 and 10 years old, respectively, and sufficient data now exist to test them rather conclusively. For the McCann et al. forecast, we count the number of qualifying earthquakes in the several categories of zones. We assume a hypothetical probability consistent with the gap model (e.g., red zones have twice the probability of green zones) and test against the null hypothesis that all zones have equal probability. The gap hypothesis can be rejected at a high confidence level. Contrary to the forecast of McCann et al., the data suggest that the real seismic potential is lower in the gaps than in other segments, and plate boundary zones are not made safer by recent earthquakes. For the 1991 Nishenko hypothesis, we test the number of filled zones, the likelihood scores of the observed and simulated catalogs, and the likelihood ratio of the gap hypothesis to a Poissonian null hypothesis. For earthquakes equal to or larger than the characteristic magnitude, the new seismic gap hypothesis failed at the 95 % confidence level in both the number and ratio tests. If we lower the magnitude threshold by 0.5 for qualifying earthquakes, the new gap hypothesis passes the number test but fails in both the likelihood and likelihood ratio tests at the 95 % confidence level. INDEX TERMS: 7215 Seismology:
An Empirical Model for Earthquake Probabilities in the San Francisco Bay Region, California, 2002–2031
"... Abstract The moment magnitude M 7.8 earthquake in 1906 profoundly changed the rate of seismic activity over much of northern California. The low rate of seismic activity in the San Francisco Bay region (SFBR) since 1906, relative to that of the preceding 55 yr, is often explained as a stressshadow ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Abstract The moment magnitude M 7.8 earthquake in 1906 profoundly changed the rate of seismic activity over much of northern California. The low rate of seismic activity in the San Francisco Bay region (SFBR) since 1906, relative to that of the preceding 55 yr, is often explained as a stressshadow effect of the 1906 earthquake. However, existing elastic and viscoelastic models of stress change fail to fully account for the duration of the lowered rate of earthquake activity. We use variations in the rate of earthquakes as a basis for a simple empirical model for estimating the probability of M �6.7 earthquakes in the SFBR. The model preserves the relative magnitude distribution of sources predicted by the Working Group on California Earthquake Probabilities ’ (WGCEP, 1999; WGCEP, 2002) model of characterized ruptures on SFBR faults and is consistent with the occurrence of the four M �6.7 earthquakes in the region since 1838. When the empirical model is extrapolated 30 yr forward from 2002, it gives a probability of 0.42 for one or more M �6.7 in the SFBR. This result is lower than the probability of 0.5 estimated by WGCEP (1988), lower than the 30yr Poisson probability of 0.60 obtained by WGCEP (1999) and WGCEP (2002), and lower than the 30yr timedependent probabilities of 0.67, 0.70,
Measuring and Modeling Viscoelastic Relaxation of the Lithosphere with Application to the Northern Volcanic
, 2010
"... In thinking about who has played a role in my arrival at this point, I have come to realize just how many people it took to get here. Mark Simons has been my research adviser during my time at Caltech and has had a hand in all of the work presented in this thesis. He has allowed and encouraged me to ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
In thinking about who has played a role in my arrival at this point, I have come to realize just how many people it took to get here. Mark Simons has been my research adviser during my time at Caltech and has had a hand in all of the work presented in this thesis. He has allowed and encouraged me to pursue my scientific interests and guided my growth as a scientist. I have found his insight and disciplined approach to science to be invaluable. In addition, I have enjoyed working with members of our research group. Current members are Ravi Kanda, Nina Lin, Belle Philibosian, Sarah Minson, and Francisco Ortega. Matt Pritchard, Rowena Lohman, and Brian Savage where there to guide me in in the early stages of my graduate school career. Special thanks must go to Eric Hetland for his help with a large portion of this work. I have also benefited from working with Shelley Kenner, Charles Williams, and Paul Rosen. I would also like to thank the members of my thesis committee, Mike Gurnis, JeanPhilippe Avouac, and Tom Heaton who has also served as my academic adviser. Hans Fleischmann, Professor Emeritus of Applied and Engineering Physics at Cornell University, was my adviser during my undergraduate study and I owe him a large debt of gratitude for his encouragement during my time at Cornell. The Gordon
Recurrence and interoccurrence behavior of selforganized complex phenomena
 NONLINEAR PROCESSES IN GEOPHYSICS
, 2007
"... ..."
Statistical forecasts and tests for small interplate repeating earthquakes along the Japan Trench, Earth Planets Space, 2011 (in press). Research group “Earthquake Forecast System based on Seismicity of Japan”, Earthquake forecast testing experiment for
"... Earthquake predictability is a fundamental problem of seismology. Using a sophisticated model, a Bayesian approach with lognormal distribution on the renewal process, we theoretically formulated a method to calculate the conditional probability of a forthcoming recurrent event and forecast the proba ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Earthquake predictability is a fundamental problem of seismology. Using a sophisticated model, a Bayesian approach with lognormal distribution on the renewal process, we theoretically formulated a method to calculate the conditional probability of a forthcoming recurrent event and forecast the probabilities of small interplate repeating earthquakes along the Japan Trench. The numbers of forecast sequences for 12 months were 93 for
Updating Seismic Hazard at Parkfield
, 2008
"... The occurrence of the September 28, 2004 M=6.0 earthquake at Parkfield, California, has significantly modified the mean and aperiodicity of the series of time intervals between the big events in this segment of the San Andreas fault. Using the Minimalist Model of characteristic earthquakes, the Brow ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
The occurrence of the September 28, 2004 M=6.0 earthquake at Parkfield, California, has significantly modified the mean and aperiodicity of the series of time intervals between the big events in this segment of the San Andreas fault. Using the Minimalist Model of characteristic earthquakes, the Brownian Passage Time Model and other, standard, statistical schemes as renewal models, we fit the new data series and recalculate the hazard parameters for the new seismic cycle. The differences resulting from these various renewal models are emphasized.
Earth Planets Space, 56, 563–571, 2004 Probability gains expected for renewal process models
, 2004
"... We usually use the Brownian distribution, lognormal distribution, Gamma distribution, Weibull distribution, and exponential distribution to calculate longterm probability for the distribution of time intervals between successive events. The values of two parameters of these distributions are determ ..."
Abstract
 Add to MetaCart
We usually use the Brownian distribution, lognormal distribution, Gamma distribution, Weibull distribution, and exponential distribution to calculate longterm probability for the distribution of time intervals between successive events. The values of two parameters of these distributions are determined by the maximum likelihood method. The difference in log likelihood between the proposed model and the stationary Poisson process model, which scores both the period of no events and instances of each event, is considered as the index for evaluating the effectiveness of the earthquake probability model. First, we show that the expected value of the loglikelihood difference becomes the expected value of the logarithm of the probability gain. Next, by converting the time unit into the expected value of the interval, the hazard is made to represent a probability gain. This conversion reduces the degrees of freedom of model parameters to 1. We then demonstrate that the expected value of the probability gain in observed parameter values ranges between 2 and 5. Therefore, we can conclude that the longterm probability calculated before an earthquake may become several times larger than that of the Poisson process model.
Geophysical Journal International Geophys. J. Int. (2014) doi: 10.1093/gji/ggu157 G
"... sm ol og y Deterministic chaos in a simulated sequence of slip events on a single isolated asperity ..."
Abstract
 Add to MetaCart
(Show Context)
sm ol og y Deterministic chaos in a simulated sequence of slip events on a single isolated asperity