Results 1  10
of
14
LOCALLY STATIONARY LONG MEMORY ESTIMATION
"... Abstract. Spectral analysis of strongly dependent time series data has a long history in applications in a variety of fields, such as, e.g., telecommunication, meteorology, hydrology or, more recently, financial and economical data analysis. There exists a wide literature on parametrically or semip ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Spectral analysis of strongly dependent time series data has a long history in applications in a variety of fields, such as, e.g., telecommunication, meteorology, hydrology or, more recently, financial and economical data analysis. There exists a wide literature on parametrically or semiparametrically modelling such processes using a longmemory parameter d, including more recent work on wavelet estimation of d. As a generalization of these latter approaches, in this work we allow the longmemory parameter d to be varying over time. Hence, we give up the somewhat restrictive assumption of secondorder stationarity of the observed process (or its increments, respectively, after differencing a finite number of times). We embed our approach into the framework of locally stationary processes which, over the past decade, has been developed for weakly dependent time series with a timevarying spectral structure. In this paper we adopt a semiparametric approach for estimating the timevarying parameter d in order to avoid fitting a parametric model, such as ARFIMA, to the observed data. We show weak consistency and a central limit theorem for our logregression wavelet estimator of the timedependent d in a Gaussian context. Both simulations and a real data example complete our work on providing a fairly general approach. 1.
Longmemory stable OrnsteinUhlenbeck processes
 Electron. J. Probab
, 2003
"... E l e c t r o n ..."
(Show Context)
Fractional Brownian motion with Hurst index H=0 and the Gaussian Unitary Ensemble
"... The goal of this paper is to establish a relation between characteristic polynomials of N ×N GUE random matrices H as N → ∞, and Gaussian processes with logarithmic correlations. First, we introduce a regularized version of fractional Brownian motion with zero Hurst index, which is a Gaussian proces ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
The goal of this paper is to establish a relation between characteristic polynomials of N ×N GUE random matrices H as N → ∞, and Gaussian processes with logarithmic correlations. First, we introduce a regularized version of fractional Brownian motion with zero Hurst index, which is a Gaussian process with stationary increments and logarithmic increment structure. Then we prove that this process appears as a limit of DN (z): = − log det(zI−H)  on mesoscopic scales as N →∞. By employing a Fourier integral representation, we show how this implies a continuous analogue of a result by Diaconis and Shahshahani [18]. On the macroscopic scale, DN (x) gives rise to yet another type of Gaussian process with logarithmic correlations. We give
LONG STRANGE SEGMENTS, RUIN PROBABILITIES AND THE EFFECT OF MEMORY ON MOVING AVERAGE PROCESSES
"... Abstract. We obtain the rate of growth of long strange segments and the rate of decay of infinite horizon ruin probabilities for a class of infinite moving average processes with exponentially light tails. The rates are computed explicitly. We show that the rates are very similar to those of an i.i. ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. We obtain the rate of growth of long strange segments and the rate of decay of infinite horizon ruin probabilities for a class of infinite moving average processes with exponentially light tails. The rates are computed explicitly. We show that the rates are very similar to those of an i.i.d. process as long as moving average coefficients decay fast enough. If they do not, then the rates are significantly different. This demonstrates the change in the length of memory in a moving average process associated with certain changes in the rate of decay of the coefficients. 1.
Prufungsausschussvorsitzender:Prof.Dr.AlfredMertins
"... inTeilenderZeitreihe)dientdazu,CharakteristikenderzuGrundeliegendenDynamik DieVerteilungordinalerMusterineinerZeitreihe(bzw. zuberechnen,oderzwischenderDynamikinunterschiedlichenTeilenderZeitreihezu unterscheiden. durchdieShannonEntropiederVerteilungordinalerMustergegebenistundalsMa EinBeispielsolc ..."
Abstract
 Add to MetaCart
(Show Context)
inTeilenderZeitreihe)dientdazu,CharakteristikenderzuGrundeliegendenDynamik DieVerteilungordinalerMusterineinerZeitreihe(bzw. zuberechnen,oderzwischenderDynamikinunterschiedlichenTeilenderZeitreihezu unterscheiden. durchdieShannonEntropiederVerteilungordinalerMustergegebenistundalsMa EinBeispielsolcheinerCharakteristikistdiePermutationsentropie,die dieKomplexitatvonZeitreihenangesehenwerdenkann.EineAnwendungderPermuta fur tionsentropieistdieAnalyseepileptischerAktivitatinEEGZeitreihen. DerKontextderUntersuchungendieserArbeitisteineparametrischeFamiliestochastischerProzessemitstationaren,nichtdegenerierten,zentriertenGauss'schenZuwachsen. DieseProzessklassebeinhaltetequidistanteDiskretisierungenFraktalerBrown'scherBewegungsowieintegrierteARFIMA(0,d,0)undAR(1)Prozesse. dassdieVerteilungordinalerMusterinsolchenProzessenstationarist,unddassjedes InKapitel3zeigenwir, ordinaleMustereinestriktpositiveAuftretenswahrscheinlichkeithat. AnzahlvonBeobachtungenordinalerMustergegeben,soistdierelativeHau Isteineendliche MusterseinunverzerrterSchatzerfurdieentsprechendeAuftretenswahrscheinlichkeit.Da gkeiteines dieVerteilungstationarerundzentrierterGauss'scherProzesseinvariantistbezuglich einerUmkehrungderRaumbzw.Zeitachse,habenbestimmteordinaleMusterdieselbe Auftretenswahrscheinlichkeit. telt,erhaltmanSchatzermitniedrigererVarianz. IndemmandierelativenHau gkeitendieserMustermitEinehinreichendeBedingungfurschwacheKonsistenzderSchatzerist, tokovarianzendesZuwachsprozessesfurwachsendeZeitabstandegegennullgehen. dassdieAu
polygonal line processes in Hilbert space ∗
"... Operator fractional Brownian motion as limit of ..."
(Show Context)
Detecting Markov Random Fields Hidden in White Noise
"... Motivated by change point problems in time series and the detection of textured objects in images, we consider the problem of detecting a piece of a Gaussian Markov random field hidden in white Gaussian noise. We derive minimax lower bounds and propose nearoptimal tests. 1 ..."
Abstract
 Add to MetaCart
Motivated by change point problems in time series and the detection of textured objects in images, we consider the problem of detecting a piece of a Gaussian Markov random field hidden in white Gaussian noise. We derive minimax lower bounds and propose nearoptimal tests. 1
Submitted to Bernoulli Greedy Algorithms for Prediction
"... In many prediction problems, it is not uncommon that the number of variables used to construct a forecast is of the same order of magnitude as the sample size, if not larger. We then face the problem of constructing a prediction in the presence of potentially large estimation error. Control of the e ..."
Abstract
 Add to MetaCart
In many prediction problems, it is not uncommon that the number of variables used to construct a forecast is of the same order of magnitude as the sample size, if not larger. We then face the problem of constructing a prediction in the presence of potentially large estimation error. Control of the estimation error is either achieved by selecting variables or combining all the variables in some special way. This paper considers greedy algorithms to solve this problem. It is shown that the resulting estimators are consistent under weak conditions. In particular, the derived rates of convergence are either minimax or improve on the ones given in the literature allowing for dependence and unbounded regressors. Some versions of the algorithms provide fast solution to problems such as Lasso.
Power laws and SelfOrganized Criticality in Theory and Nature
"... Power laws and distributions with heavy tails are common features of many complex systems. Examples are the distribution of earthquake magnitudes, solar flare intensities and the sizes of neuronal avalanches. Previously, researchers surmised that a single general concept may act as an underlying gen ..."
Abstract
 Add to MetaCart
(Show Context)
Power laws and distributions with heavy tails are common features of many complex systems. Examples are the distribution of earthquake magnitudes, solar flare intensities and the sizes of neuronal avalanches. Previously, researchers surmised that a single general concept may act as an underlying generative mechanism, with the theory of self organized criticality being a weighty contender. The powerlaw scaling observed in the primary statistical analysis is an important, but by far not the only feature characterizing experimental data. The scaling function, the distribution of energy fluctuations, the distribution of interevent waiting times, and other higher order spatial and temporal correlations, have seen increased consideration over the last years. Leading to realization that basic models, like the original sandpile model, are often insufficient to adequately describe the complexity of realworld systems with powerlaw distribution. Consequently, a substantial amount of effort has gone into developing new and extended models and, hitherto, three classes of models have emerged. The first line of