頁籤選單縮合
題 名 | Self-Normalization for Heavy-Tailed Time Series with Long Memory |
---|---|
作 者 | McElroy, Tucker; Politis, Dimitris; | 書刊名 | Statistica Sinica |
卷 期 | 17:1 2007.01[民96.01] |
頁 次 | 頁199-220 |
分類號 | 319.5 |
關鍵詞 | Heavy-tailed data; Infinite variance; Long-range dependence; Studentization; |
語 文 | 英文(English) |
英文摘要 | Many time series data sets have heavy tails and/or long memory, both of which are well-known to greatly influence the rate of convergence of the sample mean. Typically times series analysts consider models with either heavy tails or long memory; we consider both. The paper is essentially a theoretical case study that explores the growth rate of the sample mean for a particular heavy-tailed, long memory time series model. An exact rate of convergence, which displays the competition between memory and tail thickness in fostering sample mean growth, is obtained in our main theorem. An appropriate self-normalization is used to produce a studentized sample mean statistic, computable without prior knowledge of the tail and memory parameters. This paper presents a novel heavy-tailed time series model that also has long memory in the sense of sums of well-defined autocovariances; we explicitly show the role that memory and tail thickness play in determining the sample mean’s rate of growth, and we construct an appropriate studentization. Our model is a natural extension of long memory Gaussian models to data with infinite variance, and therefore pertains to a wide range of applications, including finance, insurance, and hydrology. |
本系統中英文摘要資訊取自各篇刊載內容。