This series has been discontinued as of Dec. 2001. The references are kept here for archival purposes. The "Beiträge" are papers in mathematical statistics, made accessible by StatLab Heidelberg, the statistical laboratory at the Institut für Angewandte Mathematik, Universität Heidelberg."Reports" are contributed papers published first elsewhere or technical papers.

Titles ->by author ->by date -> by series   Abstracts ->by author -> by date ->by series

StatLab Heidelberg: Dec. 2001 Abstracts by Date (most recent changes first)


Eichler, M.: Granger causality graphs for multivariate time series.
Beitrag 64 < ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.64.pdf>
Submitted: June 01
Abstract: In this paper, we discuss the properties of mixed graphs whichvisualize causal relationships between the components of multivariatetime series. In these Granger-causality graphs, the vertices, representing thecomponents of the time series, are connected by arrows according to theGranger-causality relations between the variables whereas lines correspondto contemporaneous conditional association. We show that the concept ofGranger-causality graphs provides a framework for the derivation ofgeneral noncausality relations relative to reduced information sets by performingsequences of simple operations on the graphs. We briefly discussthe implications for the identification of causal relationships.Finally we provide an extension of the linear concept to strongGranger-causality.

Ladneva, A.; Piterbarg, V.: On Double Extremes of Gaussian Stationary Processes.
Beitrag 63 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.63.ps>
Submitted: September 00
Abstract: We consider a Gaussian stationary process with Pickands' conditions and evaluate an exact asymptotic behaviorof probability of two high extremes on two disjoint intervals

Beran, R.: REACT Trend Estimation in Correlated Noise.
Beitrag 62 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.62.pdf>
Submitted: September 99.
Abstract: Suppose that the data is modeled as replicated realizations of a p-dimensional random vector whose mean µ is a trend of interest and whose covariance matrix sigma is unknown, positive defnite. REACT estimators for the trend involve transformation of the data to a new basis, estimating the risks of a class of candidate linear shrinkage estimators, and selecting the candidate estimator with smallest estimated risk. For Gaussian samples and quadratic loss, the maximum risks of REACT estimators proposed in this paper undercut that of the classically efficient sample mean vector. The superefficiency of the proposed estimators relative to the sample mean is most pronounced when the new basis provides an economical description of the vector sigma-1=2 µ, dimension p is not small, and sample size is much larger than p.A case study illustrates how vague prior knowledge may guide choice of a basis that reduces risk substantially

Beran, R.: REACT Scatterplot Smoothers: Superefficiency through Basis Economy.
Beitrag 49 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.49.ps>
Submitted: September 98 Revised: July 99, to appear in JASA (2000).
Abstract: REACT estimators for the mean of a linear model involve three steps: transforming themodel to a canonical form that provides an economical representation of the unknown meanvector, estimating the risks of a class of candidate linear shrinkage estimators, and adaptivelyselecting the candidate estimator that minimizes estimated risk. Applied to one- or higher-way layouts, the REACT method generates automatic scatterplot smoothers that competewell on standard data sets with the best fits obtained by alternative techniques. Historicalprecursors to REACT include nested model selection, ridge regression, and nested principalcomponent selection for the linear model. However, REACT's insistence on working with aneconomical basis greatly increases its superefficiency relative to the least squares fit. Thisreduction in risk and the possible economy of the discrete cosine basis, of the orthogonalpolynomial basis, or of a smooth basis that generalizes the discrete cosine basis are illustratedby fitting scatterplots drawn from the literature. Flexible monotone shrinkage of componentsrather than nested 1-0 shrinkage achieves a secondary decrease in risk that is visible in theseexamples. Pinsker bounds on asymptotic minimax risk for the estimation problem expressthe remarkable role of basis economy in reducing risk

Sawitzki, G.: Keeping Statistics Alive in Documents
Report 18 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/report.18.pdf>
Submitted: November 99
Abstract: We identify some of the requirements for document integration of software components in statistical computing, and try to give a general idea how to cope with them in an implementation.

Dahlhaus, R.; Hainz, G.: Spectral Domain Bootstrap Tests for Stationary Time Series.
Beitrag 61 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.61.ps>
Submitted: November 99
Abstract: For stationary linear processes Kolmogorov-Smirnov type goodness-of-fit tests for compound hypotheses based on frequency domain bootstrap methods are proposed. Similar botstrap tests for comparing the spectral distributions of two time series are suggested. The small sample performance of the tests is investigated by simulation, and a real data example is given for illustration

Dahlhaus, R.; Neumann, M.: Locally Adaptive Fitting of Semiparametric Models to Nonstationary Time Series.
Beitrag 60
Published in: Stochastic Processes & Their Applications, to appear.
Abstract: We fit a class of semiparametric models to a nonstationary process. This class is parametrized by a mean function ( · ) and a p-dimensional function theta ( · ) = (theta(1)( · ) , ..., theta(p) ( · ))´ that parametrizes the time-varying spectral density ftheta( · ) (lambda). Whereas the mean function is estimated by a usual kernel estimator, each component of theta ( · ) is estimated by a nonlinear wavelet method. According to a truncated wavelet series expansion of theta(i) ( · ), we define empirical versions of the corresponding wavelet coefficients by minimizing an empirical version of the Kullback-Leibler distance. In the main smoothing step, we perform nonlinear thresholding on these coefficients, which finally provides a locally adaptive estimator of theta(i) ( · ). This method is fully automatic and adapts to different smoothness classes. It is shown that usual rates of convergence in Besov smoothness classes are attained up to a logarithmic factor

Dahlhaus, R.: Graphical Interaction Models for Multivariate Time Series.
Beitrag 59 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.59.ps>
Submitted: June 99 Revised: December 99
Abstract: In this paper we extend the concept of graphical models for multivariate data to multivariate time series. We define a partial correlation graph for time series and use the partial spectral coherence between two components given the remaining components to identify the edges of the graph. As an example we consider multivariate autoregressive processes. The method is applied to air pollution data

Maercker, G.; Moser, M.: Yule-Walker Type Estimators in GARCH(1,1) Models: Asymptotic Normality and Bootstrap.
Beitrag 58 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.58.ps>
Submitted: June 99
Abstract: We investigate GARCH(1,1) processes and first prove their stability.Using the representation of the squared GARCH model as an ARMA model wethen consider Yule-Walker type estimators for the parameters of theGARCH(1,1) model and derive their asymptotic normality.We use a residual bootstrap to define bootstrap estimators for theYule-Walker estimates and prove the consistency of this bootstrapmethod. Some simulation results will demonstrate the small sample behaviour ofthe bootstrap procedure

Linton, O.; Mammen, E.; Nielsen, J.; Tanggaard, C.: Estimating Yield Curves by Kernel Smoothing Methods.
Beitrag 57 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.57.ps>
Submitted: January 99
Abstract: We introduce a new method for the estimation of discount functions, yield curves and forward curves from government issued coupon bonds. Our approachis nonparametric and does not assume a particular functional form for thediscount function although we do show how to impose various restrictions inthe estimation. Our method is based on kernel smoothing and is defined asthe minimum of some localized population moment condition. The solution tothe sample problem is not explicit and our estimation procedure isiterative, rather like the backfitting method of estimating additivenonparametric models. We establish the asymptotic normality of our methodsusing the asymptotic representation of our estimator as an infinite serieswith declining coefficients. The rate of convergence is standard for onedimensional nonparametric regression.

Dahlhaus, R.: A Likelihood Approximation for Locally Stationary Processes.
Beitrag 56 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.56.ps>
Submitted: January 99
Abstract: A new approximation to the Gaussian likelihood of a multivariate locally stationary process is introduced. It is based on an approximation of the inverse of the covariance matrix of such processes. The new quasi-likelihood is a generalisation of the classical Whittle-likelihood for stationary processes. For parametric models asymptotic normality and efficiency of the resulting estimator are proved. Since the likelihood has a special local structure it can be used for nonparametric inference as well. This is briefly sketched for different estimates.

Sawitzki, G.: The Excess Mass Approach and the Analysis of Multi-Modality
Report 17 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/report.17.pdf>
Published in: The Excess Mass Approach and Analysis of Multi-Modality. in: W. Gaul, D. Pfeifer (eds.): From data to knowledge: Theoretical and practical aspectsof classification, data analysis and knowledge organization. Proc. 18thAnnual Conference of the GfKl, Univ. of Oldenburg, 1996. Springer Verlag,Heidelberg Berlin ISBN 3-540-60354-9 pp. 203 - 211.
Abstract: The excess mass approach is a general approach to statistical analysis. It can be used to formulate a probabilistic model for clustering and can be applied to the analysis of multi-modality. Intuitively, a mode is present where an excess of probability mass is concentrated. This intuitive idea can be formalized directly by means of the excess mass functional. There is no need for intervening steps like initial density estimation. The excess mass measures the local difference of a given distribution to a reference model, usually the uniform distribution. The excess mass defines a functional which can be estimated efficiently from the data and can be used to test for multi-modality.

Franke, J.; Kreiss, J.-P.; Moser, M.: Bootstrap Autoregressive Order Selection.
Beitrag 55 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.55.ps>
Submitted: December 98
Abstract: In this paper we deal with the problem of fitting an autoregression of order p to given data coming from a stationary autoregressive process with infinite order. The paper is mainlyconcerned with the selection of an appropriate order of theautoregressive model. Based on the so-called final prediction error (FPE) a bootstrap order selection can be proposed, because it turns out that one relevant expression occuring in the FPE is ready for the application of the bootstrap principle. Some asymptotic properties of the bootstrap order selection are proved. To carry through the bootstrap procedure an autoregression with increasing but non-stochastic order is fitted to the given data. The paper is concluded by some simulations.

Chen, Z.-G.; Dahlhaus, R.; Wu, K. H. : Hidden Frequency Estimation with Data Tapers.
Beitrag 54
Submitted: November 98
Abstract: Detecting and estimating hidden frequencies have long been recognized as an important problem in time series. This paper studies the asymptotic theory for two methods of high-precision estimation of hidden frequencies (secondary analysis method and maximum periodogram method) under the premise of using a data taper. In ordinary situations, a data taper may reduce the estimation precision slightly. However, when there are high peaks in thespectral density of the noise or other strong hidden periodicities with frequencies close to the hidden frequency of interest, the procedures of detection of the existence and the estimation for the hidden frequency of interest fail if data are non-tapered whereas they may work well if the data are tapered. The theoretical results are verified by some simulated examples.

Härdle, W.; Huet, S.; Mammen, E.; Sperlich, S. : Semiparametric Additive Indices for Binary Response and Generalized Additive Models.
Beitrag 53 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.53.ps>
Submitted: October 98.
Abstract: Models are studied where the response Y andcovariates X,T are assumed to fulfill E(Y | X;T) =G{XTbeta + alpha + m1(T1 ) + ...+ md(Td) }. Here G is a known (link) function,beta is an unknown parameter, and m1, ..., md areunknown functions. In particular, we consider additive binary response models where the response Y is binary. In these models, given X and T, the response Y has a Bernoulli distribution with parameter G{ XTbeta + alpha + m1(T1 ) + ... + md(Td) }. The paper discusses estimation of beta and m1, ... , md. Procedures are proposed for testing linearity of the additive components m1, ... , md. Furthermore, bootstrap uniform confidence intervals for the additive components are introduced. The practical performance of the proposed methods is discussed in simulations and in two economic applications.

Franke, J.; Kreiss, J.-P.; Mammen, E.; Neumann, M.H. : Properties of the Nonparametric Autoregressive Bootstrap.
Beitrag 52 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.52.ps>
Submitted: October 98.
Abstract: We prove geometric ergodicity and absolute regularity of the nonparametric autoregressive bootstrap process. To this end, we revisit this problem for nonparametric autoregressive processes and give some quantitative conditions (i.e., with explicit constants) under which the mixing coefficients of such processes can be bounded by some exponentially decaying sequence. This is achieved by using well-established coupling techniques.Then we apply the result to the bootstrap process and propose some particularestimators of the autoregression function and of the density of the innovations for which the bootstrap process has the desired properties.Moreover, by using some "decoupling" argument, we show that the stationary density of the bootstrap process converges to that of the original process. As an illustration, we use the proposed bootstrap method to construct simultaneous confidence bands and supremum-type tests for the autoregression function as well as to approximate the distribution of the least squares estimator in a certain parametric model.

Polonik, W. : Concentration and Goodness-of-Fit in Higher Dimensions: (Asymptotically) Distribution-Free Methods.
Beitrag 33
Published in: Annals of Statistics (1999), 27, 1210-1229
Abstract: A novel approach for constructing goodness-of-fit techniquesin arbitrary (finite) dimensions is presented. Testing problems are considered as well as the construction of diagnostic plots. The approach is based on some new notion of massconcentration, and in fact, our basic testing problems are fomulatedas problems for " goodness-of-concentration ". It is this connection to concentration of measure that makes the approach conceptually simple.The presented test statistics are continuous functionals of certain processes which behave like the standard one-dimensional uniform empirical process.Hence, the test statistics behave like classical test statistics for goodness-of-fit. In particular, for single hypotheses they are asymptotically distribution free with well known asymptotic distribution. The simple technical idea behind the approach may be called a generalizedquantile transformation, where the role of one-dimensional quantiles in classicalsituations is taken over by so-called minimum volume sets.

Mammen, E.; Marron, J.S.; Turlach, B.A.; Wand, M.P. : A General Framework for Constrained Smoothing.
Beitrag 51 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.51.ps>
Submitted: October 98.
Abstract: There are a wide array of smoothing methods available for finding structure in data. A general framework is developed which shows that many of these can be viewed as a projection of the data, with respect to appropriate norms. The underlying vector space is an unusually large product space, which allows inclusion of a wide range of smoothers in our setup (including many methods not typically considered to be projections). We give several applications of this simple geometric interpretation of smoothing. A major payoff is the natural and computationally frugal incorporation of constraints. Our point of view also motivates new estimates and it helps to understand the finite sample and asymptotic behaviour of these estimates.

Carroll, R. J.; Härdle, W.; Mammen, E.: Estimation in an Additive Model when the Components are LinkedParametrically.
Beitrag 50 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.50.ps>
Submitted: October 98.
Abstract: Motivated by a nonparametric GARCH model we considernonparametric additive regression and autoregression modelsin the special case that the additive components are linked parametrically. We show that the parameter can be estimated with parametric rate and give the normal limit. Our procedure is based on two steps. In the first stepnonparametric smoothers are used for the estimation of each additivecomponent without taking into account the parametric link of thefunctions. In a second step the parameter is estimated by using theparametric restriction between the additive components. Interestingly, our method needs no undersmoothing in the first step.

Konakov, V.; Mammen, E.: Local Limit Theorems for Transition Densities of Markov ChainsConverging to Diffusions.
Beitrag 48 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.48.ps>
Submitted: August 98.
Abstract: We consider triangular arrays of Markov chains that converge weakly toa diffusion process. Local limit theorems for transition densitiesare proved

Beran, R.: Superefficient Estimation of Multivariate Trend.
Beitrag 47 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.47.ps>
Published in: Mathematical Methods of Statistics 8 (1999) 166--180.
Submitted: July 98.
Abstract: The question of recovering a multiband signal from noisy observationsmotivates a model in which the multivariate data points consist of anunknown deterministic trend Xi observed with multivariate Gaussian errors. A cognate random trend model suggests two affineshrinkage estimators for the deterministic trend, which arerelated to an extended Efron-Morris estimator. When representedcanonically, the one affineshrinkage estimator performs componentwise James-Stein shrinkage in a coordinate system that is determined by the data. Under the originaldeterministic trend model, this affineshrinkage estimator and its relatives are asymptoticallyminimax in Pinsker's sense over certain classes of subsets of theparameter space. In such fashion, the affineshrinkage estimator and its cousins dominate theclassically efficient least squares estimator. We illustrate their use toimprove on the least squares fit of the multivariate linearmodel.

Polonik, W.; Yao, Q.: Asymptotics of set-indexed conditional empirical processes based on dependent data.
Report 16 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/report.16.ps>
Submitted: July 98.
Abstract: Based on observation vectors (Xt,Yt) from a strong mixing stochastic process,we estimate the conditional distribution of Y given X = x by means of aNadaraya-Watson-type estimator. Using this, we study the asymptotics ofa conditional empirical process indexed by classes of sets.Under assumptions on the richness of the indexing class in terms ofmetric entropy with bracketing, we have established uniform convergence, and asymptotic normality. The key technical result gives rates of convergences for the sup-norm of the conditional empirical process over a sequenceof indexing classes with decreasing maximum Lt-norm.The results are then applied to derive Bahadur-Kiefer type approximationsfor a generalized conditional quantile process which is closelyrelated to the minimum volume sets. The potential applications in the areas ofestimation of level sets and testing for unimodality of conditionaldistributions are discussed.

Polonik, W.; Yao, Q.: Conditional Minimum Volume Predictive Regions For Stochastic Processes.
Report 15
Submitted: January 98, to appear in JASA 2000
Abstract: Motivated by interval/region prediction in nonlinear timeseries, we propose a minimum volume predictor (MV-predictor) for astrictly stationary process. The MV-predictor varies with respect tothe current position inthe state space and has the minimum Lebesgue measure amongall regions with the nominal coverage probability.We have established consistency, convergence rates, andasymptotic normality for both coverage probability and Lebesguemeasure of the estimated MV-predictor under the assumption thatthe observations are taken from a strong mixing process.Applications with both real and simulated data sets illustrate theproposed methods.

Brockwell, P.J.; Dahlhaus, R.: Generalized Durbin-Levinson and Burg Algorithms.
Beitrag 35 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.35.ps>
Submitted: January 98
Abstract: We develop recursive algorithms for subset modelling and prediction which generalize the well-known Durbin-Levinson and Burg algorithms and include the univariate version of the subset Whittle algorithm of Penm and Terrell (1982). The results are derived using a basic property of orthogonal projections which leads to very simple derivations of the standard versions of the algorithms. As an application of the results, we obtain new and easily applied algorithms for the recursive calculation of the best linear h-step predictors (for any fixed h > 0) of an arbitrary process with known mean and covariance function.

Mammen, E.; Linton, O.; Nielsen, J.: The Existence and Asymptotic Properties of a Backfitting Projection Algorithm under Weak Conditions.
Beitrag 46 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.46.ps>
Submitted: January 98.
Abstract: We derive the asymptotic distribution of a new backfitting procedure for estimating the closest additive approximation to a nonparametric regressionfunction. The procedure employs a recent projection interpretation ofpopular kernel estimators provided by Mammen et al. (1997), and theasymptotic theory of our estimators is derived using the theory of additiveprojections reviewed in Bickel et al. (1995). Our procedure achieves thesame bias and variance as the oracle estimator based on knowing the othercomponents, and in this sense improves on the method analyzed in Opsomer andRuppert (1997). We provide 'high level' conditions independent of thesampling scheme. We then verify that these conditions are satisfied in atime series autoregression under weak conditions.

Mammen, E. : Resampling Methods for Curve Estimation.
Beitrag 45
Submitted: January 1998. To appear in Smoothing and Regression. Approaches, Computation and Application (M. G. Schimek, edit.), Wiley, New York.
Abstract: This article gives an introduction to resampling methods for non- and semiparametric regression. In this fieldbootstrap approaches have been proposed e.g. for model choice, data adaptive choice of the smoothing parameter (e.g. bandwidth choice), testing and the construction of confidence intervals and bands.We do not aim to give a full overview of all these applications. Theaim of this article is to give a first impression of the power ofresampling methods in this field.

Mammen, E.; Tsybakov, A. B. : Smooth Discrimination Analysis.
Beitrag 44
Submitted: January 1998.
Abstract: Discriminant analysis for two data sets in R^d with probability densities f and gcan be based on the estimation of the set G = { x : f(x) > = g(x) }. Weconsider applications where it is appropriate to assume that the regionG has a smooth boundary. In particular, this assumption makes sense if discriminant analysis is used as a data analytic tool. We discussoptimal rates for estimation of G.

Mammen, E.; Thomas-Agnan, C. : Smoothing Splines and Shape Restrictions.
Beitrag 43
Submitted: January 98. Discussion paper, Sonderforschungsbereich 373, Berlin.
Abstract: Consider a partial linear model, where the expectation of arandom variable Y depends on covariates (x,z) through F( theta_0 x + m_0 (z)), with theta_0 an unknown parameter, and m_0 an unknown function. We apply the theory of empirical processes to derive the asymptotic properties of the penalized quasi-likelihood estimator.

Franke, J.; Kreiss, J.-P.; Mammen, E. : Bootstrap of Kernel Smoothing in Nonlinear Time Series.
Beitrag 42
Submitted: January 98. Discussion paper, Sonderforschungsbereich 373, Berlin.
Abstract: Kernel smoothing in nonparametric autoregressive schemesoffers a powerful tool in modelling time series. In this paper it is shown that the bootstrap can be used for estimating the distribution of kernel smoothers. This can be done by mimicking the stochastic nature of the whole process in the bootstrap resampling or by generating a simpleregression model. Consistency of these bootstrap procedures will be shown.

Gijbels, I.; Park, B. U. ; Mammen, E.; Simar, L. : On Estimation of Monotone and Concave Frontier Functions.
Beitrag 41
Submitted: January 98. Discussion paper, Institut de Statistique and CORE, Louvain-la-Neuve.
Abstract: A way for measuring the efficiency of enterprises is via the estimation of the so-called production frontier, which is the upper boundary of the support of the population density in the input and output space. It is reasonable to assume that the production frontieris a concave monotone function. Then, a famous estimator is thedata envelopment analysis (DEA) estimator, which is the lowest concavemonotone increasing function covering all sample points.This estimator is biased downwards since it never exceedsthe true production frontier. In this paper we derivethe asymptotic distribution of the DEA estimator, which enables us to assess the asymptotic bias and hence to propose an improved bias corrected estimator. This bias corrected estimator involves consistent estimation of the density function as well as of the second derivative of the production frontier. We also discuss briefly the construction of asymptotic confidence intervals. The finite sample performance of thebias corrected estimator is investigated via a simulation study and the procedure is illustrated for a real data example.

Konakov, V.; Mammen, E. : The Shape of Kernel Density Estimates in Higher Dimensions.
Beitrag 40
Published in: Mathematical Methods of Statisitcs 6, 440 - 464.
Abstract: Inference on the shape of a density in higher dimensions may be based on shape characteristics of kernel density estimates. In this paper asymptotic theory is offered for the distribution of the number M_n of local extremes. We show that M_n converges in distribution to the number of local extremes of a Gaussian field. Formulas for the asymptotic moments are available. The mathematical analysis is complicated by the fact that the number of local extremes is a discontinuous functional. This is the typical case for shape characteristics. Our mathematical approach is based on Edgeworth expansions of densities of kernel estimates and strong approximations.

Fan,J.; Härdle, W.; Mammen, E. : Direct Estimation of Low Dimensional Components in Additive Models.
Beitrag 38
Submitted: January 98. To appear in Ann. Statist.
Abstract: Additive regression models have turned out to be a useful statistical tool in analyses of high dimensional data sets. Recently, an estimator of additive components has been introduced by Linton and Nielsen (1994) which is based on marginal integration. The explicit definition of this estimator makes possible a fast computation and allows an asymptotic distribution theory. In this paper a modification of this procedure is introduced. We propose to introduce a weight function and to use local linear fits instead of kernel smoothing. These modifications have the following advantages:(i) We demonstrate that with an appropriate choice of the weight function, the additive components can be efficiently estimated: An additive component can be estimated with the same asymptotic bias and variance as if the other components were known.(ii) Application of local linear fits reduces the design related bias.

Mammen, E.; Marron, J.S.: Mass Recentered Kernel Smoothers.
Beitrag 37
Published in: Biometrika 84, 765 - 778
Abstract: The Local Linear smoother usually has better bias properties than the Nadaraya Watson smoother. An exception is the case of data sparsity. Here we discuss a modification of the Nadarya Watson smoother due to the Müller and Song, based on a horizontal shift of the kernel weights towards the local center of mass of the design points. This gives performance similar to the Local Linear when that works well, and better performance when it does not. The new smoother also preserves monotonicity. Shifting towards the center of mass is also used to develop a modified kernel density estimate which cancels the well known peak spreading effect.

Mammen, E.; Park, B.: Optimal Smoothing in Adaptive Location Estimation.
Beitrag 36
Published in: J. Stat. Plann. Inference 58, 333-348.
Abstract: In this paper higher order performance of kernel basedadaptive location estimators are considered. Optimalchoice of smoothing parameters is discussed and it isshown how much is lossed in efficiency by not knowingthe underlying translation density.

Erlenmaier, U.: A New Criterion for Tightness of Stochastic Processes and an Application to Markov Processes.
Report 14 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/report.14.ps>
Submitted: October 97. Revised: November 97.
Abstract: We prove a stochastic inequality for the modulus of continuity of a stochastic process U on the real line. It requires certain tail inequalities for the increments of U, refining a criterion of Billingsley (1968). Then this result is used to prove weak convergence of a goodness-of-fit test statistic for simple hypotheses about the conditional median function of a stationary Markovian time series.

Dümbgen; L.; Tyler, D.: On the Breakdown Properties of Two M-Functionals of Scatter.
Report 13 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/report.13.ps>
Submitted: September 97.
Abstract: The breakdown properties of two M-functionals of scatter are treated.The first functional is Tyler's (1987) M-functional for distributions onq-dimensional space, centered at zero. The second functional is Tyler'sM-functional applied to symmetrized distributions. While deriving explicitformulas for the breakdown points, we also investigate the causes of breakdownin detail.

Dümbgen, L.: Symmetrization and Decoupling of Combinatorial Random Elements.
Report 12
Published in: Statistics & Probability Letters 39 (1998), 355-361.
Abstract: New symmetrization and decoupling inequalities are derived for the combinatorial stochastic processes treated in Dümbgen (1994, Beitrag 12).

Härdle, W.; Mammen, E.; Müller, M. : Testing Parametric versus Semiparametric Modelling in Generalized Linear Models.
Beitrag 39 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.39.ps>
Submitted: January 1998. Discussion paper, Sonderforschungsbereich 373, Berlin.
Abstract: We consider a genralized partially linear model E(Y | X,T) = G{ X^T beta + m(T) } where G is a known function, beta is an unknown parameter vector, and m is an unknown function. The paper introduces a test statistic which allows to decide between a parametric and a semiparametric model:(i) m is linear, i.e. m(t) = t^T gamma for a parameter vector gamma,(ii) m is a smooth (nonlinear) function. Under linearity (i) it is shown that the test statistic is asymptotically normal. Moreover, it is proved that the bootstrap works asymptotically. Simulations suggest that (in small samples) bootstrap outperforms the calculation of critical values from the normal approximation. The practical performance of the test is shown in applications to data on East-West German migration and credit scoring.

Falguerolles, A. de; Friedrich, F.; Sawitzki, G.: A Tribute to J. Bertin's Graphical Data Analysis.
Beitrag 34 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.34.pdf>
Published in: In W. Bandilla, F. Faulbaum (eds.) Advances in Statistical Software 6. Lucius&Lucius Stuttgart 1997 ISBN 3-8282-0032-X pp. 11 - 20.
Submitted: March 97.
Abstract: Bertin's permutation matrices give simple and effective tools for the graphical analysis of data matrices or tables. We discuss some abstractions which help understanding Bertin's strategies and can be used in an interactive system.

Dümbgen, L.; Zerial, P.: Remarks on Low-Dimensional Projections of High-Dimensional Distributions
Report 11 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/report.11.ps>
Submitted: December 96
Abstract: Let P be a probability distribution on q-dimensional space. Necessary and sufficient conditions are derived under which a random d-dimensional projection of P converges weakly to a fixed distribution Q as q tends to infinity, while d is an arbitrary fixed number. This complements a well-known result of Diaconis and Freedman (1984). Further we investigate d-dimensional projections of ^P, where ^P is the empirical distribution of a random sample from P of size n. We prove a conditional Central Limit Theorem for random projections of ^P - P given the data ^P, as q and n tend to infinity.

Dümbgen, L. : New Goodness-of-Fit Tests and their Application toNonparametric Confidence Sets
Beitrag 32
Published in: Ann. Stat. 1998, Vol. 26, No. 1, 288-314.
Abstract: Suppose one observes a process V on the unit interval, wheredV(t) = f(t) + dW(t) with an unknown function f and standard Brownian motion W. We propose a particular test of one-point hypotheses about f which is based on suitably standardized increments of V.This test is shown to have desirable consistency properties if, for instance, fis restricted to various Hölder smoothness classes of functions. Thetest is mimicked in the context of nonparametric density estimation,nonparametric regression and interval censored data. Under shaperestrictions on the parameter f such as monotonicity or convexity, weobtain confidence sets for f adapting to its unknown smoothness.

Beran, R.; Dümbgen, L. : Modulation Estimators and Confidence Sets.
Beitrag 31 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.31.ps>
Published in: Annals of Statistics 26 (1998), pp. 1826-1856
Abstract: An unknown signal plus white noise is observed at n discretetime points. Within a large convex class of linear estimators of the signal, we choose the one which minimizes estimated quadratic risk. By construction,the resulting estimator is nonlinear. This estimation is done after orthogonal transformation of the data to a reasonable coordinate system. The procedure adaptively tapers the coefficients of the transformed data. If the class of candidate estimators satisfies a uniform entropy condition, then our estimator is asymptotically minimax in Pinsker's sense over certain ellipsoids in the parameter space and dominates the James-Stein estimatorasymptotically. We describe computational algorithms for the modulation estimator and construct confidence sets for the unknown signal.These confidence sets are centered at the estimator, have correctasymptotic coverage probability, and have relatively small risk asset-valued estimators of the signal.

Sawitzki, G.: New Directions in Programming Environments: Extensible Software.
Report 10 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/report.10.pdf>
Published in: New Directions in Programming Environments: Extensible Software. in: L. Billard, N. I.Fisher (eds.) Computing Science and Statistics. Interface '96. Proceedings of the 28th Symposium on the Interface. The Interface Foundation of North America, Inc., Fairfax Station, VA 22039-7460.1997, ISBN 1-886658-02-1 pp. 317 - 325.
Submitted: June 96
Abstract: If we want software that can be adapted to our needs on the long run, extensibility is a main requirement. For a long time, extensibility has been in conflict with stability and/or efficiency. This situation has changed with recent software technologies. Thetools provided by software technology however must be complementedby a design which exploits their facilities for extensibility. We illustrate this using Voyager, a portable data analysis system basedon Oberon.

Dahlhaus, R.; Neumann, M.H.; Sachs, R.v.: Nonlinear Wavelet Estimation of Time-Varying Autoregressive Processes.
Report 9
Abstract: We consider nonparametric estimation of the coefficients, of atime-varying autoregressive process. Choosing an orthonormal wavelet basisrepresentation of the coefficient functions, the empirical wavelet coefficientsare derived from the time series data as the solution of a least squares minimizationproblem. In order to allow the coefficient functions to be of inhomogeneous regularity,we apply nonlinear thresholding to the empirical coefficients and obtain locally smoothedestimates of the coefficient functions. We show that the resulting estimators attain theusual minimax L_2-rates up to a logarithm factor, simultaneously in a large scale of Besovclasses.

Thumfart, A. : Discrete Evolutionary Spectra and their Application to a Theoryof Pitch Perception.
Beitrag 30 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.30.ps>
Submitted: November 95.
Abstract: A definition of discrete evolutionary spectra is given thatcomplements the notion of evolutionary spectral density given by Dahlhaus(Dahlhaus, R.: Fitting time series models to nonstationary processes. Preprint,Univ. Heidelberg, 1992). For processes that have a discrete evolutionary spectrum,the asymptotic behaviour of linear functionals of the periodogram is investigated.The results are applied in a mathematical analysis of Licklider's theory of pitchperception. A pitch estimator based on this theory is investigated with respect tothe shift of the pitch of the residue described by Schouten et al.(Schouten, J.F.,Ritsma, R.J., Lopes Cardozo: Pitch of the residue, J. Acoust.Soc.Am. Vol.34, No.8,1962, 1418-1424).

Sawitzki, G. : Extensible Statistical Software: On a Voyage to Oberon.
Report 6 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/report.06.pdf>
Published in: Journal of Computational and Graphical Statistics Vol. 5 No 3 (1996)(Replaces G. Sawitzki: An Object-Oriented Portable Extensible StatisticalProgramming Environment Based on Oberon.)
Abstract: Recent changes in software technology have opened new possibilitiesfor statistical computing. Conditions for creating efficient and reliableextensible systems have been largely improved by programming languages andsystems which provide dynamic loading and type-safety across module boundaries,even at run time. We introduce Voyager, an extensible data analysis systembased on Oberon, which tries to exploit some of these possibilities.

Müller, D.W. : A Backward-Induction Algorithm for Computing the best ConvexContrast of two Bivariate Samples.
Beitrag 29
Submitted: October 95, to appear in Journal of Computational & Graphical Statistics.
Abstract: For real-valued x(1), x(2), ... , x(n) with real-valued "responses"y(1), y(2), ... , y(n) and "scores" s(1), s(2), ... ,s(n) we solve the problem ofcomputing the maximum of C(k) = s(1) I {y(1) 3 k(x(1))}+ ... + s(n) I { ... } over allconvex functions k on the line. The article describes a recursive relation and analgorithm based on it to compute this value and an optimal k in O(n(3)) steps. Fora special choice of scores, max C(k) can be interpreted as a generalized (one-sided)Kolmogorov-Smirnov statistic to test for treatment effect in nonparametric analysisof covariance.

Dümbgen, L. : Simultaneous Confidence Sets for Functions of a Scatter Matrix.
Beitrag 19
Published in: J. Multivariate Anal. 1998, Vol 65, No. 1, 19-35.
Abstract: Let Sigma be an unknown covariance matrix. Perturbation(in)equalities are derived for various scale-invariant functionalsof Sigma such as correlations (including partial, multiple andcanonical correlations) and others in connection with principalcomponent analysis. These results show that a particular confidenceset for Sigma; is canonical if one is interested in simultaneousconfidence bounds for these functionals. The confidence set isbased on the ratio of the extreme eigenvalues of Sigma-1 S, where S is an estimator for Sigma. Asymptotic considerations for theclassical Wishart model show that the resulting confidence boundsare substantially smaller than those obtained by inverting likelihoodratio tests.

Mammen, E. : Bootstrap, Wild Bootstrap and Generalized Bootstrap.
Beitrag 11 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.11.ps>
Submitted: August 93. Revised: June 95.
Abstract: Some modifications and generalizations of the bootstrap procedurehave been proposed. In this note we will consider the wild bootstrap and thegeneralized bootstrap and we will give two arguments why it makes sense touse these modifications instead of the original bootstrap. The firstargument is that there exist examples where generalized and wild bootstrapwork, but where the original bootstrap fails and breaks down. The secondargument will be based on higher order considerations. We will show thatthe class of generalized and wild bootstrap procedures offers a broadspectrum of possibilities for adjusting higher order properties of thebootstrap.

Giraitis, L.; Robinson, P.M.; Samarov, A.: Rate Optimal Semiparametric Estimationof the Memory Parameter of the Gaussian Time Series with Long Range Dependence.
Beitrag 28 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.28.ps>
Submitted: May 95.
Abstract: There exist several estimators of the memory parameter in long-memorytime series models with mean mu and the spectrum specified only locally near zerofrequency. In this paper we give a lower bound for the rate of convergence of anyestimator of the memory parameter as a function of the degree of local smoothnessof the spectral density at zero. The lower bound allows one to evaluate andcompare different estimators by their asymptotic behavior, and to claim the rateoptimality for any estimator attaining the bound. A log-periodogram regressionestimator, analysed by Robinson (1992), is then shown to attain the lower bound,and is thus rate optimal.

Eichler, M. : Empirical Spectral Processes and their Applications to StationaryPoint Processes.
Beitrag 26 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.26.ps>
Published in: Annals of Applied Probability 5 (1995), 1161-1176.
Abstract: We consider empirical spectral processes indexed by classes offunctions for the case of stationary point processes. Conditions for themeasurability and equicontinuity of these processes and a weak convergence resultare established. The results can be applied to the spectral analysis of pointprocesses. In particular, we discuss the application to parametric andnonparametric spectral density estimation.

Sawitzki, G. : Diagnostic Plots for One-Dimensional Data.
Report 8 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/report.08.pdf>
Published in: Computational Statistics. Papers collected on the Occasionof the 25 th Conference on Statistical Computing at Schloss Reisensburg.(Edited by P.Dirschedl & R.Ostermann for the Working Groups ... ... )Heidelberg, Physica, 1994, isbn 3-7908-0813-x, p. 237-258.
Abstract: How do we draw a distribution on the line? We give a survey of somewell known and some recent proposals to present such a distribution, based onsample data. We claim: a diagnostic plot is only as good as the hard statisticaltheory that is supporting it. To make this precise, one has to ask for theunderlying functionals, study their stochastic behaviour and ask for the naturalmetrics associated to a plot. We try to illustrate this point of view for someexamples.

Dümbgen, L. : The Asymptotic Behavior of Tyler's M-Estimatorof Scatter in High Dimension.
Beitrag 23 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.23.ps>
Submitted: December 94. Revised: May 97. To appear (partly) in Ann. Inst. Statist. Math. 50 (1998), pp. 471-491
Abstract: It is shown that Tyler's (1987) M-functional of scatter, whichis a robust surrogate for the covariance matrix of a distribution on R^p ,is Fr'echet-differentiable with respect to the weak topology. This propertyis derived in an asymptotic framework, where the dimension p may tend toinfinity. If applied to the empirical distribution of n i.i.d. randomvectors with elliptically symmetric distribution, the resulting estimatorhas the same asymptotic behavior as the sample covariance matrix in anormal model, provided that p tends to infinity and p/n tends to zero.

Dahlhaus, R. : Maximum Likelihood Estimation and Model Selection for Nonstationary Processes.
Report 7
Published in: J. Nonparam. Statist. 6 (1996), 171 - 191.
Abstract: The Gaussian maximum likelihood estimate is investigated for time seriesmodels that have locally a stationary behaviour (e.g. for time varying auto-regressivemodels). The asymptotic properties are studied in the case where the fitted model iseither correct or misspecified. For example the behaviour of the maximum likelihoodestimate is explained in the case where a stationary model is fitted to a nonstationaryprocess. As a general model selection criterion the AIC is considered. It can for exampleautomatically select between stationary models, nonstationary models and deterministictrends.

Dahlhaus, R. : On the Kullback-Leibler Information Divergence of LocallyStationary Processes.
Beitrag 27
Published in: Stochastic Processes and their Applications 62 (1996), 139-168.
Abstract: A class of processes with a time varying spectral representationis introduced. A time varying spectral density is defined and a uniquenessproperty of this spectral density is established. As an example we study timevarying autoregressions. Several results on the asymptotic norm - andtrace behaviour of covariance matrices of such processes are derived. Asa consequence we prove a Kolmogorov formula for the local prediction errorand calculate the asymptotic Kullback Leibler information divergence.

Dahlhaus, R.; Janas, D. : Efron's Bootstrap for Ratio Statistics in Time Series Analysis.
Beitrag 13 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.13.ps>
Published in: The Annals of Statistics (1996), Vol. 24, No. 5, p. 1934-1963.
Abstract: We prove that Efron's bootstrap applied to the sample ofstudentized periodogram ordinates works quite well for ratio statistics,e.g. estimates for the autocorrelations. The bootstrap approximation forthe distribution of these statistics is accurate to the order o 1/SQRT(T)a.s. As a consequence this result carries over to the Whittle estimates.Some simulation studies are reported for a medium-sized stretch of atime series.

Dahlhaus, R. : Fitting Time Series Models to Nonstationary Processes.
Beitrag 4 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.04.ps>
Published in: The Annals of Statistics (1997), Vol. 25, No. I, 1-37.
Abstract: A general minimum distance estimation procedure is presented fornonstationary time series models that have an evolutionary spectralrepresentation. The asymptotic properties of the estimate is derived underthe assumption of possible model misspecification. For autoregressiveprocesses with time varying coefficients the estimate is compared to theleast squares estimate. Furthermore, the behaviour of estimates isexplained when a stationary model is fitted to a nonstationary process.

Giraitis, L.; Leipus, R.; Surgailis, D. : The Change-point Problem for Dependent Observations.
Beitrag 25 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.25.ps>
Submitted: December 94.
Abstract: We consider the change-point problem for the marginal distributionfunction of a strictly stationary time series. Asymptotic behavior ofKolmogorov-Smirnov type tests and estimators of the change point is studiedunder the null-hypothesis and converging alternatives. The discussion is basedon a general empirical process' approach which enables a unified treatment ofboth short memory (weakly dependent) and long memory time series. In particular,the case of a long memory moving average process is studied, using recentresults of Giraitis and Surgailis (1994).

Giraitis, L.; Surgailis, D. : A Central Limit Theorem for the Empirical Process of a Long Memory Linear Sequence.
Beitrag 24 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.24.ps>
Submitted: December 94.
Abstract: A central limit theorem for the normalized empirical process, basedon a (non-Gaussian) moving average sequence X_t , t in Z, with long memory,is established, generalizing the results of Dehling and Taqqu (1989). Theproof is based on the (Appell) expansion 1(X_t <= x) = F(x) + f(x) X_t + ...of the indicator function, where F(x) = P[X_t <= x] is the marginaldistribution function, f(x) = F'(x), and the covariance of the remainder termdecays faster than the covariance of X_t. As a consequence, the limitdistribution of M-functionals and U-statistics based on such long memoryobservations is obtained.

Beran, R. : Bootstrap Variable-Selection and Confidence Sets.
Beitrag 22 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.22.ps>
Submitted: November 94.
Abstract: This paper analyzes estimation by bootstrap variable-selection ina simple Gaussian model where the dimension of the unknown parameter mayexceed that of the data. A naive use of the bootstrap in this problemproduces risk estimators for candidate variable-selections that have astrong upward bias. Resampling from a less overfitted model removes the bias and leads to bootstrap variable-selections that minimize risk asymptotically.A related bootstrap technique generates confidence sets that are centered atthe best bootstrap variable-selection and have two further properties: theasymptotic coverage probability for the unknown parameter is as desired; andthe confidence set is geometrically smaller than a classical competitor.The results suggest a possible approach to confidence sets in other inverseproblems where a regularization technique is used.

Dahlhaus, R.; Wefelmeyer, W.: Asymptotically Optimal Estimation in Misspecified Time Series Models.
Beitrag 21 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.21.ps>
Published in: Ann. Statist. 24, 952-974.
Abstract: A concept of asymptotically efficient estimation is presented whena misspecified parametric time series model is fitted to a stationary process.Efficiency of several minimum distance estimates is proved and the behavior ofthe Gaussian maximum likelihood estimate is studied. Furthermore, the behaviorof estimates that minimize the h-step prediction error is discussed briefly.The paper answers to some extent the question what happens when a misspecifiedmodel is fitted to time series data and one acts as if the model were true.

Dümbgen, L. : Likelihood Ratio Tests for Principal Components.
Report 4
Published in: J. Multivariate Anal. 52 (1995), p. 245-258
Abstract: A particular class of tests for the principal components of ascatter matrix Sigma is proposed. In the simplest case one wants to test,whether a given vector is an eigenvector of Sigma corresponding to itslargest eigenvalue. The test statistics are likelihood ratio statisticsfor the classical Wishart model, and critical values are obtainedparametrically as well as nonparametrically without making any assumptionson the eigenvalues of Sigma. Still the tests have similar asymptoticproperties as classical procedures and are asymptotically admissible andoptimal in some sense.

Sawitzki, G. : Testing Numerical Reliability of Data Analysis Systems.
Report 1 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/report.01.pdf>
Published in: Computational Statistics and Data Analysis 18.2(1994), p. 269-301.
Abstract: From 1990 to 1993, a series of tests on numerical reliability ofdata analysis systems has been carried out. The tests are based onL.Wilkinson's "Statistics Quiz". Systems under test included BMDP, Data Desk,Excel, GLIM, ISP, SAS, SPSS, S-PLUS,STATGRAPHICS. The results showconsiderable problems even in basic features of well-known systems. For allour test exercises, the computational solutions are well known. The omissionsand failures observed here give some suspicions of what happens in lesswell-understood problem areas of computational statistics. We cannot takeresults of data analysis systems at face value, but have to submit them to alarge amount of informed inspection. Quality awareness still needs improvement.

Polonik, W. : Minimum Volume Sets and Generalized Quantile Processes.
Beitrag 20
Published in: Stochastic Processes and Appl. (1997), 69, 1-24.
Abstract: Minimum volume sets in classes C of subsets of the d-dimensionalEuclidean space can be used as estimators of level sets of a density. By usingempirical process theory consistency results and rates of convergence forminimum volume sets are given which depend on entropy conditions on C .The volume of the minimum volume sets itself, which can be used for robustestimation of scale, can be considered as a generalized quantile process inthe sense of Einmahl and Mason (1992). Bahadur-Kiefer approximations forgeneralized quantile processes are given which generalize classical resultson the one-dimensional quantile process. Rates of convergence of minimumvolume sets can be used to obtain Bahadur-Kiefer approximations and viceversa. A generalization of the minimum volume approach to regressionproblems and spectral analysis is presented.

Dümbgen, L. : A Simple Proof and Refinement of Wielandt's Eigenvalue Inequality.
Report 5
Published in: Statistics & Probability Letters 25 (1995), 113-115.
Abstract: Wielandt (1967) proved an eigenvalue inequality forpartitioned symmetric matrices, which turned out to be very usefulin statistical applications. A simple proof yielding sharp boundsis given.

Hainz, G. : The Asymptotic Properties of Burg Estimators.
Beitrag 18 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.18.ps>
Submitted: January 94.
Abstract: There are estimators for multivariate autoregressive models whichare regarded as multivariate versions of Burg's univariate estimator. For twoof these multivariate Burg estimators the asymptotic equivalence with theYule-Walker estimator is established in this paper, so central limit theoremsfor the Yule-Walker estimator extend to these estimators. Furthermore, theasymptotic bias of the univariate Burg estimator to terms of 1/n is shown to be thesame as the bias of the least-squares estimator; n is the number ofobservations. The main results are true even for mis-specified models.

Giraitis, L.; Leipus, R. : A Generalized Fractionally Differencing Approach inLong-Memory Modelling.
Beitrag 17 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.17.ps>
Submitted: November 93.
Abstract: We extend the class of known fractional ARIMA models to the class ofgeneralized ARIMA models which allows the generation of long-memory time serieswith long-range periodical behaviour at a finite number of spectrum frequences.The exact asymptotics of the covariance function and the spectrum at the pointsof peaks and zeroes are given. For obtaining asymptotic expansions, Gegenbauerpolynomials are used. Consistent parameter estimating is discussed usingWhittle's estimate.

Dümbgen, L. : Minimax Tests for Convex Cones.
Beitrag 16
Published in: Ann. Inst. Statist. Math. 47 (1995), p. 155-165.
Abstract: Let (Pt : t in Rp) be a simple shift family of distributionson Rp, and let K be a convex cone in Rp. Within the class ofnonrandomized tests of K versus Rp \ K , whose acceptance region A satisfiesA = A + K, tests with minimal bias are constructed. They are compared tolikelihood ratio type tests, which are optimal with respect to a differentcriterion. The minimax tests are mimicked in the context of linearregression and one-sided tests for covariance matrices.

Sawitzki, G. : The NetWork Project: Asynchronous Distributed Computingon Personal Workstations.
Report 3 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/report.03.pdf>
Published in: develop 11 (Aug. 1992), p. 82-105.
Abstract: NetWork is an experiment in distributed computing. The idea isto make use of idle time on personal workstations while retaining theiradvantages of immediate and guarantied availability. NetWork wants tomake use of otherwise idle resources only. The performance criterion ofNetWork is the net work done per unit time - not computing time or othermeasures of resource utilization. The NetWork model provides correspondingprogramming primitives for distributed computing. An implementation of adistributed asynchronous neural net serves as test application.

Polonik, W. : Density Estimation under Qualitative Assumptionsin Higher Dimensions.
Beitrag 15 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.15.ps>
Published in: J. Multivariate Anal. 55, No. 1 (1995), 61-81.
Abstract: We study a method for estimating a density f in the d-dimensional Euclidean space under assumptions which are of qualitative nature. The resulting density estimator can be considered as a generalization of the Grenander estimator for monotone densities. The assumptions on f are given in terms of the density contour clusters. We assume that the density contourclusters lie in a given class of measurable subsets of the d-dimensionalEuclidean space. By choosing this class appropriately it is possible tomodel for example monotonicity, symmetry or multimodality. The mainmathematical tool for proving consistency and rates of convergence of thedensity estimator is empirical process theory.

Polonik, W. : Measuring Mass Concentration and Estimating DensityContour Clusters - an Excess Mass Approach.
Beitrag 7 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.07.ps>
Published in: Annals of Statistics, 1995, Vol. 23, No. 3, 855-881.
Abstract: By using empirical process theory we study a method addressed totesting for multimodality and estimating density contour clusters in higherdimensions. The method is based on the so-called excess mass. Given aprobability measure F and a class of sets in the d-dimensional Euclidean space, the excess mass is defined as the maximal difference between theF-measure and l times the Lebesgue measure of sets in the given class. The excess mass can be estimated by replacing F by the empirical measure. Thecorreponding maximizing sets can be used for estimating density contourclusters. Comparing excess masses over different classes yields informationabout the modality of the underlying probability measure. This can be usedto construct tests for multimodality. The asymptotic behaviour of theconsidered estimators and test statistics is studied for different classesof sets, including the classes of balls, ellipsoids and convex sets.

Beran, R. : Seven Stages of Bootstrap.
Report 2 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/report.02.ps>
Published in: "Computational Statistics" Papers collected on the Occasionof the 25th Conference on Statistical Computing at Schloss Reisensburg.(Edited by P.Dirschedl & R.Ostermann for the Working Groups ... )Heidelberg, Physica, 1994, isbn 3-7908-0813-x, p. 143-157.
Abstract: This essay is organized around the theoretical and computationalproblem of constructing bootstrap confidence sets, with forays into relatedtopics. The seven section headings are: Introduction; The Bootstrap World;Bootstrap Confidence Sets; Computing Bootstrap Confidence Sets; Quality ofBootstrap Confidence Sets; Iterated and Two-step Boostrap; Further Resources.

Janas, D.; Sachs, R.v.: Consistency for Non-Linear Functions of the Periodogram of Tapered Data.
Beitrag 14 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.14.ps>
Published in: Journal of Time Series Analysis 16 (1995), 585-606.
Abstract: We investigate the merits of using a data taper in non-linearfunctionals of the periodogram of a stationary time series. We showconsistency for a general class of statistics by the use of Edgeworthexpansion theory.

Dümbgen, L. : Combinatorial Stochastic Processes.
Beitrag 12
Published in: Stoch. Proc. Appl. 52 (1994), p. 75-92.
Abstract: Well-known results for sums of independent stochastic processesare extended to processes f1,P(1) + f2,P(2) + ...+ fn,P(n),where f = (fi,j : 1 <= i,j <= n) is a collection of independentstochastic processes fi,j on some set T, and P is a random permutationof {1, 2, ..., n} such that f, P are independent. The general results, auniform Law of Large Numbers and a functional Central Limit Theorem, areapplied to permutation processes and randomized trials.

Beran, R.: Stein Estimation in High Dimensions and the Bootstrap.
Beitrag 1 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.01.ps>
Submitted: December 92. Revised: August 93.
Abstract: The Stein estimator and the better positive-part Stein estimatorboth dominate the sample mean, under quadratic loss, in the standardmultivariate model of dimension q. Standard large sample theory does notexplain this phenomenon well. Plausible bootstrap estimators for the riskof the Stein estimator do not converge correctly at the shrinkage point assample size n increases. By analyzing a submodel exactly, with the helpof results from directional statistics, and then letting dimension q go toinfinity, we find:a) In high dimensions, the Stein and positive-part Stein estimators areapproximately admissible and approximately minimax on large compact ballsabout the shrinkage point. The sample mean is neither.b) A new estimator, asymptotically equivalent as dimension q tends toinfinity, appears to dominate the positive-part Stein estimator slightlyfor finite q.c) Resampling from a fitted standard multivariate normal distribution inwhich the length of the fitted mean vector estimates the length of thetrue mean vector well is the key to consistent bootstrap risk estimationfor Stein estimators.

Geer, S. van de; Mammen, E. : Locally Adaptive Regression Splines.
Beitrag 10
Submitted: July 93.
Abstract: In this paper least squares penalized regression estimates withtotal variation penalities are considered. It is shown that theseestimators are least squares splines with locally data adaptive placed knotpoints. Algorithms and asymptotic properties are discussed.

Janas, D. : Edgeworth Expansions for Spectral Mean Estimates withApplications to Whittle Estimates.
Beitrag 9 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.09.ps>
Submitted: July 93.
Abstract: We prove that the distributions of spectral mean estimates fromlinear processes admit Edgeworth expansions. As a consequence, Edgeworthexpansions are valid for Whittle estimates.

Dahlhaus, R. : Statistical Methods in Spectral Estimation.
Beitrag 2 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.02.ps>
Submitted: December 92. Revised: July 93.
Abstract: The paper gives an overview over the work of the Teilprojekt B2 inspectral estimation for time series during the period 1988-1992. Highresolution spectral estimates are introduced and the role of data tapers arediscussed. Parametric models such as ARMA-models are fitted and judged byfrequency domain methods. Furthermore, a method for the detection of hiddenfrequencies is discussed. The methods are illustrated by simulations.

Ehm, W.; Mammen, E.; Müller, D.W. : Power Robustification of Approximately Linear Tests.
Beitrag 8
Submitted: June 93.
Abstract: We present a general method of improving the power of linear andapproximately linear tests when deviations from a translation family ofdistributions have to be taken into account. It consists in the combinationof a linear statistic measuring location and a quadratic statistic measuringchange of shape of the underlying distribution. The resulting tests ("funneltests") in general gain a sizeable amount of power over the linear testsadapted to the translation family. This can be understood qualitatively byan analytic argument and visualized quantitatively by Monte Carlo simulations.In a simulation study the funnel tests are compared also with other non-lineartests.

Grahn, T. : A Conditional Least Squares Approach to Bilinear Time SeriesEstimation.
Beitrag 6 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.06.ps>
Submitted: April 93.
Abstract: In this paper we develop a Conditional Least Squares (CLS) procedurefor estimating bilinear time series models. We apply this method to twogeneral types of bilinear models. A model of type I is a special superdiagonalbilinear model which includes the linear ARMA model as a submodel. A model oftype II is a standardized version of the popular bilinear BL(p,0,p,1) model(see e.g. Liu and Chen (1990), Sesay and Subba Rao (1991)). For both models weshow that the limiting distribution of the resulting CLS estimates is Gaussianand the law of the iterated logarithm holds.

Hjellvik, V.; Tjostheim, D. : Nonparametric Tests for Linearity for Time Series.
Beitrag 5
Published in: Biometrika 82, 351-368.
Abstract: We introduce tests of linearity for time series based onnonparametric estimates of the conditional mean and the conditional variance.The tests are compared to a number of parametric tests and to nonparametrictests based on the bispectrum. Asymptotic expressions give bad approximations,and the null distribution under linearity is constructed using resampling ofthe best linear approximation. The new tests perform well on the examplestested.

Müller, D.W. : The Excess Mass Approach in Statistics.
Beitrag 3 <ftp://statlab.uni-heidelberg.de/pub/reports/by.series/beitrag.03.ps>
Submitted: December 92.
Abstract: The basic idea of the excess mass approach is to measure the amountof probability mass not fitting a given statistical model. It came up first inthe context of testing for a treatment effect, was later applied to inferenceabout the modality of a distribution and even density estimation. Recently theframework has been extended to regression problems. In this survey article wedescribe the idea and summarize the main results.


Titles ->by author ->by date ->by series   Abstracts ->by author ->by date ->by series

===================================================FTP 
access: Anonymous ftp to Server:   statlab.uni-heidelberg.de     
[129.206.113.100] Directory:                         ~/pub/reports       
To retrieve a compressed copy,       switch to binary file transfer       
and add .Z to the file name (e.g report01.Z       instead for 
report01.===================================================File name 
conventions:~.ps          Postscript file              Download to a 
Postscript printer, or              use Ghostscript to view.~.pdf         
Portable document format.              Use Adobe Reader version 4 or 
better.~.Z           compressed with compress~.gz          compressed with 
gzip===================================================
On the other side...