Title

Asymptotic properties of generalized kernel density estimators

Date of Completion

January 2008

Keywords

Mathematics|Statistics

Degree

Ph.D.

Abstract

The best mean square error that the classical kernel density estimator achieves if the kernel is non-negative and f has only two continuous derivatives, is of the order of n-45 . If negative kernels are allowed, then this rate can be improved depending on the smoothness of f and the order of the kernel. ^ Abramson and others modified the classical kernel estimator, assumed non-negative, by allowing the bandwidth hn to depend on the data. The last and best result in the literature is Hall, Hu and Marron who show that under suitable assumptions on a non-negative kernel K and the density f, :n( t) − f(t): = O P( 1n4/9 ) for fixed t. The main result of this thesis states that supDn :n(t) − f(t): = OP(( lognn )4/9) where Dn and n(t) are purely data driven and Dn can be taken as close as desired to the set { t : f(t) > 0}. This rate is best possible for estimating a density in the sup norm. The data driven n(t) and Dn have 'ideal' counterparts that depend on f, and for the ideal estimator, slightly sharper results are proven. ^

COinS