On robust estimation via pseudo-additive information
by Dr Davide Ferrari
Abstract: In this talk, I discuss a parameter estimator which minimizes an empirical version of the q- entropy, a generalized information measure obtained by relaxing the additivity axiom of Shannon entropy. I show that the estimator is equivalent to minimization of the family of power divergences, provided a simple transformation of the target parametric model. The impact of anomalous data on the accuracy of the estimator depends on the constant q, which is adjusted to tune the trade-off between efficiency and robustness. If q is close to 1, the procedure corresponds to minimization of the Kullback-Leibler divergence; if q=1/2, we obtain a fully parametric version of the Hellinger distance minimization. An upper bound to the estimator mean squared error in the presence of contamination is provided by computing a multiparameter generalization of the change-of-variance
function, which is used as a min-max criterion for selecting q. This approach improves upon classic methods based on power divergences as it avoids kernel density smoothing, thus permitting the treatment of multidimensional problems. Extensions to high-dimensional settings using sparsity- inducing penalization strategies are also presented. The procedure is illustrated by examples concerning generalized linear mixed models and time-series models.
For More Information: for further information pls contact Prof Richard Huggins. email: firstname.lastname@example.org