Some simple approximations to Kullback-Leibler Divergence with applications to equivalence testing
by Professor Robert G. Staudte
Abstract: Quite generally for one-parameter families, the mean of a variance stabilized test statistic is approximately equal to the signed square root of the Kullback-Leibler distance between the null and alternative distributions.
This result strengthens the case for saying that a statistic which is variance stabilized to one is a measure of evidence for the alternative which has known standard normal error, facilitating calibration and interpretation.
Its mean function factors into the square root of the sample size and the Key Inferential Function, from which accurate power functions and reliable confidence intervals are readily obtained. Moreover, such statistics lend themselves to meta-analyses with stable weights. Sometimes it is possible to derive a simple formula for the Kullback-Leibler Divergence, but when it is not, such as for the non-central Chi-squared or non-central F models, then the Key Inferential function provides a useful and simple approximation. Applications to equivalence testing illustrate these results.
For More Information: Contact Farshid Jamshidi, e-mail: firstname.lastname@example.org