A Divergence Formula for Randomness and Dimension (Short Version)

oleh: Jack H. Lutz

Format: Article
Diterbitkan: Open Publishing Association 2009-06-01

Deskripsi

If $S$ is an infinite sequence over a finite alphabet $Sigma$ and $eta$ is a probability measure on $Sigma$, then the {it dimension} of $ S$ with respect to $eta$, written $dim^eta(S)$, is a constructive version of Billingsley dimension that coincides with the (constructive Hausdorff) dimension $dim(S)$ when $eta$ is the uniform probability measure. This paper shows that $dim^eta(S)$ and its dual $Dim^eta(S)$, the {it strong dimension} of $S$ with respect to $eta$, can be used in conjunction with randomness to measure the similarity of two probability measures $alpha$ and $eta$ on $Sigma$. Specifically, we prove that the {it divergence formula} [ dim^eta(R) = Dim^eta(R) =frac{CH(alpha)}{CH(alpha) + D(alpha || eta)} ] holds whenever $alpha$ and $eta$ are computable, positive probability measures on $Sigma$ and $R in Sigma^infty$ is random with respect to $alpha$. In this formula, $CH(alpha)$ is the Shannon entropy of $alpha$, and $D(alpha||eta)$ is the Kullback-Leibler divergence between $alpha$ and $eta$.