This was originally a question on Cross Validated.
Are there any (nontrivial) lower bounds on the Kullback-Leibler divergence $ KL(f\Vert g)$ between two measures / densities?
Informally, I am trying to study problems where $ f$ is some target density, and I want to show that if $ g$ is chosen “poorly”, then $ KL(f\Vert g)$ must be large. Examples of “poor” behaviour could include different means, moments, etc.
Example: If $ f=\sum_ka_kf_k$ and $ g=\sum_kb_kg_k$ are two mixture distributions, is there a lower bound on $ KL(f\Vert g)$ in terms of $ KL(f_k\Vert g_j)$ (and also the convex weights $ a_k,b_j$ )? Intuitively, we’d like to say that if $ \inf_{k\ne j} KL(f_k\Vert g_j)$ is “big”, then $ KL(f\Vert g)$ cannot be small.
Anything along these lines (for mixtures or arbitrary measures) would be useful. Obviously, you can make assumptions about the quantities involved. Alternatively, references to any papers that study these kinds of problems (either directly or indirectly) would be helpful!