Suppose X is a random variable that we can observe, we want to take an action Y in order to minimize $ $ E[(Y-X)^{2}]$ $ Given the density of X is $ p(x)$ , what is the optimal choice of the conditional density $ q(y|x)$ of $ Y|X$ .
Formally: $ $ \min_{q}\{E[(Y-X)^{2}]=\int(y-x)^{2}q(y|x)p(x)dydx\}$ $
s.t $ $ (\forall x) \int q(y|x)dy=1$ $ $ $ -E[E[\log_{2}q(Y|X)|X]]-\{-E[\log_{2}(\int q(Y|x)p(x)dx)]\} \leq C$ $ The intuition of the last constraint is based on the entropy in the information theory:
Before observing $ X$ , the entropy of $ (X,Y)$ is defined as: $ $ -E[\log_{2}(\int q(Y|x)p(x)dx)]$ $
After observing $ X=x$ , the entropy is: $ $ -E[\log_{2}q(Y|X)|X=x]$ $
So the reduction in the entropy is: $ $ -E[\log_{2}q(Y|X)|X=x]-\{-E[\log_{2}(\int q(Y|x)p(x)dx)]\}$ $
The average reduction in the entropy is: $ $ -E[E[\log_{2}q(Y|X)|X]]-\{-E[\log_{2}(\int q(Y|x)p(x)dx)]\}$ $
Then there is an upper bound of the entropy reduction: $ $ -E[E[\log_{2}q(Y|X)|X]]-\{-E[\log_{2}(\int q(Y|x)p(x)dx)]\} \leq C$ $
The trivial case is $ C=\infty$
The Question is:
When $ X$ distribution is Gaussian, what is the optimal $ q(y|x)$ ?