Assume X and Y are conditionally independent given Z:
We are interested in the mutual information between P(X,Y) and P(Z): $I(X,Y;Z)$. Can we decompose this mutual information into two components reflecting the contribution of $X$ and $Y$ separately?
What I have tried:
The second equality is due to the additivity of entropies of independent variables (it holds for each $z$). I am left with $H(X,Y)$ which doesn’t easily decompose ($X$ and $Y$ are not necessarily indepedent when not conditioned on Z).
Note: this question is superficially very similar to this one, but it is not equivalent to it.