Wednesday, October 27, 2010

mutual information for linearly dependent rvs

i can't remember if i had recorded this anywhere, but i want to make sure i have it because it took me a little while to derive. if x1 and x2 are both uniform random variables on [0,1] and y = a*x1 +(1-a)*x2, then the mutual information between x1 and y is ln(a/(1-a)) + (1-a)/(2*a) nats, where ln is the natural log. this is a useful result for testing mutual information estimators because it's on a bounded domain so might converge more quickly than functions of exponents. the key to deriving this result is to realise that the marginal pdf for y is a trapezoid, and be careful about the log base when using the chain rule/integration by parts.

No comments: