the mutual information: detecting and evaluating dependencies between variables
r. steuer, j. kurths, c. o. daub, j. weise, and j. selbig
bootstrap confidence on mutual information
good practical info
eq 32: what about -1/N \sum \log (\hat{f(x_i)}\hat{f(y_i)}N) ? no relative scaling in x vs. y, minimum wrt h_x, h_y (?)
fig 8: pearson correlation ~= 0, mutual information > 0 is the nonlinear region not detectable with linear test
a minimax mutual information scheme for supervised feature extraction and its application to eeg-based brain-computer interfacing
farid oveisi and abbas erfanian
adaptive histogram pdf; better than fixed histogram and gram-charlier polynomial (?)
not very useful to me
examining methods for estimating mutual information in spiking neural systems
christopher j. rozell, don h. johnson
the data processing inequality proves that I(X,\hat{X}) \le I(X,Y) regardless of the estimator used to obtain \hat{X}. useful for checking the validity of a coding model if you know a correct upper bound for I(X,Y).
fast algorithm for estimating mutual information, entropies and score functions
dinh tuan pham
bias-canceling estimator, O(N) vs O(N^2) density functional (cubic spline), emphasis on gradients (scores)
blind-source separation for linear mix and nonlinear maps
http://arxiv.org/pdf/comp-gas/9405006.pdf
Prichard and Theiler. Generalized redundancies for time series analysis.
good info on advantages of Renyi MI.
independence with Renyi MI requires that it be ==0 for all of q > 1 or 0 < q < 1. maybe it's more robust for estimation since i could evaluate for multiple qs. apparently q=2 is best, statistically?
i found that i could reduce a taylor series in q down to a bunch of terms containing expected values of log(p) in a probability measure defined by p_i^q. but i'm not sure where to go from there or if that's more useful.
http://www.ima.umn.edu/preprints/July90Series/663.pdf
Parzen. Time series, statistics, and information
kind of old, but still looks like good info
Dongxin Xu and Deniz Erdogmuns. Renyi's entropy, divergence and their nonparametric estimators
book chapter with good info on interpreting and approximating the spectrum of renyi entropies, especially quadratic (q=2).
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment