Friday, June 14, 2013

mutual information and signal processing

some interesting articles:

Massoud Babaie-Zadeh, Christian Jutten, and Kambiz Nayebi. Differential of the Mutual Information
Kenneth E. Hild, II, Deniz Erdogmus, and José Príncipe. Blind Source Separation Using Renyi’s Mutual Information
George Atia and Venkatesh Saligrama. A Mutual Information Characterization for Sparse Signal Processing
Liam Paninski. Estimation of Entropy andMutual Information (recommended by others)
Janett Walters-Williams and Yan Li. Estimation of Mutual Information: A Survey


Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy (2005) by Hanchuan Peng , Fuhui Long , Chris Ding
lots of good definitions, results comparison

http://vserver1.cscs.lsa.umich.edu/~crshalizi/notabene/information-theory.html
lots of information theory links


Modelling time series using information theory
L Diambra, A Plastino
L. Zunino, M. C. SorianoI. Fischer, O. A. Rosso, and C. R. Mirasso. Permutation-information-theory approach to unveil delay dynamics from time-series analysis

No comments: