Tuesday, March 2, 2010

Python for Information Theoretic Analysis of Neural Data

an excellent article comparing mi bias correction methods also discusses reasons for switching from matlab to python. one of the interesting ideas mentioned is the shuffled information estimator, which can be used along with any of the other techniques. i just read through it quickly, but i think there is some kind of connection there to the bootstrap or other resampling methods to estimate, and thereby reduce, conditional entropy bias. more on that in 'tight data-robust bounds to mutual information combining shuffling and model selection techniques', which claims robust upper and lower bounds and precise estimates even for high correlation.
the python code used in the paper for estimation of entropies and mi is on google code. i installed it from the svn downloaded source by going into ~/checkinstall/pyentropy/pyentropy-read-only/pyentropy and
checkinstall python setup.py install
the section, 'A python library for information theoretic estimates' gives a description of the library and how to use it. only problem for me is that it assumes a finite alphabet. i think some smart histogram binning should allow me to work with it, though.
reasons for migrating their code from matlab to python are interspersed throughout the article (for example, they say cython is easier to use than mex), and they mention using mlabwrap to facilitate.

No comments: