Tuesday, December 22, 2009
inverse problems
this month's issue of inverse problems has a number of articles on imaging and inverse scattering.
math for people
blog post in response to 'math for programmers'. for those of you with an interest in math education.
the pragmatic programmer
book on programming recommended by travis oliphant. promises to take me from journeyman to master, if i decide to become a master someday.
interactive brokers news pages
ib has news pages on interest rates in various countries, fx markets (including an rss feed), and futures and options (also with a commentary rss feed on a few individual tickers). the futures and options page includes a summary explanation at the top that explains what things like implied vs. historical volatility and futures arbitrage can tell you about what derivatives traders believe will happen in the market short-term.
the eagle and the lion
partway through reading a very good (if scholarlyly dry) book by james a. bill on us-iran relations covering the installation and deposing of the shah and the aftermath of the islamic revolution. only through chapter 7 so far, and the shah has just been overthrown. given the prominence of iron in world news these days, i think this history is important to understand.
cleaning up the office
found an old paper in my office: 'analytical parametric analysis of the contact problem of human buttocks and negative poisson's ratio foam cushions'. dead serious research, and i'm not sure but i think i did cite it once. it has the only account i could ever find for an analytical solution to hertz-like contact with a finite-thickness plate.
'game theory, maximum entropy, minimum discrepancy and robust bayesian decision theory', peter d. gr\:{u}nwald and a. philip dawid. shows the connection between maximizing entropy and minimizing worst-case expected loss. hefty tome, but relatively readable.
the book, 'the statistical mechanics of financial markets', j. voit, 2005, looks like an interesting read. goes into some of the real areas of interest in quantitative finance from a physics perspective.
Monday, December 21, 2009
scipy india 2009
some interesting stuff to check out from the scipy conference in india.
chandrashekhar kaushik is using cython to do sph simulations, apparently using python down to a fairly low level.
senthil kumaran gave a talk explaining how the gil affects code speed. i will probably need to understand this at some point.
chris burns talked about the fmri project, nipype. i'm curious to see if they have written some ica code in there.
david cournapeau is coming out with a new package distribution system. might look at that as a possible alternative to distutils.
akshay srivivasan showed how to use python with an avr microcontroller to make cheap and easy instrumentation.
asokan pichai of fossee showed how they use open source software to make web video tutorials.
travis oliphant very briefly alludes to ultrasound imaging as a scipy application area. i wonder if i can get more detail on that.
stefan van der walt mentioned his github repository with scikit addons, with goodies like image processing and gpu algorithms for python.
Thursday, December 17, 2009
more estimating mutual information
the mutual information: detecting and evaluating dependencies between variables
r. steuer, j. kurths, c. o. daub, j. weise, and j. selbig
bootstrap confidence on mutual information
good practical info
eq 32: what about -1/N \sum \log (\hat{f(x_i)}\hat{f(y_i)}N) ? no relative scaling in x vs. y, minimum wrt h_x, h_y (?)
fig 8: pearson correlation ~= 0, mutual information > 0 is the nonlinear region not detectable with linear test
a minimax mutual information scheme for supervised feature extraction and its application to eeg-based brain-computer interfacing
farid oveisi and abbas erfanian
adaptive histogram pdf; better than fixed histogram and gram-charlier polynomial (?)
not very useful to me
examining methods for estimating mutual information in spiking neural systems
christopher j. rozell, don h. johnson
the data processing inequality proves that I(X,\hat{X}) \le I(X,Y) regardless of the estimator used to obtain \hat{X}. useful for checking the validity of a coding model if you know a correct upper bound for I(X,Y).
fast algorithm for estimating mutual information, entropies and score functions
dinh tuan pham
bias-canceling estimator, O(N) vs O(N^2) density functional (cubic spline), emphasis on gradients (scores)
blind-source separation for linear mix and nonlinear maps
http://arxiv.org/pdf/comp-gas/9405006.pdf
Prichard and Theiler. Generalized redundancies for time series analysis.
good info on advantages of Renyi MI.
independence with Renyi MI requires that it be ==0 for all of q > 1 or 0 < q < 1. maybe it's more robust for estimation since i could evaluate for multiple qs. apparently q=2 is best, statistically?
i found that i could reduce a taylor series in q down to a bunch of terms containing expected values of log(p) in a probability measure defined by p_i^q. but i'm not sure where to go from there or if that's more useful.
http://www.ima.umn.edu/preprints/July90Series/663.pdf
Parzen. Time series, statistics, and information
kind of old, but still looks like good info
Dongxin Xu and Deniz Erdogmuns. Renyi's entropy, divergence and their nonparametric estimators
book chapter with good info on interpreting and approximating the spectrum of renyi entropies, especially quadratic (q=2).
http://arxiv.org/pdf/comp-gas/9405006.pdf
Prichard and Theiler. Generalized redundancies for time series analysis.
good info on advantages of Renyi MI.
independence with Renyi MI requires that it be ==0 for all of q > 1 or 0 < q < 1. maybe it's more robust for estimation since i could evaluate for multiple qs. apparently q=2 is best, statistically?
i found that i could reduce a taylor series in q down to a bunch of terms containing expected values of log(p) in a probability measure defined by p_i^q. but i'm not sure where to go from there or if that's more useful.
http://www.ima.umn.edu/preprints/July90Series/663.pdf
Parzen. Time series, statistics, and information
kind of old, but still looks like good info
Dongxin Xu and Deniz Erdogmuns. Renyi's entropy, divergence and their nonparametric estimators
book chapter with good info on interpreting and approximating the spectrum of renyi entropies, especially quadratic (q=2).
Tuesday, December 15, 2009
python documentation generation
a few tools out there for generating docs for/from python code. docutils has a lot of core support and no dependencies, but it seems to be more of a library since the front-end stuff seems a little bare-bones and many others use its ReST parser.
sphinx is a very impressive doc generator, with commensurately impressive packages using it. but they say it's not really designed for auto-api docs, more for stand-alone rst files written alongside the code.
epydoc, otoh, is an auto-api generator that analyzes .py source to build the docs. it can use it's own format, or it can read rst (i think the better choice). can make dependency graphs, etc. i think i'll try this first.
happydoc is another auto-api tool. i don't think it's used as much as epydoc.
Monday, December 14, 2009
probability density estimation
'density estimation by dual ascent of the log-likelihood' by tabak and vanden-eijnden shows an interesting method for joint pdf estimation by mapping to gaussians. i wonder if the technique is related to 'whitening as a tool for estimating mutual information in spatiotemporal data sets' by galka, ozaki, bayard, and yamashita. they also cite the fact that innovations that are the sum of gaussian and poisson rvs can represent any continuous-time markov process. for continous dynamics, only the gaussian noise term will be present.
Thursday, December 10, 2009
new youtube api
apparently youtube has discontinued their old api. i'll need to look here to update things.
or i could just use this. it doesn't do the part selection like my script did, but if you can search well enough to get the first hit right, it's easy enough:
youtube-dl "ytsearch:hampster dance part 1" -o 1.flv
Wednesday, December 9, 2009
python coverage testing
coverage.py: actively developed, simple command line execution for html output. integration into my test framework would require some work with the api, though it already has a nose plugin i could look at.
figleaf is based on coverage.py but runs faster because it ignores python builtins by default. better separation between code analysis and reporting, so you can more easily combine results from multiple runs. maybe not quite as much spit and polish.
the coverage langlet module (?) takes a different approach to coverage monitoring by inserting sensor objects into block entry points of the compiled code. the module seems to be more alpha quality than ned batchelder's, but perhaps an interesting alternative if that one doesn't work out for some reason.
i think i'll try coverage and/or figleaf to see if they are helpful beyond the builtin trace capability.
Subscribe to:
Posts (Atom)