Histogram based mutual information.
i=mi(x) i=mi(x,...,xN) i=mi(x,...,xN,l) i=mi(x,...,xN,k,l) [i s]=mi(...) mi(...) mi(...,'param')
Computes the mutual information between the vectors x1,...,xN. The auto mutual information can be computed by using only one vector. The arguments can be multi-column vectors. The result i will be a N x N matrix.
[i s]=mi(...) computes the mutual information and the standard error (only for one and two arguments).
mi(...) without any output arguments opens a GUI for interactively changing the parameters.
By using the GUI, the mutual information can be stored into the workspace. If their standard error is available, they will be appended to the mutual information matrix as the last two columns.
mi without any arguments calls a demo (the same as the example below).
The parameters numbers of bins k and maximal lag l are optional. If the number of bins is not set, an amount of 10 will be used.
|Additional parameters according to the GUI.|
|'gui'||-||Creates the GUI.|
|'nogui'||-||Suppresses the GUI.|
|'silent'||-||Suppresses all output.|
x=sin(0:.2:8*pi)'+.1*randn(126,1); y=randn(5000,1); mi(x,10,40)
Please note that the mutual information derived with mi slightly differs from the results derived with migram. The reason is that mi also considers estimation errors. A full explanation can be found in this white paper.
Roulston, M. S.: Estimating the errors on measured entropy and mutual information, Physica D, 125, 1999.