Pointwise Mutual Information
Text::NSP::Measures::2D::MI::pmi is a Perl module that implements Pointwise Mutual Information. SYNOPSIS Basic Usage use Text::NSP::Measures::2D::MI::pmi; my $npp = 60; my $n1p = 20; my $np1 = 20; my $n11 = 10; $pmi_value = calculateStatistic( n11=>$n11, n1p=>$n1p, np1=>$np1,...
Platforms: *nix
License: Freeware | Size: 952.32 KB | Download (93): Text::NSP::Measures::2D::MI::pmi Download |
Text::NSP::Measures::3D::MI::pmi is a Perl module that implements Pointwise Mutual Information for trigrams. SYNOPSIS Basic Usage use Text::NSP::Measures::3D::MI::pmi; $pmi_value = calculateStatistic( n111=>10, n1pp=>40, np1p=>45, npp1=>42, n11p=>20, n1p1=>23, np11=>21, nppp=>100);...
Platforms: *nix
License: Freeware | Size: 952.32 KB | Download (92): Text::NSP::Measures::3D::MI::pmi Download |
The function will estimateEstimate Mutual Information with kernel density function
Platforms: Matlab
License: Freeware | Size: 10 KB | Download (42): Mutual Information -2 variablle Download |
Text::NSP::Measures::3D::MI is a Perl module that provides error checks and framework to implement Loglieklihood, Total Mutual Information, Pointwise Mutual Information and Poisson Stirling Measure for trigrams. SYNOPSIS Basic Usage use Text::NSP::Measures::3D::MI::ll; $ll_value =...
Platforms: *nix
License: Freeware | Size: 952.32 KB | Download (98): Text::NSP::Measures::3D::MI Download |
Normalized mutual information is often used for evaluating clustering result, information retrieval, feature selection etc. This is a optimized implementation of the function which has no for loops.
Platforms: Matlab
License: Shareware | Cost: $0.00 USD | Size: 10 KB | Download (44): Normalized Mutual Information Download |
Calculate windowed mutual information between two signals up to a pre-defined lag. The estimation of the (joint) probabilities is optimized, hence, entire computation is very fast.
Platforms: Matlab
License: Shareware | Cost: $0.00 USD | Size: 10 KB | Download (40): Windowed mutual information (migram) Download |
The definition of mutual information could resort to wiki:http://en.wikipedia.org/wiki/Mutual_informationFor marginal mutual information, we say it is :I(A,B)=sum sum P(A,B) log[P(A,B)/P(A)P(B)]For conditional mutual information, we say it is :I(A,B|C)=sum sum P(A,B|C)...
Platforms: Matlab
License: Shareware | Cost: $0.00 USD | Size: 10 KB | Download (51): Mutual Information In probability theory and information theory Download |
Very fast implementation of average mutual information. Usage: [v,lag]=ami(x,y,lag) Calculates the mutual average information of x and y with a possible lag. v is the average mutual information. (relative units see below) x & y is the time series. (column vectors) lag is a vector of time lags....
Platforms: Matlab
License: Shareware | Cost: $0.00 USD | Size: 10 KB | Download (42): Average mutual information Download |
Nowadays there are heaps of articles on the theory of fuzzy entropy and fuzzy mutual information. However, there is a clear significant lack for a Matlab implementation of these concepts. Based on numerous requests from students and researchers, I have prepared this code to simplify such concepts...
Platforms: Matlab
License: Shareware | Cost: $0.00 USD | Size: 10 KB | Download (44): uzzy Entropy and Mutual Information Download |
Mutual information I(X,Y) measures the degree of dependence (in terms of probability theory) between two random variables X and Y. Is is non-negative and equal to zero when X and Y are mutually independent. Conditional mutual information I(X,Y|Z) is the expected value of I(X,Y) given the value of...
Platforms: Matlab
License: Shareware | Cost: $0.00 USD | Size: 10 KB | Download (48): Kernel estimate for (Conditional) Mutual Information Download |
Text::NSP::Measures::2D::MI is a Perl module that provides error checks for Loglieklihood, Total Mutual Information, Pointwise Mutual Information and Poisson-Stirling Measure. SYNOPSIS Basic Usage use Text::NSP::Measures::2D::MI::ll; my $npp = 60; my $n1p = 20; my $np1 = 20; my $n11 = 10;...
Platforms: *nix
License: Freeware | Size: 952.32 KB | Download (97): Text::NSP::Measures::2D::MI Download |
Usage: I=mi(A,B), where A and B are equally sized images/signals.Function hist2 (included) is used to determine the joint histogram of the images/signals. All histograms use 256 bins.Assumptions: 1) 0*log(0)=0, 2) mutual information is obtained on the intersection between the supports of partial...
Platforms: Matlab
License: Freeware | Size: 10 KB | Download (42): Mutual information of two images or signals Download |
Text::NSP::Measures::2D::MI::tmi is a Perl module that implements True Mutual Information. SYNOPSIS Basic Usage use Text::NSP::Measures::2D::MI::tmi; my $npp = 60; my $n1p = 20; my $np1 = 20; my $n11 = 10; $tmi_value = calculateStatistic( n11=>$n11, n1p=>$n1p, np1=>$np1, npp=>$npp);...
Platforms: *nix
License: Freeware | Size: 952.32 KB | Download (91): Text::NSP::Measures::2D::MI::tmi Download |
Text::NSP::Measures::3D::MI::tmi is a Perl implementation for True Mutual Information for trigrams. SYNOPSIS Basic Usage use Text::NSP::Measures::3D::MI::tmi; $tmi_value = calculateStatistic( n111=>10, n1pp=>40, np1p=>45, npp1=>42, n11p=>20, n1p1=>23, np11=>21, nppp=>100); if(...
Platforms: *nix
License: Freeware | Size: 952.32 KB | Download (87): Text::NSP::Measures::3D::MI::tmi Download |
Usage: I=mi(A,B), where A and B are equally sized images/signals.Function hist2 (included) is used to determine the joint histogram of the images/signals.Assumptions: 1) 0*log(0)=0, 2) mutual information is obtained on the intersection between the supports of partial histograms.Example (in...
Platforms: Matlab
License: Shareware | Cost: $0.00 USD | Size: 10 KB | Download (39): Fast mutual information of two images or signals Download |
This package is the mRMR (minimum-redundancy maximum-relevancy) feature selection method in (Peng et al, 2005 and Ding & Peng, 2005, 2003), whose better performance over the conventional top-ranking method has been demonstrated on a number of data sets in recent publications. This version uses...
Platforms: Matlab
License: Freeware | Size: 532.48 KB | Download (43): mRMR Feature Selection (using mutual information computation) Download |
Functions for Information theory, such as entropy, mutual information, KL divergence, etcThis toolbox contains functions for discrete random variables to compute following quantities:1)Entropy2)Joint entropy3)Conditional entropy4)Relative entropy (KL divergence)5)Mutual information6)Normalized...
Platforms: Matlab
License: Shareware | Cost: $0.00 USD | Size: 10 KB | Download (44): Information Theory Toolbox Download |
NSB entropy and mutual information estimator; applications to the analysis of neural code.
Platforms: Mac, BSD, Linux
License: Freeware | Size: 48.22 KB | Download (45): NSB Entropy Estimation Download |
Measures of Analysis of Time Series (MATS) toolkit computes a number of different measures of analysis of scalar time series (linear, nonlinear and other statistical measures). It also contains pre-processing tools (transformations and standardizations), data splitting facility, resampled data...
Platforms: Matlab
License: Freeware | Size: 3.41 MB | Download (63): Measures of Analysis of Time Series toolkit (MATS) Download |
To find channel capacity one must maximize the mutual information with respect to the discrete set of probabilities of the source symbols 'c' , and for a given transition probability matrix.An efficient algorithm to find channel capacity Cc was derived in 1972 by Blahut and independently...
Platforms: Matlab
License: Shareware | Cost: $0.00 USD | Size: 10 KB | Download (39): Channel capacity using Arimoto-Blahut Algorithm Download |