Abstract
The dependence between random variables may be measured by mutual information. However, the estimation of mutual information is difficult since the estimation of the joint probability density function (PDF) of non-Gaussian distributed data is a hard problem. Copulas offer a natural approach for estimating mutual information, since the joint probability density function of random variables can be expressed as the product of the associated copula density function and marginal PDFs. The experiment demonstrates that the proposed copulas-based mutual information is much more accurate than conventional methods such as the joint histogram and Parzen window based mutual information that are widely used in image processing.
Original language | English |
---|---|
Pages (from-to) | 493-494 |
Number of pages | 2 |
Journal | Electronics Letters |
Volume | 47 |
Issue number | 8 |
DOIs | |
Publication status | Published - 14 Apr 2011 |
Keywords
- estimation
- mutual information
- copula
- density function
- gaussian processes
- signal processing
- information theory