Estimation of mutual information using copula density function

X. Zeng, T. S. Durrani

Research output: Contribution to journalArticlepeer-review

41 Citations (Scopus)


The dependence between random variables may be measured by mutual information. However, the estimation of mutual information is difficult since the estimation of the joint probability density function (PDF) of non-Gaussian distributed data is a hard problem. Copulas offer a natural approach for estimating mutual information, since the joint probability density function of random variables can be expressed as the product of the associated copula density function and marginal PDFs. The experiment demonstrates that the proposed copulas-based mutual information is much more accurate than conventional methods such as the joint histogram and Parzen window based mutual information that are widely used in image processing.
Original languageEnglish
Pages (from-to)493-494
Number of pages2
JournalElectronics Letters
Issue number8
Publication statusPublished - 14 Apr 2011


  • estimation
  • mutual information
  • copula
  • density function
  • gaussian processes
  • signal processing
  • information theory


Dive into the research topics of 'Estimation of mutual information using copula density function'. Together they form a unique fingerprint.

Cite this