Probability density decomposition for conditionally dependent random variables modeled by Vines

T.J. Bedford, R. Cooke

Research output: Contribution to journalArticle

331 Citations (Scopus)

Abstract

A vine is a new graphical model for dependent random variables. Vines generalize the Markov trees often used in modeling multivariate distributions. They differ from Markov trees and Bayesian belief nets in that the concept of conditional independence is weakened to allow for various forms of conditional dependence. A general formula for the density of a vine dependent distribution is derived. This generalizes the well-known density formula for belief nets based on the decomposition of belief nets into cliques. Furthermore, the formula allows a simple proof of the Information Decomposition Theorem for a regular vine. The problem of (conditional) sampling is discussed, and Gibbs sampling is proposed to carry out sampling from conditional vine dependent distributions. The so-called lsquocanonical vinesrsquo built on highest degree trees offer the most efficient structure for Gibbs sampling.
LanguageEnglish
Pages245-268
Number of pages23
JournalAnnals of Mathematics and Artificial Intelligence
Volume32
Issue number1
DOIs
Publication statusPublished - 2001

Fingerprint

Dependent Random Variables
Probability Density
Random variables
Gibbs Sampling
Sampling
Decomposition
Decompose
Generalise
Conditional Independence
Dependent
Decomposition Theorem
Multivariate Distribution
Graphical Models
Clique
Modeling
Beliefs

Keywords

  • probability
  • statistics
  • vine dependence
  • markov tree
  • management theory

Cite this

@article{34b7dcdf43bc4dd0b2ffa092221a84f1,
title = "Probability density decomposition for conditionally dependent random variables modeled by Vines",
abstract = "A vine is a new graphical model for dependent random variables. Vines generalize the Markov trees often used in modeling multivariate distributions. They differ from Markov trees and Bayesian belief nets in that the concept of conditional independence is weakened to allow for various forms of conditional dependence. A general formula for the density of a vine dependent distribution is derived. This generalizes the well-known density formula for belief nets based on the decomposition of belief nets into cliques. Furthermore, the formula allows a simple proof of the Information Decomposition Theorem for a regular vine. The problem of (conditional) sampling is discussed, and Gibbs sampling is proposed to carry out sampling from conditional vine dependent distributions. The so-called lsquocanonical vinesrsquo built on highest degree trees offer the most efficient structure for Gibbs sampling.",
keywords = "probability, statistics, vine dependence, markov tree, management theory",
author = "T.J. Bedford and R. Cooke",
year = "2001",
doi = "10.1023/A:1016725902970",
language = "English",
volume = "32",
pages = "245--268",
journal = "Annals of Mathematics and Artificial Intelligence",
issn = "1012-2443",
number = "1",

}

Probability density decomposition for conditionally dependent random variables modeled by Vines. / Bedford, T.J.; Cooke, R.

In: Annals of Mathematics and Artificial Intelligence, Vol. 32, No. 1, 2001, p. 245-268.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Probability density decomposition for conditionally dependent random variables modeled by Vines

AU - Bedford, T.J.

AU - Cooke, R.

PY - 2001

Y1 - 2001

N2 - A vine is a new graphical model for dependent random variables. Vines generalize the Markov trees often used in modeling multivariate distributions. They differ from Markov trees and Bayesian belief nets in that the concept of conditional independence is weakened to allow for various forms of conditional dependence. A general formula for the density of a vine dependent distribution is derived. This generalizes the well-known density formula for belief nets based on the decomposition of belief nets into cliques. Furthermore, the formula allows a simple proof of the Information Decomposition Theorem for a regular vine. The problem of (conditional) sampling is discussed, and Gibbs sampling is proposed to carry out sampling from conditional vine dependent distributions. The so-called lsquocanonical vinesrsquo built on highest degree trees offer the most efficient structure for Gibbs sampling.

AB - A vine is a new graphical model for dependent random variables. Vines generalize the Markov trees often used in modeling multivariate distributions. They differ from Markov trees and Bayesian belief nets in that the concept of conditional independence is weakened to allow for various forms of conditional dependence. A general formula for the density of a vine dependent distribution is derived. This generalizes the well-known density formula for belief nets based on the decomposition of belief nets into cliques. Furthermore, the formula allows a simple proof of the Information Decomposition Theorem for a regular vine. The problem of (conditional) sampling is discussed, and Gibbs sampling is proposed to carry out sampling from conditional vine dependent distributions. The so-called lsquocanonical vinesrsquo built on highest degree trees offer the most efficient structure for Gibbs sampling.

KW - probability

KW - statistics

KW - vine dependence

KW - markov tree

KW - management theory

UR - http://dx.doi.org/10.1023/A:1016725902970

U2 - 10.1023/A:1016725902970

DO - 10.1023/A:1016725902970

M3 - Article

VL - 32

SP - 245

EP - 268

JO - Annals of Mathematics and Artificial Intelligence

T2 - Annals of Mathematics and Artificial Intelligence

JF - Annals of Mathematics and Artificial Intelligence

SN - 1012-2443

IS - 1

ER -