### Abstract

Language | English |
---|---|

Pages | 245-268 |

Number of pages | 23 |

Journal | Annals of Mathematics and Artificial Intelligence |

Volume | 32 |

Issue number | 1 |

DOIs | |

Publication status | Published - 2001 |

### Fingerprint

### Keywords

- probability
- statistics
- vine dependence
- markov tree
- management theory

### Cite this

}

*Annals of Mathematics and Artificial Intelligence*, vol. 32, no. 1, pp. 245-268. https://doi.org/10.1023/A:1016725902970

**Probability density decomposition for conditionally dependent random variables modeled by Vines.** / Bedford, T.J.; Cooke, R.

Research output: Contribution to journal › Article

TY - JOUR

T1 - Probability density decomposition for conditionally dependent random variables modeled by Vines

AU - Bedford, T.J.

AU - Cooke, R.

PY - 2001

Y1 - 2001

N2 - A vine is a new graphical model for dependent random variables. Vines generalize the Markov trees often used in modeling multivariate distributions. They differ from Markov trees and Bayesian belief nets in that the concept of conditional independence is weakened to allow for various forms of conditional dependence. A general formula for the density of a vine dependent distribution is derived. This generalizes the well-known density formula for belief nets based on the decomposition of belief nets into cliques. Furthermore, the formula allows a simple proof of the Information Decomposition Theorem for a regular vine. The problem of (conditional) sampling is discussed, and Gibbs sampling is proposed to carry out sampling from conditional vine dependent distributions. The so-called lsquocanonical vinesrsquo built on highest degree trees offer the most efficient structure for Gibbs sampling.

AB - A vine is a new graphical model for dependent random variables. Vines generalize the Markov trees often used in modeling multivariate distributions. They differ from Markov trees and Bayesian belief nets in that the concept of conditional independence is weakened to allow for various forms of conditional dependence. A general formula for the density of a vine dependent distribution is derived. This generalizes the well-known density formula for belief nets based on the decomposition of belief nets into cliques. Furthermore, the formula allows a simple proof of the Information Decomposition Theorem for a regular vine. The problem of (conditional) sampling is discussed, and Gibbs sampling is proposed to carry out sampling from conditional vine dependent distributions. The so-called lsquocanonical vinesrsquo built on highest degree trees offer the most efficient structure for Gibbs sampling.

KW - probability

KW - statistics

KW - vine dependence

KW - markov tree

KW - management theory

UR - http://dx.doi.org/10.1023/A:1016725902970

U2 - 10.1023/A:1016725902970

DO - 10.1023/A:1016725902970

M3 - Article

VL - 32

SP - 245

EP - 268

JO - Annals of Mathematics and Artificial Intelligence

T2 - Annals of Mathematics and Artificial Intelligence

JF - Annals of Mathematics and Artificial Intelligence

SN - 1012-2443

IS - 1

ER -