### Abstract

The majority of work in similarity search focuses on the efficiency of threshold and nearest-neighbour queries. Similarity join has been less well studied, although efficient indexing algorithms have been shown. The multi-way similarity join, extending similarity join to multiple spaces, has received relatively little treatment.

Here we present a novel metric designed to assess some concept of a mutual similarity over multiple vectors, thus extending pairwise distance to a more general notion taken over a set of values. In outline, when considering a set of values X, our function gives a single numeric outcome D(X) rather than calculating some compound function over all of d(x, y) where x,y are elements of X. D(X) is strongly correlated with various compound functions, but costs only a little more than a single distance to evaluate. It is derived from an information-theoretic distance metric; it correlates strongly with this metric, and also with other metrics, in high-dimensional spaces. Although we are at an early stage in its investigation, we believe it could potentially be used to help construct more efficient indexes, or to construct indexes more efficiently.

The contribution of this short paper is simply to identify the function, to show that it has useful semantic properties, and to show also that it is surprisingly cheap to evaluate. We expect uses of the function in the domain of similarity search to follow.

Here we present a novel metric designed to assess some concept of a mutual similarity over multiple vectors, thus extending pairwise distance to a more general notion taken over a set of values. In outline, when considering a set of values X, our function gives a single numeric outcome D(X) rather than calculating some compound function over all of d(x, y) where x,y are elements of X. D(X) is strongly correlated with various compound functions, but costs only a little more than a single distance to evaluate. It is derived from an information-theoretic distance metric; it correlates strongly with this metric, and also with other metrics, in high-dimensional spaces. Although we are at an early stage in its investigation, we believe it could potentially be used to help construct more efficient indexes, or to construct indexes more efficiently.

The contribution of this short paper is simply to identify the function, to show that it has useful semantic properties, and to show also that it is surprisingly cheap to evaluate. We expect uses of the function in the domain of similarity search to follow.

Original language | English |
---|---|

Title of host publication | Similarity Search and Applications |

Subtitle of host publication | 6th International Conference, SISAP 2013, A Coruña, Spain, October 2-4, 2013, Proceedings |

Editors | Nieves Brisaboa, Oscar Pedreira, Pavel Zezula |

Pages | 169-174 |

Number of pages | 6 |

DOIs | |

Publication status | Published - 2 Oct 2013 |

Event | 6th International Conference on Similarity Search and Applications, SISAP 2013 - Hotel Riazor, A Coruña, Spain Duration: 2 Oct 2013 → 4 Oct 2013 |

### Publication series

Name | Lecture Notes in Computer Science |
---|---|

Publisher | Springer Berlin Heidelberg |

Volume | 8199 |

ISSN (Print) | 0302-9743 |

### Conference

Conference | 6th International Conference on Similarity Search and Applications, SISAP 2013 |
---|---|

Country | Spain |

City | A Coruña |

Period | 2/10/13 → 4/10/13 |

### Keywords

- distance metric
- multi-way divergence
- similarity join
- mutual similarity
- multiple vectors
- indexing algorithms

## Fingerprint Dive into the research topics of 'A multi-way divergence metric for vector spaces'. Together they form a unique fingerprint.

## Cite this

Moss, R., & Connor, R. (2013). A multi-way divergence metric for vector spaces. In N. Brisaboa, O. Pedreira, & P. Zezula (Eds.),

*Similarity Search and Applications: 6th International Conference, SISAP 2013, A Coruña, Spain, October 2-4, 2013, Proceedings*(pp. 169-174). (Lecture Notes in Computer Science; Vol. 8199). https://doi.org/10.1007/978-3-642-41062-8_17