Recognizing semantic relations by combining transformers and fully connected models

Dmitri Roussinov, Serge Sharoff, Nadezhda Puchnina

Research output: Chapter in Book/Report/Conference proceedingConference contribution book

2 Downloads (Pure)

Abstract

Automatically recognizing an existing semantic relation (e.g. "is a", "part of", "property of", "opposite of" etc.) between two words (phrases, concepts, etc.) is an important task affecting many NLP applications and has been subject of extensive experimentation and modeling. Current approaches to automatically telling if a relation exists between two given concepts X and Y can be grouped into two types: 1) those modeling word-paths connecting X and Y in text and 2) those modeling distributional properties of X and Y separately, not necessary in the proximity to each other. Here, we investigate how both types can be improved and combined. We suggest a distributional approach that is based on an attention-based transformer. We have also developed a novel word path model that combines useful properties of a convolutional network with a fully connected language model. While our transformer-based approach works better, both our models significantly outperform the state-of-the-art within their classes of approaches. We also demonstrate that combining the two approaches results in additional gains since they use somewhat different data sources.

Original languageEnglish
Title of host publicationLREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings
EditorsNicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Asuncion Moreno, Jan Odijk, Stelios Piperidis
Place of PublicationLuxembourg
Pages5838-5845
Number of pages8
ISBN (Electronic)9791095546344
Publication statusPublished - 31 May 2020
Event12th International Conference on Language Resources and Evaluation, LREC 2020 - Marseille, France
Duration: 11 May 202016 May 2020

Publication series

NameLREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings

Conference

Conference12th International Conference on Language Resources and Evaluation, LREC 2020
Country/TerritoryFrance
CityMarseille
Period11/05/2016/05/20

Keywords

  • semantic relations
  • combining transformers
  • fully connected models
  • natural language processing
  • NLP
  • word path model
  • distributional approach
  • attention-=based transformer
  • convulutional network

Fingerprint

Dive into the research topics of 'Recognizing semantic relations by combining transformers and fully connected models'. Together they form a unique fingerprint.

Cite this