Aligning brain activity with advanced transformer models: exploring the role of punctuation in semantic processing

Research output: Working paperWorking Paper/Preprint

28 Downloads (Pure)

Abstract

This research examines the congruence between neural activity and advanced transformer models, emphasizing the semantic significance of punctuation in text understanding. Utilizing an innovative approach originally proposed by Toneva and Wehbe, we evaluate four advanced transformer models RoBERTa, DistiliBERT, ALBERT, and ELECTRA against neural activity data. Our findings indicate that RoBERTa exhibits the closest alignment with neural activity, surpassing BERT in accuracy. Furthermore, we investigate the impact of punctuation removal on model performance and neural alignment, revealing that BERT's accuracy enhances in the absence of punctuation. This study contributes to the comprehension of how neural networks represent language and the influence of punctuation on semantic processing within the human brain.
Original languageEnglish
Place of PublicationIthaca, NY
Number of pages15
DOIs
Publication statusPublished - 16 Jan 2025

Keywords

  • neural activity
  • advanced transformer models
  • semantic processing

Fingerprint

Dive into the research topics of 'Aligning brain activity with advanced transformer models: exploring the role of punctuation in semantic processing'. Together they form a unique fingerprint.

Cite this