Scalable cross-lingual transfer of neural sentence embeddings

Hanan Aldarmaki, Mona Diab

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

We develop and investigate several crosslingual alignment approaches for neural sentence embedding models, such as the supervised inference classifier, InferSent, and sequential encoder-decoder models. We evaluate three alignment frameworks applied to these models: joint modeling, representation transfer learning, and sentence mapping, using parallel text to guide the alignment. Our results support representation transfer as a scalable approach for modular cross-lingual alignment of neural sentence embeddings, where we observe better performance compared to joint models in intrinsic and extrinsic evaluations, particularly with smaller sets of parallel data.

Original languageEnglish
Title of host publication*SEM@NAACL-HLT 2019 - 8th Joint Conference on Lexical and Computational Semantics
PublisherAssociation for Computational Linguistics (ACL)
Pages51-60
Number of pages10
ISBN (Electronic)9781948087933
Publication statusPublished - 2019
Externally publishedYes
Event8th Joint Conference on Lexical and Computational Semantics, *SEM@NAACL-HLT 2019 - Minneapolis, United States
Duration: Jun 6 2019Jun 7 2019

Publication series

Name*SEM@NAACL-HLT 2019 - 8th Joint Conference on Lexical and Computational Semantics

Conference

Conference8th Joint Conference on Lexical and Computational Semantics, *SEM@NAACL-HLT 2019
Country/TerritoryUnited States
CityMinneapolis
Period6/6/196/7/19

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Computational Theory and Mathematics

Fingerprint

Dive into the research topics of 'Scalable cross-lingual transfer of neural sentence embeddings'. Together they form a unique fingerprint.

Cite this