Semantic Importance-Aware Communications Using Pre-Trained Language Models

Shuaishuai Guo, Yanhu Wang, Shujing Li, Nasir Saeed

Research output: Contribution to journalArticlepeer-review

36 Citations (Scopus)

Abstract

This letter proposes a semantic importance-aware communication (SIAC) scheme using pre-trained language models (e.g., ChatGPT, BERT, etc.). Specifically, we propose a cross-layer design with a pre-trained language model embedded in/connected by the cross-layer manager. The pre-trained language model is utilized to quantify the semantic importance of data frames. Based on the quantified semantic importance, we investigate semantic importance-aware power allocation. Unlike existing deep joint source-channel coding (Deep-JSCC)-based semantic communication schemes, SIAC can be directly embedded into current communication systems by only introducing a cross-layer manager. Our experimental results show that the proposed SIAC scheme can achieve lower semantic loss than existing equal-priority communications.

Original languageEnglish
Pages (from-to)2328-2332
Number of pages5
JournalIEEE Communications Letters
Volume27
Issue number9
DOIs
Publication statusPublished - Sept 1 2023

Keywords

  • data importance
  • power allocation
  • pre-trained language model
  • Semantic communications

ASJC Scopus subject areas

  • Modelling and Simulation
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Semantic Importance-Aware Communications Using Pre-Trained Language Models'. Together they form a unique fingerprint.

Cite this