TY - GEN
T1 - A Comparative Study of Transformer Based Pretrained AI Models for Content Summarization
AU - Rasheed, Ashika Sameem Abdul
AU - Masud, Mohammad Mehedy
AU - Abduljabbar, Mohammed
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - In this study, we examine different transformer based pretrained Artificial Intelligence (AI) models on their ability to summarize text content from different sources. AI has emerged as a powerful tool in this context, offering the potential to automate and improve the process of content summarization. We mainly focus on the pretrained transformer models, such as Pegasus, T5, Bart, and ProphetNet for key point summarization from textual contents. We aim to assess the effectiveness of these models in summarizing different contents like articles, instructions, conversational dialogues, and compare and analyze their performance across different datasets. We use ROUGE metric to evaluate the quality of the generated summaries. The Facebook's BART model had better performance across different textual datasets. We believe that our findings will offer valuable insights into the capabilities and limitations of Transformer-based AI models in the context of extracting essential points from large articles, making them useful as assistive tools for summarizing course content in educational environments.
AB - In this study, we examine different transformer based pretrained Artificial Intelligence (AI) models on their ability to summarize text content from different sources. AI has emerged as a powerful tool in this context, offering the potential to automate and improve the process of content summarization. We mainly focus on the pretrained transformer models, such as Pegasus, T5, Bart, and ProphetNet for key point summarization from textual contents. We aim to assess the effectiveness of these models in summarizing different contents like articles, instructions, conversational dialogues, and compare and analyze their performance across different datasets. We use ROUGE metric to evaluate the quality of the generated summaries. The Facebook's BART model had better performance across different textual datasets. We believe that our findings will offer valuable insights into the capabilities and limitations of Transformer-based AI models in the context of extracting essential points from large articles, making them useful as assistive tools for summarizing course content in educational environments.
KW - Artificial Intelligence
KW - Key Point Summarization
KW - Natural Language Processing
KW - Pretrained Language Models
KW - Transformers
UR - http://www.scopus.com/inward/record.url?scp=85182937049&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85182937049&partnerID=8YFLogxK
U2 - 10.1109/IIT59782.2023.10366411
DO - 10.1109/IIT59782.2023.10366411
M3 - Conference contribution
AN - SCOPUS:85182937049
T3 - 2023 15th International Conference on Innovations in Information Technology, IIT 2023
SP - 79
EP - 84
BT - 2023 15th International Conference on Innovations in Information Technology, IIT 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 15th International Conference on Innovations in Information Technology, IIT 2023
Y2 - 14 November 2023 through 15 November 2023
ER -