Show simple item record

dc.contributor.authorMuia, Charles M.
dc.contributor.authorOirere, Aaron M.
dc.contributor.authorNdungu, Rachel N.
dc.date.accessioned2024-04-24T13:06:05Z
dc.date.available2024-04-24T13:06:05Z
dc.date.issued2024-04
dc.identifier.citationInternational Journal of Advanced Trends in Computer Science and Engineering,13(2), March - April 2024, 37- 43en_US
dc.identifier.issn2278-3091
dc.identifier.urihttps://doi.org/10.30534/ijatcse/2024/011322024
dc.identifier.urihttp://www.warse.org/IJATCSE/static/pdf/file/ijatcse011322024.pdf
dc.identifier.urihttp://repository.mut.ac.ke:8080/xmlui/handle/123456789/6437
dc.description.abstractTransformer-based models such as GPT, T5, BART, and PEGASUS have made substantial progress in text summarization, a sub-domain of natural language processing that entails extracting important information from lengthy texts. The main objective of this research was to conduct a comparative analysis of these four transformer-based models based on their performance in text summarization of news articles. In achieving this objective, the transformer models pre-trained on extensive datasets were fine-tuned on the CNN/DailyMail dataset using a low learning rate to preserve the learned representations. The T5 transformer records the highest scores of 35.12, 22.75, 32.82, and 28.59 in ROUGE-1, ROUGE-2, ROUGE-L, and ROUGE-Lsum respectively, surpassing GPT, BART, and PEGASUS across all ROUGE metrics. The findings deduced from this study establish the proficiency of encoder-decoder models such as T5 in summary generation. Furthermore, the findings also demonstrated that the fine-tuning process's effectiveness in pre-trained models is improved when the pre-training objective closely aligns with the downstream task.en_US
dc.language.isoenen_US
dc.publisherInternational Journal of Advanced Trends in Computer Science and Engineeringen_US
dc.subjectNatural Language Processing, ROUGE Metrics, Text Summarization, Transformers.en_US
dc.titleA Comparative Study of Transformer-based Models for Text Summarization of News Articlesen_US
dc.typeArticleen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record