• Login
    View Item 
    •   MUT Research Archive
    • Journal Articles
    • School of Computing and IT (JA)
    • Journal Articles (CI)
    • View Item
    •   MUT Research Archive
    • Journal Articles
    • School of Computing and IT (JA)
    • Journal Articles (CI)
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    A Comparative Study of Transformer-based Models for Text Summarization of News Articles

    Thumbnail
    View/Open
    Full Text Article (529.9Kb)
    Date
    2024-04
    Author
    Muia, Charles M.
    Oirere, Aaron M.
    Ndungu, Rachel N.
    Metadata
    Show full item record
    Abstract
    Transformer-based models such as GPT, T5, BART, and PEGASUS have made substantial progress in text summarization, a sub-domain of natural language processing that entails extracting important information from lengthy texts. The main objective of this research was to conduct a comparative analysis of these four transformer-based models based on their performance in text summarization of news articles. In achieving this objective, the transformer models pre-trained on extensive datasets were fine-tuned on the CNN/DailyMail dataset using a low learning rate to preserve the learned representations. The T5 transformer records the highest scores of 35.12, 22.75, 32.82, and 28.59 in ROUGE-1, ROUGE-2, ROUGE-L, and ROUGE-Lsum respectively, surpassing GPT, BART, and PEGASUS across all ROUGE metrics. The findings deduced from this study establish the proficiency of encoder-decoder models such as T5 in summary generation. Furthermore, the findings also demonstrated that the fine-tuning process's effectiveness in pre-trained models is improved when the pre-training objective closely aligns with the downstream task.
    URI
    https://doi.org/10.30534/ijatcse/2024/011322024
    http://www.warse.org/IJATCSE/static/pdf/file/ijatcse011322024.pdf
    http://repository.mut.ac.ke:8080/xmlui/handle/123456789/6437
    Collections
    • Journal Articles (CI) [118]

    MUT Library copyright © 2017-2024  MUT Library Website
    Contact Us | Send Feedback
     

     

    Browse

    All of Research ArchiveCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    LoginRegister

    MUT Library copyright © 2017-2024  MUT Library Website
    Contact Us | Send Feedback