Drug and Vaccine Extractive Text Summarization Insights Using Fine-Tuned Transformers
DOI:
https://doi.org/10.37965/jait.2024.0559Keywords:
BART, BERT, extractive text summarization, LexRank, TexRankasAbstract
Text representation is a key aspect in determining the success of various text summarizing techniques. Summarization using pretrained transformer models has produced encouraging results. Yet the scope of applying these models in medical and drug discovery is not examined to a proper extent. To address this issue, this article aims to perform extractive summarization based on fine-tuned transformers pertaining to drug and medical domain. This research also aims to enhance sentence representation. Exploring the extractive text summarization aspects of medical and drug discovery is a challenging task as the datasets are limited. Hence, this research concentrates on the collection of abstracts collected from PubMed for various domains of medical and drug discovery such as drug and COVID, with a total capacity of 1,370 abstracts. A detailed experimentation using BART (Bidirectional Autoregressive Transformer), T5 (Text-to-Text Transfer Transformer), LexRank, and TexRank for the analysis of the dataset is carried out in this research to perform extractive text summarization.
Metrics
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Authors
This work is licensed under a Creative Commons Attribution 4.0 International License.