Drug and Vaccine Extractive Text Summarization Insights Using Fine-Tuned Transformers

Drug and Vaccine Extractive Text Summarization Insights Using Fine-Tuned Transformers

Authors

  • Rajesh Bandaru Department of Computer Science Engineering, GITAM (Deemed to be University), Visakhapatnam, India https://orcid.org/0009-0008-5661-6599
  • Y. Radhika Department of Computer Science Engineering, GITAM (Deemed to be University), Visakhapatnam, India

DOI:

https://doi.org/10.37965/jait.2024.0559

Keywords:

BART, BERT, extractive text summarization, LexRank, TexRankas

Abstract

Text representation is a key aspect in determining the success of various text summarizing techniques. Summarization using pretrained transformer models has produced encouraging results. Yet the scope of applying these models in medical and drug discovery is not examined to a proper extent. To address this issue, this article aims to perform extractive summarization based on fine-tuned transformers pertaining to drug and medical domain. This research also aims to enhance sentence representation. Exploring the extractive text summarization aspects of medical and drug discovery is a challenging task as the datasets are limited. Hence, this research concentrates on the collection of abstracts collected from PubMed for various domains of medical and drug discovery such as drug and COVID, with a total capacity of 1,370 abstracts. A detailed experimentation using BART (Bidirectional Autoregressive Transformer), T5 (Text-to-Text Transfer Transformer), LexRank, and TexRank for the analysis of the dataset is carried out in this research to perform extractive text summarization.

Metrics

Metrics Loading ...

Downloads

Published

2024-10-11

How to Cite

Bandaru, R., & Radhika, Y. (2024). Drug and Vaccine Extractive Text Summarization Insights Using Fine-Tuned Transformers. Journal of Artificial Intelligence and Technology, 4(4), 351–362. https://doi.org/10.37965/jait.2024.0559

Issue

Section

Research Articles
Loading...