Anaphora solved Ad-Dl-Bert model for text summarization with auto encoding using the topic description and several priors (ATDS) approach
DOI:
https://doi.org/10.7494/csci.2025.26.3.6352Abstract
Owing to the large amount of digital text content in articles, novels stories, and so on, Automatic Text Summarization (ATS) is becoming a significant task. Abstractive or extractive summaries of single or multi documents have been generated by various researchers. Although several models were generated, there are still limitations like the anaphora problem that occurred during the summarization. To overcome such limitations, this paper proposes the Added dropout-Deleted Layer norm-Bidirectional Encoder Representations from Transformers (Ad-DL-BERT)-based Extractive Text Summarization (ETS). Primarily, the input document’s sentences are prepared for accurate summarization by pre-processing; then, the unwanted sentences are removed. Afterward, with the Auto encoding using the Topic Description and Several priors (ATDS) approach, the sentences under the same topic are clustered. Moreover, keywords for summarization are extracted with an Anaphora-POS (An-POS) extractor. Thereafter, for removing the redundant sentences, the ranking with Exponential Linear Unit-Generative Adversarial Network (ELU-GAN) and saliency score assignment processes are performed. Also, assignments for sentences are performed to enhance the coherency, sorting, and cosine similarity score. Lastly, the Ad-DL-BERT generated summary and the proposed technique’s performance are evaluated on the Document Understanding Conference (DUC2002) dataset. Regarding clustering time, execution time, Recall-Oriented Understudy for Gisting Evaluation (ROUGE-1) scores of recall, F-measure, and precision, the experimental outcomes exhibited the proposed techniques’ dominance over the conventional approaches.
Downloads
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Computer Science

This work is licensed under a Creative Commons Attribution 4.0 International License.