Tags

Type your tag names separated by a space and hit enter

N-GPETS: Neural Attention Graph-Based Pretrained Statistical Model for Extractive Text Summarization.
Comput Intell Neurosci. 2022; 2022:6241373.CI

Abstract

The extractive summarization approach involves selecting the source document's salient sentences to build a summary. One of the most important aspects of extractive summarization is learning and modelling cross-sentence associations. Inspired by the popularity of Transformer-based Bidirectional Encoder Representations (BERT) pretrained linguistic model and graph attention network (GAT) having a sophisticated network that captures intersentence associations, this research work proposes a novel neural model N-GPETS by combining heterogeneous graph attention network with BERT model along with statistical approach using TF-IDF values for extractive summarization task. Apart from sentence nodes, N-GPETS also works with different semantic word nodes of varying granularity levels that serve as a link between sentences, improving intersentence interaction. Furthermore, proposed N-GPETS becomes more improved and feature-rich by integrating graph layer with BERT encoder at graph initialization step rather than employing other neural network encoders such as CNN or LSTM. To the best of our knowledge, this work is the first attempt to combine the BERT encoder and TF-IDF values of the entire document with a heterogeneous attention graph structure for the extractive summarization task. The empirical outcomes on benchmark news data sets CNN/DM show that the proposed model N-GPETS gets favorable results in comparison with other heterogeneous graph structures employing the BERT model and graph structures without the BERT model.

Authors+Show Affiliations

Department of Computer Science, City University of Science and Information Technology, Peshawar 25000, Pakistan.Department of Computer Science, City University of Science and Information Technology, Peshawar 25000, Pakistan.Department of Computer Science, Islamia College, Peshawar 25000, Pakistan.Department of Computer Science, University of Engineering and Technology, Mardan, Pakistan.Department of Computer Science, University of Buner, Buner 19290, Pakistan.Faculty of Computer Science, University of Nangarhar, Jalalabad 2600, Afghanistan.

Pub Type(s)

Journal Article

Language

eng

PubMed ID

36458230

Citation

Umair, Muhammad, et al. "N-GPETS: Neural Attention Graph-Based Pretrained Statistical Model for Extractive Text Summarization." Computational Intelligence and Neuroscience, vol. 2022, 2022, p. 6241373.
Umair M, Alam I, Khan A, et al. N-GPETS: Neural Attention Graph-Based Pretrained Statistical Model for Extractive Text Summarization. Comput Intell Neurosci. 2022;2022:6241373.
Umair, M., Alam, I., Khan, A., Khan, I., Ullah, N., & Momand, M. Y. (2022). N-GPETS: Neural Attention Graph-Based Pretrained Statistical Model for Extractive Text Summarization. Computational Intelligence and Neuroscience, 2022, 6241373. https://doi.org/10.1155/2022/6241373
Umair M, et al. N-GPETS: Neural Attention Graph-Based Pretrained Statistical Model for Extractive Text Summarization. Comput Intell Neurosci. 2022;2022:6241373. PubMed PMID: 36458230.
* Article titles in AMA citation format should be in sentence-case
TY - JOUR T1 - N-GPETS: Neural Attention Graph-Based Pretrained Statistical Model for Extractive Text Summarization. AU - Umair,Muhammad, AU - Alam,Iftikhar, AU - Khan,Atif, AU - Khan,Inayat, AU - Ullah,Niamat, AU - Momand,Mohammad Yusuf, Y1 - 2022/11/22/ PY - 2022/03/08/received PY - 2022/08/02/accepted PY - 2022/12/2/entrez PY - 2022/12/3/pubmed PY - 2022/12/6/medline SP - 6241373 EP - 6241373 JF - Computational intelligence and neuroscience JO - Comput Intell Neurosci VL - 2022 N2 - The extractive summarization approach involves selecting the source document's salient sentences to build a summary. One of the most important aspects of extractive summarization is learning and modelling cross-sentence associations. Inspired by the popularity of Transformer-based Bidirectional Encoder Representations (BERT) pretrained linguistic model and graph attention network (GAT) having a sophisticated network that captures intersentence associations, this research work proposes a novel neural model N-GPETS by combining heterogeneous graph attention network with BERT model along with statistical approach using TF-IDF values for extractive summarization task. Apart from sentence nodes, N-GPETS also works with different semantic word nodes of varying granularity levels that serve as a link between sentences, improving intersentence interaction. Furthermore, proposed N-GPETS becomes more improved and feature-rich by integrating graph layer with BERT encoder at graph initialization step rather than employing other neural network encoders such as CNN or LSTM. To the best of our knowledge, this work is the first attempt to combine the BERT encoder and TF-IDF values of the entire document with a heterogeneous attention graph structure for the extractive summarization task. The empirical outcomes on benchmark news data sets CNN/DM show that the proposed model N-GPETS gets favorable results in comparison with other heterogeneous graph structures employing the BERT model and graph structures without the BERT model. SN - 1687-5273 UR - https://www.unboundmedicine.com/medline/citation/36458230/N_GPETS:_Neural_Attention_Graph_Based_Pretrained_Statistical_Model_for_Extractive_Text_Summarization_ DB - PRIME DP - Unbound Medicine ER -