(这个文章本来是单文本摘要集合,但我觉得这么大的标题实在是毫无存在的意义,所以直接废物利用改成长文本摘要了。
如果之前的博文中有地方没有把链接和标题对应替换完,请联系我修改)
我之前几篇博文中分别写过相关的信息,具体如何解耦和修改我以后再慢慢来。
最近更新时间:2023.5.9
最早更新时间:2023.5.9
1. 抽取
- (2019) Exploiting discourse-level segmentation for extractive summarization:discourse-level segmentation + adapted contextual representation model
RNN或Bert - (2020) Discourse-Aware Neural Extractive Text Summarization
2. 抽取 - 具体咋做的还没看
- (2021) Globalizing BERT-based Transformer Architectures for Long Document Summarization
- (2021) Sliding Selector Network with Dynamic Memory for Extractive Summarization of Long Documents
3. 生成 - 分治
- (2018) Deep Communicating Agents for Abstractive Summarization:强化学习。用agent分别处理每个subsection并进行信息交互
- (2020) A Divide-and-Conquer Approach to the Summarization of Long Documents
4. 抽取→生成
- (2021) Long Document Summarization in a Low Resource Setting using Pretrained Language Models
- (2021) Long-Span Summarization via Local Attention and Content Selection
5. 生成 - 具体咋做的还没看
- (2021) Hierarchical Learning for Generation with Long Source Sequences
- (2021) Efficient attentions for long document summarization
- (2022) Long Document Summarization with Top-Down and Bottom-Up Representation Inference