1. <nav id="jhvnf"></nav>

      <listing id="jhvnf"></listing>

      1. <output id="jhvnf"></output>
            1. <code id="jhvnf"></code>
              1. <var id="jhvnf"><object id="jhvnf"><noscript id="jhvnf"></noscript></object></var>

                1. <nav id="jhvnf"></nav>
                  <listing id="jhvnf"></listing>
                      1. <small id="jhvnf"></small><dd id="jhvnf"><delect id="jhvnf"></delect></dd>
                            1. <span id="jhvnf"></span>

                              自然语言处理与信息检索共享平台 自然语言处理与信息检索共享平台

                              A Selectable and Interactive Model for Abstractive Summarization

                              NLPIR SEMINAR Y2019#4


                                     In the new semester, our Lab, Web Search Mining and Security Lab, plans to hold an academic seminar every Monday, and each time a keynote speaker will share understanding of papers on his/her related research with you.


                              This week’s seminar is organized as follows:

                              1. The seminar time is 1.pm, Mon, at Zhongguancun Technology Park ,Building 5, 1306.
                              2. The lecturer is Xi Zhang, the paper’s title is A Selectable and Interactive Model for Abstractive Summarization.
                              3. The seminar will be hosted by Qinghong Jiang.
                              4. Attachment is the paper of this seminar, please download in advance.

                              Anyone interested in this topic is welcomed to join us.
                              the following is the abstract for this week’s paper.

                              A Selectable and Interactive Model for Abstractive Summarization

                              Xi Zhang


                                     Sequence-to-sequence neural networks with attention have been widely used in text summarization as the amount of textual data has exploded in recent years. The traditional approach to automatic sum-marization is based only on word attention and most of them focus on generating a single sentence summarization. In this work, we propose a novel model with a dual attention that considers both sentence and word information and then generates a multi-sentence summarization word by word. Additionally, we enhance our model with a copy-generator network to solve the out-of-vocabulary (OOV) problem. The model shows signif-icant performance gains on the CNN/DailyMail corpus compared with the baseline model. Experimental results demonstrate that our method can obtain ROUGE-1 points of 37.48, ROUGE-2 points of 16.40 and ROUGE-L points of 34.36. Our work shows that several features of our proposed model contribute to further improvements in performance.

                              You May Also Like

                              About the Author: nlpvv