-
公开(公告)号:US20190278835A1
公开(公告)日:2019-09-12
申请号:US15915775
申请日:2018-03-08
Applicant: Adobe Inc.
Inventor: Arman Cohan , Walter W. Chang , Trung Huu Bui , Franck Dernoncourt , Doo Soon Kim
Abstract: Techniques are disclosed for abstractive summarization process for summarizing documents, including long documents. A document is encoded using an encoder-decoder architecture with attentive decoding. In particular, an encoder for modeling documents generates both word-level and section-level representations of a document. A discourse-aware decoder then captures the information flow from all discourse sections of a document. In order to extend the robustness of the generated summarization, a neural attention mechanism considers both word-level as well as section-level representations of a document. The neural attention mechanism may utilize a set of weights that are applied to the word-level representations and section-level representations.
-
公开(公告)号:US11170158B2
公开(公告)日:2021-11-09
申请号:US15915775
申请日:2018-03-08
Applicant: Adobe Inc.
Inventor: Arman Cohan , Walter W. Chang , Trung Huu Bui , Franck Dernoncourt , Doo Soon Kim
IPC: G06F40/14 , G06F40/146 , G06N3/04 , G06F16/93 , G06F16/34 , G06F40/30 , G06F40/56 , G06F40/274 , G06F40/289
Abstract: Techniques are disclosed for abstractive summarization process for summarizing documents, including long documents. A document is encoded using an encoder-decoder architecture with attentive decoding. In particular, an encoder for modeling documents generates both word-level and section-level representations of a document. A discourse-aware decoder then captures the information flow from all discourse sections of a document. In order to extend the robustness of the generated summarization, a neural attention mechanism considers both word-level as well as section-level representations of a document. The neural attention mechanism may utilize a set of weights that are applied to the word-level representations and section-level representations.
-