-
公开(公告)号:US12217185B2
公开(公告)日:2025-02-04
申请号:US17332464
申请日:2021-05-27
Inventor: Hyun Woo Kim , Jeon Gue Park , Hwa Jeon Song , Yoo Rhee Oh , Byung Hyun Yoo , Eui Sok Chung , Ran Han
Abstract: A knowledge increasing method includes calculating uncertainty of knowledge obtained from a neural network using an explicit memory, determining the insufficiency of the knowledge on the basis of the calculated uncertainty, obtaining additional data (learning data) for increasing insufficient knowledge, and training the neural network by using the additional data to autonomously increase knowledge.
-
公开(公告)号:US11423238B2
公开(公告)日:2022-08-23
申请号:US16671773
申请日:2019-11-01
Inventor: Eui Sok Chung , Hyun Woo Kim , Hwa Jeon Song , Ho Young Jung , Byung Ok Kang , Jeon Gue Park , Yoo Rhee Oh , Yun Keun Lee
IPC: G06F40/56 , G06F40/30 , G06F40/289
Abstract: Provided are sentence embedding method and apparatus based on subword embedding and skip-thoughts. To integrate skip-thought sentence embedding learning methodology with a subword embedding technique, a skip-thought sentence embedding learning method based on subword embedding and methodology for simultaneously learning subword embedding learning and skip-thought sentence embedding learning, that is, multitask learning methodology, are provided as methodology for applying intra-sentence contextual information to subword embedding in the case of subword embedding learning. This makes it possible to apply a sentence embedding approach to agglutinative languages such as Korean in a bag-of-words form. Also, skip-thought sentence embedding learning methodology is integrated with a subword embedding technique such that intra-sentence contextual information can be used in the case of subword embedding learning. A proposed model minimizes additional training parameters based on sentence embedding such that most training results may be accumulated in a subword embedding parameter.
-