NATURAL LANGUAGE PROCESSING USING CONTEXT-SPECIFIC WORD VECTORS

    公开(公告)号:US20180373682A1

    公开(公告)日:2018-12-27

    申请号:US15982841

    申请日:2018-05-17

    Abstract: A system is provided for natural language processing. In some embodiments, the system includes an encoder for generating context-specific word vectors for at least one input sequence of words. The encoder is pre-trained using training data for performing a first natural language processing task. A neural network performs a second natural language processing task on the at least one input sequence of words using the context-specific word vectors. The first natural language process task is different from the second natural language processing task and the neural network is separately trained from the encoder. In some embodiments, the first natural processing task can be machine translation, and the second natural processing task can be one of sentiment analysis, question classification, entailment classification, and question answering

    DOMAIN SPECIFIC LANGUAGE FOR GENERATION OF RECURRENT NEURAL NETWORK ARCHITECTURES

    公开(公告)号:US20180336453A1

    公开(公告)日:2018-11-22

    申请号:US15953265

    申请日:2018-04-13

    Abstract: A system automatically generates recurrent neural network (RNN) architectures for performing specific tasks, for example, machine translation. The system represents RNN architectures using a domain specific language (DSL). The system generates candidate RNN architectures. The system predicts performances of the generated candidate RNN architectures, for example, using a neural network. The system filters the candidate RNN architectures based on their predicted performance. The system generates code for selected a candidate architectures. The generated code represents an RNN that is configured to perform the specific task. The system executes the generated code, for example, to evaluate an RNN or to use the RNN in an application.

    Dynamic Memory Network
    103.
    发明申请
    Dynamic Memory Network 审中-公开
    动态内存网络

    公开(公告)号:US20160350653A1

    公开(公告)日:2016-12-01

    申请号:US15170884

    申请日:2016-06-01

    CPC classification number: G06N5/04 G06N3/0445

    Abstract: A novel unified neural network framework, the dynamic memory network, is disclosed. This unified framework reduces every task in natural language processing to a question answering problem over an input sequence. Inputs and questions are used to create and connect deep memory sequences. Answers are then generated based on dynamically retrieved memories.

    Abstract translation: 公开了一种新颖的统一神经网络框架,动态存储网络。 这个统一框架将自然语言处理中的每个任务都减少到一个输入序列中的问题回答问题。 输入和问题用于创建和连接深层记忆序列。 然后基于动态检索的存储器生成答案。

    Systems and methods for mutual information based self-supervised learning

    公开(公告)号:US12198060B2

    公开(公告)日:2025-01-14

    申请号:US17006570

    申请日:2020-08-28

    Abstract: Embodiments described herein combine both masked reconstruction and predictive coding. Specifically, unlike contrastive learning, the mutual information between past states and future states are directly estimated. The context information can also be directly captured via shifted masked reconstruction—unlike standard masked reconstruction, the target reconstructed observations are shifted slightly towards the future to incorporate more predictability. The estimated mutual information and shifted masked reconstruction loss can then be combined as the loss function to update the neural model.

    Proposal learning for semi-supervised object detection

    公开(公告)号:US11669745B2

    公开(公告)日:2023-06-06

    申请号:US17080276

    申请日:2020-10-26

    CPC classification number: G06F18/2178 G06F18/2155 G06N3/082

    Abstract: A method for generating a neural network for detecting one or more objects in images includes generating one or more self-supervised proposal learning losses based on the one or more proposal features and corresponding proposal feature predictions. One or more consistency-based proposal learning losses are generated based on noisy proposal feature predictions and the corresponding proposal predictions without noise. A combined loss is generated using the one or more self-supervised proposal learning losses and one or more consistency-based proposal learning losses. The neural network is updated based on the combined loss.

    Systems and methods for unifying question answering and text classification via span extraction

    公开(公告)号:US11657233B2

    公开(公告)日:2023-05-23

    申请号:US17673709

    申请日:2022-02-16

    CPC classification number: G06F40/30 G06F40/284 G06F16/3329 G06N3/08

    Abstract: Systems and methods for unifying question answering and text classification via span extraction include a preprocessor for preparing a source text and an auxiliary text based on a task type of a natural language processing task, an encoder for receiving the source text and the auxiliary text from the preprocessor and generating an encoded representation of a combination of the source text and the auxiliary text, and a span-extractive decoder for receiving the encoded representation and identifying a span of text within the source text that is a result of the NLP task. The task type is one of entailment, classification, or regression. In some embodiments, the source text includes one or more of text received as input when the task type is entailment, a list of classifications when the task type is entailment or classification, or a list of similarity options when the task type is regression.

Patent Agency Ranking