SYSTEM FOR CREATING A TEMPORAL PREDICTIVE MODEL

    公开(公告)号:US20240143984A1

    公开(公告)日:2024-05-02

    申请号:US18491261

    申请日:2023-10-20

    摘要: A system is provided including a data pipeline and a model pipeline. A data pipeline includes: an input that receives a first dataset representing categorical features and a second dataset representing numerical features; a feature ingestion block that generates an output corresponding to a sum of the first dataset with the second dataset; an output that provides training labels based on a processing of the summed datasets to predict a temporally isolated and discrete event; and a label creation block that receives the output and generates labels for date features in the first dataset. A model pipeline includes a neural network(s) that: receives a first input corresponding to a summation of non learned date embedding with learned feature embedding; and contextualizes the summation by date embedding historical patient data into the summation. The model pipeline includes a prediction block that receives the contextualized summation and predicts one or more outcomes.

    SYSTEMS AND METHODS FOR A MULTI-MODEL MESSAGE GENERATION ARCHITECTURE

    公开(公告)号:US20230394290A1

    公开(公告)日:2023-12-07

    申请号:US18204797

    申请日:2023-06-01

    发明人: Roman Lutsyshyn

    IPC分类号: G06N3/0499

    CPC分类号: G06N3/0499

    摘要: In some aspects, the disclosure is directed to methods and systems for a multi-model message response generation system. A computing device may identify a first message. The computing device may determine that the first message is an initial message of a thread or includes an error log. Responsive to determining that the first message is the initial message of the thread or includes an error log, the computing device generates a response to the first message from a solution associated with a problem matching the first message. The computing device may identify a second message. The computing device may input the second message into a response generation model in response to determining the second message is not an initial message of a thread. The computing device may generate a second response from output data of the response generation model.

    ATTENTION NEURAL NETWORKS WITH PARALLEL ATTENTION AND FEED-FORWARD LAYERS

    公开(公告)号:US20230316055A1

    公开(公告)日:2023-10-05

    申请号:US18130335

    申请日:2023-04-03

    申请人: Google LLC

    IPC分类号: G06N3/0499

    CPC分类号: G06N3/0499

    摘要: Methods, systems, and apparatus, including computer programs encoded on computer storage media, for performing a machine learning task on a network input to generate a network output. One of the systems comprises an attention neural network configured to perform the machine learning task, the attention neural network comprising a plurality of attention layers, each attention layer comprising an attention sub-layer that is arranged in parallel with a feed-forward sub-layer.

    TRANSFORMER NETWORK WITH NORMALIZATION INCLUDING SCALING PARAMETER

    公开(公告)号:US20240320482A1

    公开(公告)日:2024-09-26

    申请号:US18176037

    申请日:2023-02-28

    摘要: A computing system is provided, including a processor configured to receive a training data set. Based at least in part on the training data set, the processor is further configured to train a transformer network that includes a plurality of layers. The plurality of layers each respectively include a plurality of sub-layers including an attention sub-layer, a feed-forward sub-layer, and a plurality of normalization sub-layers. The plurality of normalization sub-layers are downstream from corresponding sub-layers of the plurality of sub-layers. Each of the plurality of normalization sub-layers is configured to apply layer normalization to a sum of: a first scaling parameter multiplied by an input vector of the sub-layer; and an output vector of the sub-layer.