-
公开(公告)号:US11354565B2
公开(公告)日:2022-06-07
申请号:US15853530
申请日:2017-12-22
Applicant: salesforce.com, inc.
Inventor: Alexander Rosenberg Johansen , Bryan McCann , James Bradbury , Richard Socher
Abstract: The technology disclosed proposes using a combination of computationally cheap, less-accurate bag of words (BoW) model and computationally expensive, more-accurate long short-term memory (LSTM) model to perform natural processing tasks such as sentiment analysis. The use of cheap, less-accurate BoW model is referred to herein as “skimming”. The use of expensive, more-accurate LSTM model is referred to herein as “reading”. The technology disclosed presents a probability-based guider (PBG). PBG combines the use of BoW model and the LSTM model. PBG uses a probability thresholding strategy to determine, based on the results of the BoW model, whether to invoke the LSTM model for reliably classifying a sentence as positive or negative. The technology disclosed also presents a deep neural network-based decision network (DDN) that is trained to learn the relationship between the BoW model and the LSTM model and to invoke only one of the two models.
-
公开(公告)号:US20200380213A1
公开(公告)日:2020-12-03
申请号:US16996726
申请日:2020-08-18
Applicant: salesforce.com, inc.
Inventor: Bryan McCann , Nitish Shirish Keskar , Caiming Xiong , Richard Socher
IPC: G06F40/30 , G06N3/08 , G06N5/04 , G06N3/04 , G06F40/56 , G06F16/242 , G06F16/33 , G06F16/332
Abstract: Approaches for multitask learning as question answering include an input layer for encoding a context and a question, a self-attention based transformer including an encoder and a decoder, a first bi-directional long-term short-term memory (biLSTM) for further encoding an output of the encoder, a long-term short-term memory (LSTM) for generating a context-adjusted hidden state from the output of the decoder and a hidden state, an attention network for generating first attention weights based on an output of the first biLSTM and an output of the LSTM, a vocabulary layer for generating a distribution over a vocabulary, a context layer for generating a distribution over the context, and a switch for generating a weighting between the distributions over the vocabulary and the context, generating a composite distribution based on the weighting, and selecting a word of an answer using the composite distribution.
-
公开(公告)号:US20180349359A1
公开(公告)日:2018-12-06
申请号:US16000638
申请日:2018-06-05
Applicant: salesforce.com, inc.
Inventor: Bryan McCann , Caiming Xiong , Richard Socher
Abstract: A system includes a neural network for performing a first natural language processing task. The neural network includes a first rectifier linear unit capable of executing an activation function on a first input related to a first word sequence, and a second rectifier linear unit capable of executing an activation function on a second input related to a second word sequence. A first encoder is capable of receiving the result from the first rectifier linear unit and generating a first task specific representation relating to the first word sequence, and a second encoder is capable of receiving the result from the second rectifier linear unit and generating a second task specific representation relating to the second word sequence. A biattention mechanism is capable of computing, based on the first and second task specific representations, an interdependent representation related to the first and second word sequences. In some embodiments, the first natural processing task performed by the neural network is one of sentiment classification and entailment classification.
-
公开(公告)号:US20180268287A1
公开(公告)日:2018-09-20
申请号:US15853530
申请日:2017-12-22
Applicant: salesforce.com, inc.
Inventor: Alexander Rosenberg Johansen , Bryan McCann , James Bradbury , Richard Socher
CPC classification number: G06N3/0454 , G06F15/76 , G06F17/241 , G06F17/2785 , G06K9/6262 , G06K9/6267 , G06K9/6268 , G06K9/6271 , G06N3/0445 , G06N3/0472 , G06N3/0481 , G06N3/084 , G06N5/04 , G06N20/00
Abstract: The technology disclosed proposes using a combination of computationally cheap, less-accurate bag of words (BoW) model and computationally expensive, more-accurate long short-term memory (LSTM) model to perform natural processing tasks such as sentiment analysis. The use of cheap, less-accurate BoW model is referred to herein as “skimming”. The use of expensive, more-accurate LSTM model is referred to herein as “reading”. The technology disclosed presents a probability-based guider (PBG). PBG combines the use of BoW model and the LSTM model. PBG uses a probability thresholding strategy to determine, based on the results of the BoW model, whether to invoke the LSTM model for reliably classifying a sentence as positive or negative. The technology disclosed also presents a deep neural network-based decision network (DDN) that is trained to learn the relationship between the BoW model and the LSTM model and to invoke only one of the two models.
-
公开(公告)号:US11436481B2
公开(公告)日:2022-09-06
申请号:US16134957
申请日:2018-09-18
Applicant: salesforce.com, inc.
Inventor: Govardana Sachithanandam Ramachandran , Michael Machado , Shashank Harinath , Linwei Zhu , Yufan Xue , Abhishek Sharma , Jean-Marc Soumet , Bryan McCann
Abstract: A method for natural language processing includes receiving, by one or more processors, an unstructured text input. An entity classifier is used to identify entities in the unstructured text input. The identifying the entities includes generating, using a plurality of sub-classifiers of a hierarchical neural network classifier of the entity classifier, a plurality of lower-level entity identifications associated with the unstructured text input. The identifying the entities further includes generating, using a combiner of the hierarchical neural network classifier, a plurality of higher-level entity identifications associated with the unstructured text input based on the plurality of lower-level entity identifications. Identified entities are provided based on the plurality of higher-level entity identifications.
-
公开(公告)号:US11409945B2
公开(公告)日:2022-08-09
申请号:US17027130
申请日:2020-09-21
Applicant: salesforce.com, inc.
Inventor: Bryan McCann , Caiming Xiong , Richard Socher
IPC: G06F40/126 , G06N3/08 , G06N3/04 , G06F40/30 , G06F40/47 , G06F40/205 , G06F40/289 , G06F40/44 , G06F40/58
Abstract: A system is provided for natural language processing. In some embodiments, the system includes an encoder for generating context-specific word vectors for at least one input sequence of words. The encoder is pre-trained using training data for performing a first natural language processing task. A neural network performs a second natural language processing task on the at least one input sequence of words using the context-specific word vectors. The first natural language process task is different from the second natural language processing task and the neural network is separately trained from the encoder. In some embodiments, the first natural processing task can be machine translation, and the second natural processing task can be one of sentiment analysis, question classification, entailment classification, and question answering.
-
17.
公开(公告)号:US20210141865A1
公开(公告)日:2021-05-13
申请号:US16680323
申请日:2019-11-11
Applicant: salesforce.com, inc.
Inventor: Michael Machado , James Douglas Harrison , Caiming Xiong , Xinyi Yang , Thomas Archie Cook , Roojuta Lalani , Jean-Marc Soumet , Karl Ryszard Skucha , Juan Manuel Rodriguez , Manju Vijayakumar , Vishal Motwani , Tian Xie , Bryan McCann , Nitish Shirish Keskar , Armen Abrahamyan , Zhihao Zou , Chitra Gulabrani , Minal Khodani , Adarsha Badarinath , Rohiniben Thakar , Srikanth Kollu , Kevin Schoen , Qiong Liu , Amit Hetawal , Kevin Zhang , Kevin Zhang , Victor Brouk , Johnson Liu , Rafael Amsili
Abstract: A multi-tenant system performs custom configuration of a tenant-specific chatbot to process and act upon natural language requests. The multi-tenant system configures the tenant-specific chatbots without requiring tenant-specific training. The multi-tenant system providing a user interface for configuring a tenant-specific set of permitted actions. The multi-tenant system determines a set of example phrases for each of the selected permitted actions. The multi-tenant system receives a natural language request from a user and identifies the action that the user wants to perform. The multi-tenant system uses a neural network to compare the natural language request with example phrases to identify an example phrase that matches the natural language request. The multi-tenant system performs the action corresponding to the matching example phrase.
-
公开(公告)号:US10817650B2
公开(公告)日:2020-10-27
申请号:US15982841
申请日:2018-05-17
Applicant: salesforce.com, inc.
Inventor: Bryan McCann , Caiming Xiong , Richard Socher
IPC: G06F40/126 , G06N3/08 , G06N3/04 , G06F40/30 , G06F40/47 , G06F40/205 , G06F40/289 , G06F40/44 , G06F40/58
Abstract: A system is provided for natural language processing. In some embodiments, the system includes an encoder for generating context-specific word vectors for at least one input sequence of words. The encoder is pre-trained using training data for performing a first natural language processing task. A neural network performs a second natural language processing task on the at least one input sequence of words using the context-specific word vectors. The first natural language process task is different from the second natural language processing task and the neural network is separately trained from the encoder. In some embodiments, the first natural processing task can be machine translation, and the second natural processing task can be one of sentiment analysis, question classification, entailment classification, and question answering.
-
19.
公开(公告)号:US20200334334A1
公开(公告)日:2020-10-22
申请号:US16518905
申请日:2019-07-22
Applicant: salesforce.com, inc.
Inventor: Nitish Shirish Keskar , Bryan McCann , Richard Socher , Caiming Xiong
IPC: G06F17/27
Abstract: Systems and methods for unifying question answering and text classification via span extraction include a preprocessor for preparing a source text and an auxiliary text based on a task type of a natural language processing task, an encoder for receiving the source text and the auxiliary text from the preprocessor and generating an encoded representation of a combination of the source text and the auxiliary text, and a span-extractive decoder for receiving the encoded representation and identifying a span of text within the source text that is a result of the NLP task. The task type is one of entailment, classification, or regression. In some embodiments, the source text includes one or more of text received as input when the task type is entailment, a list of classifications when the task type is entailment or classification, or a list of similarity options when the task type is regression.
-
公开(公告)号:US10699060B2
公开(公告)日:2020-06-30
申请号:US16000638
申请日:2018-06-05
Applicant: salesforce.com, inc.
Inventor: Bryan McCann , Caiming Xiong , Richard Socher
IPC: G06F40/126 , G06N3/08 , G06N3/04 , G06F40/30 , G06F40/47 , G06F40/205 , G06F40/289 , G06F40/44 , G06F40/58
Abstract: A system includes a neural network for performing a first natural language processing task. The neural network includes a first rectifier linear unit capable of executing an activation function on a first input related to a first word sequence, and a second rectifier linear unit capable of executing an activation function on a second input related to a second word sequence. A first encoder is capable of receiving the result from the first rectifier linear unit and generating a first task specific representation relating to the first word sequence, and a second encoder is capable of receiving the result from the second rectifier linear unit and generating a second task specific representation relating to the second word sequence. A biattention mechanism is capable of computing, based on the first and second task specific representations, an interdependent representation related to the first and second word sequences. In some embodiments, the first natural processing task performed by the neural network is one of sentiment classification and entailment classification.
-
-
-
-
-
-
-
-
-