-
公开(公告)号:US20210374349A1
公开(公告)日:2021-12-02
申请号:US17444693
申请日:2021-08-09
Inventor: Jiachen LIU , Xinyan XIAO , Hua WU , Haifeng WANG
IPC: G06F40/295 , G06N20/00 , G06N5/02
Abstract: A method for text generation, relates to a field of natural language processing, including: obtaining corpus data; labeling the corpus data to obtain a first constraint element; obtaining a first generation target; and generating a first text matching the first generation target by inputting the corpus data and the first constraint element into a generation model.
-
2.
公开(公告)号:US20180321995A1
公开(公告)日:2018-11-08
申请号:US16039151
申请日:2018-07-18
Inventor: Yu MA , Weide ZHANG , Wei HE , Haifeng WANG , Yibing LIANG, , Zhuo CHEN
CPC classification number: G06F9/546 , G06F11/3024 , G06F11/3055 , G06F2209/508 , Y10S901/50
Abstract: This disclosure discloses a method and apparatus for monitoring a message transmission frequency in a robot operating system. A specific implementation of the method includes: writing to-be-transmitted messages, into a pre-allocated memory; obtaining time points when the to-be-transmitted messages are written into the memory, and recording the time points in a preset time point list; determining a message transmission frequency within a preset time interval based on the time points in the time point list; and comparing the message transmission frequency with a preset message transmission frequency threshold, and generating monitoring information based on a comparing result. This implementation monitors the message transmission frequency of a process to thereby avoid information codes related to monitoring of each application from being added to the application so as to reduce the program debuging cost, and improve the monitoring efficiency.
-
公开(公告)号:US20220414691A1
公开(公告)日:2022-12-29
申请号:US17622950
申请日:2021-06-02
Inventor: Jizhou HUANG , Haifeng WANG , Miao FAN , Yibo SUN
Abstract: Technical solutions relate to the field of big data technologies. A technical solution includes: pre-training a time series prediction model using first historical regional heat data; and taking second historical regional heat data as a second support set, and further training the time series prediction model using the second support set to adjust model parameters, so as to obtain the regional heat prediction model; and the regional heat prediction model is configured to predict a second query set, and the second query set includes regional heat at a prediction time.
-
公开(公告)号:US20220398465A1
公开(公告)日:2022-12-15
申请号:US17620820
申请日:2021-06-02
Inventor: Jizhou HUANG , Jingbo ZHOU , An ZHUO , Ji LIU , Haoyi XIONG , Dejing DOU , Haifeng WANG
IPC: G06N5/02
Abstract: A technical solution relates to a big data technology in the field of artificial intelligence technologies. The technical solution includes: acquiring training data including annotation results of a risk grade of each sample region and a risk grade of a district to which each sample region belongs; and training an initial model including an encoder, a discriminator and a classifier using the training data, and obtaining the risk prediction model using the encoder and the classifier after the training process. The encoder performs a coding operation using region features of the sample regions to obtains a feature representation of each sample region; the discriminator identifies the risk grade of the district to which the sample region belongs according to the feature representation of the sample region; the classifier identifies the risk grade of the sample region according to the feature representation of the sample region.
-
公开(公告)号:US20210374359A1
公开(公告)日:2021-12-02
申请号:US17133381
申请日:2020-12-23
Inventor: Wei LI , Xinyan XIAO , Hua WU , Haifeng WANG
Abstract: The disclosure may provide a method for obtaining a document layout, an electronic device, and a storage medium. The method may include: obtaining a plurality of pieces of first sample data; extracting structured information from each of the plurality of pieces of first sample data as target structured information corresponding to each of the plurality of pieces of first sample data; inputting the plurality of pieces of first sample data into an initial text generation model to generate predicted structured information corresponding to each of the plurality of pieces of first sample data; generating a first loss value based on a difference between the predicted structured information corresponding to each of the plurality of pieces of first sample data and the corresponding target structured information; and training a phrase generation ability of the initial text generation model based on the first loss value to generate the text generation model.
-
6.
公开(公告)号:US20210192284A1
公开(公告)日:2021-06-24
申请号:US16901940
申请日:2020-06-15
Inventor: Hao XIONG , Zhongjun HE , Zhi LI , Hua WU , Haifeng WANG
IPC: G06K9/62 , G06F40/117
Abstract: The present disclosure provides an end-to-end model training method and apparatus, which relates to a field of artificial intelligence technologies. The method includes: obtaining training data containing a plurality of training samples, in which the plurality of training samples include an original sequence, a target sequence and a corresponding tag list, the tag list includes importance tags in the target sequence and avoidance tags corresponding to the importance tags, and the avoidance tags are irrelevant tags corresponding to the importance tags; and adopting the training data to train a preset end-to-end model until a value of a preset optimization target function is smaller than a preset threshold.
-
公开(公告)号:US20210182498A1
公开(公告)日:2021-06-17
申请号:US16885358
申请日:2020-05-28
Inventor: Yu SUN , Haifeng WANG , Shuohuan WANG , Yukun LI , Shikun FENG , Hao TIAN , Hua WU
Abstract: The present disclosure provides a method, apparatus, electronic device and storage medium for processing a semantic representation model, and relates to the field of artificial intelligence technologies. A specific implementation solution is: collecting a training corpus set including a plurality of training corpuses; training the semantic representation model using the training corpus set based on at least one of lexicon, grammar and semantics. In the present disclosure, by building the unsupervised or weakly-supervised training task at three different levels, namely, lexicon, grammar and semantics, the semantic representation model is enabled to learn knowledge at levels of lexicon, grammar and semantics from massive data, enhance the capability of universal semantic representation and improve the processing effect of the NLP task.
-
8.
公开(公告)号:US20190384810A1
公开(公告)日:2019-12-19
申请号:US16176783
申请日:2018-10-31
Inventor: Jizhou HUANG , Yaming SUN , Wei ZHANG , Haifeng WANG
Abstract: The present disclosure provides a method of training a descriptive text generating model, and a method and apparatus for generating a descriptive text, wherein the method of training a descriptive text generating model comprises: obtaining training data, the training data comprising: a notional word, a first descriptive text and a second descriptive text of the notional word, wherein the second descriptive text is a concise expression of the first descriptive text; regarding the notional word and the first descriptive text of the notional word as input of a seq2seq model, regarding the second descriptive text of the notional word as output of the seq2sequ model, and training the seq2seq model to obtain a descriptive text generating model. The descriptive text generating model according to the present disclosure can implement generation of a concise descriptive text with respect to the notional word in a deep understanding manner.
-
9.
公开(公告)号:US20220171941A1
公开(公告)日:2022-06-02
申请号:US17348104
申请日:2021-06-15
Inventor: Xuan OUYANG , Shuohuan WANG , Chao PANG , Yu SUN , Hao TIAN , Hua WU , Haifeng WANG
Abstract: The present disclosure provides a multi-lingual model training method, apparatus, electronic device and readable storage medium and relates to the technical field of deep learning and natural language processing. A technical solution of the present disclosure when training the multi-lingual model is: obtaining training corpuses comprising a plurality of bilingual corpuses and a plurality of monolingual corpuses; training a multi-lingual model with a first training task by using the plurality of bilingual corpuses; training the multi-lingual model with a second training task by using the plurality of monolingual corpuses; and completing the training of the multi-lingual model in a case of determining that loss functions of the first training task and second training task converge. In the present disclosure, the multi-lingual model can be enabled to achieve semantic interaction between different languages and improve the accuracy of the multi-lingual model in learning the semantic representations of the multi-lingual model.
-
公开(公告)号:US20220083949A1
公开(公告)日:2022-03-17
申请号:US17194129
申请日:2021-03-05
Inventor: Haiwei WANG , Haifeng WANG , Wei HE , Ying LI , Jie WANG , Yong ZHU
Abstract: A method and apparatus for pushing information, a device and a storage medium are provided. An implementation of the method may include: acquiring a user identifier of a target user; in response to determining that the user identifier satisfies a preset condition, acquiring a work intention of the target user; determining target information according to the user identifier, the work intention and a pre-established knowledge graph; and pushing the target information to the target user.
-
-
-
-
-
-
-
-
-