Translation method, apparatus and storage medium

    公开(公告)号:US12210956B2

    公开(公告)日:2025-01-28

    申请号:US18074853

    申请日:2022-12-05

    Abstract: The present disclosure provides a translation method and apparatus, an electronic device, and a non-transitory storage medium. An implementation includes: determining an encoded feature of a sentence to be translated by an encoding module; determining, by a graph network module, a knowledge fusion feature of the sentence to be translated based on a preset graph network, wherein the preset graph network is constructed based on a polysemous word in a source language corresponding to the sentence to be translated and a plurality of translated words corresponding to the polysemous word in a target language; determining, by a decoding network, a translated sentence corresponding to the sentence to be translated based on the encoded feature and the knowledge fusion feature.

    Translation method, electronic device and storage medium

    公开(公告)号:US12197882B2

    公开(公告)日:2025-01-14

    申请号:US17885152

    申请日:2022-08-10

    Abstract: A translation method, an electronic device and a storage medium, which relate to the field of artificial intelligence technologies, such as machine learning technologies, information processing technologies, are disclosed. An implementation includes: acquiring an intermediate translation result generated by each of multiple pre-trained translation models for a to-be-translated specified sentence in a same iteration of a translation process, so as to obtain multiple intermediate translation results; acquiring a co-occurrence word based on the multiple intermediate translation results; and acquiring a target translation result of the specified sentence based on the co-occurrence word.

    Method and apparatus for training semantic retrieval network, electronic device and storage medium

    公开(公告)号:US12293300B2

    公开(公告)日:2025-05-06

    申请号:US17930221

    申请日:2022-09-07

    Abstract: The disclosure provides a method for training a semantic retrieval network, an electronic device and a storage medium. The method includes: obtaining a training sample including a search term and n candidate files corresponding to the search term, where n is an integer greater than 1; inputting the training sample into the ranking model, to obtain n first correlation degrees output by the ranking model, in which each first correlation degree represents a correlation between a candidate document and the search term; inputting the training sample into the semantic retrieval model, to obtain n second correlation degrees output by the semantic retrieval model, wherein each second correlation degree represents a correlation between a candidate document and the search term; and training the semantic retrieval model and the ranking model jointly based on the n first correlation degrees and the n second correlation degrees.

    Method and apparatus for acquiring pre-trained model

    公开(公告)号:US12277401B2

    公开(公告)日:2025-04-15

    申请号:US17502108

    申请日:2021-10-15

    Abstract: The present disclosure discloses a method and apparatus for acquiring a pre-trained model, and relates to natural language processing and deep learning technologies in the field of artificial intelligence technologies. An implementation includes: acquiring training data, the training data including a single-modal language material and a multi-modal language material, and the multi-modal language material including a language material pair formed by a first-modal language material and a second-modal language material; and performing a multi-task training operation on a pre-trained model using the training data, the multi-task including at least one cross-modal contrastive learning task and at least one single-modal learning task; the pre-trained language model obtained in the present disclosure may learn from different forms of language materials, i.e., the single-modal language material and the multi-modal language material, such that the pre-trained language model may effectively process information in various modals.

    Method for processing information
    17.
    发明授权

    公开(公告)号:US12265842B2

    公开(公告)日:2025-04-01

    申请号:US18817035

    申请日:2024-08-27

    Abstract: A method for processing information is provided. The method includes obtaining input information to be processed. The method further includes determining execution information associated with processing of the input information. The execution information includes at least one of memory information to be retrieved or tool information to be invoked. The method further includes obtaining, by using the execution information, at least one piece of processing result information corresponding to the processing of the input information. The method further includes the at least one piece of processing result information to generate output information for feedback.

    METHOD AND DEVICE FOR TRAINING SPEECH TRANSLATION MODEL, AND STORAGE MEDIUM

    公开(公告)号:US20250054494A1

    公开(公告)日:2025-02-13

    申请号:US18930081

    申请日:2024-10-29

    Abstract: A method for training a speech translation model includes: obtaining a trained first text translation model and a speech recognition model, and constructing a candidate speech translation model to be trained based on the first text translation model and the speech recognition model; obtaining at least one of a first sample source language speech or a first sample source language text to obtain a training sample of the candidate speech translation model; and training the candidate speech translation model based on the training sample until the training is completed, and obtaining a trained target speech translation model.

    METHOD AND APPARATUS FOR TRAINING SEMANTIC RETRIEVAL NETWORK, ELECTRONIC DEVICE AND STORAGE MEDIUM

    公开(公告)号:US20230004819A1

    公开(公告)日:2023-01-05

    申请号:US17930221

    申请日:2022-09-07

    Abstract: The disclosure provides a method for training a semantic retrieval network, an electronic device and a storage medium. The method includes: obtaining a training sample including a search term and n candidate files corresponding to the search term, where n is an integer greater than 1; inputting the training sample into the ranking model, to obtain n first correlation degrees output by the ranking model, in which each first correlation degree represents a correlation between a candidate document and the search term; inputting the training sample into the semantic retrieval model, to obtain n second correlation degrees output by the semantic retrieval model, wherein each second correlation degree represents a correlation between a candidate document and the search term; and training the semantic retrieval model and the ranking model jointly based on the n first correlation degrees and the n second correlation degrees.

Patent Agency Ranking