VARIANCE PROPAGATION FOR QUANTIZATION
    5.
    发明申请

    公开(公告)号:US20190354865A1

    公开(公告)日:2019-11-21

    申请号:US16417430

    申请日:2019-05-20

    Abstract: A neural network may be configured to receive, during a training phase of the neural network, a first input at an input layer of the neural network. The neural network may determine, during the training phase, a first classification at an output layer of the neural network based on the first input. The neural network may adjust, during the training phase and based on a comparison between the determined first classification and an expected classification of the first input, weights for artificial neurons of the neural network based on a loss function. The neural network may output, during an operational phase of the neural network, a second classification determined based on a second input, the second classification being determined by processing the second input through the artificial neurons using the adjusted weights.

    SPARSITY-INDUCING FEDERATED MACHINE LEARNING

    公开(公告)号:US20230169350A1

    公开(公告)日:2023-06-01

    申请号:US18040111

    申请日:2021-09-28

    CPC classification number: G06N3/098

    Abstract: Aspects described herein provide techniques for performing federated learning of a machine learning model, comprising: for each respective client of a plurality of clients and for each training round in a plurality of training rounds: generating a subset of model elements for the respective client based on sampling a gate probability distribution for each model element of a set of model elements for a global machine learning model; transmitting to the respective client: the subset of model elements; and a set of gate probabilities based on the sampling, wherein each gate probability of the set of gate probabilities is associated with one model element of the subset of model elements; receiving from each respective client of the plurality of clients a respective set of model updates; and updating the global machine learning model based on the respective set of model updates from each respective client of the plurality of clients.

    PRIVACY-AWARE PRUNING IN MACHINE LEARNING

    公开(公告)号:US20220318412A1

    公开(公告)日:2022-10-06

    申请号:US17223946

    申请日:2021-04-06

    Abstract: Certain aspects of the present disclosure provide techniques for improved machine learning using private variational dropout. A set of parameters of a global machine learning model is updated based on a local data set, and the set of parameters is pruned based on pruning criteria. A noise-augmented set of gradients is computed for a subset of parameters remaining after the pruning, based in part on a noise value, and the noise-augmented set of gradients is transmitted to a global model server.

Patent Agency Ranking