METHODS AND SYSTEMS USING IMPROVED TRAINING AND LEARNING FOR DEEP NEURAL NETWORKS

    公开(公告)号:US20200026988A1

    公开(公告)日:2020-01-23

    申请号:US16475075

    申请日:2017-04-07

    Abstract: Methods and systems are disclosed using improved training and learning for deep neural networks. In one example, a deep neural network includes a plurality of layers, and each layer has a plurality of nodes. For each L layer in the plurality of layers, the nodes of each L layer are randomly connected to nodes in a L+1 layer. For each L+1 layer in the plurality of layers, the nodes of each L+1 layer are connected to nodes in a subsequent L layer in a one-to-one manner. Parameters related to the nodes of each L layer are fixed. Parameters related to the nodes of each L+1 layers are updated, and L is an integer starting with 1. In another example, a deep neural network includes an input layer, output layer, and a plurality of hidden layers. Inputs for the input layer and labels for the output layer are determined related to a first sample. Similarity between different pairs of inputs and labels between a second sample with the first sample is estimated using Gaussian regression process.

    METHODS AND SYSTEMS FOR BUDGETED AND SIMPLIFIED TRAINING OF DEEP NEURAL NETWORKS

    公开(公告)号:US20200026965A1

    公开(公告)日:2020-01-23

    申请号:US16475078

    申请日:2017-04-07

    Abstract: Methods and systems for budgeted and simplified training of deep neural networks (DNNs) are disclosed. In one example, a trainer is to train a DNN using a plurality of training sub-images derived from a down-sampled training image. A tester is to test the trained DNN using a plurality of testing sub-images derived from a down-sampled testing image. In another example, in a recurrent deep Q-network (RDQN) having a local attention mechanism located between a convolutional neural network (CNN) and a long-short time memory (LSTM), a plurality of feature maps are generated by the CNN from an input image. Hard-attention is applied by the local attention mechanism to the generated plurality of feature maps by selecting a subset of the generated feature maps. Soft attention is applied by the local attention mechanism to the selected subset of generated feature maps by providing weights to the selected subset of generated feature maps in obtaining weighted feature maps. The weighted feature maps are stored in the LSTM. A Q value is calculated for different actions based on the weighted feature maps stored in the LSTM.

    Methods and apparatus for multi-task recognition using neural networks

    公开(公告)号:US11106896B2

    公开(公告)日:2021-08-31

    申请号:US16958542

    申请日:2018-03-26

    Abstract: Methods and apparatus for multi-task recognition using neural networks are disclosed. An example apparatus includes a filter engine to generate a facial identifier feature map based on image data, the facial identifier feature map to identify a face within the image data. The example apparatus also includes a sibling semantic engine to process the facial identifier feature map to generate an attribute feature map associated with a facial attribute. The example apparatus also includes a task loss engine to calculate a probability factor for the attribute, the probability factor identifying the facial attribute. The example apparatus also includes a report generator to generate a report indicative of a classification of the facial attribute.

    METHODS AND APPARATUS FOR MULTI-TASK RECOGNITION USING NEURAL NETWORKS

    公开(公告)号:US20210004572A1

    公开(公告)日:2021-01-07

    申请号:US16958542

    申请日:2018-03-26

    Abstract: Methods and apparatus for multi-task recognition using neural networks are disclosed. An example apparatus includes a filter engine to generate a facial identifier feature map based on image data, the facial identifier feature map to identify a face within the image data. The example apparatus also includes a sibling semantic engine to process the facial identifier feature map to generate an attribute feature map associated with a facial attribute. The example apparatus also includes a task loss engine to calculate a probability factor for the attribute, the probability factor identifying the facial attribute. The example apparatus also includes a report generator to generate a report indicative of a classification of the facial attribute.

Patent Agency Ranking