METHOD FOR ESTIMATING PERCEPTUAL SEMANTIC CONTENT BY ANALYSIS OF BRAIN ACTIVITY

    公开(公告)号:US20180092567A1

    公开(公告)日:2018-04-05

    申请号:US15564071

    申请日:2016-04-05

    IPC分类号: A61B5/0484 G06F3/01 A61B5/16

    摘要: A perceptual semantic content estimation method includes: (A) inputting, to data processing means, brain activity induced in a subject by a training stimulation and detected as an output of a brain activity detection means and an annotation of a perceptual content; (B) associating a sematic space representation of the training stimulation and the output of the brain activity detection means in a stored semantic space and storing the association in a training result information storage means; (C) inputting, to the data processing means, an output when the brain activity detection means detects brain activity induced by a novel stimulation, and obtaining a probability distribution in the semantic space which represents perceptual semantic contents for the output of the novel stimulation-induced brain activity by the brain activity detection means on the basis of the association; and (D) estimating a highly probable perceptual semantic content on the basis of the probability distribution.

    DEEP NEURAL NETWORK LEARNING METHOD AND APPARATUS, AND CATEGORY-INDEPENDENT SUB-NETWORK LEARNING APPARATUS
    3.
    发明申请
    DEEP NEURAL NETWORK LEARNING METHOD AND APPARATUS, AND CATEGORY-INDEPENDENT SUB-NETWORK LEARNING APPARATUS 有权
    深层神经网络学习方法和装置,以及类别独立的子网络学习装置

    公开(公告)号:US20160110642A1

    公开(公告)日:2016-04-21

    申请号:US14787903

    申请日:2014-05-15

    IPC分类号: G06N3/08 G06N3/04

    摘要: Provided is a DNN learning method that can reduce DNN learning time using data belonging to a plurality of categories. The method includes the steps of training a language-independent sub-network 120 and language-dependent sub-networks 122 and 124 with training data of Japanese and English. This step includes: a first step of training a DNN obtained by connecting neurons in an output layer of the sub-network 120 with neurons in an input layer of sub-network 122 with Japanese training data; a step of forming a DNN by connecting the sub-network 124 in place of the sub-network 122 to the sub-network 120 and training it with English data; repeating these steps alternately until all training data ends; and after completion, separating the first sub-network 120 from other sub-networks and storing it as a category-independent sub-network in a storage medium.

    摘要翻译: 提供了一种可以使用属于多个类别的数据来减少DNN学习时间的DNN学习方法。 该方法包括以日语和英语的训练数据训练与语言无关的子网络120和与语言相关的子网络122和124的步骤。 该步骤包括:训练通过将子网络120的输出层中的神经元与子网络122的输入层中的神经元连接到日本训练数据而获得的DNN的第一步骤; 通过将子网络124代替子网络122连接到子网络120并用英语数据进行培训来形成DNN的步骤; 交替重复这些步骤,直到所有训练数据结束; 并且在完成之后,将第一子网120与其他子网分离,并将其作为类别独立的子网存储在存储介质中。