- 专利标题: Activation layers for deep learning networks
-
申请号: US14954646申请日: 2015-11-30
-
公开(公告)号: US09892344B1公开(公告)日: 2018-02-13
- 发明人: Son Dinh Tran , Raghavan Manmatha
- 申请人: A9.com, Inc.
- 申请人地址: US CA Palo Alto
- 专利权人: A9.COM, INC.
- 当前专利权人: A9.COM, INC.
- 当前专利权人地址: US CA Palo Alto
- 代理机构: Hogan Lovells US LLP
- 主分类号: G06K9/66
- IPC分类号: G06K9/66 ; G06N3/08 ; G06K9/62
摘要:
Tasks such as object classification from image data can take advantage of a deep learning process using convolutional neural networks. These networks can include a convolutional layer followed by an activation layer, or activation unit, among other potential layers. Improved accuracy can be obtained by using a generalized linear unit (GLU) as an activation unit in such a network, where a GLU is linear for both positive and negative inputs, and is defined by a positive slope, a negative slope, and a bias. These parameters can be learned for each channel or a block of channels, and stacking those types of activation units can further improve accuracy.
信息查询