Invention Grant
- Patent Title: Parameter server and method for sharing distributed deep learning parameter using the same
-
Application No.: US15984262Application Date: 2018-05-18
-
Publication No.: US10990561B2Publication Date: 2021-04-27
- Inventor: Shin-Young Ahn , Eun-Ji Lim , Yong-Seok Choi , Young-Choon Woo , Wan Choi
- Applicant: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
- Applicant Address: KR Daejeon
- Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
- Current Assignee: ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
- Current Assignee Address: KR Daejeon
- Priority: KR10-2017-0068445 20170601
- Main IPC: G06F15/16
- IPC: G06F15/16 ; G06F15/173 ; G06N3/08 ; H04L29/08 ; G06F9/50

Abstract:
Disclosed herein are a parameter server and a method for sharing distributed deep-learning parameters using the parameter server. The method for sharing distributed deep-learning parameters using the parameter server includes initializing a global weight parameter in response to an initialization request by a master process; performing an update by receiving a learned local gradient parameter from the worker process, which performs deep-learning training after updating a local weight parameter using the global weight parameter; accumulating the gradient parameters in response to a request by the master process; and performing an update by receiving the global weight parameter from the master process that calculates the global weight parameter using the accumulated gradient parameters of the one or more worker processes.
Public/Granted literature
- US20180349313A1 PARAMETER SERVER AND METHOD FOR SHARING DISTRIBUTED DEEP LEARNING PARAMETER USING THE SAME Public/Granted day:2018-12-06
Information query