Coordinated learning using distributed average consensus
摘要:
A distributed computing device generates a gradient descent matrix based on data received by the distributed computing device and a model stored on the distributed computing device. The distributed computing device calculates a sampled gradient descent matrix based on the gradient descent matrix and a random matrix. The distributed computing device iteratively executes a process to determine a consensus gradient descent matrix in conjunction with a plurality of additional distributed computing devices connected by a network to the distributed computing device. The consensus gradient descent matrix is based on the sampled gradient descent matrix and a plurality of additional sampled gradient decent matrices calculated by the plurality of additional distributed computing devices. The distributed computing device updates the model stored on the distributed computing device based on the consensus gradient descent matrix.
公开/授权文献
信息查询
0/0