- 专利标题: DATA PARALLELISM IN DISTRIBUTED TRAINING OF ARTIFICIAL INTELLIGENCE MODELS
-
申请号: US16588402申请日: 2019-09-30
-
公开(公告)号: US20210019152A1公开(公告)日: 2021-01-21
- 发明人: Bharadwaj Pudipeddi , Marc Tremblay , Sujeeth Subramanya Bharadwaj , Devangkumar Patel , Jinwen Xi , Maral Mesmakhosroshahi
- 申请人: Microsoft Technology Licensing, LLC
- 申请人地址: US WA Redmond
- 专利权人: Microsoft Technology Licensing, LLC
- 当前专利权人: Microsoft Technology Licensing, LLC
- 当前专利权人地址: US WA Redmond
- 主分类号: G06F9/38
- IPC分类号: G06F9/38 ; H04L29/08 ; G06N3/08
摘要:
Methods, systems, apparatuses, and computer program products are described herein that enable execution of a large AI model on a memory-constrained target device that is communicatively connected to a parameter server, which stores a master copy of the AI model. The AI model may be dissected into smaller portions (e.g., layers or sub-layers), and each portion may be executed as efficiently as possible on the target device. After execution of one portion of the AI model is finished, another portion of the AI model may be downloaded and executed at the target device. To improve efficiency, the input samples may be divided into microbatches, and a plurality of microbatches executing in sequential order may form a minibatch. The size of the group of microbatches or minibatch can be adjusted to reduce the communication overhead. Multi-level parallel parameters reduction may be performed at the parameter server and the target device.
公开/授权文献
信息查询