-
公开(公告)号:US20250028887A1
公开(公告)日:2025-01-23
申请号:US18768459
申请日:2024-07-10
Applicant: MEDIATEK INC.
Inventor: CHIEH-WEN CHEN , Yao-Sheng Wang , Yu-Cheng LO , WeiLing YU
IPC: G06F30/33 , G06F119/06
Abstract: A computing device collects a plurality of data samples. Each data sample represents a signal activity of a plurality of signals of the chip. The computing device selects a subset of signals from the plurality of signals as proxies. These proxies are correlated with an actual power consumption of the chip according to a criterion. The computing device trains the power model using signal activities of the plurality of signals as inputs and the actual power consumption as an output. The computing device fine-tunes coefficients of the proxies in the power model. This fine-tuning adjusts an estimation error between an estimated power consumption output by the power model and the actual power consumption.
-
公开(公告)号:US20240160934A1
公开(公告)日:2024-05-16
申请号:US18450571
申请日:2023-08-16
Applicant: MEDIATEK INC.
Inventor: Hao CHEN , Po-Hsiang YU , Yu-Cheng LO , Cheng-Yu YANG , Peng-Wen CHEN
IPC: G06N3/082
CPC classification number: G06N3/082
Abstract: A method for removing branches from trained deep learning models is provided. The method includes steps (i)-(v). In step (i), a trained model is obtained. The trained model has a branch structure involving one or more original convolutional layers and a shortcut connection. In step (ii), the shortcut connection is removed from the branch structure. In step (iii), a reparameterization model is built by linearly expanding each of the original convolutional layers into a reparameterization block in the reparameterization model. In step (iv), parameters of the reparameterization blocks are optimized by training the reparameterization model. In step (v), each of the optimized reparameterization blocks is transformed into a reparameterized convolutional layer to form a branchless structure that replaces the branch structure in the trained model.
-