MODEL ASSEMBLY WITH KNOWLEDGE DISTILLATION

    公开(公告)号:US20250086493A1

    公开(公告)日:2025-03-13

    申请号:US18243259

    申请日:2023-09-07

    Abstract: In one implementation, a device receives, via a user interface, one or more constraint parameters for each of a plurality of machine learning models that perform different analytics tasks. The device computes, based on the one or more constraint parameters, a set of weights for the plurality of machine learning models. The device generates a unified model by performing knowledge distillation on the plurality of machine learning models using the set of weights. The device deploys the unified model for execution by a particular node in a network.

Patent Agency Ranking