Method and System for Dynamic Compositional General Continual Learning

    公开(公告)号:US20230385644A1

    公开(公告)日:2023-11-30

    申请号:US17853682

    申请日:2022-06-29

    CPC classification number: G06N3/082 G06N3/0481

    Abstract: A computer-implemented method for general continual learning combines rehearsal-based methods with dynamic modularity and compositionality. Concretely, the method aims at achieving three objectives: dynamic, sparse, and compositional response to inputs; competent application performance; and—reducing catastrophic forgetting. The proposed method can work without knowledge of task-identities at test-time, it does not employ task-boundaries and it has bounded memory even when training on longer sequences.

    METHOD AND SYSTEM FOR RELATIONAL GENERAL CONTINUAL LEARNING WITH MULTIPLE MEMORIES IN ARTIFICIAL NEURAL NETWORKS

    公开(公告)号:US20240119304A1

    公开(公告)日:2024-04-11

    申请号:US18180719

    申请日:2023-03-08

    CPC classification number: G06N3/096 G06N3/045

    Abstract: A computer-implemented method including the step of formulating a continual learning algorithm with both element similarity as well as relational similarity between the stable and plastic model in a dual-memory setup with rehearsal. While the method includes the step of using only two memories to simplify the analysis of impact of relational similarity, the method can be trivially extended to more than two memories. Specifically, the plastic model learns on the data stream as well as on memory samples, while the stable model maintains an exponentially moving average of the plastic model, resulting in a more generalizable model. Simultaneously, to mitigate forgetting and to enable forward transfer, the stable model distills instance-wise and relational knowledge to the plastic model on memory samples. Instance-wise knowledge distillation maintains element similarities, while relational similarity loss maintains relational similarities. The memory samples are maintained in a small constant-sized memory buffer which is updated using reservoir sampling. The method of the current invention was tested under multiple evaluation protocols, showing the efficacy of relational similarity for continual learning with dual-memory setup and rehearsal.

Patent Agency Ranking