喵ID:apdsmp免责声明

CL-LSG: Continual Learning via Learnable Sparse Growth

CL-LSG:通过可学习的稀疏增长持续学习

基本信息

DOI:
--
发表时间:
2022
期刊:
影响因子:
--
通讯作者:
Deliang Fan
中科院分区:
文献类型:
--
作者: Li Yang;Sen Lin;Junshan Zhang;Deliang Fan研究方向: -- MeSH主题词: --
关键词: --
来源链接:pubmed详情页地址

文献摘要

Continual learning (CL) has been developed to learn new tasks sequentially and perform knowledge transfer from the old tasks to the new ones without forgetting, which is well known as catastrophic forgetting . While recent structure-based learning methods show the capability of alleviating the forgetting problem, these methods require a complex learning process to gradually grow-and-prune of a full-size network for each task, which is inefficient. To address this problem and enable efficient network expansion for new tasks, to the best of our knowledge, we are the first to develop a learnable sparse growth (LSG) method, which explicitly optimizes the model growth to only select important and necessary channels for growing. Building on the LSG, we then propose CL-LSG , a novel end-to-end CL framework to grow the model for each new task dynamically and sparsely. Different from all previous structure-based CL methods that start from and then prune (i.e., two-step) a full-size network, our framework starts from a compact seed network with a much smaller size and grows to the necessary model size (i.e., one-step) for each task, which eliminates the need of additional pruning in previous structure-based growing methods.
持续学习(CL)旨在按顺序学习新任务,并在不遗忘的情况下将知识从旧任务转移到新任务,这也就是众所周知的灾难性遗忘问题。虽然近期基于结构的学习方法显示出缓解遗忘问题的能力,但这些方法需要一个复杂的学习过程,针对每个任务对一个完整规模的网络逐步进行增长和修剪,效率低下。为了解决这个问题并实现针对新任务的高效网络扩展,据我们所知,我们率先开发了一种可学习的稀疏增长(LSG)方法,该方法明确地优化模型增长,仅选择重要且必要的通道进行增长。在此基础上,我们进而提出了CL - LSG,这是一种新颖的端到端持续学习框架,能够动态且稀疏地为每个新任务增长模型。与之前所有基于结构的持续学习方法不同,那些方法从一个完整规模的网络开始然后进行修剪(即两步法),我们的框架从一个尺寸小得多的紧凑种子网络开始,并针对每个任务增长到必要的模型尺寸(即一步法),这就消除了之前基于结构的增长方法中额外修剪的需要。
参考文献(2)
被引文献(0)
KSM: Fast Multiple Task Adaption via Kernel-wise Soft Mask Learning
KSM:通过内核软掩模学习实现快速多任务适应
DOI:
10.1109/cvpr46437.2021.01363
发表时间:
2021
期刊:
2021
影响因子:
0
作者:
Yang, Li;He, Zhezhi;Zhang, Junshan;Fan, Deliang
通讯作者:
Fan, Deliang

数据更新时间:{{ references.updateTime }}

Deliang Fan
通讯地址:
--
所属机构:
--
电子邮件地址:
--
免责声明免责声明
1、猫眼课题宝专注于为科研工作者提供省时、高效的文献资源检索和预览服务;
2、网站中的文献信息均来自公开、合规、透明的互联网文献查询网站,可以通过页面中的“来源链接”跳转数据网站。
3、在猫眼课题宝点击“求助全文”按钮,发布文献应助需求时求助者需要支付50喵币作为应助成功后的答谢给应助者,发送到用助者账户中。若文献求助失败支付的50喵币将退还至求助者账户中。所支付的喵币仅作为答谢,而不是作为文献的“购买”费用,平台也不从中收取任何费用,
4、特别提醒用户通过求助获得的文献原文仅用户个人学习使用,不得用于商业用途,否则一切风险由用户本人承担;
5、本平台尊重知识产权,如果权利所有者认为平台内容侵犯了其合法权益,可以通过本平台提供的版权投诉渠道提出投诉。一经核实,我们将立即采取措施删除/下架/断链等措施。
我已知晓