节点文献

自动语音识别模型压缩算法综述

Compression Algorithms for Automatic Speech Recognition Models: A Survey

  • 推荐 CAJ下载
  • PDF下载
  • 不支持迅雷等下载工具,请取消加速工具后下载。

【作者】 时小虎袁宇平吕贵林常志勇邹元君

【Author】 SHI Xiaohu;YUAN Yuping;Lü Guilin;CHANG Zhiyong;ZOU Yuanjun;College of Computer Science and Technology, Jilin University;Management Center of Big Data and Network, Jilin University;Intelligent Network Development Institute, R&D Institute of China FAW Group Co., Ltd;College of Biological and Agricultural Engineering, Jilin University;School of Medical Information, Changchun University of Chinese Medicine;

【通讯作者】 邹元君;

【机构】 吉林大学计算机科学与技术学院吉林大学大数据和网络管理中心中国第一汽车集团有限公司研发总院智能网联开发院吉林大学生物与农业工程学院长春中医药大学医药信息学院

【摘要】 随着深度学习技术的发展,自动语音识别任务模型的参数数量越来越庞大,使得模型的计算开销、存储需求和功耗花费逐渐增加,难以在资源受限设备上部署.因此对基于深度学习的自动语音识别模型进行压缩,在降低模型大小的同时尽量保持原有性能具有重要价值.针对上述问题,全面综述了近年来该领域的主要工作,将其归纳为知识蒸馏、模型量化、低秩分解、网络剪枝、参数共享以及组合模型几类方法,并进行了系统综述,为模型在资源受限设备的部署提供可选的解决方案.

【Abstract】 With the development of deep learning technology, the number of parameters in automatic speech recognition task models was becoming increasingly large, which gradually increased the computing overhead, storage requirements and power consumption of the models, and it was difficult to deploy on resource-constrained devices. Therefore, it was of great value to compress the automatic speech recognition models based on deep learning to reduce the size of the modes while maintaining the original performance as much as possible. Aiming at the above problems, a comprehensive survey was conducted on the main works in this field in recent years, which was summarized as several methods, including knowledge distillation, model quantization, low-rank decomposition, network pruning, parameter sharing and combination models, and conducted a systematic review to provide alternative solutions for the deployment of models on resource-constrained devices.

【基金】 国家自然科学基金(批准号:62272192);吉林省科技发展计划项目(批准号:20210201080GX);吉林省发改委项目(批准号:2021C044-1);吉林省教育厅科研基金(批准号:JJKH20200871KJ)
  • 【文献出处】 吉林大学学报(理学版) ,Journal of Jilin University(Science Edition) , 编辑部邮箱 ,2024年01期
  • 【分类号】TN912.34;TP18
  • 【下载频次】165
节点文献中: 

本文链接的文献网络图示:

本文的引文网络