节点文献

一类支持向量机的快速增量学习方法

Fast incremental learning method for one-class support vector machine

  • 推荐 CAJ下载
  • PDF下载
  • 不支持迅雷等下载工具,请取消加速工具后下载。

【作者】 王洪波赵光宙齐冬莲卢达

【Author】 WANG Hong-bo,ZHAO Guang-zhou,QI Dong-lian,LU Da(College of Electrical Engineering,Zhejiang University,Hangzhou 310027,China)

【机构】 浙江大学电气工程学院

【摘要】 提出一类支持向量机(OCSVM)的快速增量学习方法.在OCSVM初始分类器的基础上,添加一个德尔塔函数形成新的决策函数,实现增量学习的过程.通过分析德尔塔函数的几何特性,构造出与OCSVM相似的优化目标函数,从而求解德尔塔函数的参数.优化问题能够进一步转化为标准的二次规划(QP)问题,但是在优化过程中Karush-Kuhn-Tucker(KKT)条件发生很大改变.根据新的KKT条件,为QPP提出修正的序贯最小优化(SMO)求解方法.整个学习过程直接操作初始分类器,仅仅训练新增样本,避免了对初始样本的重复训练,因此能够节约大量的学习时间和存储空间.实验结果表明,提出的快速增量学习方法在时间和精度上均优于其他的增量学习方法.

【Abstract】 A fast incremental learning method of one-class support vector machine(OCSVM) was proposed.A new decision function of OCSVM was constructed by adding a delta function based on the initial classifier in order to achieve the incremental learning.The objective function which had the similar form with OCSVM was constructed to solve the parameters of delta function by analyzing the geometric properties of delta function.The optimization problem can be converted into a standard quadratic programming(QP) problem,but the Karush-Kuhn-Tucker(KKT)conditions greatly changed.An improved sequential minimal optimization(SMO) method was proposed according to the new KKT condition.Directly manipulating the initial classifier and under its influence,the method only trained the new data,so saved much learning time and storage space.Experimental results show that the fast incremental learning method performs better than other incremental methods in both time and accuracy.

【基金】 国家自然科学基金资助项目(60872070);浙江省科技计划资助项目(2008C21141);浙江省科技计划资助项目(2010C33044);浙江省重大科技攻关项目(2010C11069)
  • 【文献出处】 浙江大学学报(工学版) ,Journal of Zhejiang University(Engineering Science) , 编辑部邮箱 ,2012年07期
  • 【分类号】TP181
  • 【被引频次】14
  • 【下载频次】508
节点文献中: 

本文链接的文献网络图示:

本文的引文网络