节点文献

最小二乘隐空间支持向量机

Least Squares Hidden Space Support Vector Machines

  • 推荐 CAJ下载
  • PDF下载
  • 不支持迅雷等下载工具,请取消加速工具后下载。

【作者】 王玲薄列峰刘芳焦李成

【Author】 WANG Ling~1) BO Lie-Feng~1) LIU Fang~2) JIAO Li-Cheng~1) ~1) (Institute of Intelligent Information Processing, Xidian University, Xi ′an 710071) ~2) (School of Computer, Xidian University, Xi ′an 710071)

【机构】 西安电子科技大学智能信息处理研究所西安电子科技大学计算机学院西安电子科技大学智能信息处理研究所 西安710071西安710071西安710071

【摘要】 在隐空间中采用最小二乘损失函数,提出了最小二乘隐空间支持向量机(LSHSSVMs).同隐空间支持向量机(HSSVMs)一样,最小二乘隐空间支持向量机不需要核函数满足正定条件,从而扩展了支持向量机核函数的选择范围.由于采用了最小二乘损失函数,最小二乘隐空间支持向量机产生的优化问题为无约束凸二次规划,这比隐空间支持向量机产生的约束凸二次规划更易求解.仿真实验结果表明所提算法在计算时间和推广能力上较隐空间支持向量机存在一定的优势.

【Abstract】 Utilizing least squares loss function in the hidden space, least squares hidden space support vector machines (LSHSSVMs) are proposed in this paper. Like in the hidden space support vector machines (HSSVMs), the kernel functions used in LSHSSVMs are not necessary to satisfy the positive definite condition, so they can be chosen from a wide range. Due to the adoption of the least squares loss function, LSHSSVMs result in an unconstrained convex quadratic programming, which is more convenient to solve than the constrained convex quadratic programming yielded by HSSVMs. A conjugate gradient algorithm is designed to efficiently solve LSHSSVMs, and an analysis of computation time is also given. The comparative experimental results on pattern recognition and function regression show some advantages of LSHSSVMs over HSSVMs on the computational complexity and the generalization performance.

【基金】 国家自然科学基金(60372050,60133010);国家“八六三”高技术研究发基金(2002AA135080)资助
  • 【文献出处】 计算机学报 ,Chinese Journal of Computers , 编辑部邮箱 ,2005年08期
  • 【分类号】TP18
  • 【被引频次】34
  • 【下载频次】674
节点文献中: 

本文链接的文献网络图示:

本文的引文网络