论文部分内容阅读
研究了支持向量机 (SVM)在二次损失函数下的优化问题解的形式 ,并与普通的最小二乘 (L S)估计问题进行了比较 ,得到了几乎完全一致的优化问题形式。由于 SVM在二次损失函数下的优化问题对应于一个欠定问题 ,该问题在最小二乘估计中有最小范数解。如果 SVM的参数选择合适 ,从理论上可以证明采用二次损失函数的 SVM函数拟合问题实际为约束最小二乘估计问题 ,并且该问题的解对应于最小范数最小二乘解。由于最小化范数解实际是 SVM在取某些参数时的一个特例 ,如果能够自动调整这些参数 ,则得到一类最小化范数解。由此提出了采用 SVM解决最小二乘法问题的思想 ,由于 SVM的优点 ,使解更加符合实际情况
The form of the solution to the optimization problem of support vector machine (SVM) under the quadratic loss function is studied. Compared with the general least squares (LS) estimation problem, an almost complete form of the optimization problem is obtained. Since the SVM optimization problem under the quadratic loss function corresponds to an underdetermined problem, this problem has the least norm solution in the least squares estimation. If SVM parameters are chosen properly, it can theoretically be proved that the SVM function fitting problem using the quadratic loss function is actually a constrained least-squares estimation problem, and the solution to the problem corresponds to the least-norm least-squares solution. Since the minimization of norm solutions is actually a special case of SVM taking certain parameters, a class of minimization norm solutions are obtained if these parameters can be automatically adjusted. Therefore, the idea of using SVM to solve the least squares problem is proposed. Due to the advantages of SVM, the solution is more in line with the actual situation