论文部分内容阅读
针对硬间隔支持向量机(HM-SVMs)存在过拟合的风险,软间隔支持向量机存在着高计算成本的负担,本文首先介绍了一种新的SVMs迭代训练算法——分段贪婪算法(GS-SVMs)。该方法不需要引入正则化参数,就能处理HM-SVMs的过拟合问题,进而提高了训练速度。然后运用样本缩减策略进一步改进GS-SVMs,得到新算法NGS-SVMs。实验结果证明,该方法具有一定的可行性和有效性,特别适用于大样本数据。
There is a risk of over-fitting for hard-interval support vector machines (HM-SVMs), and there is a high computational cost burden on soft-interval SVMs. In this paper, a new iterative SVMs training algorithm named Segmented Greedy Algorithm GS-SVMs). This method can deal with the over-fitting problem of HM-SVMs without introducing regularization parameters, thus improving the training speed. Then the sample reduction strategy is used to further improve the GS-SVMs to get the new algorithm NGS-SVMs. Experimental results show that this method has some feasibility and validity, especially suitable for large sample data.