论文部分内容阅读
针对神经网络的过拟合和泛化能力差的问题 ,研究了样本数据的输入输出混合概率密度函数的局部最大熵密度估计 ,提出了运用Chebyshev不等式的样本参数按类分批自校正方法 ,以此估计拉伸样本集 ,得到新的随机扩充训练集 .使估计质量更高 ,效果更好 .仿真结果证明用这种方法训练的前馈神经网络具有较好的泛化性能 .
Aiming at the problem of poor fitting and generalization ability of neural network, the local maximum entropy density estimation of input and output mixed probability density function of sample data is studied. The method of classifying the sample parameters by Chebyshev inequality is proposed. This estimation stretches the sample set to obtain a new stochastic extension training set, which makes the estimation quality higher and the effect is better.The simulation results show that the feedforward neural network trained by this method has good generalization performance.