Abstract:
Kernel methods have been developed to handle nonlinear classification problems in online classification. In recent years, budget maintenance algorithms have been developed to avoid the infinite increase in the number of support vectors with data flow when calculating kernel functions. The existing fixed budget kernel classification algorithms are severely affected by label noise in classification performance. To address this issue, a label noise online kernel learning method based on kNCN is proposed. When the buffer reaches the budget size, this method uses the kNCN principle to find k near centroid nearest neighbor points for each support vector in the buffer. Then, by calculating the local label inconsistency between them, the deletion candidate set and anchor set are constructed. Then, a trial and error model is established for all instances in the deletion candidate set, and the classification accuracy of the trial and error model is tested on the anchor set to determine which support vector should be most removed from the buffer, Maintain a fixed budget. The experimental results on synthetic data sets and real data sets show that this method is applied to fixed budget perceptron and passive attack algorithms, and the classification performance in the tag noise scene is effectively improved. The comprehensive ranking on six data sets is better than other comparison algorithms.