Abstract:
Kernel methods have been developed to handle nonlinear classification problems in online classification. In recent years, budget maintenance algorithms have been developed to avoid the infinite increase in the number of support vectors with data flow when calculating kernel functions. The existing fixed budget kernel classification algorithms are severely affected by label noise in classification performance. To address this issue, a label noise online kernel learning method based on kNCN is proposed. When the buffer reaches the budget size, this method uses the kNCN principle to find
k near centroid nearest neighbor points for each support vector in the buffer. Then, by calculating the local label inconsistency between them, the deletion candidate set and anchor set are constructed. Then, a trial and error model is established for all instances in the deletion candidate set, and the classification accuracy of the trial and error model is tested on the anchor set to determine which support vector should be most removed from the buffer, maintain a fixed budget. Experimental results on synthetic data sets and real data sets show that the application of this method on fixed-budget perceptrons and passive attack algorithms can effectively improve classification performance in label noise scenarios. The comprehensive results on six data sets Ranking is better than other comparison algorithms.