Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
Training Support Vector Machine can become very challenging in large scale problems. Training several lower complexity SVMs on local subsets of the training set can significantly reduce the training complexity and also improve the classification performances. In order to obtain efficient multiple classifiers systems, classifiers need to be both diverse and individually accurate. In this paper we propose an algorithm for training ensembles of SVMs by taking into account diversity between each parallel classifier. For this, we use an information theoretic criterion that expresses a trade-off between individual accuracy and diversity. The parallel SVMs are trained jointly using an adaptation of the Kernel-Adatron algorithm for learning online multiple SVMs. The results are compared to standard multiple SVMs techniques on reference large scale datasets.
Nahal Mansouri, Sahand Jamal Rahi, Soroush Setareh
Mathieu Salzmann, Yinlin Hu, Shuxuan Guo
Mahsa Shoaran, Bingzhao Zhu, Arman Zarei