Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
Class posterior distributions can be used to classify or as intermediate features, which can be further exploited in different classifiers (e.g., Gaussian Mixture Models, GMM) towards improving speech recognition performance. In this paper we examine the possibility to use kNN classifier to perform local phonetic classification of class posterior distribution extracted from acoustic vectors. In that framework, we also propose and evaluate a new kNN metric based on the relative angle between feature vectors to define the nearest neighbors. This idea is inspired by the orthogonality characteristic of the posterior features. To fully exploit this attribute, kNN is used in two main steps: (1) the distance is computed as the cosine function of the relative angle between the test vector and the training vector and (2) the nearest neighbors are defined as the samples within a specific relative angle to the test data and the test samples which do not have enough labels in such a hyper-cone are considered as uncertainties and left undecided. This approach is evaluated on TIMIT database and compared to other metrics already used in literature for measuring the similarity between posterior probabilities. Based on our experiments, the proposed approach yield 78.48% frame level accuracy while specifying 15.17% uncertainties in the feature space.
Daniel Kuhn, Viet Anh Nguyen, Bahar Taskesen
,