Online music databases have increased signicantly as a consequence of the rapid growth of the Internet and digital audio, requiring the development of faster and more efficient tools for music content analysis. Musical genres are widely used to organize music collections. In this paper, the problem of automatic music genre classification is addressed by exploring rhythm-based features obtained from a respective complex network representation. A Markov model is build in order to analyse the temporal sequence of rhythmic notation events. Feature analysis is performed by using two multivariate statistical approaches: principal component analysis(unsupervised) and linear discriminant analysis (supervised). Similarly, two classifiers are applied in order to identify the category of rhythms: parametric Bayesian classifier under gaussian hypothesis (supervised), and agglomerative hierarchical clustering (unsupervised). Qualitative results obtained by Kappa coefficient and the obtained clusters corroborated the effectiveness of the proposed method.
See how this article has been cited at scite.ai
scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.