Many music composition algorithms attempt to compose music in a particular style. The resulting music is often impressive and indistinguishable from the style of the training data, but it tends to lack significant innovation. In an effort to increase innovation in the selection of pitches and rhythms, we present a system that discovers musical motifs by coupling machine learning techniques with an inspirational component. The inspirational component allows for the discovery of musical motifs that are unlikely to be produced by a generative model, while the machine learning component harnesses innovation. Candidate motifs are extracted from non-musical media such as images and audio. Machine learning algorithms select the motifs that best comply with patterns learned from training data. This process is validated by extracting motifs from real music scores, identifying themes in the piece according to a theme database, and measuring the probability of discovering thematic motifs verses non-thematic motifs. We examine the information content of the discovered motifs by comparing the entropy of the discovered motifs, candidate motifs, and training data. We measure innovation by comparing the probability of the training data and the probability of the discovered motifs given the model. We also compare the probabilities of media-inspired motifs with random motifs and find that media inspiration is more efficient than random generation.
College and Department
Physical and Mathematical Sciences; Computer Science
BYU ScholarsArchive Citation
Johnson, Daniel S., "Musical Motif Discovery in Non-Musical Media" (2014). All Theses and Dissertations. 4081.
music composition, machine learning