Date of Award
Doctor of Philosophy (PhD)
Computational and Data Sciences
Dr. Erik Linstead
Dr. Elizabeth Stevens
Dr. Grace Fong
Recently, there has been a tremendous increase in generating and synthesizing music and art using various computational techniques. An area that is still under-researched, however, is how one medium can be converted into the other, while maintaining the overall aesthetics. Over the last few centuries, artists, composers, and scholars, have attempted to use substitute one form of art for the other: by proposing techniques where music notes are synonymous to colors, by inventing instruments that combine the aesthetics of music and visual art, and by incorporating the two media in live performances. A widely accepted computational approach, for the conversion, has yet to be introduced and there is some space for experimentation: to develop and polish a mechanism that can be adopted globally. This dissertation explores different computational techniques that can be used to convert one medium into the other, with a varying human input component. First, a simple computational approach to convert piano compositions into paintings is discussed that relies heavily on human encoded metrics to perform the conversion. Next, the embedding layer of a simple neural network is explored to provide information about artists and their artworks. The idea of using embedding layers for data compression is extended to produce a pure computational approach, where latent spaces of two trained variational autoencoder networks are interchanged. Finally, the earlier approaches are merged to explore human encoded metrics mapped to a generative model to interpolate music synthesis. For all approaches, results are derived to further enhance the commentary on the interchange of music and art.
Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.
R. H. Ali, "Computational approaches to facilitate automated iterchange between music and art," Ph.D. dissertation, Chapman University, Orange, CA, 2022. https://doi.org/10.36837/chapman.000359
Available for download on Wednesday, May 31, 2023