Introduction Last winter, I worked on a personal project I call ongaku (from the Japanese for ‘music’). This was an attempt to use manifold learning to create a metric space for music.
Introduction The internet has allowed any artist to share their work with the world, and in many cases be paid for it. This is mostly a recent development, with the creation of online marketplaces, and patronage services.
Preliminary Analysis This will document the process of doing preliminary analysis on bandcamp album covers. This is part of an ongoing project which can be found here and on github. The goals of this exploration are to look into the viability of using machine learning methods in this setting.
This is a project to look into the question:
How do indie musicians use visual signs to indicate their subgenre?
The main functions are in bandcamp_webtools.py, and do basic web scraping.
This is an ongoing project, full details are availible here and on github
How do indie musicians use visual signs to indicate their subgenre? This exploration will work by gathering album covers from bandcamp.
Overview Ongaku is a method for creating playlists programmatically, using only the content of the song alone. It uses gammatone cepstral analysis to create unique matrices to represent each song. A gammatone cepstrum is similar to the more common spectra used for audio analysis.