Week 10: Data Sonification

Gravestone inscriptions frequently mention the data. If we represented some of our class graveyard data in sound, what patterns might we hear? What might those patterns reflect in the lived experiences of those communities?

In your workbench, in the week 10 folder, is a jupyter notebook called sonification.ipynb. Give that a whirl and use it to represent your gravestone data. Then, import the resulting .midi files into something like Garageband or other music editing software to assign instrumentation, or remix as appropriate.

Or, you can give TwoTone a play around; you load a csv of data into it, and then add ‘voices’ for each column of data. You can assign instrumentation and range and tempo. The result can be downloaded from the site as an mp3 file. You might get an error about ‘midi’ requiring a site permission. Just hit the note icon, add another row of data, and you should be able to just ignore the error.

Some Examples

An example of a project that uses sound, sonification, and art to interpret an ancient site is the wonderful Soundmarks project by Rose Ferraby and Rob St. John.

A piece I made with Eric Kansa and Andrew Reinhard, ‘Reflexivity’ actually used a web app called Two Tone to generate the original soundlines (we then remixed). Our rationale for how we turned archaeological survey data into sound and then remixed it can be read at Epoiesen, a journal for creative engagement in history and archaeology, here. (Incidentally, Epoiesen publishes undergraduate researchers too.)

An MA project at Carleton by Cristina Wood is another example of sonification used for research - Songs of the Ottawa (now archived at the Internet Archive.)

Prev