The 2016 version of Big Ears was facilitated and curated by the same people as the previous year but the design brief was slightly different. This time around designers/makers would work on their own to create an interface for a particular musician associated with the Drake Music Project. Instead of have just three days to complete the project we were instead given almost three months in which to compete it, which allowed for a lot more possibilities to be explored.
Another difference was the fact that the project was not sponsored by Ableton so I could process the audio in an environment that I was used to, which is Csound. This also meant that I could build something that was far more user friendly and accessible that the previous years effort. One of the issues with the previous years project was that the interface needed a laptop with Live to run. Another issue was that as it was designed to be examined with the guts out it wasn’t very portable. These two factors combined meant that the musician could not take the interface home with them and play with it after the performance in SARC. I solved these issues by building an interface/instrument that required no specialist knowledge to use, was robust and operated independent of any other technologies.
The musician I was paired up with was limited in his physical interaction with the world to using just a joystick which he used to control his wheelchair. Almost every other task was made possible through the aid of a care assistant. With this in mind I decided to design something that would not only use his already established abilities but also had the possibility of improving these abilities. With this in mind I designed an instrument that used a joystick very similar to the one that he was used to to create music. I once again used an Intel Galileo running Csound as the brain of the instrument which meant that I had much more control over how the interface behaved and the audio output it produced.
The basic operation of the interface was that it was activated by touch and the x and y-axis of the joystick controlled the resonance and cutoff of an emulation of a moog filter. The joystick itself had two layers of copper foil separated by a layer of velostat which acted like a force sensor. This was implemented in the instrument by having a modulation index of the synth being increased as the joystick is squeezed tighter which resulted in more harmonics and a brighter sound. Again the idea behind this instrument was to create something that was on one hand accessible but on the other hand had scope to be learned and granted the musician the possibility to become better at performing on it. Below is an example of what it sounds like.