The Final Product and Evaluation

Here are the three songs performed in full:

Given that the brief required the final performance to provide a visual element that complemented the music, in this respect it was successful. The idea that the music itself would be written with the knowledge that it would be performed in such a way did indeed inform how the pieces were composed and provides an intriguing commentary on the relationship between music, visuals, technology and mixed media in general. Although certain obstacles such as the relay drawing too much power from the Arduino (necessitating the removal of the lamps) and the use of only one Arduino removed the visual feedback from the controller, the aesthetic aspect of the final product works well. It is also interesting to note that due to the fact a lot of the lighting patches within Max/MSP are random, which not only adds a little unpredictability to the performance but augments the performance aspect in that you cannot experience the same rendition of the song twice. With regards to what could be improved, using lamps as intended would provide more variation to the lighting yet at the same time a concern noticed when filming the performance is that too much light has the ability to overpower the others, resulting in a less subtle performance. Although the Leap Motion itself was utilised it could have possibly been integrated into the project a little more creatively so that it manipulated the audio material itself as opposed to just affecting it. In addition, perhaps somehow the Max/MSP patch could have been built in such a way that the samples only played on the beat, similar to quantisation. Such a device is used in Ableton Live so that samples are played in time and this would have aided the live performance. In conclusion while certain shortcomings with regards to technology limited the scope of the project, the final product exhibits the exact qualities that were intended in the beginning. To be an engaging piece of visual and auditory media that engages in both a complementary and contrasting relationship while making efficient use of custom built components.

The Performance

Having completed the following the project is ready to be performed:

  • The construction and wiring of the controller and lighting system
  • The composition, mixing and mastering of three songs (totalling around 10 minutes and 40 seconds)
  • The creation of a max patch which integrates the aforementioned parts along with the implementation of the Leap Motion controller to manipulate effects

It has been determined however after various trials that performing the pieces live over the course of ten minutes is a difficult prospect. This is due to the fact that if a single sample is even marginally out of time with another the lighting system and the song itself is thrown noticeably out of time.

With this in mind the decision has been made to create polished videos that showcase the technology. This will be done by performing the pieces as intended, without an audience and from multiple angles. This way the takes where the songs are played correctly will enter the final cut and when coupled with the mastered versions of the full songs will create an aesthetically pleasing and dynamic demonstration of the project.

Max/MSP Patch

The final step in integrating all the parts together is the Max/MSP patch that interprets the data from the box, plays the sounds and activates the LEDs.

This is the front end of the patch which contains the graphical user interface for the maxuino object. From here the mode for each pin can be set (digital I/O, analogue in, PWM, etc.). A subpatch has been created so that by simply pressing the button near the top left will set the correct pin modes for the purpose of my project. In the centre at the top is a Umenu object; when the relevant song is selected it activates certain subpatches so that the right samples, lighting system and bpm are selected. To the upper right are the sample and lighting subpatches which contain audio material and lighting sequences for each song.

Frontend

This is a look inside the sub patch which sets the correct pin mode:

inputselect

This sub patch translates a BPM value into milliseconds which is used for both the delay effect utilised by the Leap Motion and the lighting systems:

bpm

This sub patch details how when the buttons are pressed the values are routed to select certain samples:

routeThis sub patch demonstrates how each sample is turned on and off. As the digital pins (the buttons on the controllers) are constantly streaming data, a switch has to be employed so that when a ‘1’ is received (meaning that a button has been pressed) it immediately switches off the flow of data for 400 milliseconds so that no more ‘1’s are received. If this system is not set up and a toggle is simply used the sample switches on multiple times when the button is pressed. It also shows how audio is transmitted to lighting patches that require audio data to light the LEDs.

buttonNext is the sub patch that deals with LEDs that turn on and off in a random sequence. At first the ‘random’ object was used but this yielded results that were too unpredictable; for example a light could switch itself on and off multiple times due to the fact the values are random. By using the ‘urn’ object it generates a sequence of numbers that, while random, are non repeating.

arpeggio

Next is a sub patch that translates the audio volume of a sample into a value that the maxuino object can interpret and fade an LED in accordance. By using the ‘function’ object you can create custom ramps so that if for example a sample is too quiet you can have the LED be fully bright (a value of 1) if the volume only ever reaches 0.5.

audio

Finally is the delay/filter sub patch that interprets data from the Leap Motion controller into an effect. The two effects can be engaged separately or together and the length of the delay (which is in time with the bpm) may be selected for each channel independently. The values from the Leap Motion (such as palm rotation, hand position in x/y/z space) are scaled and used to create a dry/wet signal with regards to the delay or cut-off frequency for the filter.

leap

The Mastering Process

Once all the songs were mixed so that they were tonally similar and stylistically appropriate it is time for mastering.

To ensure that all songs were mastered so that they were cohesive and could appear as a single piece of work, they would need to be mastered using a single reference track to inform decisions on how to shape them tonally, their dynamic quality and their overall loudness. With this in mind I chose Olafur Arnalds – Only The Winds as the reference track. This is because it contains instrumentation that is present across at least one of, if not all three songs: piano, strings, electronic drums and synthesizers. In addition to this it is stylistically similar to the music for the project, being that it is composed of a lot of conflict and resolution; where parts are introduced and swell to a crescendo.

All the songs were mastered using the same signal path (in a daisy chain in an effort to reduce analogue to digital and digital to analogue conversion) to help promote uniformity. Firstly they are routed through a Drawmer 1960 where light compression is introduced to help glue the music together. Next in the signal path is the Focusrite ISA 430 MkII where the warmth and air of each piece can be manipulated using low and high shelve equalisation respectively until it is a decent reflection of the reference material. Minor adjustments are then made using the parametric section of the equaliser to reduce boxiness, emphasise certain elements (such as the strings at 6 kHz which was needed in all cases), etc. They were then routed through the UAD 1176 where gain (but no compression) was added before reintroducing it back into the DAW, this is to reduce the amount of digital gain applied before limiting and to impart some more analogue warmth. Finally any corrective processing such as notches to remove any resonant frequencies is applied (and inserted before the hardware insert so any problem frequencies are not amplified by the increase in gain) along with any needed multiband compression.

Lastly the L1 Ultramaximizer limiter plugin is implemented, while constantly referring back to the reference material to ensure all the songs are at a comparative volume level.

Composition and How It Relates To The Visual

The compositional aspect of the project is arguably the most important for three reasons.

Firstly it will have to fully utilise the lighting system, controller and Leap Motion to their fullest potential.

Secondly the pieces of music should be constructed in a way that they are predominantly loop based and simple to perform using the controller.

Finally they should be varied enough so that the songs contrast one another which again ties into utilising the lighting system properly.

With this in mind the compositional approach for the three songs written are as follows:

The Low Ground

The Low Ground begins with a swell of ambient noise which when connected to the various PWM output pins creates light that fades in accordance with the audio volume using the live.slider object.

An arpepggiated synthesizer part is then introduced which will involve the random switching on and off of LEDs to contrast the fluctuation of the LEDs being controlled by the noise.

Towards the middle of the song when it reaches a climax the instrumentation is stripped back to percussive elements. It is at this point the lit up LEDs will reduce in number and certain lights will correspond to different parts of the drum kit.

Then when the instrumentation is filled out once again, while the drum LEDs remain flashing the rest of the lights will switch on and off randomly to complement the rhythm of the bass instrument. By having the LEDs flash at a faster rate it will emphasise the increased energy in this section of the song. This is something to referred to by Brown and Kerr (2009) in their research on adaptive audio (in particular how to effectively convey mood) as an increase in ‘rhythmic density’. This is where more notes or musical content is played to convey a sense of heightened mood or intensity and I feel the same approach works well regards to the visual aspect of the piece.

Then finally the song will end as it begun, with some LEDs following the audio volume of the ambient noise that eventually fades out.

Almost Always At The End

This song is composed in direct contrast to the first, in that it contains no percussive elements, has a slower tempo and contains elongated, flowing phrases as opposed to the impactful crescendos of the previous piece (such as where the drums come in towards the middle of the piece). The song is composed in such a way so that it showcases the visual aspect of the project’s expressiveness.

The song begins with a single note being struck which will be reflected with a single LED.

As the instrumentation fills out more lights are introduced so that they follow the audio volume of certain instruments; this is to exhibit the more nuanced capabilities of the lighting system in comparison to the first song. It is interesting to note that the lighting system is so effective in replicating the expressiveness of the cello parts that you can actually see the light fluctuate with the vibrato. This is interesting as it demonstrates how even using technology and prerecorded material, subtleties can be communicated; providing an intriguing commentary on the relationship of technology and acoustic instruments.

The song then strips back to piano and a single cello and is reflected as such in the LEDs.

The song then returns to full instrumentation, with the piano playing an arpeggiated part which like the first song is imitated by the lights. Again like the first song the rate of the lights flashing is increased halfway through the later section to elevate the energy as the song reaches its peak before winding down as it did in the middle, except this time with a single cello.

Closure

This song is composed to be somewhat a combination of the previous two approaches, using a mixture of nuanced and impactful lighting.

The song begins with a single kick drum that fades, showcasing how the LEDs can perform in a percussive yet subtle manner.

Alternate lights begin to flash randomly in time with the arpeggiated piano while a reverse piano part which establishes the chord sequence fades in. This way the lights flashing on and off are a contrast to the ones in between fading in, this is designed to highlight the textural differences of the music.

When all the instruments but the arpeggiated piano drop out the lights turn off and fade in to accentuate the build up to the energetic final section.

Like the previous songs the rate of the lights switching off is double to highlight the increased energy in the latter part of the song while the lights fading in accordance with the reverse piano remain to provide contrast.

Finally when the song reaches its peak, like the middle section, only the three fading lights remain which dim in accordance with the ambient noise that slowly fades out.

Wiring

Now that the entire project is ran from a single Arduino, the wires connecting both the controller’s buttons and the wires attached to the LEDs will be housed in the controller itself. To combat having innumerable wires everywhere, speaker wire (2 wires housed in one case) will be used to connect the 7 Arduino pins and 1 ground pin to a breadboard housed in a separate box (using the speaker wire cuts the amount of wires leaving the casing in half). From the lighting system box 7 speaker wires will connect the LEDs to the ground terminal and corresponding Arduino pins. With regards to the controller’s wiring, each of the 18 buttons will have one terminal wired to ground (using a daisy chain) and one terminal wired to it’s corresponding digital pin. Then using the Arduino software, by engaging the internal pull-up resistors it saves having to draw power from the Arduino to see if the button state changes. Here is a wiring diagram to better illustrate what I have described:

Untitled Sketch_bb

Below are pictures detailing the actual wiring of the controller and the box containing the breadboard which the LEDs are attached to:

WP_20150504_009

WP_20150504_011

WP_20150504_023

Problems, Solutions and A New Feature

Having built the new controller and a prototype of the lighting system a few problems have been encountered.

Firstly the Maxuino object which deals with data transmitting both to and from the Arduino units can only hand data from one Arduino. What this means in terms of the project is that everything has to be ran from a single Arduino chip. Unfortunately this means that the original plan of having the buttons on the controller light up in accordance with the BPM of the samples is no longer feasible as there are not enough pins on a single chip to cover all of the functionality required.

To show that this was a possibility before this problem was encountered here is a video of the aforementioned feature working:

The second problem encountered is that the current draw from the relay (which deals with the switching on and off of the lamps powered by the mains) is too large for the Arduino to handle, resulting in the Arduino resetting itself after a short time. Having experimented with external power supplies and powering a separate Arduino to no avail, the lighting system will have to consist solely of LED lights.

Again just to showcase the system somewhat working, here is a test of the lamps before they shortly switch off:

Having upgraded to Max/MSP 7 however I have been able to utilise the Maxuino external object to it’s fullest potential. As my previous version of Max did not contain the live.slider object, I was unable to use the Pulse Width Modulation (this is where a digital signal is turned on and off to approximate analogue values) outputs on the Arduino to make the LEDs fade. This new feature will drastically change and improve my compositional approach due to the fact that instead of having a binary response to the audio material the LEDs can now perform in a more nuanced manner and even modulate in accordance with the volume of the audio material; therefore making the lighting system somewhat self operating.

The Controller – Design #2

After the first design was built there were a number of things I felt could be improved upon in terms of design and functionality.

  • The pre-made box itself was unnecessarily deep in comparison to similar professionally made products.
  • As the box was so high it was uncomfortable to play due to the fact the interface is horizontal and not facing the user.
  • I felt the single colour of the controller left something to be desired visually
  • After considering how the box would function in relation to the lights and the leap motion controller there seemed to be room for optimisation in terms of controls.

To address these issues with the second design I decided to implement the following changes:

  • Build my own box and minimise the dimensions as much as possible to make it have a lower profile.
  • Angle the interface so that it is facing towards the user and therefore easier to play.
  • Fabricate a piece of metal on the top to house the buttons.
  • Remove the rotary potentiometers as the Leap Motion has so much more potential for producing numerous control data using a single hand.
  • By removing the rotary potentiometers I would then have a place to actually place the Leap Motion controller (something that was overlooked in the previous design)

With these changes I decided on the definitive functionality of each of the buttons:

  • The top three rows (12 buttons) would be responsible for the triggering of samples.
  • The fourth row of buttons would engage an effect, the parameters of which can be controlled by the Leap Motion controller.
  • The Player One and Player Two buttons located at the bottom would be used to transition into the next composition (by allocating the top 12 buttons a different set of samples).

After implementing all of the above changes this is the final design:

Design 2 Dimensions Top


Design 2 Dimensions Side

Below is a 3D model of what the box will look like:

Untitled-1

The Controller – Design #1

When first designing the controller I first looked at comparable examples such as the Native Instruments Traktor. I chose the Traktor as it has similar elements to my own design; those being illuminated buttons, rotary potentiometers and trigger pads for playing samples.

I decided to emulate the design in that the buttons for playing the samples should be most easily accessible to the user, therefore they need to be located closest.

In terms of measurements I would have to account for the width of the mounting holes required for each button (28mm), the depth of the plungers (65mm) and how much room would be needed to house the buttons, wires and Arudino comfortably.

With this in mind this is the first design for my controller:

Design 1 Dimensions Top

As you can see although the buttons themselves are only 28mm in diameter there needs to be extra room to allow for the translucent rings on the top of the buttons (this coincidentally helped to make the wiring less problematic as well as there was more room).

The box was bought pre-made (the full dimensions of which are 280x200x180) from a lightweight wood with a natural finish (later for purely aesthetic reasons I varnished the box).

Here can be seen the completed build of the first design of my controller:

_MG_5529


_MG_5539

Additional Information

The buttons were purchased from Arcade World UK and their technical specifications can be found here:

http://www.arcadeworlduk.com/products/Chrome-Effect-Illuminated-Arcade-Button.html

Both the rotary potentiometers and box were purchased from Ebay but the pages of which are no longer accessible.