Sunday 30 November 2008

Crucial point (part 1)

I am now at the point, in my project, that I have to decide what sound I am going to use. How I am going to use them it was all figure out more or less at the programming sequencer procedure (and still is). This point is crucial because my decisions now will bring the composition alive or not. Bad quality and bad combination of sounds will of course produce a bad composition. In addition to that I have to consider how sound effects in Ableton Live will change the final outcome. It is very similar to cooking a Christmas turkey dinner with the staffing. If the turkey is not good quality whatever you do with the rest of the dish the results will be less than medium. From the other hand if the turkey is free range and good quality, fresh from a butcher, then is more likely to have an unforgettable Christmas dinner. There are two ways to do these tests. The first option, which I don’t have, is to test each sound at the rehearsals to see (hear) whether are appropriate for this composition. In other words buy ten turkeys from different places and put them in the oven to find out which turkey taste better so that you will buy it for the Christmas dinner. Rehearsals in my case are tricky because the performers are doing this for free so they are not so happy to do rehearsal just to figure out what type of sounds I will use. The other part that is also very related to the final outcome of the composition is the Christmas stuffing (sound effects). Ableton live have a wide range of sound effects that I can use but which of those will taste the better?
In addition to all this I have to use 8 different sound files (two for each performer) that the qualities and character of the sounds will be a pleasant mixture. Again like a four-course meal with two ingredients for each course. The reason for this decision to have 8 different sounds files, two for each performer, is because I the composition is in an 8-speaker surround set up. Each performer will have a pair of speakers with the ability to pan and choose the sound file on the fly. The only sounds that I know so far are the alto saxophone and the electric base.

My approach here would be a combination of experiments with sounds. I am aiming for faster and most efficient way to choose between the sounds. I know that I will have base and saxophone so having those two sounds as a given I will start choosing my sound around that area. I am approaching the sounds as a saxophone quartet with frequencies low middle and high. I will update this as soon as I decide what sounds I am going to use.

Wednesday 26 November 2008

After the meeting at VRU 25/11/2008

Today’s meeting was very helpful. I was able to formulate my aims for this composition-project in a clearer and more directional way. As I’ve mention previously in my Blog the main purpose for this project is to involve the audience in a way that they will transform and interact with the music in a live performance. Furthermore is the idea and fact that audiences can influence the performance consciously and subconsciously. In the first section of my programming with Max/MSP I am using cameras to track the colour from the audience and therefore a consciously interaction. I’ve decided to drop out this part completely since the other two parts are quite complicated. Also another reasons for dropping out is the fact that since I was going to use cameras I had to do some tests to see how the cameras will act in a performance situation. How far away can the camera track colour. How the lights in the performance affect the colour. Am I going to use LEDs or mobile phones from the audience? All these are questions that will produce even more questions if they are answered. Even if the tests are done the final outcome, from the section one, was going to be unstable and would crucially influence negatively the other two sections. It occur to me, after the meeting, that what I was trying to do in section one with the cameras, was to track the position of the particular colour within the frame of the camera. This all could work in a much simpler manner. I will track the amount of colour (Red, Green, Blue) within the frame of the camera. I will need to do some test as well, if I am going to include this, but with this it might work in a more secure way.

The next video shows how the, abandon for now, idea of colour tracking works. It works fine with only one colour received form the camera. When are two colours, in this case two red sources, it automatically connect the lines from the outer X to the outer Y. Watch the cursors at the right video. Imagine this in a group of audience of 20 people. The incoming data would be very unstable.





Knowing that the first section is not included, I will have to do a few modification to to the second part to be completed as a compositional piece. I will add another Wii as the master Wii or in musical terms a conductor. I will conduct the other four Wiis with the master Wii. The button in the masters Wii will act as a gate for the effects on Live Ableton. I am working on this modification and I don’t know the actual outcome of this process. Hopefully I will be able to control the final musical outcome from the Master Wii . I am still working on the third section. I am waiting from the base player and the saxophone player to let me know if they can do the performance.

Monday 24 November 2008

Three sections for the project

I have decided that I will need six people to participate in this project apart from the audience and me. Four of them will perform with laptops and Wiis and the other two will be performing traditional instrument probably on base and sax. I have found two of them so far that are willing to collaborate with me.

Project construction

I have mentioned earlier that I had to broke down my programming to three sections. Thanks to Jonathan Green that already made a few patches in Max/MSP for colour tracking (similar my section one programming) I have decided that it was better to work on the second section to see how I was doing. In this second section I have done more or less half of the work that I need to do. I have connected the Wii with Max and route different buttons to the sequencer. Using the Rewire in Max I have connected it with Ableton live which I will have all the audio effects for real-time processing. Have a look at the video.



Sunday 9 November 2008

Putting things together 9/11/200

I have decided that it will be easier and more effective if I break my project in three parts where I can manage each one separately. Also for testing the project it would be much easier.

So I have three sections. The first is the tracing colour via the camera or cameras. This section will deal with the audience as a group. The idea here is to use their mobile phones as a source of light to be able to track them through the cameras in to Max/MSP. Another easy way to have light information form the audience is to give them a small colour LED with a small battery attached on. Also it will be visually interesting.






From there the audience will realise that their action is causing an effect on music. Another part of the audience as a group will be the use of Bluetooth to sent pictures. I am thinking to have a projector with the images that they send which will manipulate and affect the music.

The second is the data from the Wii remotes. I will have 4 Wii remotes that each one will be like an instrument. I have finished yesterday 8/11/2008 a few hand made infrared devices to use with the Wii infrared camera that is attached on the Wii remote. So far I am only forming my ideas on how it will work.

The third section is dealing with the live performers and the stage set up. I haven’t done anything for this section yet. I will form eventually along with the other sections.

Wednesday 5 November 2008

Something solid 05/11/08

Trying to answer those questions, I have mentioned earlier in my blog, I have ended up with more questions and things started to drift away. At that point I knew that I had to arrange a meeting with Greg to put things in order. After a conversation with Greg (28-10-2008) about my new ideas and questions i had in mind, I have realized that it was not clear to me what I am going to do for collaborative project. So,to make my project clear I have to be able to explain it.


In my collaborative project I am focusing on how the audience can influence and transform a live musical performance.

WHY?

It is fairly accepted, at least from me, that the audience is able to contribute to the final musical outcome. If there is no audience then music can not exist. There is a connection between the audience and performer/s as it is with the composer and performer/s. To make things clearer here is a small drawing how they connect with each other.




Every element can affect the others in any direction.
To create, what we can label as, a live musical outcome we need all of the three elements above. All of these factors can influence and transform the music in a live performance.


HOW?

I have to find ways that I can engage the audience to participate into the music. I have mention earlier the conscious and the unconscious influence of the audience. In most of the western civilisation musical tradition the audience contributes to performances/compositions from the unconscious point of view. Their influence is more psychological to the performers. They can not control the mental influence since they don't know how the performer is going to react.

From the conscious point of view the audience is very restricted. There are few concerts that the audience is allowed to interact with the music. The next videos are from the concert Rock Music, Rock Art that held at Birmingham Conservatoire on the 24 of October 2008. During that concert the musicians wanted the audience to clap their hands as a part of the concert. At the end of the concert the musicians from the Uganda Dance Academy ask from the audience to participate by dancing with the rest of the group. Here are two videos taken with my mobile phone.






Furthermore it is forbidden to influence the performance/composition in a western tradition. In concerts they give you the performance notes in a hard paper so that you don’t make any sounds accidentally during the performance. They also supply you with throat drops so that you will not disturb the concert and the others. Moreover you feel so uncomfortable by the unwritten rules that you are waiting for the section to finish, you cough and then you prepare yourself mentally to seat still for the next 22 minutes.

So for my collaborative project I want the audience to be able to choose their participation consciously or unconsciously. For the hardware I am planning to use mobile phones the Wii remote and other portable devices as a part of conscious participation from the audience. Mobile phones because now almost everyone has a mobile phone and they will feel comfortable enough to use it during a performance. Also I am assuming that the already know how to use the Bluetooth or take a picture. The Wii remote because is a small wireless device which I can get lots of data from it. For the software I will us Max/MSP and Ableton Live.