Hostname: page-component-6dbcb7884d-g628x Total loading time: 0 Render date: 2025-02-14T06:47:54.304Z Has data issue: false hasContentIssue false

My Content/My Space/My Music

Published online by Cambridge University Press:  26 March 2013

Alexandros Kontogeorgakopoulos
Affiliation:
School of Art & Design, Cardiff Metropolitan University, Llandaff Campus, Western Avenue, Cardiff, CF5 2YB, UK E-mail: akontogeorgakopoulos@cardiffmet.ac.uk
Olivia Kotsifa
Affiliation:
School of Art & Design, Cardiff Metropolitan University, Howard Gardens Campus, Howard Gardens, Cardiff, CF24 0SP, UK E-mail: okotsifa@cardiffmet.ac.uk
Rights & Permissions [Opens in a new window]

Abstract

This paper presents an interactive sound design and interactive composition aesthetic. Three projects are presented as case studies and underline the importance of audience involvement: From snow [to space to movement] to sound (2011), Melodic walk (2012) and Points… (2012). All three projects have been designed, implemented and put in practice, and outline the aesthetic vision and approach of the authors. In the examples above, elements of interactive performance, sound installation and architectural design are blended together in order to deliver a sonic result, where the audience plays a central role. The members of the audience interact directly with the artworks, and as a result become part of the installation. Moreover, by bringing their own content into the interactive scenario, they also become contributors. The architectural space is an important parameter, as the spatial design is key to audience interaction with the music. Technical and aesthetic aspects are presented alongside the experiences of the audience/participants/contributors.

Type
Articles
Copyright
Copyright © Cambridge University Press 2013

1. Introduction

One of the key challenges of contemporary art forms, music and design is the incorporation of interactivity and the introduction of computer-mediated technology for the creation of systems that offer a deeper engagement to the participants (Chadabe Reference Chadabe1997; Dixon Reference Dixon2007; Fox and Kemp Reference Fox and Kemp2009; Noble Reference Noble2009). Interactivity in all those disciplines, and particularly in the art world, not only defines a communication between humans and machines but also impacts aspects of cultural life. For instance, a cultural break occurred in the late 1960s where ‘the proscenium between performers and the public was lowered’ (Morse Reference Morse2003: 16). The audience became an essential part of the discourse of the artwork. The simplest case within media arts is where the audience controls some part of an essentially reactive system. In the field of computer music more sophisticated systems have been developed over the years that bring the audience even closer to the artwork. Installation works managed to create conditions where the audience, the systems that communicate the intention of the artist, and the performers themselves interact in several ways or modalities (Bongers Reference Bongers2000).

In this paper three projects are presented as case studies. All were conceived, designed and implemented by the authors. These are interactive sound-design and interactive composition aesthetics whereby audience members experience sonic artwork as both receiver and contributor. In section two we introduce this approach theoretically. Some brief technical and technological insight is given in the third section. Section four refers to the three works that demonstrate the suggested paradigm in a practical way.

2. Audience/Participants/Contributors

In an interactive artwork, the responsiveness of the designed system is critical to its overall effectiveness. Joel Chadabe, a pioneer in interactive music systems, introduced the term ‘interactive composition’ in the early 1980s. Chadabe often uses a conversational metaphor to describe the nature of his work and to convey a strong image of the responsiveness of the system. He describes some of his compositions as ‘Conversing with a clever friend’. The metaphor clearly indicates his approach (Drummond Reference Drummond2009). Most of the interactive compositions presented in last three decades ask ‘how clever is this friend?’, and the advance of technology has nourished this type of musical exploration. Other questions remain that we address in this paper: ‘how close to us is this friend?’ and ‘how much does the conversation topic intrigue the participants?’. When a person introduces a topic of conversation, it is likely that that they will feel a deeper investment in the discussion than they would if it had been someone else that introduced the topic. In other words, in order to engage a person fully in a conversation, it is important to find a subject of discussion that is interesting and familiar to the others involved. In the form of interactive art projects where the audience is invited to interact with artworks, such as in sound installations, this argument is even more relevant.

Chadabe also suggests that ‘the challenge for computer music composers in the near future will be to use their elite knowledge and skill to create situations in which members of the public without that knowledge and skill can participate meaningfully in a musical process’ (Chadabe Reference Chadabe2000: 91). Furthermore, Hahn and Bahn emphasise that one of the challenges facing the creators of interactive works is ‘the incorporation of technology into the “look” and “feel” of the work’ (Hahn and Bahn Reference Hahn and Bahn2002: 230). In response to these statements we believe that interactive architectural space can be considered as highly engaging for participants with no prior skills, enabling them to directly experience the artwork in a creative way.

In 2004, M. Krueger published an interesting essay on interactive aesthetics in the field of media arts (Krueger Reference Krueger2004). According to Krueger, system responses should be obvious and understandable. In contrast, Drummond warns that ‘a system consistently providing precise and predictable interpretation of gesture to sound would most likely be perceived as reactive rather than interactive’ (Drummond Reference Drummond2009: 128). The importance of the gestural nuance is emphasised by Garnett, who argues that the subtleties of phrasing and articulation are one of the most important elements the performer brings to interactive computer music (Garnett Reference Garnett2001). In this paper we embrace the qualities of Steve Reich's Music As a Gradual Process, in which the process is aimed to be perceived by the audience (Reich Reference Reich2002). This work also attempts to marry subtle gestural control with very clear perceptible compositional processes.

The paper suggests an interactive sound design and interactive composition aesthetic based on three inter-related points:

  • The participants deliver simple and original sonic material, which enters into a dialogue with the interactive system prior to or during the experience of the artwork.

  • The participants are in constant engagement with simple and accessible music compositional devices and perceptible musical processes, which define the evolution and the form of the artwork.

  • The participants alter and generate constituents of the artwork by exploring the space.

The argument put forward here is that the image and identity of the participants are important ingredients in the interactive artwork. Regarding our third point, some similarities can be found in the Partial Space installation (Rebelo Reference Rebelo2003: 184): ‘Partial Space is an interactive sound installation, presented and in ongoing development by the author since 1998. It consists of an environment in which inhabitants perform a resonant space. By moving in the installation space, the audience triggers sine tones of frequencies that correspond to the natural resonant modes of that architectural, physical space. Sound becomes the medium for experiencing architecture.’ However, we believe that in our approach, by letting the participants offer their own personal sonic content, their role is transformed into that of contributor and therefore their experience is altered. They are no longer merely audience members, or even merely participants – they are co-contributors and creators. To illustrate the suggested approach and idea in practice, the paper makes reference to three works, all of which blend the elements of space, movement, interactive performance, music and compositional processes in order to deliver a sonic result where the participant play a central role in the creative dramaturgy.

3. Technology

In order to sense the motion of the participants and realise an expressive dialogue between the elements of the presented pieces, a variety of software and hardware tools and off-the-shelf systems technologies were considered (O'Sullivan and Igoe Reference O'Sullivan and Igoe2004; Noble Reference Noble2009). As in our previous works (Kontogeorgakopoulos, Kotsifa and Erichsen Reference Kontogeorgakopoulos, Kotsifa and Erichsen2011), key motion features such us velocity, presence, position, orientation and acceleration were detected and tracked; in this work, we limited the use of sensors to a single camera-based motion-tracking system. Camera-based motion tracking is not uncommon in interactive music, art and design (Levin Reference Levin2006; Schacher Reference Schacher2010; Wechsler, Weiss and Dowling Reference Wechsler, Weiss and Dowling2004; Winkler Reference Winkler1997). Among several existing technological solutions (MAX MSP Jitter with cv.jit library, the Eyesweb platform, the Processing programming language with the openCV library and the openFrameworks framework with the ofxOpenCV), the Eyecon system was utilised.

Eyecon is a commercially available computer vision system specifically developed for interactive performances by the Palindrome Inter-Media Performance. It offers a highly intuitive graphical user interface that allows the user to graphically define lines, zones and fields wherein the conceived interaction will take place. For example, the audiences in our projects can touch one of these virtual line segments, which are drawn according to the architecture of the performance space, and trigger or modulate elements of the musical composition.

The digital signal conditioning and the gesture analysis were carried out using the MAX MSP visual programming language. Useful digital-signal-processing operations such as smoothing, scaling, averaging, debouncing and edge detection, the mapping of given input ranges to a desired output range, and low-level motion feature extraction were programmed graphically. The computer vision algorithms were run on a separate computer from the basic audio workstation running the interactive composition algorithms on MAX MSP. The two computers were linked together by ethernet and shared data through the OSC protocol. Some control events were also transmitted via an internal MIDI bus to the Ableton Live digital audio workstation, which performed sound synthesis and processing and synced or non-synced audio-clip triggering.

4. The Works

4.1. Constrained space: from snow [to space to movement] to sound (2011)

This project is based on real-time interaction and creates a responsive environment for snowboarders that aims to change the experience of play (Kontogeorgakopoulos et al. Reference Kontogeorgakopoulos, Kotsifa and Erichsen2011). It is an interactive design focused mainly on the interaction between the installation and the participants (Bullivant Reference Bullivant2006; Fox and Kemp Reference Fox and Kemp2009).

The main concept was to use interactive technologies to produce music through the movements of snowboarders in a purpose-built snowpark. The snowboarders were asked to provide music and audio tracks that they could relate to, which were then edited and used as part of the interaction. In so doing, the snowboarders experienced an immediate engagement with the space as their own movements generated and transformed elements of music familiar to them.

4.1.1. The space and movement

Many interactive art/architecture projects focus on interior spaces (Bullivant Reference Bullivant2006; Fox and Kemp Reference Fox and Kemp2009; Freyer, Noel and Rucki Reference Freyer, Noel and Rucki2008). The beauty of the project, and also its challenge, was the freedom of movement allowed to each user/participant/contributor. Other factors, such as varying weather conditions, affected the performance of the technology, also had to be contended with. Snow was the primary construction material, and although it is often used in vernacular architecture there are very few contemporary architectural projects that use this material (Fung and Debany Reference Fung and Debany2005). Snow was used to reshape the landscape creating interesting shapes (modules) within a snowpark in a French ski resort.

Collaborating with HO5, a snowpark development company, the authors were able to observe snowboarders live in action while using snowpark modules (Figure 1). After several module designs had been built and tested, specific modules that led to movements in tune with the aesthetics and the functionality of the project were identified.

Figure 1 Modules used in snowparks. From the above, a kicker, a wall, a bonk and a rail were used and placed carefully in the designed space composition, giving choice to the snowboarders.

The snowpark was designed to allow the snowboarders to obtain the speeds required for their performance. Each module had to have a specific angle/slope and dimension as well as to have enough space to allow the snowboards to move freely. The fact that there was no single path restriction allowed the snowboarders to create a variety of sounds through different body gestures (Figure 2). They were not restricted by convention, and this new freedom enabled them to explore and develop different new moves and styles.

Figure 2 The module arrangement start and end points and the different paths each snowboarder can choose to take.

This project was an installation composed of the improvised and unchoreographed movements of the snowboarders within the designed space. As the snowboarders moved freely within the installation, they came to realise that the number of tricks they did, the way they performed them and the point where they were within the space all determined the sound they produced. (For further information regarding the interaction design, please refer to Kontogeorgakopoulos et al. Reference Kontogeorgakopoulos, Kotsifa and Erichsen2011.)

4.1.2. The culture and music

Careful research on snowboarding culture, movements and tricks was carried out in order to attain a better understanding and to help create a successful interaction. Examples can be seen on the HO5 and Pirates websites (HO5 Park 2012; Pirate Movie Production 2012). Moreover, the authors spent time talking to the HO5 employees who were part of this culture, observing their gestures and collecting their preferred music. The latter was an important aspect of the project, as, by providing their own music, each individual snowboarder became a contributor as well as a participant, user, and member of the audience.

The music, the sound samples and the sound effects collected displayed cultural characteristics similar to street and urban culture. Both are characterised by particular musical genres such as dub, hip-hop, rap and electronica. The familiarity of the snowboarders with the preferred musical genres helped stimulate curiosity and encouraged exploration of the designed space. This is important for the spatial understanding of space, which, as suggested by Lefebvre (Reference Lefebvre1992: 294), defines the ‘inhabitant’ as a full participant, a user, a performer of space.

The final interactive composition was exclusively based on the collected material. It may be described as a dynamic musical dialogue, balanced towards experimental popular music forms and a DJ remix aesthetic, and often shifted towards a more sound-based musical idiom. In the first case electroacoustic techniques were transforming and arranging the collected material in a synchronised, beat-driven way, while in the second case the sonic material was organised and treated in a more experimental way, sometimes masking the sound sources and creating a more fluent rhythmic development.

The interactive composition paradigm followed that of performer improvisation and predetermined computer sequences (Winkler Reference Winkler1998). Aspects of indeterminacy were used only to choose specific samples from the collected audio database and to set cue looping and triggering points. Explicit comprehensible mapping of performance gestures to the parameters of digital signal processing (granulation effects, pitch shifting, delay-based effects, distortion, etc.) and interactive composition algorithms were regularly employed in the piece.

The feedback given by the snowboarders was positive and during contact time with them, both inside and outside of the snowpark, the authors continuously attempted to improve the installation by experimenting with several different design solutions. The initial idea was to focus on aerial moves but this proved to be unsatisfactory. Such moves did not last long enough for the snowboarders and nonparticipants to engage properly with the generated material. However, the tricks that involved stalling on modules or carving the slope offered many interaction possibilities that directly influenced the snowboarders’ performance: for example, sustaining a sound using granulation techniques. Most importantly, the snowboarders’ familiarity with the sonic material was a significant factor in the success of the project. They managed to explore it musically using their motion and tried to perform tricks that would create a clear impact on the sonic output. Even during the design phase they actively contributed to design issues regarding the original music content, how to distribute it spatially, and how to make the interactions more straightforward. It was suggested by the snowboarders that clear and sharp event-based interactions were preferred over smooth ones. An unpredicted issue appeared when testing the installation; the camera was a disturbing factor for younger participants, as they became acutely aware of the fact that they were being filmed and did not respond well to the interactive qualities of the snowpark. An improvement for future projects might be to hide the technology and video tracking system or to place them intentionally in the artistic-performance dialogue.

4.2. Unconstrained space: Melodic Walk (2012)

This interactive composition was partly created at Bauhaus University in Weimar during the MotionComposer Workshop and Symposium, coordinated by Robert Wechsler in 2012. The MotionComposer project (MotionComposer 2012) works to give people with a wide range of disabilities the opportunity to interact with music through movement. Motioncomposer aims to achieve this by developing tailor-made tools, software prototypes, processes, digital musical instruments and interactive compositions. Through motion-tracking technology, a computer music language and digital signal-processing algorithms, a simple interactive music work has been conceived and implemented digitally by Alexandros Kontogeorgakopoulos. The design of the interactive system was informed by observations of, and feedback from, a group of people with disabilities who tested the prototype systems daily during throughout the workshop. During the last day of the workshop, the participants were asked to perform in and explore the interactive environment creatively without rehearsal or preparation time.

Based on the three points presented in the beginning of the article, an open-form musical piece was composed and revised several times during the Symposium. The participants initiated the work by individually improvising and playing a short melody or motif of their preference. The composition had as many sections as it did motifs provided by the participants. A common keyboard controller was used as an input device.

Selected musical segments were then triggered by the motion-tracking software via a set of lines through the space defined by the software (Figure 3). These lines were not visible to the participants and defined the timeline of the music. The participants were able to move back and forth in space and perform their melodies and rhythms.

Figure 3 Example of interaction.

The music phrases composed by the participants became the basic cell of the interactive work. Ideas and concepts from the total serialism and minimalism movements influenced the design. The basic cell served as the material from which the harmony was created, and the texture and variations of the pre-given note sequence. The line segments containing the melody could be arranged, spatially organised and composed in a way that defined the musical architecture of the work.

Musical compositional devices such as retrograde, inversion, sequencing, the canon and phasing may naturally be expressed by designing and carefully positioning line segments: for example, two lines containing the same melody translated into space become a canon. If the same lines have slightly different length, they give a phasing effect. By changing the direction of the line we obtain the retrograde of the motif. Simple harmonic content or a drone-type effect can be generated by superimposing and transposing a line that controls a different musical instrument or sound synthesis algorithm on top of itself. By using delay-based effects, such as echo and reverberation, sustained pitch centres were obtained, which give the piece a modal character. The participants were free to find zones on their phrase and undulate or locally explore their pitch properties accompanied by a sustained note coming from the same music material. The musical phrase or phrases can change according to an event sequencer or by employing simple random processes.

At the same time, electroacoustic techniques helped craft the interactive performance. A version of the composition conceived during the workshop was based on a simple narrative. An eastern folkloric ‘magical’ atmosphere influenced the sound design and the pitch-class set of the music. In this version of the composition, motion features such as the barycentre velocity altered the timbre of the played notes and slightly detuned them, thus intensifying the illusionary aspects of the narrative. The quantity of motion was controlling a subtractive sound synthesis algorithm, in order to simulate the wind sound. The performers, by slightly swinging the lower parts of their bodies could expressively control this environmental sound effect. Moreover, the pitched tones were gradually transformed into impulsive grains by enveloping each sound and hence the performed melody gave the impression of a rainy soundscape. Since only the attack-onset of the notes was perceived and a network of delay lines circulated the signals on a feedback topology, the temporal order and relationships became unclear. Finally, the granular texture was further transformed into a sustained inharmonic chord, which eventually faded out.

The space had no physical constraints but was defined by invisible, yet audible, clear straight lines containing the timeline of the melody and the structure of the piece. The shapes created by the lines defined the music form. By walking and moving in the space, the participants created their own music. Since the initial material was given by the participants and shaped and articulated dynamically by their own body movements, the participants engaged with the work in a very personal level.

The installation was tested initially with a set of familiar melodies chosen by the participants instead of motifs or complete melodies played by them. At the outset, participants were very curiously exploring the interactive space but soon seemed to lost interest, especially when they discovered and learned how to perform the given melodies correctly. On the other hand it was observed that it was more stimulating to recall and interact with their own melodic material. Often, however, because they were not trained musicians, the melodic choices did not serve the purpose for intuitive and engaging music performance. A scale mapping would facilitate the effectiveness of the motivic construction and development, and this is a priority for future improvement. In contrast to the previous project, the participants preferred more smooth and fine control of the interactive environment. Pure note triggering limited possibilities for interesting sonic articulation; participant movements were in most cases very gentle and delicate. Enhancing the interaction with sonic treatment and sound design gave a basis for more expressive performance. Moreover, the narrative character of the composition facilitated the effectiveness and the coherence of the perceived music structure.

Clearly this work stands between an interactive installation and interactive digital musical instrument. Although the system responses were mostly straightforward they did not affect the expressivity of the participants. As the possibility of aesthetic success was high, audience members were not afraid to explore it creatively by producing and performing small musical pieces. Because the target user/audience was people with impaired muscle coordination – including a wide range of disabilities, such as autism and cerebral palsy – the complexity of the system design was kept to a minimum. This work is an ongoing development aiming to offer the possibility to the audience/participants/contributors, with or without disabilities, to express themselves musically and to engage with an aesthetic exploration where they feel they have ownership over the artistic outcome.

4.3. Semiconstrained space: Points… (2012)

Points… is a work in progress project for acoustic guitar and live electronics. As an artwork it stands between an interactive music performance and an interactive installation that offers the performer, whether trained guitarist or not, control over the flow of time and at the same time control over the timbre the pitch of the music by the same means. This constitutes one of the principal axes of the work. The basic sound material derives from the acoustic instrument and it instantly becomes a spatial property that the spectator-performer can refer to, later in the performance. As in the previous projects, motion-tracking technology transforms the space into an interactive sonic environment.

The composed system response type, according to Rowe's framework (Rowe Reference Rowe1993) is mainly transformative. The performer/participant continuously improvises, creates and records sound by interacting with an acoustic guitar. At the same time he or she walks through the space following a trajectory indicated on a map. The sonic material provided by the performer during the space navigation is altered timbrally and played back according to his or her position in space. Therefore musical phrases and sonic textures derived from the guitar performance are spatially explored, layered, restructured, fused with the live element and then projected back into the space for further possible sonic treatment by the same simple process.

The space in this interactive work has only one, non-physical constraint. The constraint is marked on a map forming the score for the piece (Figure 4). Because of this, the work becomes an open spatiotemporal event evolved from generic directions offered by the map/score. The space becomes the canvas of the piece and the dynamics of the human body the brush that blends and marks behind the memories of the past soundscape.

Figure 4 The score/map, where lines and densities of lines and points represent the path one can take within the physical space.

The idea of using a map to navigate within an audio-landscape is not novel (Fells Reference Fells2002; Karandinou, Achtipi and Giamarelos Reference Karandinou, Achtipi and Giamarelos2009). A map that also takes into account the architectural space and transforms it into a score/map is the Imaginational Map 2 (Mehta Reference Mehta2011: 84). According to the composer/designer of the map/score, ‘the drawing behaves like an architectural plan leading to the construction of a musical architecture … For the interpreter, however, the metaphor becomes localized into a map in which many degrees of freedom between improvisation and composition are explored in real time through a landscape of shifting sonic and navigational choices.’ This map helps moving musicians to navigate through a space.

Focusing on this project, the designed map does not restrict the user; on the contrary, the user is free to move in any direction ignoring the map if he or she wishes to. However, the way information is embedded in the map encourages the participant to take sonically interesting paths. As mentioned before, the points on the map are directly related to the physical space the user occupies. One might say it is an architectural plan of the space. The points and some important lines are drawn on the floor in a fashion similar to the stage set design of Dogville by Lars Von Trier (Trier Reference Trier2003), with the white chalk lines on the floor representing an architectural plan.

Clearly if we consider as a performance the dramaturgy and the sonic outcome of a set of prewritten instructions on a score, then the work under consideration could belong to this category of performance. The score is the map. But what about the case where the score becomes a physical path where the physical constraints of space direct the performers to follow a certain trajectory? It is obvious that the musical trajectory would remain the same. What would change is the substance that imposes the musical structure. In the first case it is immaterial in the form of written commands, in the second case it is material, taking the form of unwritten constraints.

The responsiveness of the designed environment is precise and the participants that contributed during the development stage expressed enthusiasm about the effectiveness of the interaction. They also voiced concerns regarding the control of the dynamics. The first author is currently exploiting possibilities for further articulation of the generated sonic material apart from timbral and structural modifications.

5. Summary

This article suggests an aesthetic for the design of interactive artworks. In order to better articulate, demonstrate and support efficiently the views and proposals of the report, three projects designed by the authors were presented and analysed. The projects bring together elements of interactive performance, sound installation and architectural design. The members of the public and the spectators have a significant role in these works, not only as participants but also as contributors since they deliver sonic material, which enters into the dialogue with the interactive systems prior to and also during the experience of the artwork.

We believe that this type of engagement is important for the success and the effectiveness of interactive art and music composition. Further research by other researchers and artists is therefore encouraged. More specifically, it is important to give consideration to the nature, content and context of the material provided by the participants. As mentioned before, the presented projects constitute part of a larger ongoing work and further improvement is planned for the near future.

So far the participants’ reflection on the project has been positive. For example, the other researchers and spectators on the MotionComposer project quickly embraced the key idea of participant content contribution. Similarly, the snowboarders engaged much more with the interactive snowpark when they had input their own sonic content. Not only did they enjoy affecting the musical composition through their movement, but they also supported the idea of the real-time interaction of movement and sound in snowboarding and similar sports. Further discussions are taking place with HO5, and ideas for similar projects have been mentioned.

References

Bongers, B. 2000. Physical Interfaces in the Electronic Arts: Interaction Theory and Interfacing Techniques for Real-Time Performance. In M.M. Wanderley and M. Battier (eds)Trends in Gestural Control of Music. Paris: IRCAM–Centre Pompidou.Google Scholar
Bullivant, L. 2006. Responsive Environmnets, Architecture, Art and Design. London: V&A.Google Scholar
Chadabe, J. 1997. Electric Sound: The Past and Promise of Electronic Music. Upper Saddle River, NJ: Prentice Hall.Google Scholar
Chadabe, J. 2000. Remarks on Computer Music Culture. Computer Music Journal 24(4): 911.Google Scholar
Dixon, S. 2007. Digital Performance. Cambridge, MA: The MIT Press.Google Scholar
Drummond, J. 2009. Understanding Interactive Systems. Organised Sound 14(2): 124133.Google Scholar
Fells, N. 2002. On Space, Listening and Interaction: Words on the Streets are These and Still Life. Organised Sound 7(3): 287294.Google Scholar
Fox, M., Kemp, M. 2009. Interactive Architecture. Princeton, NJ: Princeton Architectural Press.Google Scholar
Freyer, C., Noel, S., Rucki, E. (Troika). 2008. Digital by Design: Crafting Technology for Products and Environments. London: Thames & Hudson.Google Scholar
Fung, L., Debany, J. 2005. The Snow Show. London: Thames & Hudson.Google Scholar
Garnett, G. 2001. The Aesthetics of Interactive Computer Music. Computer Music Journal 25(1): 2133.Google Scholar
Hahn, T., Bahn, C. 2002. Pikapika: The Collaborative Composition of an Interactive Sonic Character. Organised Sound 7(3): 229238.CrossRefGoogle Scholar
Karandinou, A., Achtipi, C., Giamarelos, S. 2009. Athens by Sound. Venice and Athens: Futura and the Greek Ministry of Culture.Google Scholar
Kontogeorgakopoulos, A., Kotsifa, O., Erichsen, M. 2011. From Snow to Sound to Space to Music. Proceedings of the 2011 Sound and Music Computing Conference, Padova.Google Scholar
Krueger, M. W. 2004. Toward Interactive Aesthetics. Ars Electronica Katalog.Google Scholar
Lefebvre, H. 1992. The Production of Space. Trans. D. Nicholson-Smith. Oxford: Blackwell Publishers.Google Scholar
Levin, G. 2006. Computer Vision for Artists and Designers: Pedagogic Tools and Techniques for Novice Programmers. Journal of Artificial Intelligence and Society 20(4): 462482.Google Scholar
Mehta, R. 2011. Imaginational Map 2(R1, R3, R5). Leonardo Music Journal 21: 8485.Google Scholar
Morse, M. 2003. The Poetics of Interactivity. In J. Malloy (ed.) Women, Art and Technology. Cambridge, MA: The MIT Press.Google Scholar
Noble, J. 2009. Programming Interactivity. Sebastopol, CA: O'Reilly.Google Scholar
O'Sullivan, D., Igoe, T. 2004. Physical Computing: Sensing and Controlling the Physical World with Computers, Course Technology. Boston, MA: Thomson.Google Scholar
Rebelo, P. 2003. Performing Space. Organised Sound 8(2): 181186.Google Scholar
Reich, S. 2002. Writings on Music 1965–2000. Oxford: Oxford University Press.Google Scholar
Rowe, R. 1993. Interactive Music Systems: Machine Listening and Composing. Cambridge, MA: The MIT Press.Google Scholar
Schacher, J. 2010. Motion to Gesture to Sound: Mapping for Interactive Dance. Proceedings of the International Conference on New Instruments for Musical Expression (NIME2010), Sydney, 250–4.Google Scholar
Trier, L. V. 2003. Dogville. Icon Home Entertainment.Google Scholar
Wechsler, R., Weiss, F., Dowling, P. 2004. Eyecon: A Motion Sensing Tool for Creating Interactive Dance, Music and Video Projections. Proceedings of the Society of Study of Artificial Intelligence and the Simulation of Behavior. Leeds: SSAISB.Google Scholar
Winkler, T. 1997. Creating Interactive Dance With the Very Nervous System. Proceedings of the 1997 International Computer Music Conference, Thessaloniki.Google Scholar
Winkler, T. 1998. Composing Interactive Music: Techniques and Ideas Using Max. Cambridge, MA: The MIT Press.Google Scholar
Figure 0

Figure 1 Modules used in snowparks. From the above, a kicker, a wall, a bonk and a rail were used and placed carefully in the designed space composition, giving choice to the snowboarders.

Figure 1

Figure 2 The module arrangement start and end points and the different paths each snowboarder can choose to take.

Figure 2

Figure 3 Example of interaction.

Figure 3

Figure 4 The score/map, where lines and densities of lines and points represent the path one can take within the physical space.