Generative Music Systems for Live Performance Andrew R. Brown, Toby Gifford, and Rene Wooller Queensland University of Technology, Brisbane , Australia. {a.brown, t.gifford, r.wooller}@qut.edu.au Music improvisation continues to be an intriguing area for computational creativity. In this paper we will outline two software systems designed for live music performance, the LEMu (live electronic music) system and the JamBot (improvisatory accompaniment agent). Both systems undertake an analysis of human created music, generate complementary new music, are designed for interactive use in live performance, and have been tested in numerous live settings. These systems have some degree of creative autonomy, however, we are especially interested in the creative potential of the systems interacting with human performers. The LEMu software generates transitional material between scores provided in MIDI format. The LEMu software uses an evolutionary approach to generated materials that provide an appropriate path between musical targets [1]. This musical morphing process is controlled during performance by an interactive nodal graph that allows the performer to select the morphing source and target as well as transition speed and parameters. Implementations included the MorphTable [2] where users manipulate blocks on a large surface to control musical morphing transitions. This design suits social interaction and is particularly suited to use by inexperienced users. The JamBot [3] listens to an audio stream and plays along. It consists of rhythmic and harmonic analysis algorithms that build a dynamic model of the music being performed. This model holds multiple probable representations at one time in the Chimera Architecture [4] which can be interpreted in various ways by a generative music algorithm that adds accompaniment in real time. These systems have been designed using a research method we have come to call Generation in Context, that relies on iterations of aesthetic reflection on the generated outcomes to inform the processes of enquiry [5]. References 1. Brown, A.R., R. Wooller, and E.R. Miranda, Interactive Evolutionary Composition: Musical morphing as a compositional strategy, in A-Life for Music: Music and Computer Models of Living Systems, E.R. Miranda, Editor. 2010, A-R Editions: Middleton, WI, (in press). 2. Brown, A. R., R. Wooller, and K. Thomas, The Morph Table: A collaborative interface for musical interaction, in Trans: Boundaries/Permeability/Reification. Australasian Computer Music Conference, A. Riddel and A. Thorogood, Editors. 2007, ACMA:Canberra. p. 34–39. 3. Gifford, T., JamBot. 2008: Brisbane. http://www.dr.offig.com/ 4. Gifford, T. and A. R. Brown, Do Androids Dream of Electric Chimera?, in Improvise: The Australasian Computer Music Conference 2009, A. Sorensen and P. McIlwain, Editors. 2009, ACMA: Brisbane. 5. Brown, A. R., T. Gifford, E. Narmour and R. Davidson, Generation in Context: An Exploratory Method for Musical Enquiry in The Second International Conference on Music Communication Science, C. Stevens, et al., Editors. 2009, HCSNet: Sydney. p. 7-10. 290