NIME07 Concert 1

by

Frederick Loewe Theater, NYU

Disparate Bodies – Pedro Rebelo, Alain Renaud, Tom Davis
Disparate Bodies
is a network performance that explores multi-modal remote presence. The performance happens simultaneously in three sites (Belfast, NY and Stanford, California). The stage performance in NY features a laptop musicians and two Remote.bots. These are robotic entities that host the physical and musical gestures which are performed by the remote participants in the various locations. They consist of reflective elements which move according to the analysis of each audio stream and project glimpses of 3D rendered imagery around the performance space. The performance is based on the notion of performance entities as reflected by telepresence, robotics and sound systems. As such, each performer (local and remote) has a specific sound diffusion set up and a chosen 3D avatar which consists of abstract representations of movement and gesture. The performance is improvised with reference to strategies that intend to explore the relationship between sound and movement. The performance uses high quality audio streaming software developed by CCRMA and gesture, robotic and 3D rendering technologies developed at SARC. Instrumentation: Saxophones (Franziska Schroeder), Moustrap (Mark Applebaum), Piano/Computer (Pedro Rebelo), Remote.bot (Tom Davis) and Frequencyliator (Alain Renaud).
EyeMusic v1.0 – Anthony Hornof, Troy Rogers, Tim Halverson
EyeMusic is a project that explores how eye movements can be sonified to show where a person is looking using sound, and how this sonification can be used in real time to create music. An eye tracking device (the LC Technologies Eyegaze Communication System, http://www.eyegaze.com/) reports where the performer is looking on the computer screen, as well as other parameters pertaining to the status of the eyes. The eye tracker reports these data in real time to a computer program (written using Max/MSP/Jitter). The computer program generates and modifies sounds and images based on these data. While the eye is, in ordinary human usage, an organ of perception, EyeMusic allows for it to be a manipulator as well. EyeMusic creates an unusual feedback loop. The performer may be motivated to look at a physical location either to process it visually (the usual motivation for an eye movement) or to create a sound (a new motivation). These two motivations can work together to achieve perceptual-motor harmony and also to create music along the way. The two motivations can also generate some conflict, though, as when the gaze must move close to an object without looking directly at it, to set up a specific sonic or visual effect. Through it all, EyeMusic explores how the eyes can be used to directly perform a musical composition.
“Let’s Just See What Happens” for Long Tube and gestural interface – Brenda Hutchinson
website


Ménagerie Imaginaire – Zach Settel, Mike Wozniewski, Jeremy Cooperstock
http://www.electrocd.com/bio.e/settel_za.html
http://www.cim.mcgill.ca/~mikewoz/
http://www.cim.mcgill.ca/~jer/
Cyberdidj Australis – Garth Paine, Michael Atherton
For a recently designed telescopic didjeridu, Capybara and Wacom interface. The work explores the shifting fundamentals and overtones of the didjeridu and the possibilities of interactive synthesis. Traditional playing techniques are extended and morphed by and in response to electronic elaboration. The performers explore shifting dronal material, vocalisations, and additive rhythmic patterns to create dramatic shifts in timbre, density and pulse.
“NYZ” by Zanana – Monique Buzzarté, Kristin Norderval
http://www.zanana.org/


KARMA/live – Kurt Hentschlager
http://www.hentschlager.info/
KARMA is a “living” environment, a procedurally changing audiovisual installation. KARMA follows a non-linear progression in which moments of commotion are followed by periods of meditative peace. The installation comes alive via humanoid 3D figures suspended, often seemingly unwell, trembling and oscillating. Their movements emanate a drone-like sound-scape. The 3D characters are presented as puppets on strings, instilling them with a familiar yet ambiguous sense of human life, resulting in an indefinite dance of the almost living dead. Karma is incidentally the name of the physics simulation unit within Unreal Tournament, a multi player computer game. Karma in UT or similar “3D real-time engines” describes the simulation of physical laws like gravity & kinetic forces. In KARMA / cell, the motions and actions of the 3D characters synthesize through an additional sound software, a dynamic sound-track composed on the fly. The characters each are a discrete musical instrument and become, through their “motions and emotions,” part of a symphonic, multilayered body of sound. Both the realtime synthesis of the characters motions and their sounds build, within the scripted frame defined by the artist, an endlessly changing variety of emotional expressions.

Leave a comment