Kawasaki Plant is a new strand of performance research exploring the use of AI technologies in performance. Emerging from an initial collaboration between myself and sound engineer, modular synthesist & audio coder Cherif Hashizume, in which we worked to design & build a computer program that enabled me to live-edit pop songs with my body movements (a body, see link).
I will use this period to design & build new audio-visual programs based on machine learning principal primarily developed for advanced artificial intelligence that are responsive to my body, creating a soundscape & 360o projection (live-stream of video from inside my body) that can fill a space. As well as responding to my body the audio & visuals will be light responsive, so they can grow in intensity during a durational performance and thus reach there loudest point at nightfall.
This work is focused on how the live body interacts with technology and how I can use technology to become bigger than my physical body. Using bodily variables of heat, sweat, speed & heartbeat to create & control audio-visuals. Multiple sensory equipment will be employed to track and analyse the surroundings so the program can react to not only to the performer but the environment itself such as temperature, humidity, brightness and so on. The project explores advanced application of artificial intelligence and its non-linear response to external inputs.
Aside from these explorations into using tech through my body, I also have a specific performance concept I would like to develop in conjunction with this. For this new work I will be continuously running & falling over an extended duration.
I am interested in power structures and the relationships between machine & creator in relation to historic & contemporary systems of patriarchy & capitalism.
Kawasaki Plant was the first robot to kill a human.
You can see some of my research notes & work-in-progress via my blog.