Kawasaki Plant was a performance exploring the use of AI technologies in performance.  Emerging from an initial collaboration between myself and sound engineer, modular synthesist & audio coder Cherif Hashizume, in which we worked to design & build a computer program that enabled me to live-edit pop songs with my body movements (a body, see link).

 

The work comprised of new audio-visual programs based on machine learning principal primarily developed for advanced artificial intelligence that were responsive to my body, creating a soundscape & 360o projection (live-stream of video from inside my body) that can filled the space. 

 

This work was focused on how the live body interacts with technology and how I can use technology to become bigger than my physical body. Using bodily variables of heat, sweat, speed & heartbeat to create & control audio-visuals. Multiple sensory equipment to track and analyse the surroundings the program can react to not only to the performer but the environment itself such as temperature, humidity, brightness and so on. The project explores advanced application of artificial intelligence and its non-linear response to external inputs.

 

The work comprised of myself & a group of performers continuously running & falling over for an extended duration, ranging from 2 - 5 hours. 

 

I am interested in power structures and the relationships between machine & creator in relation to historic & contemporary systems of patriarchy & capitalism.

 

Kawasaki Plant was the first robot to kill a human.

You can see some of my research notes & work-in-progress via my blog.