Machine Learning Assisted Art
Date Created
2016
Project Team Size
Solo Project
My first experiment with Wekinator ( by Rebecca Fiebrink ) and machine learning art making. Wekintor is a helper application that receives inputs, in this case handgestures seen by the leap controller and processes them as models, which are converted into open sound control (OSC) messages. I then feed the OSC messages into Reaktor's (commerical synthetizer) various input parameters to control the quality of the sound.
When learning to connect to Reaktor I wasn't able to parameterize various inputs individually, so I ended up changing multiple Reaktor inputs all at the same time. Some times I felt like I could have real authorship, sometime I felt like that my hand merely acted as a general mood maker. After spending some more time with Reaktor, I eventually figured out how to take Wekinator's OSC array and assign indexes to individual Reaktor parameters. This was solved by assigning values in the OSC section of the "connect" panel, vs. the using the more generic (and visible) OSC control tab. As an interface/experience designer, it is amazingly challenging to support multiple creative workflows, this project was a good reminder of that challenge.
This experiment was inspired by the work of Gene Kogan