Today we worked with our own DIY devices rather than the ‘pre-fab’ sensors. We spent a good portion of the time workshopping the shirt with the actuators and then moved on the other soft circuit garments.
We covered a variety of activities – live coding the buzzers, buzzers triggered by a handmade sensor, and using data from the previous day’s work with the Polar and OmSignal shirt to create a buzzing pattern. In general, the vibe motors that are shown on each sleeve.
The live coding of the actuators worked really well. I created simple keypresses in Processing to send OSC to the xOSC micro controller and turn the buzzers on and off. We tried various scores such as switching between right and left sides or limbs, or stopping moving when sent a short buzz. The dancers really did respond as an output of the coding.
We also explored using the crotched stretch sensor. This had some slight issues during the process, but also had some clear moments of connection between the two dancers. The issues were more technical aspects – the sensor did not work wirelessly as Processing would not read the bluetooth module. Also at times the computer stopped reading the USB device (which I have had happen with various Ardunio devices at different points, especially when powering off the computer). However, the breath rate of one dancer did in fact change the starting and stopping of the actuators on the other dancer when it was all working.
Finally, we used data sets we had previously collected and created arrays in Processing to trigger the buzzers. First we used a set from one of yesterday’s improvisations with the Polar heart rate. There were some issues with the rate at which the data was sent to the actuators – Processing went through the set very quickly, and usually was done with the first half of the set by the time the OSC connected. This made for some odd latency issues. However, finding a longer set did help with this issue. We also used a data set of respiratory rates from the OMSignal. This was more complicated to achieve as there is no API or web data with this device. Instead, Camille hand recorded all the data from that session in order to be input into the array. What was interesting was that there was a clear difference in how the garment translated the heartbeat versus the breath rate and the dancers were able to translate this into their movement.
Also, the Processing code is going up on github and you can find it here: https://github.com/sicchio/hacking-the_body
Here are more images Camille took of Kate helping dancers with the wearable she made with vibe boards and the wireless X-OSC board that communicates with Kate’s computer:
Kate troubleshooting the sketch she made:
Even though OM Signal said they would try get us some data to work with it was’t in time for our residency so Camille had to hand ‘collect’ the data from some of our sessions by trying to find the peaks and valleys of the heart rate and respiratory rate, so that we could use this data in the array that Kate wrote in Processing, here’s a screen grab of it:
Below are the dancers exploring movement with handmade wearable devices on the neck (with the Flexinol/Muscle wire and pressure sensor) and the arm (stroke sensor and vibration actuator):
All in all the day, was very revealing in terms of giving us some ideas of which direction to go in for future iterations of the project, papers and more funding. More will be revealed in future.
For now, the next step will be to create a custom sensing and actuation garment, maybe work with OM Signal as well for sensing, if they get their API done this year and wish to work with us, and develop some movement vocabulary with dancers that explore a few specific sensors or physiological data, likely with more X-OSC boards with vibration, as the response is very easy to work with in terms of response in movement, and possibly more with muscle wire for other haptic interaction.