Here’s new social media documentation from our times in prep at Access Space in February for the Hacking the Body 2.0 performances Flutter/Stutter and Feel Me:
As well as new video by Queen Mary Digital Media MA Students
January and February 2016 were very busy months for the Hacking the Body 2.0 project and the extended team.
From February 6th, Kate and dancers Phoebe Brown and Tara Baker rehearsed the dance pieces Flutter Stutter and Feel Me in Sheffield’s Access Space and Victoria Works studios with the new versions of the technology that had been prepared for the performances. Becky Stewart, working the construction, interaction and engineering of the devices, and Tara Baoth Mooney, who had designed, refashioned and hacked the costume garments together, both joined them on the 6th and 7th to try out both the costumes and devices together, so that they could be rehearsed with for the week. After Becky and Tara left, Kate worked with the dancers and the tech for the shows in London on the 16th and Sheffield on the 18th. There were several moments when the tech was not connecting to the server or the electronics were breaking and needed resoldering so the process was quite stop and start for the dancers, but they learned to ‘fake’ or understand the what the tech was supposed to do, in the occasion that for some reason during the performance something wasn’t exactly working as desired, they could still carry on with the performance without it being noticeable to the audience.
Some images from these rehearsals are here:
Camille joined Kate and the dancers for the remaining rehearsals from the 12th to the 15th. Kate and Camille were invited to and did a presentation at the February 2016 Dorkbot at Limehouse Town Hall on Feb 15th. The turn out for this Dorkbot was huge, about 130-150 and they had great interest and questions – images below.
They then made their way to the Watermans Arts Centre on Tuesday the 16th to set up and for the show that evening.
The performance notes & credits for the pieces were as follows:
Artistic Direction and Concept: Camille Baker and Kate Sicchio Electronics Design: Becky Stewart
Costume Design: Tara Baoth Mooney
Choreography: Kate Sicchio
Sound: Rick Loynes
Performance: Tara Baker and Phoebe Brown
Flutter Stutter is an improvisational dance piece that uses soft circuit sensors to trigger sound and haptic actuators in the form of a small motor that tickles the performers. Dancers embody the flutter of the motor and respond with their own movement that reflects this feeling. The sensors and actuators are bespoke designs by Becky Stewart and Tara Baoth Mooney that interact, influence and interrupt the dance and hack the body.
Artistic Direction and Concept: Camille Baker and Kate Sicchio Interaction Design: Camille Baker
App Design: Peter Todd
Costume Design: Tara Baoth Mooney
Choreography: Kate Sicchio
Music Composition: Tara Baoth Mooney
Music: Tara Baoth Mooney, Chis O Loughlin and Tobi Luck Sound Design: Camille Baker
Performance: Tara Baker and Phoebe Brown
Feel Me is a choreography for two dancers. The timing of the movement of one dancer is determined by the other dancer’s breath rate. Each dancer feels the other dancer’s breath in the form of haptic vibrations on their body. The breath is read through the modified OmSignal shirt, a commercial device that reads biological data from the body and transmits to your smartphone through a custom app developed by Peter Todd. By hacking this device we have created a new choreography for these performers from their own breath.
Hacking the Body Collaborators
Graphic Design: David Palmer
PR: Natalia Vartapetova
Video Production: Dann Emmons with Aaran Green and Richard Bolam
And many thanks to…
- Arts Council England for funding this work.
- OmSignal for support in using their SDK.
- Access Space, especially Jake Harries, John X. Moseley and Susanne Palzer, for allowing us to rehearse and perform in Sheffield.
- Waterman’s Art Center and Irini Papadimitriou for hosting our London performance.
- Victoria Works, Jon Chapman and David Palmer for help with flooring in Sheffield.
- New Malden Studios for hosting our initial R&D.
- Lee Paul Heron for allowing us to take over his dinner table for years while wescheme and plan our work.
It wasn’t a huge turn out, since it is outside Central London and a Tuesday night, but the space was great and we had great response and questions from the audience. Photos (& some mobile videos) from the London performance can be seen on twitter @hacking_body and here:
Then Kate, Camille, Phoebe and Tara Baker all travelled back to Sheffield on the Wednesday and had a rare night off, then set up for the next show on the Thursday mid-day. The space at Access was smaller, but we managed to get a dance floor in, lights and 36 tixs were sold – over the anticipated amount. The technology was still a bit finicky all day but the performance was much more exciting as there was really good energy and interest from the audience and it was a more intimate experience for them with the dancers right up close. There were excellent questions and very positive feedback about the project in general. See photos below.
Also the great craftswomanship and design work of Becky Stewart and Tara Baoth Mooney should be acknowledged (see credits above) for making it all happen – images of their work below:
Tara’s designs and reused hacked garments:
And Becky’s amazing soft circuit electronics work making the tickle motor and the laser cut conductive fabric ‘wires’, sewn conductive thread with poppers, with various microcontrollers in use with vibe boards connecting through a custom “internet of things” wireless network.
Overall it was a great success, in terms of collaboration, creativity, hard work and great team work! Next step is now to get all our documentation done, our Arts Council England report done as well as working on our ambition to take it allot New York by September.
Hacking the Body has been funded a 2nd time through UCA, from August to November to make new garments for performance based on the April research residency. This time we have brought in two collaborators, who Kate will discuss further next post, Becky Stewart from AntiAlias and Codasign as our electronics interaction designer and Tara Baoth Mooney as our ethical fashion/costume designer.
This round of the testing is now to work with and understand how to iterate a new set of garments/ costumes for dancers 1) based on on touch and haptic interaction for 2-way dancer sensing-actuation interaction, here’s some of the tech constructed by Becky:
Some movement images from yesterday with dancers:
The other thing we are doing, 2) is working with the vibe boards (in wrist/ ankle bands I made)that we worked with before but in interaction with a handmade stretch sensor Kate made to work with X-Osc, and then 3) working with 2 OM Shirts this time between dancers and with vibe actuation, but now a new iPad interface to connect the OM Shirts (OM kindly gave us their SDK) to X-Osc and the X-Osc to the vibe bands, so dancers can connect to each other, again for 2-way physiological sensing and haptic actuation interaction.
More images from today to come as well as some design images from Tara and Becky from the planning stages running up to this.
Today we worked with our own DIY devices rather than the ‘pre-fab’ sensors. We spent a good portion of the time workshopping the shirt with the actuators and then moved on the other soft circuit garments.
We covered a variety of activities – live coding the buzzers, buzzers triggered by a handmade sensor, and using data from the previous day’s work with the Polar and OmSignal shirt to create a buzzing pattern. In general, the vibe motors that are shown on each sleeve.
The live coding of the actuators worked really well. I created simple keypresses in Processing to send OSC to the xOSC micro controller and turn the buzzers on and off. We tried various scores such as switching between right and left sides or limbs, or stopping moving when sent a short buzz. The dancers really did respond as an output of the coding.
We also explored using the crocheted stretch sensor. This had some slight issues during the process, but also had some clear moments of connection between the two dancers. The issues were more technical aspects – the sensor did not work wirelessly as Processing would not read the bluetooth module. Also at times the computer stopped reading the USB device (which I have had happen with various Ardunio devices at different points, especially when powering off the computer). However, the breath rate of one dancer did in fact change the starting and stopping of the actuators on the other dancer when it was all working.
Finally, we used data sets we had previously collected and created arrays in Processing to trigger the buzzers. First we used a set from one of yesterday’s improvisations with the Polar heart rate. There were some issues with the rate at which the data was sent to the actuators – Processing went through the set very quickly, and usually was done with the first half of the set by the time the OSC connected. This made for some odd latency issues. However, finding a longer set did help with this issue. We also used a data set of respiratory rates from the OMSignal. This was more complicated to achieve as there is no API or web data with this device. Instead, Camille hand recorded all the data from that session in order to be input into the array. What was interesting was that there was a clear difference in how the garment translated the heartbeat versus the breath rate and the dancers were able to translate this into their movement.
Also, the Processing code is going up on github and you can find it here: https://github.com/sicchio/hacking-the_body
Here are more images Camille took of Kate helping dancers with the wearable she made with vibe boards and the wireless X-OSC board that communicates with Kate’s computer:
Kate troubleshooting the sketch she made:
Even though OM Signal said they would try get us some data to work with it was’t in time for our residency so Camille had to hand ‘collect’ the data from some of our sessions by trying to find the peaks and valleys of the heart rate and respiratory rate, so that we could use this data in the array that Kate wrote in Processing, here’s a screen grab of it:
Below are the dancers exploring movement with handmade wearable devices on the neck (with the Flexinol/Muscle wire and pressure sensor) and the arm (stroke sensor and vibration actuator):
All in all the day, was very revealing in terms of giving us some ideas of which direction to go in for future iterations of the project, papers and more funding. More will be revealed in future.
For now, the next step will be to create a custom sensing and actuation garment, maybe work with OM Signal as well for sensing, if they get their API done this year and wish to work with us, and develop some movement vocabulary with dancers that explore a few specific sensors or physiological data, likely with more X-OSC boards with vibration, as the response is very easy to work with in terms of response in movement, and possibly more with muscle wire for other haptic interaction.
Today was our first day at Siobhan Davies Dance Studios with our two lovely dancers Charlie and Emma. We worked with the pre-fab biosensors today and went through a series of dance improvisations and reflective writing exercises with the dancers to collect more information about how sensing informs how we think of our body and movement. We structured the improvisations off the conceptual framework we discussed earlier in the week – ‘unknown unknowns’, ‘known unknowns’, ‘known Knowns’ and ‘unknown knowns’ but also moving from sensing for self, sensing together, sensing with collaborators and sensing for corporations.
We started with just putting the sensors on the dancers and telling them to do an open score improv, where they could move however they felt.
We then told the dancers what we were sensing with the devices on their bodies and revealed each of the sensors one by one. Each new sensor was explored in a 5 minute solo dance improvisation.
We then asked them to explore a similar process, but as a duet, where they were asked to communicate what they were sensing (which sensor they were focussing on: heartrate, respiratory, calories/energy, strength, etc.) to the other dancer. We asked them do this 4 times to correspond with the of the 4 physiological types of data the sensors were collecting. This was expressed first as more of a contact improv interaction, and then later became more of a call and response score, as we asked them to each communicate a different sensors each time,but without the other dancer knowing which. It was a very interesting to watch how they moved further apart as they tried to use eye contact and watch each other to communicate the different data types/sensing rather than what they are used to doing with contact improv.
We then revealed more about the process and overall project to the dancers – what we were aiming to make with the sensors, why we wanted to collect data. This definitely changed their behaviour from the first open improv and the dancers were much more reserved in their movement, focusing much more on what ‘data might be useful’. The movement became more exercise-like, such as trying to tire themselves to manipulate their heartrate or breathing.
We also “lied” to them to see how they would respond to suggestion of what we intended to do with the data (i.e. make a particular type of performance), to see if they would change the way they danced and this indeed did happen, they wanted to help manipulate their imagined idea of a visual or musical outcome that we might make for the performance.
We then discussed that these were commercial devices and apps that were collecting their biosignals. This was the prompt for the final improvisation.
Below are some of the images from the Polar app that Camille grabbed from the sessions to show the various ways the company think people want to track their data. Sadly, because they won’t share their API with us unless we are an insurance company, fitness company or medical company/institution and pay for it. So we couldn’t use any date from them.
Below is the interface for the OM Signal which only had a ‘screen’ for each of the data types it was collecting: heart rate, respiratory rate, calories/energy, and strength.
Below are some images from Emma trying to express strength through movement.
Today felt like a lot was accomplished. We finished our DIY garments and wearables and planned activities for the next two days where we will be workshopping at the Siobhan Davies studios with two dancers. We also spent time collecting data from our commercial biosensing products and started our professional documentation process with the lovely Dann Emmons.
The bluetooth module on the breath (stretch) sensor.
This morning, Kate spent time still trying to get the serial data sent via bluetooth from the breath sensor to be read by Processing. It never happened but a USB version does work. When plugged in, Processing takes the serial data from the crotched sensor and then sends this vis OSC to the xOSC and the buzzer garment. A bit more tweaking of the limits of the serial numbers to trigger the buzzers and we will have one DIY garment on one dancer talking to another DIY garment on another dancer.
Here is the wireless buzzer garment and the plugged in stretch sensor.
We spent time this afternoon also collecting data from both the OMsignal shirt and the Polar belt. We created reports of respiratory rate, heart rate and calories burned. This data could possibly be used to create an array for the buzzers, with higher numbers vibrating the left arm and lower numbers vibrating the right (for example). We also plan on using these commercial sensors tomorrow when we work with the dancers and explore research questions in terms of data collection ethics and identity.
One thing that didn’t happen in Day 4 was the connection of Camille’s little muscle wire actuator to a sensor.
So for Day 5, (though she’s wasn’t feeling well), Camille connected a pressure sensor to the neck piece and connected the buzzer to the stroke sensor.
and reconnected the neck band to the 3.7 Lipo battery. The pressure sensor is very simple conductive fabric separated by foam and connect to (+) and (-) reconfigured jumper wires to have a more solid connection for the very sensitive and subtle muscle wire (like a delicate flower) 😉
The rest of the we worked with the OM Signal and Pulsar to get data for Processing with the dancers tomorrow. Camille had a conversation with the makers of OM Signal who said they would try get us the raw data to work with as Kate mentions above, to then find a way to work with it in real time with the dancers.
Pictures of that to come from our documenter Dann Emmons.
Day 4, we decided to focus one making our wearables – Camille’s a neck piece that when the shoulder pressure pad is pressed it will cause the Flexinol in the neck ruffles to straighten …
This seemed to take all day to construct, as the conductive thread was not conductive enough to carry the current and the piece had to be reconstructed many times and the nice ribbon has yet to be added to where the green felt is now holding together.
Also attachment to the battery has been a bit of an issue and it’s not connected to the pressure pad – tomorrow.
More importantly – the Flexinol is not moving / straightening noticeably enough for it to be obvious in practice – so we’ll see what comes of it with the dancers. If more time it would be good to have a Nitinol spring/coil which is stronger and more noticeable or maybe even a thicker piece of the Flexinol.
It had to be turned upside down to fasten the piece and make it lose enough that the organza ruffles can more independently.
Kate worked on her constructing her buzzer garment today. After spending a lot of time trying to get the sewing machine to work to no avail, hand sewing became the main opinion for putting the piece together.
Because the xOSC board is not sewable in the same way as something such as a Lilypad or Flora the design we developed was based around a velcro pocket which would house the micro controller and battery. The vibe motors (buzzers) are also not completely sewing friendly either so these were soldered to longer wires which were then threaded through jersey tubes and sewn to a shirt.
The use of the wires really dictated the design aspect of the garment and Kate decided to make it a bit whimsical in colour and creating a wavy line pattern. It is very costume like which works in our performance context.
The buzzers are sewn onto the right and left upper arms. The idea is that a score referencing direction could be devised and explored in various ways through this garment. More on these explorations later this week.
Today we both on ways that actuators can be incorporated into our work. We both worked on trying separate ideas today that we will continue to develop tomorrow and Wednesday before we begin our work with our two dancers.
Kate focused on using buzzers and the iOSC board. Using Processing she was able to create a way to live code the buzzers. In terms of performance, one idea is that the buzzing will become a score for the dancer to interpret and the choreographer will be able to program the score live. This is one idea we will explore Thursday/Friday.
Kate also worked on using her previously made crochet breath (stretch) sensor which is bluetooth enabled. The goal with this was that this could talk to the iOSC board via Processing and control the buzzers. So one dancer’s breath would be felt on the other dancer’s body. However, while Ardunio IDE was recognising the serial input from the crochet breath (stretch) sensor, Processing was not. This issue has yet to be resolved.
Camille worked on a few things today – first half of the day was frustrating trying to get the heat pad to work – which previously worked with the 9 volt battery, but once tried again today it wasn’t working and a 2nd one was tried but still didn’t work. New batteries were bought, case that was the problem but still no joy, so that was shelved.
What did work – after some research and experimentation – Flexinol or Muscle Wire (technical different but same idea) – which either contracts or in this case straightens out when current is applied. So the next stage is thinking of what can easily be done with it – some sort of a touch – response … Kate will post a video of it…
So for tomorrow the Flexinol will be combined with this nice organza fabric to make some sort of straightening/ contracting neck piece that loosens. Some useful resources found for this actuator https://spaghettionastick.wordpress.com/2011/10/17/nitinol-flexinol-muscle-wire-shape-memory-alloy/
What is also working and was an easy add-on to my already made stroke sensor – a buzzer that can be sewn on, that Kate brought from NYC contact who set up Invent-abling – so when someone touches or strokes it the buzzer will vibrate.
Here’s Camille making a plan for day 4 (above) and below is some bits that will be put together so the black press sensor in the bottom left can cause the motor, top right to pull up a dancer’s top – just a bit 🙂
Some of our project for sensing and actuation for working with dancers on Thursday – to complete tomorrow.
Today we decided to put aside our pre-made corporate wearables after we couldn’t get them to work in our Pilates class this morning (short-lived batteries already expired by today so we couldn’t get anything from them today), and come back to them after charging them and having a response from both companies. We do have a Skype meeting with the OM Signal people to give them feedback and ask how we can get raw data online or reports of data that we can use for our dance experiments.
Instead, today we worked with the x-OSC wireless device to send sensor data between it and the computer – as recommended by friend Becky Stewart. So we spent much of the afternoon trying to get it to get meaningful signals from any input sensor. While it connects right away and uses its own web-based interface and as can be seen below, lighting up our LED right away, we couldn’t get it to show input values from any sensor for sometime.
Kate, a more confident coder, tried getting any input signals via Isadora and Processing, while Camille researched other useful code and help files to help us progress (sadly, the x-OSC website doesn’t have much help except a very basic manual, missing many basic set details).
Luckily, the online tutorials on OSC p5 on Codasign’s website and the OSC P5 libraries and tutorials at http://www.sojamo.de/libraries/oscP5 helped us get the basic set up registering sensor values in Processing eventually and then start to see input values from our handmade sensors below.
Then we spent some time trying to figure out how we might understand the values from the onboard accelerometer, gyroscope and magnetometer, that could also be interesting to work with, with our dancers coming to work with us later in the week. However, we’ve had to send a message to board developer to see if he can help us with those sensors, since none of the videos he has on his site really address them in a simple way.
Tomorrow the goal is to design some additional handmade sensing and actuating sensors: buzz/vibrate, heat, and squeeze (using muscle wire/flexinol) to add into our garments for our dancers to explore with, so tomorrow is more a making day rather than set up and troubleshooting like the last two days.
Today was the first day of Camille working with Kate in London. We are in residence at UCA Epson for the next few days, followed by two days at Siobhan Davies Dance Studios.
Our goal is to make performative garments with biosensing devices (both off the shelf and DIY) for choreographic exploration.
We are working with two themes that overlap in our explorative process – data collection ethics and identity. We are interested in ‘known knowns, known unknowns and unknown unknowns’. This will become a basis for instructions when we work with our two dancers. It is also what we will explore with different levels of sharing data from biosensing devices in terms of:
-self/internal (what does this data tell us about ourselves?)
-shared with other performers (what does this data tell us about the other?)
-shared with other collaborators (what does this data tell us about the composition/situation)?
-shared with biosensor corporations (what data is being shared?)
-shared with other corporations (who else is getting our data?)
Today we started with two off the shelf biosensing products. The OMSignal shirt and the Polar H7 Heart Rate Sensor. Just to start with each product we had to give the companies various pieces of information about us including: body stats (weight, height), age and contact details.
Kate wearing the Polar H7 Heartrate monitor
Camille wearing the OMSignal Biosensing shirt – small(mens)
Both are designed to interface with phone apps. This means to use in performance we need to find ways around this. One idea is to use a web called Fluxstream which allows for various fitness trackers to be presented on the web. This is taking some time to install and we haven’t quite completed this. We have contacted both companies about getting developer APIs and are waiting on this approval. We are also exploring ways of collecting the data to use in ‘non real time’ within performance settings.
Another interesting thing about OM Signal shirt is it need to have some moisture (we had to buy some Aloe Vera for the contacts to register my heart rate) and this small mens was a bit small for Kate’s ribcage and worked a bit better with the Aloe Vera and fit a bit better on Camille. So far just standard info – though we’ve not done any physical work – tomorrow we will wear them to Pilates.