Introducing Silence Suit

first sensors

Meditation stool with soft sensor and heart-rate sensor

For over a year I’ve been working on a meditation wearable. It measures biometric and environmental input. Its goals is to use the measurements to improve your meditation and use the data to generate artistic visualisations. The wearable is part of a bigger project Hermitage 3.0, a high-tech living environment for 21st century hermits (like me). Now that the wearable project is taking shape I’d like to tell a little about to process of creating it.

The sensors
I started with a simple but surprisingly accurate heart-rate sensor. It works with the Arduino platform. It uses an ear-clip and sends out inter beat intervals and beats per minute at every beat. With some additional code in Processing I can calculate heart-rate variability. These are already two important measures that can tell a lot about my state while meditating. Then I added galvanic skin response to measure the sweatiness of my skin, a nice indicator of stress or excitement. I added an analogue temperature sensor that I put on my skin to measure its temperature. Low skin temperature also indicates a state of relaxation. I also made a switch sensor that is attached to my meditation stool. Sitting on it indicates the start a session, getting up marks the end.
All sensors were connected with a wire to my computer but the aim was, of course, to make it wireless so I’d be free to move. But I could already see day to day changes in my measurements.

A little help from my friends
As things were becoming more complex I posted a request for help in a Facebook group. A colleague, Michel offered to help. We first looked at different ways to connect wirelessly. Bluetooth was a problem because it has very short range. Xbee also wasn’t ideal because you need a separate connector. We also made a version where we could write to an SD card on the device. But this of course doesn’t offer live data which was crucial for my plans. We finally settled for WiFi using the Sparkfun Thing Dev ESP8266. We were going to need a lot of analogue pins which the thing dev doesn’t offer. So we used the MCP3008 chip to supply 8 analogue i/o pins.

Overview of all the sensors

Overview of all the sensors

More is more
We could then increase the amount of sensors. We’ve added an accelerometer for neck position, replaced the analogue skin temperature sensor with a nice and accurate digital one. Around that time a wearable from another project was finished. It is a vest with resistive rubber bands that measures expansion of the chest and belly region. Using the incoming analogue values I can accurately calculate breath-rate and upper and lower respiration. Then it was time to add some environmental sensors. They give more context to for example GSR and skin temp readings. We’ve added room temperature and humidity, light intensity and RGB colour and air flow.

Vest with sensors

Vest with sensors

Environmental sensors

Environmental sensors

Seeing is believing
From the start I’ve made simple plots to get a quick insight into the session data. For now they don’t have an artistic purpose but are purely practical. At this point it is still essential to see if all sensors work well together. It’s also nice to get some general insight into how the body behaves during a meditation session.
Data is also stored in a structured text file. It contains minute by minute averages as well as means for the whole session.

Session data plot with legend

Session data plot with legend

I’ve also made a Google form to track my subjective experience of each session. I rate my focus, relaxation and perceived silence on a 7 point likert scale and there is a text field for a remark about my session.

Results from Google form: very relaxed but not so focussed...

Results from Google form: very relaxed but not so focussed…

Suit
I used the vest from the other project to attach the sensors to. But last week costume designer Léanne van Deurzen has made a first sample of the wearable. It was quite a puzzle for her and her interns to figure out the wiring and positioning of every sensor. I really like the look of this first design. It’s fits with the target group: high-tech hermits and it also is very comfortable to wear.

Upper and lower part of the suit

Upper and lower part of the suit

Back with extension where soft sensors to detect sitting will be placed

Back with extension where soft sensors to detect sitting will be placed

The future
The next step will be adding sensors for measuring hand position and pressure and a sound-level sensor.
Then we will have to make the processing board a bit smaller so it can fit in the suit. We can then start integrating the wiring and replacing it by even more flexible ones.
When all the sensors are integrated I can really start looking at the data and look for interesting ways to explore and understand it.
I’m also looking for ways to fund the making of 15 suits. That way I can start experiments with groups and find ways to optimise meditation by changing the environment.

@cocoon

Last week I joined the Seeker project, a co-creation project by Angelo Vermeulen exploring life in space(ships). It’s been really inspiring so far as living in space combines knowledge from the fields of architecture, sustainability, farming, water and power supply and Quantified Self. The latter being my addition, of course :)

Together with architecture master students from the TU/e I’m looking into the interior of the ship which will be based on two caravans. As life in a spaceship is crowded and noisy my aim is to make a quick and dirty prototype that will:

  • detect the noise level
  • detect the users’ heart-rate
  • promote relaxation by pulsating light and sounds from nature

Noise level, heart-rate and soundtrack will (hopefully) be send to the base-station so people have a life indication of the status of the ship and the people living in it.

This is the sketch:

Today I’ll have a talk with the technicians for MAD to see what is possible. I’m thinking of using the following sensors:

Heart-rate: http://floris.cc/shop/en/sensors/731-pulse-sensor-amped.html

Noise level: http://floris.cc/shop/en/seeedstudio-grove/239-grove-sound-sensor.html

Playing sound: http://floris.cc/shop/en/shields/155-adafruit-wave-shield.html

The cocoon itself will be the good old paper lamp:

science hack day

Last weekend I took part in the first Dutch Science Hack Day in Eindhoven. I had posted my idea on the forum and was hoping for a nice group of experts to work with. The idea was to create a mood enhancer. When you’re sad it could help you be become happy again. When your happy you could help others who are sad to improve their mood or support them. It will consist of a) mood detection, b) mood changing, c) mood sharing.

On the forum one participant, Siddhesh (PhD student TU/e), had already expressed his interest. After I’d introduced my idea I was joined by Leonid and Huang-Ming both students at industrial design at the TU/e and Ketan also a PhD student at the TU/e. We were later joined by Iwan an interior architect. So we had a nice mixed group from different countries.

I was pleasantly surprised at how swiftly we decided on the use case and technologies to be used. Everybody was very eager to start to work and do so in their field of expertise. We decide to use two hardware sensors (heart-rate and skin conductance) to provide the level of arousal and one on-line software sensor, face.com, that uses portraits to classify moods. The heart-rate sensor was already finished because we could reuse it from another project by Leonid and Huang-Ming but there was still a lot of work to be done.

For output we wanted to do something with light and sound as they are the least obtrusive when you’re working. We wanted to work with a physical object to display the mood and also enhance it and to use Twitter to share moods. We had difficulty to decide if the visualisation should just be personal feedback or should also display a friends’ status. As time was limited we decided on just feedback. The application moved from enhancement to awareness of moods which was enough for just one weekend.

I took on the task of implementing the valance through the face.com API. It would all have to be done in 24 hours so that was pretty challenging. Registering at face.com was easy. The API was pretty straight forward and only later I discovered the it could not just detect smiling or not smiling but a whole set of moods: happy/sad/angry/surprised/neutral value and confidence, based on the expression of the person in the the photo. There’s was also a lot of other info to be gotten from the image using the faces.detect method, the accuracy of the results was surprising, even under less favourable light circumstances. The main hurdle was uploading an image for face.com and keeping it in sync with the rest of the application. In the end we used the local Dropbox folder to store the web cam captures and letting Dropbox sync with the web version, the URL and file name are used in the face.com request.

The others worked on building the Galvanic Skin Response sensor, the lamp object and the integration of the heart-rate sensor and software for the new purpose.

We used Processing as the main language to read the values from the sensors, connect to the web and drive the output. The sensors write their current values to a file separate and one script reads all the sensor input to generate a visual output, change the colour and position of the lamp and change the sound.

The main application shows a changing, interactive landscape of lines and circles. The   amount of arousal the corresponding valence determine:

  • The position and colour of the circle. When you click on a circle the web cam image and heart-rate value is shown, allowing you to trace back how you felt during the day.
  • The position of and colour of the light object
  • The sound being played

Iwan made a nice presentation and we were finished just in time. The presentation went well and the jury picked our design as the best in Overall happy living category! That was just the icing on the cake of great and inspiring weekend.

Science Hack Day Eindhoven 2012 winners compilation from M.A.D. ART on Vimeo.

Being one of the winners we also presented at the Internet of Things event at the High Tech campus in Eindhoven.

MADlab kindly supplied me with an artist residency to cover expenses.

Ups and downs

measuring electricity

Paul explaines how to measure the power used for the heating of the fabric

I’m having a hard time with my wearable. There are different areas where I’m experiencing problems:
- The thermochromic ink deteriorates very fast. Under influence of UV radiation for which I haven’t been able to find a solution. The amount of electricity it needs to heat up is very much depending on the air temperature. Also the width of the strip of fabric (this has to be equal for the whole length) is crucial for successful colour change. So there’s a lot of testing ahead before this will work well. I have switched from reflective strip to using reflective fabric, I wonder if the performance will improve.

bt-arduino

Arduino talking to Nokia

- I’ve been talking to a lot of experts lately but Friday last I got a pretty disturbing e-mail from the RIVM which is a leading Dutch centre of expertise and research, it advises and supports policy-makers and professionals in public health and environmental areas. The gases I’m measuring are a good indication of air quality but the sensors I’m using aren’t sensitive enough. I was happy to find gas sensors in the first place but now they appear to be worthless for my purpose.
- Programming the Bluetooth connection has made some progress. I can now connect with the Arduino board and send and receive bytes. Now I’m at the final stage of sending all of my sensor data in a single string to the Nokia. I also have to find out how to let Python check for incoming serial data continuously. Some sort of event listener.
- I keep having trouble using the internal GPS. It seems to break down after I’ve used it once. Only a restart will make it work again.
- On the bright side I’ve made two dummies for my vest. Saturday I worked together with my tailor on my second dummy, using the actual fabric. This gave me a lot of insight in what I want. Discussing it with AnnaMariaCornelia it became clear that I have to take a radical turn to make my vest look like true work ware. I’m really looking forward to designing my vest and make it look sturdy and cool to wear.

dummy_2

The second dummy which at this stage looks too much like an apron

Productive

Nice vest with functionality

Nice vest with functionality

My aim for this workshop (19-9-09) was to put the three sensors together but we also had a have a meeting with AnnaMariaCornelia concerning the design. The rest was kind enough to let me meet first. I showed the various designs I did with the safety vest as basis. The vest was present from the start of this project because of it’s connection with safety and the clean way it signals a simple message. So Anna suggested I stick to those basics instead of just decorating the vest. She’s absolutely right of course. I should just use the esthetics and functionally of the vest to inform about air quality. Research has shown that the amount and broadness of the reflective stripes indicate higher safety. So I can vary the width of the strips to indicate more or less pollution, resulting in more or less visibility. I’ll use strips of conductive fabric to heat the paint. That should give a nice, clean result.
My research also showed that there are some very functional coats with different pockets. I want to use them in my design to store and build in the boards, buttons, phone and power supply.

The building of the board with sensors went pretty smoothly. That’s no surprise with an expert like Paul standing by my side. So now I’ve got a great little board with all the sensors powered at their own voltage. We also build in a switch to turn all the sensors on and off at the same time to increase their lifetime. We did discover that the ozone sensor is influenced by humidity. So I ordered a humidity sensor yesterday so I can at least measure it. How to interpret the data is a completely different question, the next problem to tackle.

Three sensors integrated into one board

Three sensors integrated into one board

Art Pollution Kit

Michal Kindernay and Gívan Belá performing

Michal Kindernay and Gívan Belá performing

Last night I went to Brussels to watch the performance with the Art Pollution Kit. This is a project by Michal Kindernay and Gívan Belá (Guy van Bellen.) They are making a cheap or DIY kit for measuring different kinds of environmental values. The prototype they showed last night measured temperature, light, humidity and noise. They will be extending it with other sensors like the gas sensors I’m using.
The prototype itself was very basic but what I found very interesting was their thinking about pollution. They asked themselves what is pollution in an image, in sound? They found that the degradation of the colour spectrum is an equivalent for visual pollution. Michal has been busy with visualising pollution in a very direct way: by changing the actual pixels of the image of the location where the pollution is. The image is being polluted by the data. Twenty-four hours of  data can be displayed in the image which changes over time. When there’s no pollution you get a perfectly clear picture. So the visual and environmental data are merged into one image which is very powerful. In the performance the same data was used to generate sound. This way the kit can be used as an instrument. The plan is to distribute the kits to artists at different locations, gather all the data and work with the data from the networked kits.

Michal also showed me some other visualisations of (noise) pollution he’s working on. It involves real time erasing of unwanted object (like cars) from camera recordings. That way you get a completely clean street view. The streets only gets filmed when there’re no cars which shockingly meant only a couple of minutes of recording during hours of filming. We share the same irritations and fascinations and might work together in the future.

Testing 1, 2, 3…

Thursday I did it test with all my sensors for 24 hours. It went rather well but there art two main points of attention.

Mathe, Richard an me working (no were not terrorists)

Mathe, Richard an me working (no we're not terrorists)

1. The stealth cam. It has limited capabilities in storage capacity and battery capacity. When I set the resolution to low it should be able to hold 350 pictures. This is not the case, it holds 300 max. Another drawback is that it’s very hard to make out to which resolution the cam is set. So I accidentally set it to high. As a result I couldn’t complete the whole day because I was with friends and I forgot my cable so I was stuck for the evening with a full camera. But I did get some nice shots (view right.)

It takes just one AAA size battery. I used up only 2.5 in 13.5 hours so that was fine. I do have to carry a spare one with me where ever I go. I bought a quick, one hour charger for four AAA batteries so I will never go without.

Tiny cam can be attached with velcro to a card

Tiny cam can be attached with Velcro to a card

I’ve found a very nifty  solution for wearing the camera. I can pin or clip it to every sort of clothing. The images are still shaky but he, life is bumpy ;) To my relieve people didn’t look too suspicious when I passed them, I’m still a bit embarrassed though.

It was quite a puzzle to see where I missed pictures. (In the future I will note exactly when the camera stopped and restarted.) When I empty the storage it takes at least a couple of minutes to store them on hard disk. So there are gaps. I filled the gaps with duplicate pictures but I’m still files short…? I think the 60 second laps probably isn’t exactly 60 seconds so after a couple of hundred pictures you start noticing that. I’ll have to wait and see how that works out in the app.

The main challenge is dealing with the gaps in the data from all the devices.

2. My Suunto watch does a great job during the day but when I sleep it loses connection with the skin and it stopped at 4 a.m. Which was still almost all night. But it will have to do better when the project actually runs. I think I’ll buy some medical tape to fasten it for the night.

I don’t think it can log 24 hours of data, but it’s close. To be on the safe side I’ll store the data somewhere in the middle of the day.

Logging my activities was a very mindful thing to do. I thought it would irritate me but it made me quiet and alert at the same time. May be using pencil and paper helped there too.

Meeting 04/04/08 @ Anjas’ studio

Meeting at Anja’s studio. 04-04-08 from 11 to 15.30. Present: Anja, Barbara (till 14.30), Danielle.

Barbara has done research into the chakra’s but found the possibilities limited (one line) and the appearance too esoteric. So we’ll have to choose the area’s we want to work with ourselves. We may use some of the underlying theory and the colours might be useful too.

Barbara also did research on heat sensitive paints:
+ They react quickly to warmth (fast feedback)
+ You can mix colours and are thus able to make all the colours you want
- No matter what colour, on reaction to heat it always turns white.
We could experiment with using different colours of shirts so we have three colours to work with: the paint colour, white and the colour of the shirt.
It might be interesting to look into compact heating and cooling (Peltier junctions, restrictive coil – we’re not sure what that means yet) elements to have more control on the saturation of the colours. Biofeedback could drive these elements.
Luminex is cool stuff the drawback is that it’s only visible in the night.

We want to research the possibilities of using a map or grid to make a kind of guide that can let people know where to touch or press when having certain emotions or ailments. This should be a playful design.

As far as output/display we came up with the following list: LED, LED wire, Luminex (conductive fyber), heat sensitive paint, e-paper, light emitting wallpaper. We want to see if we can meet the producer of the Iliat based in Eindhoven. (View ‘In de ban het ding’ link).

We want to investigate three sort of prototypes:
1) A shirt with heat sensing paints for the area’s to interact with. Each of us is going to formulate some thoughts on this and make a design. The next meeting will produce these shirts an wear them once they’re finished to see what the reactions are.
2) A shirt with a map/guide and one sensor or a couple of switches which uses LED’s as output.
3) A shirt with a map/guide and one sensor or a couple of switches which uses Luminex as output. There is an Italian company (view link) that sells towels in different colours which we’re going to buy and use in our prototype.

With regard to the sensors we think we’ll get the most useful data when we combine a heart-rate sensor with a body motion sensor. This way we can rule out false data because increasing heart-rate combined with body motion would mean activity and not stress.
We want to look into energy point sensors which are used in traditional Chinese medicin.
We’re worried about the integration of the sensors with the wearable. We don’t won’t the technique to be separated from the garment. Nor do we want to be too much of a hassle to put on. A nice looking ear clip or ring in line with the design for measuring heart-rate would be a solution. If we want to use a breast belt ;-) we could use a bra or top integrated into the garment. For men we would have to think of another solution…

We replied to Frank Kooi’s e-mail asking some new questions.
Barbare is going to London on an internship with loop.ph who do lot’s of interesting things with light, she’ll keep us updated.
Anja and Danielle might be partaking in the Mediamatic e-Fashion day with Leah Buechley providing that we can work on our project and discuss it with her.

The next meeting is 12-05-08 in Breda at Danielle’s place