Bring things together

Last week Danielle had her mid-term evaluation from WEAR Sustain. I found it interesting to hear about it because as a part of my internship I am also interested how the dependency on a grand feels. She had to show the progress of the project so that the financier can decide if there are enough steps made to further sponsor it. Danielle was very aware of the fact that the project cannot proceed if the jury does not see sufficient progress.

So the days before, the deadline influenced the workflow. We worked extra hard to bring things together so that Danielle could present the suit with all the sensors detecting data while presenting. Everyone invested extra much time and energy to reach the deadline. But as you can imagine we did not reach our goals in one step.

Vera de Pont came in the weekend to fit the suit and to adjust the last things. For example she had to fit the breathing sensor which has to sit very tight to detect the data correctly. Danielle and Vera had to add all the cabling and every sensor to the suit to make the suit a Silence Suit. They worked the whole day until late in the evening to bring it all together.

Danielle wearing the suit including the cabling and the sensors

Danielle wearing the suit including the cabling and the sensors

Silence Suit side view

Silence Suit side view

Silence Suit back

Silence Suit back

The next day Simon de Bakker from Proto Space came to install the software on Danielle’s laptop. So that she could start with the baseline measurements and present the whole suit in action to the jury of WEAR Sustain. But that seemed more difficult than expected. The software could not work be installed the laptop. Simon had to create a Windows installation so it could run on her laptop. It cost much time but after two days it was possible to create a profile, connect to a suit, to start a session and fill in the questionnaire. The only problem was still that the detected data were not right. Only the wind sensor and the breathing sensor worked acurately. The other sensors also send data to the server but they were incorrect.

Selecting a user and suit on the MLEK data server

Selecting a user and suit on the MLEK data server

Data coming in on the MLEK data server

Data coming in on the MLEK data server

Simon is still working hard to solve the problem. So Danielle could not present all the working sensors to the jury last week. But we are optimistic that she may continue her project because for the rest we do fulfil the milestones she formulated in the beginning. We will keep you up to date! :)

Trying to get the complexity of the project

For me the project becomes more and complex. So in this blog post I will try to give you an overview of different aspects of the project around the Silence Suit, the correlations of these aspects and my difficulties with this complexity.

Developing the button further
To keep it concrete, I will first explain to you what we did today. I helped Danielle to develop further the button I introduced to you last time. We learned that it is not necessary to use this button to mark an exceptional experience while meditating. But we want to use it to mark a moment when something in the environment changes. Think about the changing of light or loud noise. We want to mark the moment in your timeline when something happens which influences your meditation session so you can see later the impact by analysing your data. So we try to include the button in the suit as comfortable as possible. First we thought about a glove, but now we found out that a ring should be better. The button has to be as small as possible. We also think about assembling the button like we made the sitting sensor with conductive fabric. That would make it much more comfortable if there is no hard piece on the ring.

button ring - to mark negative influences of your surroundings

button ring – to mark negative influences of your surroundings

Baseline measurements
But why is it not necessary to mark an extraordinary positive experience anymore? That has to do with the artificial intelligence of the software. I find it difficult to understand how it works precisely. But we learned from the data scientist that the software will learn itself what a good meditation session is. To make it a learning system you need many baseline measurements. A baseline measurement means that you track your meditation session without any actuation. Before and after meditating you have to fill in the questionnaire developed by Danielle in consultation with different experts. She has formulated many questions which are relevant. By detecting a minimum of 30 sessions in combination with this questionnaire the system can start to figure out which aspects are the most important to make it a good session.

questionnaire - by Google Forms

questionnaire – by Google Forms

Between scientific research and design thinking
It is important that the data are correct so you can utilize them in a scientific way. That is among other things one aspect which makes the project so complex. On the one hand it is a scientific research. On the other hand the suit arises from a design mentality, which intends to make it as chic and as comfortable as possible at the same time. Otherwise the user will not use it for his own scientific research. Furthermore, Danielle has her vision as an artist to bring all these disciplines together to create a completely new and unknown outcome. The Silence Suit is actually a small part of the bigger vision of Hermitage 3.0. But how does Danielle handle the complexity of her vision? I think one aspect is among others that she assumes different kind of roles in the project. At one point she assumes the role of the researcher and at another point she really thinks as a designer. That makes it possible to keep the complexity. To deepen different aspects, she asks different kinds of experts for help.

Costumer journey, flow charts and wire frames
That is also how she worked to develop the wire frames. We have to think about what the screen will look like, so that the user will know how to use the database for his own interests. First, Danielle assumed different kinds of roles as users. She developed a costumer journey for each user. From this costumer journey an expert has created a flow chart. That brings the costumer journey to a more abstract level. The flow chart serves as an intermediate step from costumer journey to the wire frames. The wire frames will finally indicate the functionality of each screen, so that every user can use it for his own interests.

Flow chart - one user case

Flow chart – one user case, by Anne van den Heuvel

So as you can see there are many things in development. Many things are going well and every team member is working hard to bring the Silence Suit to a higher level. Of course, there are still many things which have to be explored, but that keeps it interesting. I find it really nice to see that after so many organizational problems in the beginning, we really make great steps to realize a meaningful research project. Next week, we will visit the DesignLab Twente where we will meet Vera de Pont to bring the electronics and the new design of the suit together. I am really excited about that meeting and I hope to give you another inspiring insight in our project next time.





Exploring a new button

I honestly have to say that the project seems to go really well. I enjoy every day of my internship because every week there is something new to develop. Every time I am excited what comes next and which idea’s will be altered and which will be completely new. As every week I will give you a little impression of what has happened recently.

The design of the Silence Suit is in development. Vera de Pont works hard to optimize the sketches and to start sewing as soon as possible. This week she came along to show some different cloths. She also presented her newest sketches of the suit.

Bottom layer of Silence Suit by Vera de Pont

The idea of a contemporary monk is taking shape.  The air circulation is also optimized by including the pattern in the design. To decorate the suit in a practical way, she plans to embroider graphic icons on the pockets of the different sensors. So, you know all the sensors and it simplifies the maintenance of the suit after washing it.

embroidery - graphic icons

embroidery – graphic icons

We have to work on the artificial intelligence part of the system. At a certain point the system has to know what a good meditation session is to influence it in a positive way. The goal is to program a good meditation session. The programmer wants to know constitutes meditation quality? To answer that question a lot of tests have to be done.

By means of a questionnaire in combination with the data of the session Danielle wants to do research about the quality of the meditation. Therefore, she plans to include a new sensor in the suit and we already did some tests with it this week. The plan is to lengthen one sleeve of the under vest to a glove. We have to include two buttons in the glove that you can push while meditating without moving that much.

button - to mark an extraordinary positive or negative experience

button – to mark an extraordinary positive or negative experience

One button will mark an extraordinary positive experience in the timeline of the session. You have to push the other one if there is a negative influence of your surroundings. For example, if the light instrument falls down or there is some background noise you can push the button. The system will mark that point in your timeline and you can see afterwards the effect of that occurrence on your meditation.

The form as well as the content of the Silence Suit are in development. As you can see every week we are making steps to get a grip on the complexity of the project.





Introducing Silence Suit

first sensors

Meditation stool with soft sensor and heart-rate sensor

For over a year I’ve been working on a meditation wearable. It measures biometric and environmental input. Its goals is to use the measurements to improve your meditation and use the data to generate artistic visualisations. The wearable is part of a bigger project Hermitage 3.0, a high-tech living environment for 21st century hermits (like me). Now that the wearable project is taking shape I’d like to tell a little about to process of creating it.

The sensors
I started with a simple but surprisingly accurate heart-rate sensor. It works with the Arduino platform. It uses an ear-clip and sends out inter beat intervals and beats per minute at every beat. With some additional code in Processing I can calculate heart-rate variability. These are already two important measures that can tell a lot about my state while meditating. Then I added galvanic skin response to measure the sweatiness of my skin, a nice indicator of stress or excitement. I added an analogue temperature sensor that I put on my skin to measure its temperature. Low skin temperature also indicates a state of relaxation. I also made a switch sensor that is attached to my meditation stool. Sitting on it indicates the start a session, getting up marks the end.
All sensors were connected with a wire to my computer but the aim was, of course, to make it wireless so I’d be free to move. But I could already see day to day changes in my measurements.

A little help from my friends
As things were becoming more complex I posted a request for help in a Facebook group. A colleague, Michel offered to help. We first looked at different ways to connect wirelessly. Bluetooth was a problem because it has very short range. Xbee also wasn’t ideal because you need a separate connector. We also made a version where we could write to an SD card on the device. But this of course doesn’t offer live data which was crucial for my plans. We finally settled for WiFi using the Sparkfun Thing Dev ESP8266. We were going to need a lot of analogue pins which the thing dev doesn’t offer. So we used the MCP3008 chip to supply 8 analogue i/o pins.

Overview of all the sensors

Overview of all the sensors

More is more
We could then increase the amount of sensors. We’ve added an accelerometer for neck position, replaced the analogue skin temperature sensor with a nice and accurate digital one. Around that time a wearable from another project was finished. It is a vest with resistive rubber bands that measures expansion of the chest and belly region. Using the incoming analogue values I can accurately calculate breath-rate and upper and lower respiration. Then it was time to add some environmental sensors. They give more context to for example GSR and skin temp readings. We’ve added room temperature and humidity, light intensity and RGB colour and air flow.

Vest with sensors

Vest with sensors

Environmental sensors

Environmental sensors

Seeing is believing
From the start I’ve made simple plots to get a quick insight into the session data. For now they don’t have an artistic purpose but are purely practical. At this point it is still essential to see if all sensors work well together. It’s also nice to get some general insight into how the body behaves during a meditation session.
Data is also stored in a structured text file. It contains minute by minute averages as well as means for the whole session.

Session data plot with legend

Session data plot with legend

I’ve also made a Google form to track my subjective experience of each session. I rate my focus, relaxation and perceived silence on a 7 point likert scale and there is a text field for a remark about my session.

Results from Google form: very relaxed but not so focussed...

Results from Google form: very relaxed but not so focussed…

I used the vest from the other project to attach the sensors to. But last week costume designer Léanne van Deurzen has made a first sample of the wearable. It was quite a puzzle for her and her interns to figure out the wiring and positioning of every sensor. I really like the look of this first design. It’s fits with the target group: high-tech hermits and it also is very comfortable to wear.

Upper and lower part of the suit

Upper and lower part of the suit

Back with extension where soft sensors to detect sitting will be placed

Back with extension where soft sensors to detect sitting will be placed

The future
The next step will be adding sensors for measuring hand position and pressure and a sound-level sensor.
Then we will have to make the processing board a bit smaller so it can fit in the suit. We can then start integrating the wiring and replacing it by even more flexible ones.
When all the sensors are integrated I can really start looking at the data and look for interesting ways to explore and understand it.
I’m also looking for ways to fund the making of 15 suits. That way I can start experiments with groups and find ways to optimise meditation by changing the environment.


Last week I joined the Seeker project, a co-creation project by Angelo Vermeulen exploring life in space(ships). It’s been really inspiring so far as living in space combines knowledge from the fields of architecture, sustainability, farming, water and power supply and Quantified Self. The latter being my addition, of course :)

Together with architecture master students from the TU/e I’m looking into the interior of the ship which will be based on two caravans. As life in a spaceship is crowded and noisy my aim is to make a quick and dirty prototype that will:

  • detect the noise level
  • detect the users’ heart-rate
  • promote relaxation by pulsating light and sounds from nature

Noise level, heart-rate and soundtrack will (hopefully) be send to the base-station so people have a life indication of the status of the ship and the people living in it.

This is the sketch:

Today I’ll have a talk with the technicians for MAD to see what is possible. I’m thinking of using the following sensors:


Noise level:

Playing sound:

The cocoon itself will be the good old paper lamp:

science hack day

Last weekend I took part in the first Dutch Science Hack Day in Eindhoven. I had posted my idea on the forum and was hoping for a nice group of experts to work with. The idea was to create a mood enhancer. When you’re sad it could help you be become happy again. When your happy you could help others who are sad to improve their mood or support them. It will consist of a) mood detection, b) mood changing, c) mood sharing.

On the forum one participant, Siddhesh (PhD student TU/e), had already expressed his interest. After I’d introduced my idea I was joined by Leonid and Huang-Ming both students at industrial design at the TU/e and Ketan also a PhD student at the TU/e. We were later joined by Iwan an interior architect. So we had a nice mixed group from different countries.

I was pleasantly surprised at how swiftly we decided on the use case and technologies to be used. Everybody was very eager to start to work and do so in their field of expertise. We decide to use two hardware sensors (heart-rate and skin conductance) to provide the level of arousal and one on-line software sensor,, that uses portraits to classify moods. The heart-rate sensor was already finished because we could reuse it from another project by Leonid and Huang-Ming but there was still a lot of work to be done.

For output we wanted to do something with light and sound as they are the least obtrusive when you’re working. We wanted to work with a physical object to display the mood and also enhance it and to use Twitter to share moods. We had difficulty to decide if the visualisation should just be personal feedback or should also display a friends’ status. As time was limited we decided on just feedback. The application moved from enhancement to awareness of moods which was enough for just one weekend.

I took on the task of implementing the valance through the API. It would all have to be done in 24 hours so that was pretty challenging. Registering at was easy. The API was pretty straight forward and only later I discovered the it could not just detect smiling or not smiling but a whole set of moods: happy/sad/angry/surprised/neutral value and confidence, based on the expression of the person in the the photo. There’s was also a lot of other info to be gotten from the image using the faces.detect method, the accuracy of the results was surprising, even under less favourable light circumstances. The main hurdle was uploading an image for and keeping it in sync with the rest of the application. In the end we used the local Dropbox folder to store the web cam captures and letting Dropbox sync with the web version, the URL and file name are used in the request.

The others worked on building the Galvanic Skin Response sensor, the lamp object and the integration of the heart-rate sensor and software for the new purpose.

We used Processing as the main language to read the values from the sensors, connect to the web and drive the output. The sensors write their current values to a file separate and one script reads all the sensor input to generate a visual output, change the colour and position of the lamp and change the sound.

The main application shows a changing, interactive landscape of lines and circles. The   amount of arousal the corresponding valence determine:

  • The position and colour of the circle. When you click on a circle the web cam image and heart-rate value is shown, allowing you to trace back how you felt during the day.
  • The position of and colour of the light object
  • The sound being played

Iwan made a nice presentation and we were finished just in time. The presentation went well and the jury picked our design as the best in Overall happy living category! That was just the icing on the cake of great and inspiring weekend.

Science Hack Day Eindhoven 2012 winners compilation from M.A.D. ART on Vimeo.

Being one of the winners we also presented at the Internet of Things event at the High Tech campus in Eindhoven.

MADlab kindly supplied me with an artist residency to cover expenses.

Ups and downs

measuring electricity

Paul explaines how to measure the power used for the heating of the fabric

I’m having a hard time with my wearable. There are different areas where I’m experiencing problems:
- The thermochromic ink deteriorates very fast. Under influence of UV radiation for which I haven’t been able to find a solution. The amount of electricity it needs to heat up is very much depending on the air temperature. Also the width of the strip of fabric (this has to be equal for the whole length) is crucial for successful colour change. So there’s a lot of testing ahead before this will work well. I have switched from reflective strip to using reflective fabric, I wonder if the performance will improve.


Arduino talking to Nokia

- I’ve been talking to a lot of experts lately but Friday last I got a pretty disturbing e-mail from the RIVM which is a leading Dutch centre of expertise and research, it advises and supports policy-makers and professionals in public health and environmental areas. The gases I’m measuring are a good indication of air quality but the sensors I’m using aren’t sensitive enough. I was happy to find gas sensors in the first place but now they appear to be worthless for my purpose.
- Programming the Bluetooth connection has made some progress. I can now connect with the Arduino board and send and receive bytes. Now I’m at the final stage of sending all of my sensor data in a single string to the Nokia. I also have to find out how to let Python check for incoming serial data continuously. Some sort of event listener.
- I keep having trouble using the internal GPS. It seems to break down after I’ve used it once. Only a restart will make it work again.
- On the bright side I’ve made two dummies for my vest. Saturday I worked together with my tailor on my second dummy, using the actual fabric. This gave me a lot of insight in what I want. Discussing it with AnnaMariaCornelia it became clear that I have to take a radical turn to make my vest look like true work ware. I’m really looking forward to designing my vest and make it look sturdy and cool to wear.


The second dummy which at this stage looks too much like an apron


Nice vest with functionality

Nice vest with functionality

My aim for this workshop (19-9-09) was to put the three sensors together but we also had a have a meeting with AnnaMariaCornelia concerning the design. The rest was kind enough to let me meet first. I showed the various designs I did with the safety vest as basis. The vest was present from the start of this project because of it’s connection with safety and the clean way it signals a simple message. So Anna suggested I stick to those basics instead of just decorating the vest. She’s absolutely right of course. I should just use the esthetics and functionally of the vest to inform about air quality. Research has shown that the amount and broadness of the reflective stripes indicate higher safety. So I can vary the width of the strips to indicate more or less pollution, resulting in more or less visibility. I’ll use strips of conductive fabric to heat the paint. That should give a nice, clean result.
My research also showed that there are some very functional coats with different pockets. I want to use them in my design to store and build in the boards, buttons, phone and power supply.

The building of the board with sensors went pretty smoothly. That’s no surprise with an expert like Paul standing by my side. So now I’ve got a great little board with all the sensors powered at their own voltage. We also build in a switch to turn all the sensors on and off at the same time to increase their lifetime. We did discover that the ozone sensor is influenced by humidity. So I ordered a humidity sensor yesterday so I can at least measure it. How to interpret the data is a completely different question, the next problem to tackle.

Three sensors integrated into one board

Three sensors integrated into one board

Art Pollution Kit

Michal Kindernay and Gívan Belá performing

Michal Kindernay and Gívan Belá performing

Last night I went to Brussels to watch the performance with the Art Pollution Kit. This is a project by Michal Kindernay and Gívan Belá (Guy van Bellen.) They are making a cheap or DIY kit for measuring different kinds of environmental values. The prototype they showed last night measured temperature, light, humidity and noise. They will be extending it with other sensors like the gas sensors I’m using.
The prototype itself was very basic but what I found very interesting was their thinking about pollution. They asked themselves what is pollution in an image, in sound? They found that the degradation of the colour spectrum is an equivalent for visual pollution. Michal has been busy with visualising pollution in a very direct way: by changing the actual pixels of the image of the location where the pollution is. The image is being polluted by the data. Twenty-four hours of  data can be displayed in the image which changes over time. When there’s no pollution you get a perfectly clear picture. So the visual and environmental data are merged into one image which is very powerful. In the performance the same data was used to generate sound. This way the kit can be used as an instrument. The plan is to distribute the kits to artists at different locations, gather all the data and work with the data from the networked kits.

Michal also showed me some other visualisations of (noise) pollution he’s working on. It involves real time erasing of unwanted object (like cars) from camera recordings. That way you get a completely clean street view. The streets only gets filmed when there’re no cars which shockingly meant only a couple of minutes of recording during hours of filming. We share the same irritations and fascinations and might work together in the future.

Testing 1, 2, 3…

Thursday I did it test with all my sensors for 24 hours. It went rather well but there art two main points of attention.

Mathe, Richard an me working (no were not terrorists)

Mathe, Richard an me working (no we're not terrorists)

1. The stealth cam. It has limited capabilities in storage capacity and battery capacity. When I set the resolution to low it should be able to hold 350 pictures. This is not the case, it holds 300 max. Another drawback is that it’s very hard to make out to which resolution the cam is set. So I accidentally set it to high. As a result I couldn’t complete the whole day because I was with friends and I forgot my cable so I was stuck for the evening with a full camera. But I did get some nice shots (view right.)

It takes just one AAA size battery. I used up only 2.5 in 13.5 hours so that was fine. I do have to carry a spare one with me where ever I go. I bought a quick, one hour charger for four AAA batteries so I will never go without.

Tiny cam can be attached with velcro to a card

Tiny cam can be attached with Velcro to a card

I’ve found a very nifty  solution for wearing the camera. I can pin or clip it to every sort of clothing. The images are still shaky but he, life is bumpy ;) To my relieve people didn’t look too suspicious when I passed them, I’m still a bit embarrassed though.

It was quite a puzzle to see where I missed pictures. (In the future I will note exactly when the camera stopped and restarted.) When I empty the storage it takes at least a couple of minutes to store them on hard disk. So there are gaps. I filled the gaps with duplicate pictures but I’m still files short…? I think the 60 second laps probably isn’t exactly 60 seconds so after a couple of hundred pictures you start noticing that. I’ll have to wait and see how that works out in the app.

The main challenge is dealing with the gaps in the data from all the devices.

2. My Suunto watch does a great job during the day but when I sleep it loses connection with the skin and it stopped at 4 a.m. Which was still almost all night. But it will have to do better when the project actually runs. I think I’ll buy some medical tape to fasten it for the night.

I don’t think it can log 24 hours of data, but it’s close. To be on the safe side I’ll store the data somewhere in the middle of the day.

Logging my activities was a very mindful thing to do. I thought it would irritate me but it made me quiet and alert at the same time. May be using pencil and paper helped there too.