Soft Meditation, first prototype

For the past couple of weeks I have been working on the first version of my Soft Meditation piece. This is a performance in which I meditate and live data is transformed into an animated, artistic visualisation.

Soft Meditation Performance photograph by Kristin Neidlinger

Soft Meditation Performance photograph by Kristin Neidlinger

Background

For the past year I have been developing, together with a team, Meditation Lab Experimenter Kit. A tool-kit that consists of a suit with sensors and software which allows you to monitor and optimise you meditation practice through self-experimentation and interaction with the environment.
Soft Meditation is the first application made with this tool-kit. It uses the API to create generic imagery from live sensor data collected with the suit. My aim is to explore whether donating personal data can create a positive, meditative effect in others even though they aren’t meditating themselves.

Why soft?

The title of the performance refers to the environmental psychology term soft fascination coined by Kaplan and Kaplan as part of their attention restoration theory. In my own words: the theory describes how looking at natural phenomena like waves on the water captures your attention without causing any cognitive strain. That way the mind can restore and refresh. Meditation is all about attention and I am looking for an easy way to capture the visitors attention and take them to a place of calm.
Trying to do this with meditation is, despite of popular belief, quite hard work. So soft also refers to the gentle and playful way in which, I hope, a meditative state of mind is achieved.

Inspiration For Soft Meditation

Inspiration For Soft Meditation

How soft?

But how do I capture attention in a way that is calming and uplifting? I’ve read some articles (view references below) about the affective properties of motion graphics and compiled an inventory of effects. For my goal it would be best to use slow, linear motion from left to right. I could then play with speed and waviness to create more intensity and interest depending on the sensor data in direct input.

Prototype design

For years I’ve been thinking about expressing my inner meditation state through a water metaphor. Movement of water is endlessly fascinating and mysterious and to my mind perfectly suited for my intentions. I looked for inspiration online which set the boundaries for which software environment to choose.
After exploring various platforms, languages and libraries I ended up with good old Processing as a platform. I found this sketch online which offered a nice starting point to build on. I started modifying it.

Exploring the Box Waves Processing sketch

Exploring the Box Waves Processing sketch


Considering I wanted a complex and lively wave animation I choose pitch (nodding movement of the head), breathing (top and bottom), finger pressure and heart-rate as input sensors.
SoftMed Prototype

Interaction with the audience

I have been thinking about how to make the performance multi-directional. I wanted to somehow include the audience into what is happening on the screen. What both me and the audience share are the sounds in the room. I decided to use the marker button provide with the suit to change the animation speed depending on the loudness of the sounds. Over time the audience would notice the relation between speed and sounds was my idea.

The first performance

I was invited to give a short presentation at the Human-Technology Relations: Postphenomenology and Philosophy of Technology conference at the University of Twente. Instead of a talk I decided I would test my prototype. I could only last for 5 minutes. I had programmed the sound of a bell at the beginning and end. I was facing the wall while the audience looked at a big screen over my head.


I was a bit nervous on how it would be to meditate in front of some 30 strangers. But once I sat down it was just like I always do: notice my body (pounding heart) and mind.

I was less pleased with the demo effect. One sensor was not working properly (I still don’t know why). This created hard-edged shapes and motions from right to left the exact opposite of the intended animation.

I tried pressing the marker button when I heard something. But as the performance progressed the room became more and more silent. Which I suppose is a sign that it worked but not something I had counted on.

Measurements

I am of course interested in the effects of the performance. I supplied the audience with the Brief Mood Introspection Scale (BMIS). Four sub-scores can be computed from the BMIS: Pleasant Unpleasant, Arousal-Calm, Positive-Tired and Negative-Relaxed Mood. I asked to fill them in before (baseline) and after the performance. 10 questionnaires were returned of which 6 were complete and correct. I am working on the results and will report on them in a later post.

Reactions

I was pleased to hear that people were fascinated by the wave and tried to work out what it signified. People found the performance interesting and aesthetically pleasing. We discussed what caused the effects: the context, the staging of me sitting there and people wanting to comply, the animation or the silence? A lot of things to explore further!
One participant came up to me later and explained how much impact the performance had on him. He found it very calming. “Everything just dropped from me” he explained. It also made him think about silence in his life and looking inward more. This is all I can hope to achieve. I continue my research with new energy and inspiration.

The next version of the performance will be on show during the biggest knowledge festival of south Netherlands (het grootste kennisfestival van zuidnederland) in Breda on September 13th.

References
- Feng, Chao & Bartram, Lyn & Gromala, Diane. (2016). Beyond Data: Abstract Motionscapes as Affective Visualization. Leonardo. 50. 10.1162/LEON_a_01229.
- Lockyer, Matt, Bartram Lyn. (2012). Affective motion textures. Computers & Graphics
- K Piff, Paul & Dietze, Pia & Feinberg, Matthew & Stancato, Daniel & Keltner, Dacher. (2015). Awe, the Small Self, and Prosocial Behavior. Journal of personality and social psychology. 108. 883-899. 10.1037/pspi0000018.

How to test a meditation wearable?

I suppose the answer to that question is obvious but not so easy to realise: during a retreat. But still, that is what I did. Last week I spend 6 days meditating while at the same time putting my brand new wearable and software platform to the test.

It was snowing outside while I was doing my 6-day retreat

It was snowing outside while I was doing my 6-day retreat

What is it all about?

For those of you who missed it: the past 3 months I’ve been working on the Meditation Lab Experimenter Kit. The focus on those first months has been to design and develop a new Silence Suit wearable, improve the electronics and create a software platform (the Data Server) to log and explore the data.
The whole team has been working really hard to get the prototype ready for single user testing. It was quite exciting to put all the different parts together which have been developed by different team members on separate locations. I managed only just in time to get everything to work for the start of my self conducted retreat.

Data science

The main goal was to gather as much baseline data as possible. At a later stage I will try to influence my meditation through manipulating the light. But to really see the effects I need insight into how my ordinary meditation data looks. So German, our AI and data science expert, advised me to get as many 20 minute sessions as possible. I managed to do 54!
Things I wanted to know:
Do all the sensors produce reliable data?
How stable is the software platform?
How easy is it to use the wearable and the platform?
Will I enjoy using both?

Do all the sensors produce reliable data?

MLEK HR

Getting good heart-rate data was the biggest challenge

Because I had been working with most of the sensors in my first prototype I had a pretty good idea of what the data should look like. Programmer Simon had swiftly put together a script that could plot data from all the sensors in graphs. That way I could easily grasp the main trends. It immediately became clear that the heart-rate sensor wasn’t doing what I’d hoped. A lot of beats were missed, once even only 2 data points were collected in 20 minutes (and no, I was not dead).
Oddly enough the rest of the data was fine. I tried recharging the batteries and changing the ear clip but nothing worked and whether or not I’d get good data seemed unpredictable. Until the final day.
While looking at the graphs after I’d finished a session I casually rubbed my earlobe and it felt cold. I looked at the data and saw that the signal deteriorated towards the end of the session. Eureka! The blood flow to my earlobe was the problem, not the electronics.
Cold is a major influence but I also want to experiment with the tightness of the clip. It might prevent the blood from circulating properly.
So most sensors performed well, better even than I’d hoped. Unfortunately no data comes from the cute little PCB one of the students at Design Lab has designed and soldered. Also the soft sensor for detecting sitting down (also the start button) is still unstable.

Force sensor to measure pressure between fingers

How stable is the software platform?

The software runs on my old Dell laptop and Simon has installed the Lightweight X11 Desktop Environment (LXDE) on it. So it runs on Linux which was a new experience for me. But I like it, it is basic and simple and does what it should. To start the system I have to run the server for data storing and the adapter for communication with the hardware. I must say I am very impressed with the whole performance. There has been no data loss and the plots are great to get an impression of the session.

getSession

Data output from one meditation session

How easy is it to use the wearable and the platform?

I was pleasantly surprised by the comfortableness of the suit even after 10 sessions in one day. Putting it on with attention takes about 2 minutes and then you’re all set. You hardly notice that you are packed with 10 different sensors.
The pre and post qualitative forms are easy to use. At the moment I still have to use URLs to access certain functionality but everything works and that was such a relief. Plotting the data with around 5000 data points per sensor per 20 min session is hard work for my old Dell. But it gives me time to do a little walking meditation…

Maybe it is just me but I don’t mind filling in two forms for every session. I seriously consider every question and try to answer as honestly as I can.
Doing two or three session in a row is even easier. All I have to do is refresh the home page of the server and I can start another session.

Will I enjoy using both?

Well yes, using the system was a pleasant experience for me. I did learn that I should not look at the data before filling in the post meditation questionnaire because the data caused my mood to plummet. So it will be best to have the data summery after that has been done.

last Session

Session summary. The number of data points will be replaced by mean values.

I have a lot of confidence that the system will be useful and give a lot of insights. There is still a way to go until I can actually automate the light actuation intelligently. But the plots did show variations and now German can work his magic. I can’t wait to see what he will come up with.

Introducing Silence Suit

first sensors

Meditation stool with soft sensor and heart-rate sensor

For over a year I’ve been working on a meditation wearable. It measures biometric and environmental input. Its goals is to use the measurements to improve your meditation and use the data to generate artistic visualisations. The wearable is part of a bigger project Hermitage 3.0, a high-tech living environment for 21st century hermits (like me). Now that the wearable project is taking shape I’d like to tell a little about to process of creating it.

The sensors
I started with a simple but surprisingly accurate heart-rate sensor. It works with the Arduino platform. It uses an ear-clip and sends out inter beat intervals and beats per minute at every beat. With some additional code in Processing I can calculate heart-rate variability. These are already two important measures that can tell a lot about my state while meditating. Then I added galvanic skin response to measure the sweatiness of my skin, a nice indicator of stress or excitement. I added an analogue temperature sensor that I put on my skin to measure its temperature. Low skin temperature also indicates a state of relaxation. I also made a switch sensor that is attached to my meditation stool. Sitting on it indicates the start a session, getting up marks the end.
All sensors were connected with a wire to my computer but the aim was, of course, to make it wireless so I’d be free to move. But I could already see day to day changes in my measurements.

A little help from my friends
As things were becoming more complex I posted a request for help in a Facebook group. A colleague, Michel offered to help. We first looked at different ways to connect wirelessly. Bluetooth was a problem because it has very short range. Xbee also wasn’t ideal because you need a separate connector. We also made a version where we could write to an SD card on the device. But this of course doesn’t offer live data which was crucial for my plans. We finally settled for WiFi using the Sparkfun Thing Dev ESP8266. We were going to need a lot of analogue pins which the thing dev doesn’t offer. So we used the MCP3008 chip to supply 8 analogue i/o pins.

Overview of all the sensors

Overview of all the sensors

More is more
We could then increase the amount of sensors. We’ve added an accelerometer for neck position, replaced the analogue skin temperature sensor with a nice and accurate digital one. Around that time a wearable from another project was finished. It is a vest with resistive rubber bands that measures expansion of the chest and belly region. Using the incoming analogue values I can accurately calculate breath-rate and upper and lower respiration. Then it was time to add some environmental sensors. They give more context to for example GSR and skin temp readings. We’ve added room temperature and humidity, light intensity and RGB colour and air flow.

Vest with sensors

Vest with sensors

Environmental sensors

Environmental sensors

Seeing is believing
From the start I’ve made simple plots to get a quick insight into the session data. For now they don’t have an artistic purpose but are purely practical. At this point it is still essential to see if all sensors work well together. It’s also nice to get some general insight into how the body behaves during a meditation session.
Data is also stored in a structured text file. It contains minute by minute averages as well as means for the whole session.

Session data plot with legend

Session data plot with legend

I’ve also made a Google form to track my subjective experience of each session. I rate my focus, relaxation and perceived silence on a 7 point likert scale and there is a text field for a remark about my session.

Results from Google form: very relaxed but not so focussed...

Results from Google form: very relaxed but not so focussed…

Suit
I used the vest from the other project to attach the sensors to. But last week costume designer LĂ©anne van Deurzen has made a first sample of the wearable. It was quite a puzzle for her and her interns to figure out the wiring and positioning of every sensor. I really like the look of this first design. It’s fits with the target group: high-tech hermits and it also is very comfortable to wear.

Upper and lower part of the suit

Upper and lower part of the suit

Back with extension where soft sensors to detect sitting will be placed

Back with extension where soft sensors to detect sitting will be placed

The future
The next step will be adding sensors for measuring hand position and pressure and a sound-level sensor.
Then we will have to make the processing board a bit smaller so it can fit in the suit. We can then start integrating the wiring and replacing it by even more flexible ones.
When all the sensors are integrated I can really start looking at the data and look for interesting ways to explore and understand it.
I’m also looking for ways to fund the making of 15 suits. That way I can start experiments with groups and find ways to optimise meditation by changing the environment.