Introducing Meditation Lab Experimenter Kit

For over a year I’ve been working on a wearable which will track physiological and environmental parameters during meditation. The idea was to improve the quality of your meditation by changing aspects of your environment e.g. light, sounds or temperature.

Silence Suit

In the spring of this year the opportunity arose to apply for an open call organised by the EU. The aim of that call is to generate knowledge about and new applications that address important issues concerning wearable technology today: data ethics and sustainability. Teams consisting of artist/designers and technologists were invited to apply for the WEARsustain open call.

I’m happy to announce here that my project is one of the 23 winners. For the next 6 months I’ll work with a great team of experts to realize this project. There’s what we’ll do.

DIY Science

We will create the Meditation Lab Experimenter Kit. This is a tool-set for studying, enhancing and sharing meditation experiences. The kit consists of a wearable and software. The main functionalities are:
1) Monitoring: A two piece garment, Silence Suit houses seven different biometric sensors and three environmental sensors.
2) Logging and analysing: A data server can store the data and allows the user to perform data analysis
3) Influencing: The wearable is part of an Internet of Things ecology allowing it to automatically optimise the environment for meditation
4) Sharing: Live or logged data can be used for to create custom output, in this case artistic visualizations for others to experience meditation.

The development will be staged around experiments. I will conduct 1-person meditation sessions in a controlled and customizable environment to explore the influence of light on meditation. Sensor data is combined with qualitative input about the session. The aim is to make 5 wearables. That way I can test the results in group experiments.

DIY Sustainability

I want to make sustainability as easy as possible for the user. The hardware consists of of-the-shelf, low cost and open source sensors. This makes replacement easy. The battery and micro-controller container will be 3D printed. This allows for easy adjustment and replacement. All schematics and patterns will become open-source. Users can keep working with the components and customize the suit.

Freeing Quantified Self

With regards to data ethics I believe that people have a right to own their data and that sharing should be opt-in only. That is why the software should function fully stand alone to protect the personal data. Basic statistical analyses let users explore their data. This makes it easy to independently make sense of the data. The kit democratizes doing scientific experiments and promotes data literacy.

Here’s a video I made together with Michel Gutlich about what we intend to do.

 

Don’t DI all Y

I realize that this is quite an ambitious plan for 6 months. That’s why I work with enthusiastic experts:
ProtoSpace will work on the dataserver.
Vera de Pont will design a new suit and sew the wearables in 3 different sizes.
Hans d’Achard will manage the system architecture and technology management of the software system.
Germán Bravo will provide expert knowledge and work on the machine learning.
Meike Kurella will be my intern for this period. She’ll be blogging about the process and help out with all kind of hands on tasks (sewing, soldering and help out with the experiments).

I’m very much looking forward to starting the project and learning how technology can support spirituality and health. Check this blog for the latest updates.

Logo-WEAR

europalogo

Bewaren

Bewaren

Bewaren

Bewaren

Bewaren

Bewaren

Maya cabin hackathon

Since this year my projects Meditation Lab and Silence Suit are part of Hack the Body program initiated by the art-science lab Baltan. They want to combine different programs so they suggested that Hack the Body should work together with people from the Age of Wonderland program.
That meant I could work with Branly again. I met him last year and that was a very impressive experience. Branly works with people using ancient Maya spirituality.
At the same time I could try out the Sensiks cabin. With this cabin you can create multi-sensory experiences. This is very similar to what I want to do in my Hermitage 3.0 project. (This will be a space where I can optimise meditation by changing the environment and influencing the senses.)
I brought my Silence Suit which already has a lot of working sensors. We could use the suit to log biometric and environmental data and see how they are influenced by the actuators in the Sensiks cabin.
The main aim of the hackathon was to explore if ancient Maya culture and rituals can be transferred to a high tech environment. The team members were David, Branly, Masha, later to be joined by Michel.

Day 1: exploring
The first afternoon Branly explained the Tuj/Temazcal. It is used in a purifying rebirth ritual. It is a small dome-like structure that is heated by hot stones and steam. The experience resembles a sauna. The rebirth ritual is multi-sensory too: touch (temperature, rubbing with twigs and salt), smell: different herbs and resins, taste: hot drinks (herbal infusions, cacao, honey). Sound: beating of a drum, like heartbeat. Vision is excluded mostly. The Tuj is dark except for red hot glowing stones. We decided to take this as a starting point for building our experience.

Tuj/Temazcal Wikipedia image

The Tuj is located on a beach or in the woods. A quiet, relaxing space. The ritual isn’t limited to experience in the dome. Preparations start days before. The space around the dome is also part of the ritual. For example the structure has a low door so you have to get on all fours to enter. This immediately takes you back to your childhood.

Sensiks control panel photo by Masha Ru

Sensiks control panel photo by Masha Ru

The Sensiks cabin has lots of different actuators: smell, airflow, light, sound, temperature and VR. Everybody had a test ride. We all felt the cabin was rather clinical. We wanted to connect it to the environment. Make it part of a bigger ritual like the Maya rebirth ritual.

Day 2: concept development
Next day we were joined by other Hack the Body participants and hackers. One of them was Michel with whom I collaborate on the Silence Suit.
The whole group had a very interesting discussion about what an experience actually is and where it is experienced. Is it meaningful to recreate an experience that can never match the real thing? The most interesting would be to create something that can’t be experienced in the real world. We wanted to work on changing our state of mind through bodily experiences.

Another level of conciousness... Photo by Masha Ru

Another level of conciousness… Photo by Masha Ru

Day 3: design and experiments
The Maya team was joined by technology wizard Michel. We decided that we did not want to mimic the actual sensory experiences but try to induce a state of mind, another level of consciousness. We used these keywords as our guideline: womb, unknown, subconscious, abstract and random, rhythm. The next step was to translate these abstract concepts into an experience in the cabin. Actuators that we could use: smoke, heat, sound, red and blue lights.

Michel at work Photo by Masha Ru

Michel at work Photo by Masha Ru

In the womb the developing child experiences the heartbeat and breathing of the mother. In the rebirth ritual they make use of a drum to simulate that heartbeat. We wanted to use our own heartbeat and breathing using life data from the Silence Suit. The Sensiks cabin would provide the feedback through sound and light and influence the user. We did little experiments to try out the effects of hearing your heartbeat and breathing, using smoke, scent, heating the cabin, using airflow, etc. It was promising.

Experimenting with sound Photo by Masha Ru

Experimenting with sound Photo by Masha Ru

Day 4: building and presentation
We wrote a scenario of the ritual which started and ended outside of the cabin. Our aim was to slow heart-rate by manipulating the feedback. Just like the peaceful heart-beat of the mother will quiet the unborn child. This is also a way to connect to the heartbeat of the cosmos.
From this came the idea to limit the experience to 260 heart-beats (there are 260 days in a Maya year). By slowing your heart-rate you can make the experience last longer. Four stages of 65 beats would offer different experiences aimed at first going inward and then returning to the outside again.

The ritual starts outside Photo by Masha Ru

The ritual starts outside Photo by Masha Ru

The main challenge was to get the Sensiks and Silent Suit systems working together and to time the events to the users’ heart-rate. We didn’t even have time to test the final scenario.
One of the jury members agreed to be the guinea-pig. And even though we didn’t manage to manipulate the heart-rate feedback we could hear her heart-beat slowing down as she progressed through the experience. Later she described that she could turn inwards and let go of the world outside the cabin. This was exactly what we were aiming for.

Presenting "260 beats womb reset" Photo by Stellarc

Presenting “260 beats womb reset” Photo by Stellarc

Some conclusions
For me the “260 beats womb reset” experience was a proof of concept. That you can actually change a state of mind through relatively simple means (light, sound, smell and airflow) using physiological data as input. An interesting insight is that it is important to make the experience bigger than the box. To create a larger ritual that is not isolated from the rest of the environment. The user must be lured and triggered to actually use the cabin, it must make sense in the context of life.

It was a great inspiration to work with Branly, David, Masha, Michel, Fred (the inventor of the Sensiks) and all the other participants. Michel did a great job of getting everything to work in time for the presentation and combining the systems. We’ve been able to create a spiritual experience using technology. It will be worthwhile exploring this further. I feel a step closer to realizing my Hermitage 3.0.

Edit >> In addition to this report there is an interview with me by Olga Mink from Baltan Laboratories all about the hackathon. Included there is a very nice video impression of the whole week.

Bewaren

Bewaren

Bewaren

Introducing Silence Suit

first sensors

Meditation stool with soft sensor and heart-rate sensor

For over a year I’ve been working on a meditation wearable. It measures biometric and environmental input. Its goals is to use the measurements to improve your meditation and use the data to generate artistic visualisations. The wearable is part of a bigger project Hermitage 3.0, a high-tech living environment for 21st century hermits (like me). Now that the wearable project is taking shape I’d like to tell a little about to process of creating it.

The sensors
I started with a simple but surprisingly accurate heart-rate sensor. It works with the Arduino platform. It uses an ear-clip and sends out inter beat intervals and beats per minute at every beat. With some additional code in Processing I can calculate heart-rate variability. These are already two important measures that can tell a lot about my state while meditating. Then I added galvanic skin response to measure the sweatiness of my skin, a nice indicator of stress or excitement. I added an analogue temperature sensor that I put on my skin to measure its temperature. Low skin temperature also indicates a state of relaxation. I also made a switch sensor that is attached to my meditation stool. Sitting on it indicates the start a session, getting up marks the end.
All sensors were connected with a wire to my computer but the aim was, of course, to make it wireless so I’d be free to move. But I could already see day to day changes in my measurements.

A little help from my friends
As things were becoming more complex I posted a request for help in a Facebook group. A colleague, Michel offered to help. We first looked at different ways to connect wirelessly. Bluetooth was a problem because it has very short range. Xbee also wasn’t ideal because you need a separate connector. We also made a version where we could write to an SD card on the device. But this of course doesn’t offer live data which was crucial for my plans. We finally settled for WiFi using the Sparkfun Thing Dev ESP8266. We were going to need a lot of analogue pins which the thing dev doesn’t offer. So we used the MCP3008 chip to supply 8 analogue i/o pins.

Overview of all the sensors

Overview of all the sensors

More is more
We could then increase the amount of sensors. We’ve added an accelerometer for neck position, replaced the analogue skin temperature sensor with a nice and accurate digital one. Around that time a wearable from another project was finished. It is a vest with resistive rubber bands that measures expansion of the chest and belly region. Using the incoming analogue values I can accurately calculate breath-rate and upper and lower respiration. Then it was time to add some environmental sensors. They give more context to for example GSR and skin temp readings. We’ve added room temperature and humidity, light intensity and RGB colour and air flow.

Vest with sensors

Vest with sensors

Environmental sensors

Environmental sensors

Seeing is believing
From the start I’ve made simple plots to get a quick insight into the session data. For now they don’t have an artistic purpose but are purely practical. At this point it is still essential to see if all sensors work well together. It’s also nice to get some general insight into how the body behaves during a meditation session.
Data is also stored in a structured text file. It contains minute by minute averages as well as means for the whole session.

Session data plot with legend

Session data plot with legend

I’ve also made a Google form to track my subjective experience of each session. I rate my focus, relaxation and perceived silence on a 7 point likert scale and there is a text field for a remark about my session.

Results from Google form: very relaxed but not so focussed...

Results from Google form: very relaxed but not so focussed…

Suit
I used the vest from the other project to attach the sensors to. But last week costume designer Léanne van Deurzen has made a first sample of the wearable. It was quite a puzzle for her and her interns to figure out the wiring and positioning of every sensor. I really like the look of this first design. It’s fits with the target group: high-tech hermits and it also is very comfortable to wear.

Upper and lower part of the suit

Upper and lower part of the suit

Back with extension where soft sensors to detect sitting will be placed

Back with extension where soft sensors to detect sitting will be placed

The future
The next step will be adding sensors for measuring hand position and pressure and a sound-level sensor.
Then we will have to make the processing board a bit smaller so it can fit in the suit. We can then start integrating the wiring and replacing it by even more flexible ones.
When all the sensors are integrated I can really start looking at the data and look for interesting ways to explore and understand it.
I’m also looking for ways to fund the making of 15 suits. That way I can start experiments with groups and find ways to optimise meditation by changing the environment.

Pitch for The Big Date Hackathon

I was invited to pitch at a hackathon hosted by the GGD. The topic was: Data citizens: using quantified self to improve health? I got a lot of positive feedback on my pitch so I want to share it here.

I have a dream…
But then I wake up.
I’m lying in my bed, my Emfit QS sleep sensor has logged my sleep phases, heartrate and movements. Todays’ sleep score is 86 points. But how did I sleep according to me? For one I already feel quite stressed because of some issues at work.
I take my morning blood pressure reading and sure enough the blood pressure has risen.
I hope some meditation will help. I put on my meditation monitoring gear and meditate for 30 minutes. Later I can see from my log that my heart rate came down. And I’m glad the whirlwind of thoughts has dropped.
Every morning I’m curious about my current weight. So I step on my Aria Wi-Fi scale, hmm. Yesterday I had a beer and some peanuts and it shows: weight has gone op by 0.4 kg and fat percentage by 0.1. But I can make a new start every day.
So let’s continue with a healthy breakfast: banana 83 gr, 74 kcal, orange 140 gr, 69 kcal, kiwi fruit, 75 gr, 46 kcal. After that a nice, warm oatmeal with extra fibre, apricots, flax seeds and soy milk: a total of 360 kcal.
Now I’m ready for work! My project timer logs the minutes I spend on different projects and the Workpace software makes sure it take my breaks on time.
After lunch (498 kcal) it is time for my walk in the afternoon sun. 4731 steps. Still more then 5000 to go.
In the evening, after a workout and a nice diner I check my energy balance, 1966 calories in and 1856 calories out. I try and burn a little bit more and take an evening stroll.

After some stretch exercise I head of to bed. And then I have a dream:
I’m travelling on a train. A nice and professional looking lady takes the seat next to me. She says: “I’ve been watching you. I see you very often, almost every time I take the train. I’ve got a feeling I know you pretty well. I know you have a very conscious lifestyle: your diet is healthy, you take enough exercise and your BMI is perfect. I estimate your biological age to be around 12.5 years younger than your chronological age. But still, you sleep poorly from time to time and your fat percentage as well as your concentration during meditation fluctuate. Please let me tell you what you can do to further optimise your health.” She bends over and starts whispering in my ear. I can’t make out everything she says but a sense of insight, purpose and control fills me. I lean back in my chair and a feel happy and relieved.

As we’re entering a tunnel she gets up and sits down opposite an elderly, overweight woman with a walking stick by her side. Slowly the young professional transforms into a kind granny as she takes out some knitting from her bag. She starts a conversation with the other woman, about arthritis if I’m not mistaken. Then I wake up.

I had a dream. In this dream all the fragmented pieces of data that I collect about my body and behaviour were translated into actionable information, explained to me in a language I can understand. I had insight into what my next steps should be and what path to follow to keep on track and to further improve my health. I received some true health wisdom.
Now I’m a media artist, I work with data, I program, make visualisations and use statistics. But even for me it is not clear what actionable conclusions I can draw from my data. A visualisation doesn’t necessarily lead to insight let alone advice on how to improve my lifestyle.

And look at the elderly lady. She got her information in a way that was appropriate for her. The oracle answered questions and gave advice fitting to this individual based on a deep understanding of all the data available.

But… it was a dream.
I challenge you to come up with solutions on how to combine data sets, generate knowledge from it and translate it into plans and advice people can really work with. Solutions that are transparent and respect the choices and privacy of the users.
I challenge you to make my dreams come true this weekend.

The big date

The big date hackathon, picture by MAD

 

working on numuseum

After a long time I’ve taken up the numuseum website. It’s been nagging me for ages that it’s so outdated and not working properly any more. I’m keeping it simple but will be implementing some new things.

designI want to create a now part (“nu” means now in Dutch) and a museum part. Now always shows the most recent data. I’ll start of with a picture of the sky with time and location data. I will overlay that with personal data like mood and heart rate. The museum part will show the now part history in some interactive way.

I’ve found a cute, free font Jaapokki Regular that I’ll be using for the website.

The menu at the bottom gives access to the archive of net-art pieces, an about and contact page.

I’ve already started coding the sky part. I use a very neat FTP app (AndFTP) to send the sky pictures to the server. A PHP script sorts the pictures (most recent first) and grabs the date-time and locations data (from EXIF headers).

home

sleepGalaxy: final design

Displaying different activities with the right duration and start time

Displaying different activities with the right duration and start time

There were still a couple of variables to visualise once the basics design was ready. I had to work on integrating my pre-sleep activity. In the end I used three activity types: sport, social and screen (computer and television). Of the first two I’d logged duration by recording start and finish time. For screen time I just logged total duration because it was often scattered.
I was looking for a way to display all aspects (type, start, finish and duration) in a way that fitted with the nice, round shapes I’d been using so far. Then I realised the pre-sleep activities were recorded from 18:00h onwards. So the main circle could act as a dial. I could split up the space from 18 till 23:59 using the activity duration. I calculated the starting position of each activity as a degree on the dial and added the minutes the activity lasted. Using the arc shape with a substantial line thickness resulted in nice, bold strokes around my “night” circles. Each activity type has its own colour.

The final night design (rating still in green)

The final night design (rating still in green)

I was happy with the result but then the recovery line just looked plain ugly. I decided to use the same arc shape on the other side of the circle. The more recovery the thicker the stroke in green. The less recovery the thicker the line in red.

Finally there was the subjective rating of the sleep. I think it is important to incorporate how the night felt for me. Emfit uses a star system from 1 to 5 stars. So I played around with stars, ellipses and other shapes but finally settled on simple golden dots. A five star night would have the fifth and biggest dot in the middle of the deep sleep circle, this seemed fitting.

UFO like rating design

UFO like rating design

When the individual nights were finished it was time for the overall poster design. I somehow had got it into my head that this would be easy. But it was quite hard the capture the look and feel I was aiming for. I wanted the poster to be simple so that the individual nights would stand out and make a nice “galaxy”. On the other had I did want a legend and some explanation of what was on display.

Sketch of the poster design

Sketch of the poster design

My first idea was to go for a size of 70 x 100 cm, the nights would have a size of around 10 cm. This was too small for all the details to be visible. My final poster will be 91 x 150 cm. The nights are big enough and they all have enough space on the sheet while it is still possible to compare them. I found the nice, slim font Matchbook for the title, the legend and text. I’ll be sending the pdf to the printer next week.

Sleep statistics

Let me start with some characteristics of my sleep pattern. My mean hours of actual sleep is 7.19, of which 20.4% is REM sleep, light sleep 60.1%, deep sleep 15.7%. According to the Emfit QS website my REM sleep is on the low end and my light sleep on the high end needed for complete recovery. I suppose that’s why I often don’t feel really fit when I get out of bed. On average I spend 7.89 hours in bed.

I’ve been looking at the correlations between the sleep and context variables, using data from 35 nights. I’ve also included some other variables that I’ve measured during the same period. I’ll discuss some of the significant correlations I’ve found.

correlationsTable

There are some surprises here. Eating in the evening doesn’t seem to be the healthiest thing to do. It lowers my HRV and prevents deep sleep. I’ve stopped eating after diner.

Deep sleep in minutes. The graph makes very clear that having zero calories leads to the most minutes of deep sleep.

Deep sleep in minutes. The graph makes very clear that having zero calories leads to the most minutes of deep sleep.

The effect of sleep on blood pressure was also an eye-opener. When I sleep better the blood pressure lowers again.

My subjective sleep appreciation correlates positively and highly significant with all sleep phases and the time spend in bed as well as actually sleeping. It has no correlation to deep sleep though. I’ve heard people say that this is the main determinant for their perceived sleep quality. For me this seems to be just sleeping. To crank up my REM and light sleep I should allow myself to spend more hours in bed, there is a strong correlation.

All the other variables don’t affect my sleep. This could be due to them not occurring very often/not every night. I’ve looked at overall stress and happiness. They don’t seem to be connected to any of the sleep parameters. Happiness is positively correlated to the minutes I work out. This is of course often demonstrated in research but it was nice that it sneaked into this unrelated dataset.

Contrary to what I expected the following variables have no significant bearing on my sleep phases: social activity, meditation and evening screen time. Meditation I usually do in the mornings so I can imagine that the effect wears off. But screen time doesn’t affect my sleep contrary to what is claimed. Maybe that’s because I watch boring stuff ;-)

sleepGalaxy: design & calories

Design

Design

I’ve been working on the overall design step by step, alternating between coding and looking. I want to incorporate my calorie intake after 6 PM. I’m not recording the times I ate and I suspect they influence my whole sleep. So the most logical position is to circle all around the “sleep circles”. There is a lot of difference in daily intake after 6 PM, ranging from zero to 900 calories so far. I wanted to plot every calorie so they would have to change sizes depending on the amount. I also wanted to spread the calories evenly around the entire circle. How to go about that? Fortunately, I’ve found this great tutorial. The code is deprecated and the feed doesn’t seem to work any more but I managed to recycle the code concerning the plotting of the elements in a circle.

calorieViz1

Plotting numbers instead of dots

The code uses translate and rotation, which (for me) are very hard to grasp concepts. So instead of using the dots in the design I used numbers to get insight into how the elements are placed on the screen.
By keeping the size of the calorie circle constant, you can already see relations between the sleep duration, the amount of calories eaten and recovery.

cals2

Evening with a lot of calories

cals1

Evening with less calories

In the design you can also see an eclipse. These are the stress and happiness values for the whole day. I poll them by picking a number between 1 and 7 in the form at the end of the day. The mood is the bright circle. The stress circle covers the brightness depending on the amount of happiness felt during the day. By vertically changing the position, I can create a crescent. This can turn into a smile or a frown. The opacity of the black circle indicates the amount of stress. I’m coding this at the moment.

Bewaren

sleepGalaxy: recovery

As I explained in my previous post I find the recovery measurement very useful. It seems a good representation of how rested I feel. It is calculated using RMSSD. The Emfit knowledge base explains it like this: “… For efficient recovery from training and stress, it is essential that parasympathetic nervous system is active, and our body gets sufficient rest and replenishment. With HRV RMSSD value one can monitor what his/her general baseline value is and see how heavy exercise, stress, etc. factors influence it, and see when the value gets back to baseline, indicating for example capability to take another bout of heavy exercise. RMSSD can be measured in different length time windows and in different positions, e.g. supine, sitting or standing. In our system, RMSSD is naturally measured at night in a 3-minute window during deep sleep, when both heart and respiration rates are even and slow, and number of movement artifacts is minimized…” Here is an example of how recovery is visualised in the Emfit dashboard:

Emfit dashboard

Emfit dashboard

I looked for a way to integrate this measure in a way fitting with my “planet metaphor”. I’ve chosen a kind of pivot idea. It vaguely reminds of the rings around planets.

Using the mouse pointer to enter different values of recovery

Using the mouse pointer to enter different values of recovery

I thought it would be easy to just draw a line straight through the middle of the circles. I wanted it to tilt depending on the height of the score. It was harder then expected. I ended up using two mirroring lines and vectors. Starting point was the excellent book by Daniel Shiffman, The nature of code.

Integrating with circle visualisations.

Integrating with circle visualisations.

Once I got the basics working, I went on to refine the way the line should look projected over the circles. Going up from the lower left corner indicates positive recovery, visualised by the green coloured line. The more opaque the better the recovery. Of course, negative recovery goes the other way around.

Slight recovery

Slight recovery

The is a difference in the starting points from which the recovery is calculated. Sometimes my evening HRV is very high. This results in a meagre recovery or even a negative recovery. I might think of an elegant way to incorporate this in the visual. May be I have to work with an average value. For the moment I’m still trying to avoid numbers.

Almost maximum recovery

Almost maximum recovery

Negative recovery

Negative recovery

sleepGalaxy: kick off

Finally, I’ve started to work on a piece that’s been on my mind for almost two years. Ever since I met the nice people from Emfit at the Quantified Self conference. They kindly gave me their sensor in return for an artwork I would make with it.

Emfit QS

Emfit QS sleep sensor

You put the sensor in your bed, go to sleep and it wirelessly sends all kinds of physiological data to their servers: movement, heart rate, breath rate. All this data together they use to calculate the different sleep stages. From the heart rate they’ve recently started calculating HRV and recovery. This latter value to me is best indicator of my sleep quality and how energetic I feel.
Emfit offers a nice interface to explore the data and view trends.
emfitInterface

In sleepGalaxy I want to explore the relationship between sleep quality and the following variables: exercise, social and work meetings, calorie and alcohol intake, screen time and overall happiness and stress during the day. I’m under the impression that these have the most impact on my sleep, that is, the sleep phases, the ability to stay asleep and recovery.

Google form

Google form

To track the variables I’ve created a Google form that I fill in every night before I go to sleep. I’ve set an alarm on my iPad so I don’t forget.

Excel sheet with some of the Emfit data

Excel sheet with some of the Emfit data

firstNight

First circle visualisation

From all the Emfit data I’ll be using a subset. My first sketches focus on the sleep phases. I’ve spend a couple of hours programming first the basic idea: transforming the sleep phases into concentric circles. Going from awake to light sleep, REM sleep and deep sleep in the centre.

The next step was to make sure the different phases are displayed correctly, representing the amount of time spend in each phase and total time in bed. I’m programming in Processing and I’ve created an class called Night. After reading in the Emfit excel data as a csv file I loop through the rows and create a night object representing every night.
Displaying the circles went fine but the proportions between the circles just didn’t look right. I realised I had a conflict working with minutes in a decimal context. I wrote a little function that converts the minutes of the hours into decimal values and then adds them to the whole hours:
float min2dig(String time){
String[] tmp = split(time,'.');
float t = float(tmp[0])+(float(tmp[1])/60);
return t;
}

Now the basis of the visualisation is ready. The image below displays sleep phases of the four nights in the excel data from above. I look forward to adding more data. To be continued…
firstNights