sleepGalaxy: kick off

Finally, I’ve started to work on a piece that’s been on my mind for almost two years. Ever since I met the nice people from Emfit at the Quantified Self conference. They kindly gave me their sensor in return for an artwork I would make with it.

Emfit QS

Emfit QS sleep sensor

You put the sensor in your bed, go to sleep and it wirelessly sends all kinds of physiological data to their servers: movement, heart rate, breath rate. All this data together they use to calculate the different sleep stages. From the heart rate they’ve recently started calculating HRV and recovery. This latter value to me is best indicator of my sleep quality and how energetic I feel.
Emfit offers a nice interface to explore the data and view trends.
emfitInterface

In sleepGalaxy I want to explore the relationship between sleep quality and the following variables: exercise, social and work meetings, calorie and alcohol intake, screen time and overall happiness and stress during the day. I’m under the impression that these have the most impact on my sleep, that is, the sleep phases, the ability to stay asleep and recovery.

Google form

Google form

To track the variables I’ve created a Google form that I fill in every night before I go to sleep. I’ve set an alarm on my iPad so I don’t forget.

Excel sheet with some of the Emfit data

Excel sheet with some of the Emfit data

firstNight

First circle visualisation

From all the Emfit data I’ll be using a subset. My first sketches focus on the sleep phases. I’ve spend a couple of hours programming first the basic idea: transforming the sleep phases into concentric circles. Going from awake to light sleep, REM sleep and deep sleep in the centre.

The next step was to make sure the different phases are displayed correctly, representing the amount of time spend in each phase and total time in bed. I’m programming in Processing and I’ve created an class called Night. After reading in the Emfit excel data as a csv file I loop through the rows and create a night object representing every night.
Displaying the circles went fine but the proportions between the circles just didn’t look right. I realised I had a conflict working with minutes in a decimal context. I wrote a little function that converts the minutes of the hours into decimal values and then adds them to the whole hours:
float min2dig(String time){
String[] tmp = split(time,'.');
float t = float(tmp[0])+(float(tmp[1])/60);
return t;
}

Now the basis of the visualisation is ready. The image below displays sleep phases of the four nights in the excel data from above. I look forward to adding more data. To be continued…
firstNights

xbee hello world!

Today I’ve had my first success with communicating between two Xbees. Mostly thanks to this simple but clear tutorial. After installing the XCTU software with Thijs a few months back I had forgotten quite a lot of his private class “introduction to Xbee”. From his instructions I ordered 7 Xbee antenna’s and one Xbee explorer USB. My goal is to make an Xbee network without using Arduinos with the Xbees. This apparently is possible. But before getting to that point I had to make an Xbee “hello world” to grasp the concept and get the basics right.

In this picture you see the a light sensor attached to an Arduino and Xbee antenna. The Arduino prints the measurements to the serial port. Through the TX and RX pin the Arduino is connected to the Xbee antenna. This sends data to the other Xbee antenna that acts as a receiver. The data is printed in red in the XCTU terminal on the right.

On to the next step: running the Xbee on a battery and programming the Xbee pins to read and send the wind sensor data. To be continued…

server side

Last week I visited the nice people of Sense OS. They offer a very powerful platform for working with sensor data: CommonSense. I talked to them about hosting my breathing_time performance on their system.They are enthusiastic about the project and want to take on the challenges that they’re up against.

Sense OS logo

The idea is that I make an account especially for breathing_time that will include the 5 wind sensors. The main challenges are: synchronisation of the different data streams and the real time aspect. As for synchronisation all the participants will have to install Network Time Protocol (NTP) software. This will ensure that all the computers use exactly the right time.

The data will be near real time. Every device will send an array with 5 data points every 500 milliseconds. And every second it’ll retrieve the 10 most recent data points for visualisation. We will use threads to make sure that hiccups in the connection or the network won’t disturb the flow of data. A local time stamp will be used to ensure every device always has the most recent data.

This sounds like a sound strategy the ensure flow through and speed. We’ll have to test of course to see how quick the data is send.

It’s been very nice talking to experts in this field. I feel the server side aspect of the performance is in safe hands with them.

wind sensor demo

Yesterday we managed to get the first wind sensor working. Together with Richard I  connected the sensor to the RBBB board, calibrated it and read some wind/breath data from the serial port into Processing. It works surprisingly well:

I have already noticed that the distance from the sensor to the nostrils is critical. I’ll have to experiment. Richard had a nice idea. I should use the tubes to direct the air from the nostrils straight to the sensor for the most accurate result. I will give it a try. It will make the wearable more scary but that doesn’t have to be a problem :)

analogue input

Today I did a little test using analogue input for my LED. I build my third RBBB today and wanted to take the LED output a little further. For my breathing device it will have to respond to the analogue input of breathing. So I emulated that with a light sensor. Covering the sensor less or more changed the brightness of the LED. For a start it was fine, but with breathing the LED will have to go from no light to very much light to give proper feedback on the respiration pattern.

breath sensor progress

During the TIK launch event I was fortunate to meet Christian Pointer. He’s an electronic engineer, programmer and hacker from Graz, Austria. He was very kind to help me with improving the stretch sensor. With the few parts I had with me he managed to increase the output range from the sensor to the Arduino dramaticly. Making it much easier for me to work the data and use it in different applications. The only thing I have to do is replace the two pot meters with ones’ of more suitable value and replace the simple opamp with a rail2rail one (TLV274 for example).

Afbldng296
Scheme_merlin_sensor_kl

Hardware integration

Buttons, temperature and humidity sensor

Buttons, temperature and humidity sensor

On the photograph you  can see the 5 volt circuit. It houses two sensors and two buttons. At the workshop last Saturday I put together button circuit, if you press the white one it means it smells good, the brown one indicates it smells bad. This is of course just a functional setup, purely for testing hardware and code. Later I’ll make soft buttons which will be integrated in the vest. We also made the humidity sensor (the kinky copper plate, we ran out of circuit board.)  It will take some time to calculate the actual humidity from the output and before I can use it with the ozone sensor. A period of extensive callibration is due anyway.
I worked on the code which I completed today. The Arduino now collects data from all sensors (except the gas sensors which somehow have stopped working at the moment) and outputs an ‘e’ when no buttons was pressed, a ‘+’ when the smells good button was pressed and a ‘-’ when the bad button was pressed. It works in a two minute loop which is broken when a button is pressed, then data is immediately collected and send.
As for the sending part, I’m still working on the Bluetooth connection between the Arduino and the phone.