Last weekend I took part in the first Dutch Science Hack Day in Eindhoven. I had posted my idea on the forum and was hoping for a nice group of experts to work with. The idea was to create a mood enhancer. When you’re sad it could help you be become happy again. When your happy you could help others who are sad to improve their mood or support them. It will consist of a) mood detection, b) mood changing, c) mood sharing.
On the forum one participant, Siddhesh (PhD student TU/e), had already expressed his interest. After I’d introduced my idea I was joined by Leonid and Huang-Ming both students at industrial design at the TU/e and Ketan also a PhD student at the TU/e. We were later joined by Iwan an interior architect. So we had a nice mixed group from different countries.
I was pleasantly surprised at how swiftly we decided on the use case and technologies to be used. Everybody was very eager to start to work and do so in their field of expertise. We decide to use two hardware sensors (heart-rate and skin conductance) to provide the level of arousal and one on-line software sensor, face.com, that uses portraits to classify moods. The heart-rate sensor was already finished because we could reuse it from another project by Leonid and Huang-Ming but there was still a lot of work to be done.
For output we wanted to do something with light and sound as they are the least obtrusive when you’re working. We wanted to work with a physical object to display the mood and also enhance it and to use Twitter to share moods. We had difficulty to decide if the visualisation should just be personal feedback or should also display a friends’ status. As time was limited we decided on just feedback. The application moved from enhancement to awareness of moods which was enough for just one weekend.
I took on the task of implementing the valance through the face.com API. It would all have to be done in 24 hours so that was pretty challenging. Registering at face.com was easy. The API was pretty straight forward and only later I discovered the it could not just detect smiling or not smiling but a whole set of moods: happy/sad/angry/surprised/neutral value and confidence, based on the expression of the person in the the photo. There’s was also a lot of other info to be gotten from the image using the faces.detect method, the accuracy of the results was surprising, even under less favourable light circumstances. The main hurdle was uploading an image for face.com and keeping it in sync with the rest of the application. In the end we used the local Dropbox folder to store the web cam captures and letting Dropbox sync with the web version, the URL and file name are used in the face.com request.
The others worked on building the Galvanic Skin Response sensor, the lamp object and the integration of the heart-rate sensor and software for the new purpose.
We used Processing as the main language to read the values from the sensors, connect to the web and drive the output. The sensors write their current values to a file separate and one script reads all the sensor input to generate a visual output, change the colour and position of the lamp and change the sound.
The main application shows a changing, interactive landscape of lines and circles. The amount of arousal the corresponding valence determine:
- The position and colour of the circle. When you click on a circle the web cam image and heart-rate value is shown, allowing you to trace back how you felt during the day.
- The position of and colour of the light object
- The sound being played
Iwan made a nice presentation and we were finished just in time. The presentation went well and the jury picked our design as the best in Overall happy living category! That was just the icing on the cake of great and inspiring weekend.
Being one of the winners we also presented at the Internet of Things event at the High Tech campus in Eindhoven.
MADlab kindly supplied me with an artist residency to cover expenses.