I’m starting a new data visualisation project. It uses some eight years of data from the numuseum website. In 2005 I started with a micro diary (255 characters per day) using custom software to update it every two hours. In 2006 followed my energy level and in 2008 inner peace and stress were added. All could be updated every two hours using the custom software. There are almost 900 pictures and around 60 haiku’s.
All this data will be integrated in an off-line visualisation: reversed calendar. This will take the form of an enormous tear-off calendar, where every leaf represents a day. There will be 2865 leafs in the calendar.
So I first have to get my head around the data sets. Luckily statistics wizard Marco Altini is helping to sort things out. He uses the very powerful program called R. Which can give you quick insight into data correlations. It was a bit embarrassing to discover how sloppy my data is. Notations have changed over the years and errors cause my program to halt.
At the moment I’m designing the leafs and doing some initial data accessing. I started out with the micro diary. I use Processing and Java to read in the data and at a later stage create pdfs for every day which can then be printed and made into a real tear-off calendar.
My idea is to make a big loop going through all the days from 27-4-2005 till 01-03-2013. I use the Java GreogorianCalendar class for that. The diary is in csv format consisting of a date and text string. I then compare every date with the date text string in the loaded data. It took some time to get the formatting right so the data can be compared as strings. I now have the first data ready to be incorporated into the pages. The printed output looks like this:
maandag 25-02-2013 // date of the page
Things are looking clear today.<br />Why does everything go more slow then you hoped? // diary text
p. 2861 // page of the calendar
I hope I will learn the art of not worrying.<br />A nice conversation, new perspectives.
I want to share a little bit of code with you that I’m not using in the application but which might come in handy sometime. It lets you compare a date string to the incremented date (calStart):
Date d = new Date();
// make date
String myDate = “30-11-2005″;
DateFormat sdf = new SimpleDateFormat(“dd-MM-yyyy”);
d = sdf.parse(myDate);
Calendar tmpCal = new GregorianCalendar();
I’ve been discovering the Flex development environment (Flash Builder) and the scripting language the last couple of weeks. I’ve been doing the very good on-line course which really gives you insight into the program and the object oriented approach that is used. It’s nice to test my knowledge of OOP and see how Flex compares to Java. In the course they use the MVC pattern to set up projects. The whole environment is set up so that it is easy to separate data from interface and functionality. Design can be styled in separate stylesheets. In design mode you can quickly create an application by dragging and dropping components.
After having worked with Flash for over 10 years this is such an improvement. Much credit of course goes to Eclipse, the base on which the Builder is build. I also love the Network Monitor. It lets you monitor all incoming and outgoing data in three views two of which being tree view (XML nodes) and raw (plain text). Binding is also a very important, new concept for me. It is used to bind data to UI components.
Flex can only take XML as input. Flash is a lot more flexible in that respect. So I have to format all the database data to XML which isn’t all that difficult once you get the hang of it. Below is my first Flex project using the data from the Collecting Silence database to determine the relationship between stress and silence. I’ve used a standard chart component to make a first visualisation and trying out the concepts.
Finally I can pick up the research about the correlation between silence and stress. This was of course the main goal of the Collecting Silence project but I never got round to really dive into it. So I picked up where I left two years ago.
I’ve worked on a sketch in Processing:
Blue = silence data, green = stress. I want to create a sort of landscape where the gaps between the two create the relation. I want to integrate this graph in an application where people can explore the data from the perspective of the correlation:
Rolling over the data lines displays the values and moves the map. You can pick a date and explore the data attached to that date.
I’m going to build the app in Flash/AS3 (as the website of the project is for a large part in Flash) and I’m trying to do it the OOP way again which is still quite hard for me.
The second performance at the TIK festival was very different from the first. The sound was on and everybody was present, according to the logs. But the animation wasn’t as nice. I realised later that this was due to poor data throughput. An installation was running that took up a lot of bandwidth at times. Not all the breath flows were visible. But it was still worthwhile I suppose judging from this nice picture by Annemie Maes:
I realised after both performances that this is only the start. I managed in a relatively short time to tackle all major hurdles but there’s a lot to be improved and added. I understood from the participants and the audience that they find it exciting to breath and create something together. So my idea of bringing people together through breath seems to work. I’d like to explore this further and I’m considering turning this into an open source project and develop a kit that people can work with so they can joint the community of breathers ;-)
The logs show even more differentiation then during the first performance:
Last week I visited the nice people of Sense OS. They offer a very powerful platform for working with sensor data: CommonSense. I talked to them about hosting my breathing_time performance on their system.They are enthusiastic about the project and want to take on the challenges that they’re up against.
The idea is that I make an account especially for breathing_time that will include the 5 wind sensors. The main challenges are: synchronisation of the different data streams and the real time aspect. As for synchronisation all the participants will have to install Network Time Protocol (NTP) software. This will ensure that all the computers use exactly the right time.
The data will be near real time. Every device will send an array with 5 data points every 500 milliseconds. And every second it’ll retrieve the 10 most recent data points for visualisation. We will use threads to make sure that hiccups in the connection or the network won’t disturb the flow of data. A local time stamp will be used to ensure every device always has the most recent data.
This sounds like a sound strategy the ensure flow through and speed. We’ll have to test of course to see how quick the data is send.
It’s been very nice talking to experts in this field. I feel the server side aspect of the performance is in safe hands with them.
Yesterday we managed to get the first wind sensor working. Together with Richard I connected the sensor to the RBBB board, calibrated it and read some wind/breath data from the serial port into Processing. It works surprisingly well:
I have already noticed that the distance from the sensor to the nostrils is critical. I’ll have to experiment. Richard had a nice idea. I should use the tubes to direct the air from the nostrils straight to the sensor for the most accurate result. I will give it a try. It will make the wearable more scary but that doesn’t have to be a problem :)
I’m now processing all the video data for the silent portraits I filmed
during my mute month. Video is clearly not my thing. I’ve already spend
hours finding the right settings for the films. It’s hard not to get lost in
the maze of codecs, fps and aspect ratio settings.
I’ve made a first sketch of a new visualisation I’m exploring. It’s been in my head for years: sharing a complete (or as much as possible) experience and not just pictures or texts. I want to integrate time with maps and add subjective data as an extra layer.
I’m using a wearable time-lapse camera that takes a picture every 3 seconds (there about). For now I’m using a booklet with bookmarks in different colours to annotate the pictures with sensual experiences.
This example is just indoors and I don’t use GPS. Things will be more interesting when I go outside and incorporate maps and pictures from different eras for example. The idea is to get a rich, layered map that can go back from the prehistory to our current time and record my really tiny specs of exeriences in history.
I’ve build an animation in Flash using ActionScript to add a clock. Here’s a filmstrips of different screen examples.
My first attempt at visualising the stress and noise data from the
collecting silence (http://www.collectingsilence.org) project. An aim of the project is to discover if there’s
a direct relation between silence and relaxation/well-being. In this cut-out
of a large graph there seems to be some relation. (The longer and more
transparent the line the higher the value.)
I’m finally finished with the acquisition, parsing and the filtering of the data. The non-visual data that is. I’ve spend the last couple of days working on it. It’s pretty dull work but essential for displaying the data correctly. So now I’ve got 14 exquisite files which hold all the data (date and time, heart-rate, GPS coords and activities) for all 14 days. My Flash app will only have to loop through them. Bit by bit I’ll work on the photographs and I hope the whole work will be finished and online by next Wednesday.
An interesting book on working with data (for which I unfortunately don’t have enough time to work through) is Visualizing Data by Ben Fry, yes one of the developers of Processing. He’s a real genius in working with data, take a look at some of his projects.