Maya cabin hackathon

Since this year my projects Meditation Lab and Silence Suit are part of Hack the Body program initiated by the art-science lab Baltan. They want to combine different programs so they suggested that Hack the Body should work together with people from the Age of Wonderland program.
That meant I could work with Branly again. I met him last year and that was a very impressive experience. Branly works with people using ancient Maya spirituality.
At the same time I could try out the Sensiks cabin. With this cabin you can create multi-sensory experiences. This is very similar to what I want to do in my Hermitage 3.0 project. (This will be a space where I can optimise meditation by changing the environment and influencing the senses.)
I brought my Silence Suit which already has a lot of working sensors. We could use the suit to log biometric and environmental data and see how they are influenced by the actuators in the Sensiks cabin.
The main aim of the hackathon was to explore if ancient Maya culture and rituals can be transferred to a high tech environment. The team members were David, Branly, Masha, later to be joined by Michel.

Day 1: exploring
The first afternoon Branly explained the Tuj/Temazcal. It is used in a purifying rebirth ritual. It is a small dome-like structure that is heated by hot stones and steam. The experience resembles a sauna. The rebirth ritual is multi-sensory too: touch (temperature, rubbing with twigs and salt), smell: different herbs and resins, taste: hot drinks (herbal infusions, cacao, honey). Sound: beating of a drum, like heartbeat. Vision is excluded mostly. The Tuj is dark except for red hot glowing stones. We decided to take this as a starting point for building our experience.

Tuj/Temazcal Wikipedia image

The Tuj is located on a beach or in the woods. A quiet, relaxing space. The ritual isn’t limited to experience in the dome. Preparations start days before. The space around the dome is also part of the ritual. For example the structure has a low door so you have to get on all fours to enter. This immediately takes you back to your childhood.

Sensiks control panel photo by Masha Ru

Sensiks control panel photo by Masha Ru

The Sensiks cabin has lots of different actuators: smell, airflow, light, sound, temperature and VR. Everybody had a test ride. We all felt the cabin was rather clinical. We wanted to connect it to the environment. Make it part of a bigger ritual like the Maya rebirth ritual.

Day 2: concept development
Next day we were joined by other Hack the Body participants and hackers. One of them was Michel with whom I collaborate on the Silence Suit.
The whole group had a very interesting discussion about what an experience actually is and where it is experienced. Is it meaningful to recreate an experience that can never match the real thing? The most interesting would be to create something that can’t be experienced in the real world. We wanted to work on changing our state of mind through bodily experiences.

Another level of conciousness... Photo by Masha Ru

Another level of conciousness… Photo by Masha Ru

Day 3: design and experiments
The Maya team was joined by technology wizard Michel. We decided that we did not want to mimic the actual sensory experiences but try to induce a state of mind, another level of consciousness. We used these keywords as our guideline: womb, unknown, subconscious, abstract and random, rhythm. The next step was to translate these abstract concepts into an experience in the cabin. Actuators that we could use: smoke, heat, sound, red and blue lights.

Michel at work Photo by Masha Ru

Michel at work Photo by Masha Ru

In the womb the developing child experiences the heartbeat and breathing of the mother. In the rebirth ritual they make use of a drum to simulate that heartbeat. We wanted to use our own heartbeat and breathing using life data from the Silence Suit. The Sensiks cabin would provide the feedback through sound and light and influence the user. We did little experiments to try out the effects of hearing your heartbeat and breathing, using smoke, scent, heating the cabin, using airflow, etc. It was promising.

Experimenting with sound Photo by Masha Ru

Experimenting with sound Photo by Masha Ru

Day 4: building and presentation
We wrote a scenario of the ritual which started and ended outside of the cabin. Our aim was to slow heart-rate by manipulating the feedback. Just like the peaceful heart-beat of the mother will quiet the unborn child. This is also a way to connect to the heartbeat of the cosmos.
From this came the idea to limit the experience to 260 heart-beats (there are 260 days in a Maya year). By slowing your heart-rate you can make the experience last longer. Four stages of 65 beats would offer different experiences aimed at first going inward and then returning to the outside again.

The ritual starts outside Photo by Masha Ru

The ritual starts outside Photo by Masha Ru

The main challenge was to get the Sensiks and Silent Suit systems working together and to time the events to the users’ heart-rate. We didn’t even have time to test the final scenario.
One of the jury members agreed to be the guinea-pig. And even though we didn’t manage to manipulate the heart-rate feedback we could hear her heart-beat slowing down as she progressed through the experience. Later she described that she could turn inwards and let go of the world outside the cabin. This was exactly what we were aiming for.

Presenting "260 beats womb reset" Photo by Stellarc

Presenting “260 beats womb reset” Photo by Stellarc

Some conclusions
For me the “260 beats womb reset” experience was a proof of concept. That you can actually change a state of mind through relatively simple means (light, sound, smell and airflow) using physiological data as input. An interesting insight is that it is important to make the experience bigger than the box. To create a larger ritual that is not isolated from the rest of the environment. The user must be lured and triggered to actually use the cabin, it must make sense in the context of life.

It was a great inspiration to work with Branly, David, Masha, Michel, Fred (the inventor of the Sensiks) and all the other participants. Michel did a great job of getting everything to work in time for the presentation and combining the systems. We’ve been able to create a spiritual experience using technology. It will be worthwhile exploring this further. I feel a step closer to realizing my Hermitage 3.0.

Edit >> In addition to this report there is an interview with me by Olga Mink from Baltan Laboratories all about the hackathon. Included there is a very nice video impression of the whole week.




Pitch for The Big Date Hackathon

I was invited to pitch at a hackathon hosted by the GGD. The topic was: Data citizens: using quantified self to improve health? I got a lot of positive feedback on my pitch so I want to share it here.

I have a dream…
But then I wake up.
I’m lying in my bed, my Emfit QS sleep sensor has logged my sleep phases, heartrate and movements. Todays’ sleep score is 86 points. But how did I sleep according to me? For one I already feel quite stressed because of some issues at work.
I take my morning blood pressure reading and sure enough the blood pressure has risen.
I hope some meditation will help. I put on my meditation monitoring gear and meditate for 30 minutes. Later I can see from my log that my heart rate came down. And I’m glad the whirlwind of thoughts has dropped.
Every morning I’m curious about my current weight. So I step on my Aria Wi-Fi scale, hmm. Yesterday I had a beer and some peanuts and it shows: weight has gone op by 0.4 kg and fat percentage by 0.1. But I can make a new start every day.
So let’s continue with a healthy breakfast: banana 83 gr, 74 kcal, orange 140 gr, 69 kcal, kiwi fruit, 75 gr, 46 kcal. After that a nice, warm oatmeal with extra fibre, apricots, flax seeds and soy milk: a total of 360 kcal.
Now I’m ready for work! My project timer logs the minutes I spend on different projects and the Workpace software makes sure it take my breaks on time.
After lunch (498 kcal) it is time for my walk in the afternoon sun. 4731 steps. Still more then 5000 to go.
In the evening, after a workout and a nice diner I check my energy balance, 1966 calories in and 1856 calories out. I try and burn a little bit more and take an evening stroll.

After some stretch exercise I head of to bed. And then I have a dream:
I’m travelling on a train. A nice and professional looking lady takes the seat next to me. She says: “I’ve been watching you. I see you very often, almost every time I take the train. I’ve got a feeling I know you pretty well. I know you have a very conscious lifestyle: your diet is healthy, you take enough exercise and your BMI is perfect. I estimate your biological age to be around 12.5 years younger than your chronological age. But still, you sleep poorly from time to time and your fat percentage as well as your concentration during meditation fluctuate. Please let me tell you what you can do to further optimise your health.” She bends over and starts whispering in my ear. I can’t make out everything she says but a sense of insight, purpose and control fills me. I lean back in my chair and a feel happy and relieved.

As we’re entering a tunnel she gets up and sits down opposite an elderly, overweight woman with a walking stick by her side. Slowly the young professional transforms into a kind granny as she takes out some knitting from her bag. She starts a conversation with the other woman, about arthritis if I’m not mistaken. Then I wake up.

I had a dream. In this dream all the fragmented pieces of data that I collect about my body and behaviour were translated into actionable information, explained to me in a language I can understand. I had insight into what my next steps should be and what path to follow to keep on track and to further improve my health. I received some true health wisdom.
Now I’m a media artist, I work with data, I program, make visualisations and use statistics. But even for me it is not clear what actionable conclusions I can draw from my data. A visualisation doesn’t necessarily lead to insight let alone advice on how to improve my lifestyle.

And look at the elderly lady. She got her information in a way that was appropriate for her. The oracle answered questions and gave advice fitting to this individual based on a deep understanding of all the data available.

But… it was a dream.
I challenge you to come up with solutions on how to combine data sets, generate knowledge from it and translate it into plans and advice people can really work with. Solutions that are transparent and respect the choices and privacy of the users.
I challenge you to make my dreams come true this weekend.

The big date

The big date hackathon, picture by MAD


Quantified Self Europe conference 2015

As always, I was very much looking forward to the conference. The program looked promising and I hoped to meet QS pals. And because I was giving an Ignite talk and testing my Virtual View installation with updated software (view below.) This is an account of the most striking things I heard and saw.


The how-to sessions were new. I suppose they’re great for subjects which are limited in scope. Like the one on meditation tracking by Gary Wolf. The idea that just tracking time and duration of your meditation sessions can give you insight into how your life is going was refreshing. I’ve got an idea to automatically log my sitting periods. This session has given it a new boost.

There were some sessions on HRV. I went to the one Marco Altini gave together with Paul LaFontaine. I got some useful information on the two modes of tracking: PPG (60 seconds in the morning) or situational tracking. Both have their specific uses. The Polar H7 belt is most reliable and comfortable for the latter as you can wear it for long periods. It was nice to see how Paul did many experiments combining datasets of activities (e.g. phone logs) with HRV data. The session was a nice intro but I would have liked more hands on information. But I talked with Marco later during the office hour. If I just want to measure global, all day changes in heart rate a device like the Fitbit Charge HR would also do. Marco was wearing one and was satisfied with it. He’s the expert, it’s on my whish list…

I really liked that the show & tell talks were programmed on their own. It gave a lot less choice anxiety. The one on speed-reading by Kyrill Potapoc was a real revelation. I’ve already installed the Spritzlet browser extension. As a dyslectic, any improvement in my reading speed is welcome.
I also enjoyed the way Awais Hussain approached datasets that already existed to gain insight in causal chains and decision points. All this in aid to get best start for the future. I think it is a poetic approach.

I skipped one breakout to stroll around the tables during the Office hour. This made me very happy. Emmanuel Pont has developed the Smarter Timer app. It lets you track you activities at room level using differences in strength of Wifi networks. It is a learning app so you can teach it your activities in certain places. A desktop app will also track your software use. Exactly what I need! And a big improvement from the piece I did way back in 2008 “Self portrait @ home“. (I scanned QR-codes every time I entered an room.)
I also had a nice chat with Frank Rousseau from Cozy. An open source platform that allows you control over your own data. If offers similar functionality to the Google suite (mail, calendar, file sharing, etc.) I’m trying it out at the moment. I hope that I’ll be using it on my own server one day.

Ellis Bartholomeus

Ellis Bartholomeus told a very refreshing story about her hand drawn smiley’s. She treated the little drawings as data and discovered much about her moods. It was nice to watch the different stages of her process of getting to grips with what icons to use and how to interpret them.
Jakob Eg Larsen shed some interesting light on one of my favourite topics: food logging. I liked the simplicity of his approach to just photograph one meal a day, his dinner. It was funny how he struggled with the aesthetics of the food. It made me wonder: how much do the colours of your food tell you about their nutritional value?
One of the most amusing and at the same time personal talks was from Ahnjili Zhuparris. She was looking for correlations between her menstruation cycles and other aspects of the life like music and word choice. Not all clichés appear to be true. Female cycles caused some complaining among a few of the male attendants. Moderator Gary Wolf dealt with that in a compassionate but effective way. I was very impressed.

Jakob Eg  Larsen

Reactions to the Virtual View installation

During the Office hour and at the end of the day people tried out the installation. I had 14 users in total. Of course I logged some general data ;-)
I logged baseline heart rate and the lowest heart rate achieved during the experience after the baseline was set. The mean heart rate is calculated over each animated wave. A wave lasts 7.5 to 13.75 seconds depending on the frequency spectrum data. The mean baseline heart rate was 79,68 and the mean lowest heart rate was 68,01. The difference between these two means is significant. There was quite some variation between users: the maximum heart rate during baseline was 96.08 the minimum was 54.76 resulting in a big difference of 41.3. The variation in lowest pulse during the experience was between 80.07 max. and 50.45 min. resulting in a difference of 29.6.
For me it was good to see that even in relaxed circumstances using Virtual View results in a reduction of heart rate. Every user showed reduction, the average reduction was 14% with a maximum of 32%!

Still from the animation

I’m really happy to have received valuable feedback. These are some of the remarks that stood out. Overall users really liked the installation and found it relaxing. A couple of people expected an end to the animation. But a view doesn’t have a beginning or end. I should find a way to make it more clear that people can leave at any time.
Even though I’ve improved the feedback on heart rate some people would still like a little more information about it. For example their baseline measurement at the start of the animation.
The use-case of daycare with difficult children or people under stress like refugees, was suggested.
One of the users said it would be nice to have sheep on the hills. I really like that idea. They shouldn’t be too hard to draw and animate. Their moving speed could for example also give an indication of heart rate.
There were some requests for Virtual Reality devices but I still don’t think this is a suitable technology for patients in healthcare institutions, the main target group.

Apart from the content, there’s always the social aspect which makes the QS conferences such great experiences. People just feel uplifted by the open and tolerant atmosphere, the sense of learning and sharing that we all feel. I can’t wait for next conference to come to Europe.

Quantified Self Conference Europe 2014

For the third time I’ve visited the Quantified Self Europe conference in Amsterdam. I had been looking forward to it but was also a bit nervous because I was asked to take part in a panel discussion on Sunday morning. I felt very honoured of course to have been asked to talk about my Reversed calendar project which I finished last year. The discussion topic was on long term tracking. Apparently it is not something a lot of people have done. We got a lot of questions and hardly any experiences from the audience. The talk went well and it was nice to hear the other speakers. Especially Alberto is quite a die-hard, logging the most crazy things. He’s also an artist and it is interesting to see the different approach artists take on collecting data about themselves. The starting point that you use your personal data as material to make stuff is so different from other approaches. The goal is not to improve but to become aware and study yourself through the collecting, more then through the actual interpretation of the data.

QSEU14 talk

Here’s what I did:
Grief and Mood Tracking (Breakout session)
Whitney Erin Boesel, Dana Greenfield
What happens when you’re tracking, but not looking to change you how feel? Join us to discuss the ways we can use different techniques to work through the process of loss and grief.
Dana gave a very moving and inspiring opening talk about how she is tracking the memory of her mother who passed away recently. She used simple tools like a Google form and pictures to log things that reminded her of her mother. I’d already decided before that I’d like to join this breakout as I’ve made a cd-rom about the death and remembrance of my mother and my grieving process back in 2001.

Someone suggested that it would be interesting to track how the grieving network around you changed as time moved on. For me the reason to make this cd was partly because of a lack of network…
The question came up if someone had experienced grieving with and without tracking. I was in the unfortunate position to have experienced both. I was quite a discovery for me that the making of an art-piece was much more helpful in the grieving process than just tracking my mood. The latter was just a conformation of my sadness while in the art making process I could transform this into something beautiful that I could share.

Ignite Talks
Washing My Eyelids
Steve Dean
Steve will demonstrate how he used self-tracking tools to get under atopic dermatitis.
Tracking his eyelid inflammation was useful to him in talking to his doctor but didn’t yield any insights on its own. This was an interesting talk because of the frustrating process Steve was tracking and the way he kept going in spite of the lack of results.
Analyzing Changes in My Weight and Sleep
Kouris Kalligas
Kouris spent thirty hours combining his multiple data streams into one place, and learned what influenced his weight and sleep.
What was interesting for me here was the thoroughness with which Kouris had looked for correlations between the things he tracked. He also made a list of expected findings at the beginning of his quest and compared these with the outcomes of his analyses. One finding intrigued me: higher fat percentage during the day led to a better sleep. I think that might have to do with feeling more satiated And therefore eating earlier. I’m going to do a little experiment myself on the correlation of food and beverage intake and working late and sleep quality (view below).
Fit 50s Sound 60s
Maria Benet
Maria has been tracking for almost 10 years, developing strategies for improving and maintaining her health as she ages.
I really enjoyed this talk by Maria. First of all because she’s not your usual QS suspect. I found it very refreshing to hear a story by someone who step by step discovered self tracking because she wanted to lose weight and become fit again. Here was this somewhat older lady talking about all these apps and devices with a lot knowledge from experience. I liked the Excel sheet in which she manually annotated and combined different measurements go gain more insight. A quote from Maria that I wrote down: Small habits add up to a big impact in the long run.
A Testosterone and Diet Experiment
Maximilian Gotzler
Blood tests showed Max he had low levels of Vitamin D and Testosterone. Could diet changes help?
What I liked about this talk was the thoroughness with which Maximilian tried to tackle his deficiencies. He had all kinds of blood test done which I didn’t know existed. Would I be able to afford them?

Photo Lifelogging as Context for QS Practice
Cathal Gurrin, Niclas Johansson, Rami Albatal
Learn how to use computer vision to extract metadata from lifelogging photos, enrich a photo timeline with other personal data, and draw insights from massive longitudinal photo collections.
I’ve been thinking a lot about easy food logging and behaviour tracking through pictures. I would make my life so much easier if these things could be automated. So I was really happy when I read this was a topic of one of the breakout sessions. It was a very interesting but sobering talk. No way am I going to write my own program or app to log my food or extract activity from a picture. It takes the experts a _very_ long time to write classification algorithms for every object. It all has to be annotated by human hand.
But fortunately they are open to collaboration. I think automated food and calorie logging will be very big. So I offered to work on the annotating if it can eventually lead me to my food being logged with the right amount of calories while I eat! They were also interested in my behaviour tagged pictures from the north-southfeeling project. So if they’re helpful I’m happy to share them.

Sensing Smell
Jenny Tillotson
Scent has the power to profoundly affect our psychology and physiology. Learn about the state of the art in smell tracking, interpretation, and use.
Smell is something I’ve been interested in for a long time. I’ve used it in the AQAb wearable of course. But for me personally smell is also very important. Jenny is a designer of wearables who is really deep into everything about smell. She’s working on a device that can reduce stress and improve sleep through scent. Being an academic she has the opportunity to work with lots of experts in the field, I envy that sometimes. As an artist you have to do so much on your own.
A lot of aspects about scent and smell still remain a mystery. Digitalising scent is still far of. I asked her about enhancing meditation with scent. She said there’s an interest in that lately, in the realm of mindfulness and she will e-mail some pointers on where to start with that. Great!

Neuroscience & EEG
Martin Sona
This was an unplanned breakout session with neuroscientist Martin Sona on the latest developments in devices and applications for the QS community.
Martin is a really nice and accessible guy. I knew he had a lot of knowledge on open source EEG but I had no clue he was a neuroscientist working as a researcher at the university of Maastricht.
I’ve been looking for an easy way to capture brain data. I was very enthusiastic about the TrueSense wearable bio-sensor kit that was at the QS conference last year. But I couldn’t really work with it because I couldn’t figure out how to get to the live data and it was very hard to interpret the streams. Martin has been collaborating with them and made some patch in BrainBay an open source Bio- and Neurofeedback Application that can be used with the TrueSense kit. Wow, looking forward to trying that out. Martin is looking for ways to be able to place the sensor at different sides of the head. I will look into that for him. I want to integrate it in a wearable anyway.

On top of all these inspiring talks and exchanges I was lucky to make contact with a lot of people. Some from companies, some just participants, some I’d met before others new. There’s a lot of time to talk to people and the insights you get from them and hopefully give to others is just so rewarding.

And I’ve done an ‘impulse’ purchase. It wasn’t really an impulse as I slept on it but for me it is quite something to buy something over a 100 Euro without weeks of deliberation. I’ve ordered an Emfit device. It’s a sleep tracker that can distinguish between different sleep phases and track heart- and breathing rates when you sleep. It can even do heart-rate variability. They’re working on a downloadable csv file of your data and an API. All data is send wireless from a non-contact device under your bed sheets. I’ve wanted a sleep tracker for years. Can’t wait to try it!

Photo Ian Forrester

Finally there was quite a distinct buzz about empathy and including others in your tracking. Kaiton Williams gave an interesting opening speech where he mentioned tracking for empathy. I’ve always wanted to inspire and give to others with my tracking by transforming it into art. But I’m looking for ways to make it more concrete. Quite a few people came up to me to talk about the subject. I might even do something to improve animal well fare using the breathCatchers. It is good to see that others are also looking for ways to reach out and share more.

All in all a very, very inspiring and uplifting experience. I’m already looking forward to next year.

library atmosphere

The past couple of weeks I’ve been working on an assignment for the municipal library. The task was to let people present their views on the library of the future. To that end we created an area with seats, a bar, a touch table and lights. By touching a picture on the screen visitors could select a different atmosphere and actually change that by at the same time changing the colour of the lights. The choices of the visitors were logged in a file. This installation was presented during the Cultuurnacht (culture night) in the city of Breda, the Netherlands.

My task was to make the interactive application and drive the lights. I’ve wanted to experiment with interactive lighting so I can apply it in my Hermitage 3.0 project. So for me it was a great opportunity to learn about it. And learn I did.

My idea was to work with the Philips Hue. They have a great API and an active community. But due to budgetary restrictions I had to work with an alternative: Applamp, also known as Milight. The concept is the same: a wifi connected bulb can change colour and brightness by sending code to a local server port generated by a small wifi box. Applamp also has a phone app to work with the lights and a very basic API.

I had wanted to start working on the application before Christmas but this ideal scenario didn’t work out. The bulbs arrived mid January… The first task was to connect to the lights using the app. It appeared that my Android phone was too old for the app to work. So I had to borrow my neighbours’ iPad. The bulbs can be programmed into groups but you have to follow the steps in communicating with the lights otherwise it won’t work.

Applamp with iPad app

Once the bulbs were programmed I thought it would be easy to set up a simple program switching a bulb on and off. I’d found a nice Python API and some other examples in different languages. Non in Java or Processing though. I’ve used Processing because I wanted a nice interface with pictures, a full screen presentation and log the actions to a file.

I tried and tried but the UDP socket connection wasn’t working. So the biggest thing I learned was to do with network. I received a lot of help from Ludwik Trammer (Python API) and Stephan from the Processing forum. The latter finally managed to retrieve my local IP address and the port for the Milight wifi box, which was all I needed. (You actually don’t need the precise port, sending it to .255 is good enough.) The light technician Jan showed me a little app called Fing that makes it super easy to get insight into all the things connected to your local IP.

In Processing I wrote the interaction making sure that no buttons could be pressed while the program was driving the bulbs. There should be at least 100 ms between the different commands you send to the bulbs. This made the program a bit sluggish. But if the commands are send to quickly it doesn’t reach the bulbs and the colour doesn’t change. I had to fiddle around with it to get it stable. But the settings in my home weren’t optimal for the library. Alas there was not enough time to experiment with it there. So it wasn’t perfect but people got the idea.

This is a snippet of the program in Processing:

// import UDP library

UDP udp;  // the UDP object
int port = 8899; // new port number
String ip = "xx.xx.xx.255"; // local ip address

int[] colourArray = {110, -43, -95, 250, 145};
int currentAtmosphere = -1;

void setup(){
  udp = new UDP(this, port);
  startState = true;

void mouseClicked(){
  currentAtmosphere = 1;
  RGBWSetColor(byte(colourArray[currentAtmosphere]), false);

void RGBWGroup1AllOn(){
  udp.send(new byte[] {0x45, 0x0, 0x55}, ip, port);

void RGBWSetColorToWhiteGroup1(){
  RGBWGroup1AllOn(); // group on
  udp.send(new byte[] {byte(197), 0, 85}, ip, port); // make white
  udp.send(new byte[] {78, 100, 85}, ip, port); // dim light

void RGBWSetColor(byte hue, boolean tryEnd){
  udp.send(new byte[] {0x40, hue, 0x55}, ip, port); // send hue
    udp.send(new byte[] {78, 100, 85}, ip, port); // dim light
    udp.send(new byte[] {78, 59, 85}, ip, port); // full brightness

Another thing that’s puzzling is the hue value that has to be send. As all the codes send to the bulbs should be in byte size the hue must be a value between 0 and 255. The hue scale of course is from 0 to 360 degrees. I’ve figured out how they are mapped but found out by just trying all the values from 0 to 255.

I’m happy to say that the installation was a success. People thought it was fun to work with it and I had some nice insights into peoples idea’s for the library of the future. The final presentation could have been more subtle. But that’s something for next time.


Karuna clouds

Maha Karuna Ch’an, de zengroep onder leiding van Ton Lathouwers bestaat dit jaar 25 jaar. Vanuit de groep kwam het initiatief om iets doen met teisho’s (zen toespraken) die Ton in al deze jaren heeft gehouden. Ik heb in de 15 jaar dat ik bij Maha Karuna kom vele toespraken gehoord. Soms zijn ze heel ontroerend, bijna altijd inspirerend. Mij werd gevraagd of ik vanuit mijn kunstenaarschap een andere, meer beeldende benadering had om iets te doen met de teisho’s. Omdat nieuwe media mijn medium is leek het me interessant iets te doen met woordwolken die je op internet kunt genereren. Het idee hierbij is dat hoe vaker een woord in een tekst voorkomt, hoe groter het wordt afbeeld. Dit is dus een nieuwe manier om inhoud te visualiseren, die veel gebruikt wordt in blogs. Ik was benieuwd hoe de teisho’s zich hebben ontwikkeld, is er een rode lijn te ontdekken?

Stap één was het verzamelen van zoveel mogelijk teksten uit zoveel mogelijk jaren. Dit bleek niet eenvoudig. Het materiaal was fragmentarisch, van vroegere jaren bleek er nauwelijks iets gedigitaliseerd. Ik heb besloten om alleen te werken met jaren met voldoende tekstmateriaal. Dit zijn de jaren 2001, 2002, 2007, 2008, 2009 en 2011.

Mijn eerste idee was te werken met Wordl, een beroemde online tool, die fraaie wolken produceert. Deze tool bleek weinig inhoudelijke controle over het uiterlijk te kunnen uitoefenen, bovendien kon het niet zo’n lange teksten aan. Ik heb deze meer visuele tool aangevuld met de functionaliteit van Tagcrowd. Hiermee kun je woorden uitsluiten en woordaantallen weergeven die je als tekst kunt selecteren. Ik besefte dat ik zelf het een en ander moest programmeren om tot optimaal resultaat te komen.

Van verschillende mensen heb ik tekst bestanden gekregen. Deze moest ik ordenen, opschonen en onderzoeken wat echt bruikbaar was. Daarna heb ik de teksten van elk jaar samengevoegd tot één grote tekst en deze geupload naar Tagcrowd. Gaandeweg heb ik de lijst met uit te sluiten woorden uitgebreid en de verzamelde teksten diverse malen naar Tagcrowd gestuurd.
Zo’n analyse geeft al snel meer dan 500 verschillende woorden terug. Ik heb gekozen de meest voorkomende 150 woorden per jaar te gebruiken. Van ieder woord wist ik dan ook meteen hoe vaak het in dat jaar was genoemd. De resultaten kopieerde ik naar een tekstbestand. Dat zag er dan bijvoorbeeld zo uit: aarde (39) abe (117) allemaal (150) anderen (41), enz.
Het bleek dat Wordl een advanced setting had waarmee je in plaats van de lange tekst, direct een lijst met gewogen woorden kon invoeren in het formaat woord:aantal. Ik heb in Processing een programma geschreven om de Tagcrowd resultaten om te zetten naar het Wordl formaat:
Deze gewogen woordlijsten heb ik geïmporteerd in Excel en gesorteerd op alfabet of meest voorkomend. Door de verschillende presentaties van woorden, kijk je op een andere manier naar de tekst, je ziet weer andere verbanden, andere woorden springen eruit.
Door zo te spelen met de woordlijsten en hun aantallen ontstonden de ideeën voor verschillende visualisaties. Ik vond de gewogen woorden in een wolk met mooie kleurtjes en lettertypen weinig betekenisvol. Gelukkig bood Wordl ook de mogelijkheid zelf middels een code (hexadecimaal) kleuren aan te geven, die heb ik uiteindelijk handmatig aan elk woord toegevoegd.

Vooral alle visualisaties waren de lijsten met gewogen worden de basis. Ik heb hierin verschillende hoofdthema’s onderscheiden: samen, mystiek, sutra, tijd, zen, taal en overigen. Elk thema eigen kleur.

Pagina 1: van elk gekozen jaar de worden de 100 meest voorkomende woorden getoond. Ze zijn gesorteerd van meest naar minst voorkomend. Vormgeving en kleurtoekenning heb ik met de hand uitgevoerd in Illustrator.

Pagina 2: Elk jaar heeft zijn eigen woordwolk, de centrale wolk bevat woorden die in alle jaren voorkomen. Ik heb een programma geschreven dat de zes lijsten met 150 meest voorkomende woorden doorloopt. Woorden die in alle lijsten voorkomen zijn bij elkaar gezet en het aantal vermeldingen van elk van die woorden is per woord bij elkaar opgeteld, deze bepalen de grootte van de woorden. Er waren in totaal 66 gemeenschappelijke woorden, variërend van 903 tot 252 vermeldingen. De kleurencodes heb ik handmatig aan de lijst toegevoegd:
Hiermee heb ik een woordwolk gegenereerd in Wordl. Ik schreef ook een programma om de overige woorden per jaar uit te filteren en de aantallen bij elkaar op te tellen. Ook hier zijn kleurencodes handmatig toegevoegd en is voor ieder jaar afzonderlijk een wolk gegenereerd. (Zoals je ziet wordt Dostojevki lang niet zo vaak vermeld als Ton wel denkt.)

Pagina 3: Ik vond het ontroerend dat bepaald woord zo vaak is uitgesproken. Ik wilde recht doen aan die woorden en uitdrukken wat die hoeveelheden eigenlijk betekenen. Ik schreef een programma dat de meest voorkomende, gezamenlijke woorden zo vaak afdrukt als zij uitgesproken zijn. De kleuren zijn handmatig aangebracht in Illustrator.

Pagina 4: Deze wolk toont de verzameling van de 150 meest voorkomende worden van alle jaren. Het programma dat ik schreef kijkt per woord hoe vaak het woord voorkomt per jaar. In het totaal bleken er 300 verschillende woorden te zijn. De kleurcodes zijn gekopieerd uit de woordenlijsten met kleurcode die eerder handmatig waren aangevuld. Het leverde een gigantische woordwolk op die ik over mijn twee schermen heb verdeelt om zo de kleinere woorden ook te kunnen vangen. Het kleinste woord, jan kwam 15 keer voor.

Voor mij was het heel interessant om te ontdekken dat de woorden die ik zo goed ken uit de toespraken van Ton mysterieuzer schenen door ze op verschillende manieren bij elkaar te zien. Ik heb ze teruggeven aan zichzelf.

Ik wil Maha Karuna Ch’an hartelijk bedanken voor het gebruik van de teksten. Verder dank ik: Karin van der Molen, Mieke Coenen, Maria Werkhoven en Jo Ampe voor het aanleveren van de teksten.