library atmosphere

The past couple of weeks I’ve been working on an assignment for the municipal library. The task was to let people present their views on the library of the future. To that end we created an area with seats, a bar, a touch table and lights. By touching a picture on the screen visitors could select a different atmosphere and actually change that by at the same time changing the colour of the lights. The choices of the visitors were logged in a file. This installation was presented during the Cultuurnacht (culture night) in the city of Breda, the Netherlands.

My task was to make the interactive application and drive the lights. I’ve wanted to experiment with interactive lighting so I can apply it in my Hermitage 3.0 project. So for me it was a great opportunity to learn about it. And learn I did.

My idea was to work with the Philips Hue. They have a great API and an active community. But due to budgetary restrictions I had to work with an alternative: Applamp, also known as Milight. The concept is the same: a wifi connected bulb can change colour and brightness by sending code to a local server port generated by a small wifi box. Applamp also has a phone app to work with the lights and a very basic API.

I had wanted to start working on the application before Christmas but this ideal scenario didn’t work out. The bulbs arrived mid January… The first task was to connect to the lights using the app. It appeared that my Android phone was too old for the app to work. So I had to borrow my neighbours’ iPad. The bulbs can be programmed into groups but you have to follow the steps in communicating with the lights otherwise it won’t work.

Applamp with iPad app

Once the bulbs were programmed I thought it would be easy to set up a simple program switching a bulb on and off. I’d found a nice Python API and some other examples in different languages. Non in Java or Processing though. I’ve used Processing because I wanted a nice interface with pictures, a full screen presentation and log the actions to a file.

I tried and tried but the UDP socket connection wasn’t working. So the biggest thing I learned was to do with network. I received a lot of help from Ludwik Trammer (Python API) and Stephan from the Processing forum. The latter finally managed to retrieve my local IP address and the port for the Milight wifi box, which was all I needed. (You actually don’t need the precise port, sending it to .255 is good enough.) The light technician Jan showed me a little app called Fing that makes it super easy to get insight into all the things connected to your local IP.

In Processing I wrote the interaction making sure that no buttons could be pressed while the program was driving the bulbs. There should be at least 100 ms between the different commands you send to the bulbs. This made the program a bit sluggish. But if the commands are send to quickly it doesn’t reach the bulbs and the colour doesn’t change. I had to fiddle around with it to get it stable. But the settings in my home weren’t optimal for the library. Alas there was not enough time to experiment with it there. So it wasn’t perfect but people got the idea.

This is a snippet of the program in Processing:

// import UDP library
import hypermedia.net.*;

UDP udp;  // the UDP object
int port = 8899; // new port number
String ip = "xx.xx.xx.255"; // local ip address

int[] colourArray = {110, -43, -95, 250, 145};
int currentAtmosphere = -1;

void setup(){
  udp = new UDP(this, port);
  startState = true;
  RGBWSetColorToWhiteGroup1();
}

void mouseClicked(){
  currentAtmosphere = 1;
  RGBWSetColor(byte(colourArray[currentAtmosphere]), false);
}

void RGBWGroup1AllOn(){
  udp.send(new byte[] {0x45, 0x0, 0x55}, ip, port);
}

void RGBWSetColorToWhiteGroup1(){
  RGBWGroup1AllOn(); // group on
  myDelay(100);
  udp.send(new byte[] {byte(197), 0, 85}, ip, port); // make white
  udp.send(new byte[] {78, 100, 85}, ip, port); // dim light
}

void RGBWSetColor(byte hue, boolean tryEnd){
  RGBWGroup1AllOn();
  myDelay(100);
  udp.send(new byte[] {0x40, hue, 0x55}, ip, port); // send hue
  myDelay(100);
  if(tryEnd){
    udp.send(new byte[] {78, 100, 85}, ip, port); // dim light
  }
  else{
    udp.send(new byte[] {78, 59, 85}, ip, port); // full brightness
  }
}

Another thing that’s puzzling is the hue value that has to be send. As all the codes send to the bulbs should be in byte size the hue must be a value between 0 and 255. The hue scale of course is from 0 to 360 degrees. I’ve figured out how they are mapped but found out by just trying all the values from 0 to 255.

I’m happy to say that the installation was a success. People thought it was fun to work with it and I had some nice insights into peoples idea’s for the library of the future. The final presentation could have been more subtle. But that’s something for next time.

 

test session

I’ve been working like mad for last couple of weeks to get the ‘drawingBreath’ software going. Main issues:

  • working with the sense-os API, more specificly formatting the string to be send to and retrieved from the server
  • getting the custom software to work on the various PCs
  • making the software work for five sensors in stead of one

From the above you can tell that I’m just an artist struggling to program without proper education. But I have learned a lot again especially about JSON in Java and iterations. And was happy I finished my two Java courses, at least now I had a good idea of what I was doing. The software can now do the following:

  • Login to the sense-os platform and get a session id
  • List all the ids of the 5 sensors
  • Read data from the serial port
  • Format (JSON) and send that data with a time stamp
  • Retrieve the data from all 5 sensors
  • Calibrate all 5 sensors
  • Make a drawing for every sensor
  • Make sounds for every sensor
  • The different tasks are all conducted by separate timers

I only want to fine tune the drawing and the speed of the drawing but for the most part it’s finished(!).

I’ve conducted some test sessions with a smaller group but yesterday evening was the first time there was four of us. It went surprisingly well. No problems with the server, it was a bit unreliable lately. And the visual and audio results were promising:

server side

Last week I visited the nice people of Sense OS. They offer a very powerful platform for working with sensor data: CommonSense. I talked to them about hosting my breathing_time performance on their system.They are enthusiastic about the project and want to take on the challenges that they’re up against.

Sense OS logo

The idea is that I make an account especially for breathing_time that will include the 5 wind sensors. The main challenges are: synchronisation of the different data streams and the real time aspect. As for synchronisation all the participants will have to install Network Time Protocol (NTP) software. This will ensure that all the computers use exactly the right time.

The data will be near real time. Every device will send an array with 5 data points every 500 milliseconds. And every second it’ll retrieve the 10 most recent data points for visualisation. We will use threads to make sure that hiccups in the connection or the network won’t disturb the flow of data. A local time stamp will be used to ensure every device always has the most recent data.

This sounds like a sound strategy the ensure flow through and speed. We’ll have to test of course to see how quick the data is send.

It’s been very nice talking to experts in this field. I feel the server side aspect of the performance is in safe hands with them.

Live

Last week a major hurdle was taken for this project.
1) I held my presentation for a very interested audience of around 200 students from the Design Academy in Eindhoven, the Netherlands. I was very nervous but once I got started things went OK. I only had 10 minutes so I focussed on the design process. The audience was also design orientated so I didn’t want to bother them with too much technique.

Presentation at Design Academy

Presentation at Design Academy

Opcacity and outline change depending on data

Opcacity and outline change depending on data

2) I finished the website www.aqab.nl I’m still amazed by the swiftness with which I realised this project. I’m very happy with the Flash app. Even though the data is fake (for now) it does give a good insight into the power of the app. I think I’m most proud of the icon and the way in which the outline colour changes to indicate good, bad or neutral smell. It’s just so easy to work with the Google Maps for Flash API. Great job, good documentation too! You can just use all the build in functionality in a very simple way by using the classes. Also check out the check boxes with which you can select all the days of a month, not as simple as it looks to program…
3) I’ve edited the video and put it online as a Flash flv file and you can play in from the homepage using the simple player. There wasn’t very much useful material for the motorway scene, so I’ve had to improvise a little. I think the rest of the video runs smooth enough. It gives a good impression of how to use the wearable.

There are still some things to be done: the displays don’t light up more then four rings per gas at the same time, I’ll have to change that for more pollution.  Maybe just light up the smallest and the largest. The Nickle Cadmium don’t supply enough power either and I don’t want to use more batteries. I’ll have to work on the Bluetooth connection. And I’ll have to interpret the Arduino values and recalculate them to ppm for every gas.

So keep an eye on this blog. All announcements will go through the website, the five most recent entrees of this blog are listed there as well.