Healthcare – acrylic on screen

I next thought I’d like to see what it looked like if I put my acrylic etchings on top of a computer monitor and ‘map’ some visualizations underneath them. I decided to do the same thing I had done to the medical symbol with the Raspberry Pi at Starbucks to these images and then put that underneath the acrylic:

I also tried a more abstract visualization too:

Life Patterns

Curious about my daily patterns I decided to explore the location data from my cell phone recorded over the course of a year. I set out to make a year long installation where people could see where I was in real time, as I move about my life.

I started out by creating a map with TileMill and then laser-etching it onto wood, creating a 16 square foot map of NYC:

I used openPaths on my phone to record the location data. To access that data I wrote a Ruby script that goes online and gets any new locations I’ve been too and updates a CSV file containing all my locations:

This script is run from within openFrameworks automatically, which I used for the visuals. I wanted to create something similar to a flight pattern map that airlines use. To do this I decided that if the next point was above the preceding point the path would curve upwards, and if it was below, downwards. Also, I thought the radius of the arc should be based on the distance between the current point and the destination point, so some fun trigonometry ensued:

I then used MadMapper to projection map on the wood:

Of course, can’t have the grey map image projected too, so just black for the background:

What was really interesting to me, was how beautiful the predictability of my life ended up being, drawn out like this. But perhaps I should explore all those blank areas on the map more? Let me know where you think I should go…

Here’s some video:

openFrameworks over phone

I’m dialing into openFrameworks from a landline. My button presses on the phone are displayed on screen as confirmation.

When I call in I’m routed to an Asterisk dial plan that waits for my button presses. When a button is pressed it calls a short Ruby script that makes a post request to a url sending the number of the button. A small Sinatra app receives this number and places it into a yaml file. The Sinatra app is also receiving get requests from openFrameworks. The get route in the Sinatra app returns the last number placed in the yaml file and it’s ID. Openframeworks keeps track of the ID. If it is different then the last ID it knows it is a new number and displays the number on screen.
Continue reading


I wanted to create a video that was rather abstract, but still played within the confines of time. I wanted multiple frames of the source video to be included on screen in a way where it would be hard to tell what the original movie was. Using openFrameworks I created two different settings. One captures the screen then draws that captured image back to screen in random places over top of the original image. The second draws the video to an FBO that again draws a subsection to random places. Since the FBO is not cleared old frames stay displayed until their alpha values finally get completely overwritten by new frames. The FBO is then added to an ofTexture and is additive blended with a white rectangle. In both settings the mouse controls the alpha value. I improvised between the two settings and with the alpha values. The original video was shot with my iPhone on the Brooklyn Bridge at dusk. I made the music too. Code for the video: