Life Patterns

Curious about my daily patterns I decided to explore the location data from my cell phone recorded over the course of a year. I set out to make a year long installation where people could see where I was in real time, as I move about my life.

I started out by creating a map with TileMill and then laser-etching it onto wood, creating a 16 square foot map of NYC:

I used openPaths on my phone to record the location data. To access that data I wrote a Ruby script that goes online and gets any new locations I’ve been too and updates a CSV file containing all my locations:

This script is run from within openFrameworks automatically, which I used for the visuals. I wanted to create something similar to a flight pattern map that airlines use. To do this I decided that if the next point was above the preceding point the path would curve upwards, and if it was below, downwards. Also, I thought the radius of the arc should be based on the distance between the current point and the destination point, so some fun trigonometry ensued:

I then used MadMapper to projection map on the wood:

Of course, can’t have the grey map image projected too, so just black for the background:

What was really interesting to me, was how beautiful the predictability of my life ended up being, drawn out like this. But perhaps I should explore all those blank areas on the map more? Let me know where you think I should go…

Here’s some video:

Firewall

Firewall is an interactive media installation created with Mike Allison. A stretched sheet of spandex acts as a membrane interface sensitive to depth that people can push into and create fire-like visuals as well as expressively play music.

The original concept stems from a performance piece I’m currently developing as Purring Tiger (with Kiori Kawai) titled Mizalu, which will premiere in June 2013. During one scene in the performance dancers will press into the spandex with the audience facing the opposite side. Mizalu is about death and experience of reality, so this membrane represents a plane that you can experience but never get through. As hard as you try to understand what’s in between life and death, you can never fully know.

The piece was made using Processing, Max/MSP, Arduino and a Kinect. The Kinect measures the average depth of the spandex from the frame it is mounted on. If the spandex is not being pressed into nothing happens. When someone presses into it the visuals react around where the person presses, and the music is triggered. An algorithm created with Max allows the music to speed up and slow down and get louder and softer, based on the depth. This provides a very expressive musical playing experience, even for people who have never played music before. A switch is built into the frame which toggles between two modes. The second mode is a little more aggressive than the first.

User Testing:

Glockentar

The Glockentar combines a glockenspiel with a guitar.

guitarGlockenspiel
Glockentar

Each time a string is plucked a glockenspiel bell is struck with a solenoid, and a beam of light is projected across the length of the string.

Glockentar

Glockentar

The light follows a logarithmic curve in terms of it’s speed as it goes up and down the string. It starts fast, pauses for a moment then goes back down the string.

The lights are essentially rectangles made in openFrameworks, then sent to MadMapper via Syphon.

In MadMapper they are then mapped to the strings.

Glockentar

An Arduino is used to turn the strings into switches. Each string acts as a ground, and electricity is sent to the pick. When a string is plucked with the pick the switch is closed and the solenoids and projections are triggered.

Glockentar

Here is the Arduino code:

Here is the OF code: