Firewall

Firewall is an interactive media installation created with Mike Allison. A stretched sheet of spandex acts as a membrane interface sensitive to depth that people can push into and create fire-like visuals as well as expressively play music.

The original concept stems from a performance piece I’m currently developing as Purring Tiger (with Kiori Kawai) titled Mizalu, which will premiere in June 2013. During one scene in the performance dancers will press into the spandex with the audience facing the opposite side. Mizalu is about death and experience of reality, so this membrane represents a plane that you can experience but never get through. As hard as you try to understand what’s in between life and death, you can never fully know.

The piece was made using Processing, Max/MSP, Arduino and a Kinect. The Kinect measures the average depth of the spandex from the frame it is mounted on. If the spandex is not being pressed into nothing happens. When someone presses into it the visuals react around where the person presses, and the music is triggered. An algorithm created with Max allows the music to speed up and slow down and get louder and softer, based on the depth. This provides a very expressive musical playing experience, even for people who have never played music before. A switch is built into the frame which toggles between two modes. The second mode is a little more aggressive than the first.

User Testing:

Physical Computing’s Greatest Hits (and misses) – response

I’ve seen a lot of the projects listed in Physical Computing’s Greatest Hits (and misses). One that I’ve had recent experience with is the hand as cursor example. For GO-Brooklyn Open Studios I created a hand as cursor projection where 100 particles followed each hand of each user in a different color, and drew lines (pictures below). The installation was pretty effective. People intuitively use their hands when they see something they want to interact with. With this installation people could immediately see the effects of their actions, since the particles drew lines wherever the user moved their hands. The results were pleasing to see too, due to the physics that the particles were operating with. It was really enjoyable for me watching everyone have so much fun with it.

The installation also had a sound component. Based on the distance between both hands, sounds were triggered in certain spaces of the room. This was a little less effective because their was no visual reference for the users. They could tell that they were doing something that made the sounds, but couldn’t tell quite what it was that the sounds were linked to.

I’ve also created body as cursor projects for live performance where dancers’ movements are tracked from above. This is a little less obvious to the audience, since they are not directly interacting with it, but can be effective for using multiple people’s movements to draw things on screen.

GO – Brooklyn Open Studios pictures:

go pic 1
go pic 1
go pic 1

My Own Little Universe

3D drawing using the Kinect. I tracked the closest point to the Kinect plus two other relatively close points using openFrameworks. Each point has a different color and 5 particles each that draw in the 3D space with some physics. The code is here. Zooming in, through, and around the thing is fun.

my own little universe

my own little universe

my own little universe

my own little universe

Somehow I can’t figure out how to upload a good screen captured video of 3D stuff. The video below is ugly as sin, especially when zooming. If someone knows how to upload a decent screen capture of a 3D environment please let me in on the trick.