Data Bending

Live Pixel Sorted Experience

This one isn't design, but I think it's pretty sick so I'm showing it anyways.

Back in my sophomore year a professor got me really into glitch art. I'm a decent coder, so I wrote a program in Processing, which is a Java-based language for visual programming, that creates an aurora-like image based off of a live video feed. When I originally made it I projected it on a huge wall in a dark room, but unfortunately I was young and dumb and didn't take a video. I recently edited the project to run on my Macbook Pro through the webcam.

Because each pixel is individually displaced compression is really rough during rendering but running live it's super crisp. The framerate also took a hit from screen recording, but it consistently sits at 29 normally.

Explanation

First I take the pixels from the raw input and sort them by luminosity. I put the brightest in each column at the bottom, and the darkest at the top. I then displace the pixels vertically based on a counter running through a sine function. The function itself, and the speed of the counter, is dependent on the average brightness of each frame. When the frame is darker the wave oscillates quicker (it's afraid of the dark).

In the original version I used a Microsoft Kinect to get depth data, and would change the frequency based on the viewer's proximity, but unfortunately my Macbook can't do that.

If you want to see the code lmk! I'm not stingy about this kind of thing.