This is a script written for pyprocessing which I used to remix Naked and Famous' music video Young Blood. It's a simple algorithm with lovely output in video as well as the stills.
The project brief was to try and create and original and compelling generative image or animation that illustrates information overload - produce a data visualisation.
My main inspiration came from Jim Campbell's work with low resolution. I saw his exploded views installation at the San Francisco Museum of Modern Art. What really stood out to me was that his installation was still interesting to look at from different angles, even if you didn’t know what was actually happening. So my concept was trying to view a movie from another angle or another way to view a movie.
I experimented with different movie clips but in the end music videos provided the most variety in colours, shots and pacing and thus were the most successful.
The algorithm I ended up with was relatively simple, less than 50 lines of code - showing that complexity is not always best. A sphere is drawn to represent each pixel in the original frame, with its x and y coordinate staying the same only the z coordinate is changed based upon the brightness of the original pixel's colour. If you're really interested in seeing the code and how it works I have it on GitHub.
This is the comparison video I made where you can compare the original video next to the remixed version:
This is a selection of stills taken from the video without the nasty YouTube compression artifacts: