Archive for the 'Music Video' Category

Vevo Presents: The Maccabees (in the dark)

This live session with The Maccabees is presented by Vevo for the Magners ‘Made in the Dark’ campaign. The brainchild of Directors Jamie Roberts and Will Hanke, the performance contains a combination of live action footage (shot with an Alexa on a technocrane) and an animated sequence by Jamie Child and James Ballard. In addition to this, the scene was also shot in 3D. This was where I came in. In order to achieve this we built a rig that contained 10 Kinect cameras, each attached to a Macbook Pro. Seven of these were facing outward to the crowd and 3 were facing inward to the band.

Three applications were built to achieve this, all using openFrameworks. The client application used ofxKinect to record the point clouds. The millimetre data for each pixel of the depth map was transcoded into 320×240 TIFF images and exported to the hard drive at roughly 32 fps. A server application was used to monitor and control the 10 clients using OSC. Among other tasks, this starts/stops the recording, synchronises the timecode and displays the status, fps and a live preview of the depth map.

Once the recording had taken place a separate ‘mesh builder’ app then created 3D files from this data. Using this software, the TIFFs are imported and transformed back into their original point cloud structure. A variety of calibration methods are used to rotate, position and warp the point clouds to rebuild the scene and transform it into 2 meshes, one for the band and another for the crowd. A large sequence of 3D files (.obj) were exported and given to the post production guys to create the animated sequence in Maya and After Effects. This app also formats the recorded TIFF and .obj files so that there are only 25 per second and are in an easily manageable directory structure.

The whole project was one long learning experience consisting of many of hurdles. Networking, controlling and monitoring so many client machines was a challenge, as was dealing with and formatting such a huge amount of files.

One of the greatest challenges was calibrating and combining the band point clouds. I thought it would be possible to implement a single collection of algorithms to format each one then just rotate and position them. This wasn’t the case. It seemed that each camera recording was in fact slightly different. I had to implement a sort of cube warping system to alter each point cloud individually so they would all fit together.

I was aware that Kinect point clouds become less accurate at distances beyond 2-3 metres so I implemented a few custom smoothing techniques. These were eventually dropped in favour of the raw chaotic Kinect aesthetic, which we agreed to embrace from the early stages of the project. The idea from the start was to make an abstract representation of the band members.

The shoot took place at 3 Mills studios. Jamie and Will had managed to bring together a huge team of talented people and some incredible technology with a relatively small budget. In addition to Kinect rig, the technocrane and the Alexa camera there was also a programmable LED rig built and controlled by Squib. It was a pleasure to be part of the team and see it all come together.

Here’s some behind the scenes footage.


Flickr Photos