Cell at Audi City Beijing

I recently travelled to China to install Cell at a new media arts exhibition held at Audi City Beijing. Cell in an installation I made in collaboration with Keiichi Matsuda in 2011 that is a provocation, a comment on the commodification of identity and a vision of how we might present ourselves in the coming years (more here).

It was always our intention to change this installation over time to implement new technologies and adapt to different contexts. In this instance we decided to give the visitors the opportunity to contribute to the piece by submitting the tags. This was achieved via a web app that would present the user with one of twenty questions such as – “Where did you meet your first love?” or “What is something you couldn’t live without?”. The answer is submitted and added to the collection of tags. Whereas the original piece would allow users to adopt the role of fictional characters, the result of this version was a crowd sourced cloud of words and phrases that formed a collective identity over the course of the week long exhibition.

Cell_1

Cell_2

There were several challenges this time round. The display consisted of 2 “PowerWalls” that, when combined, consisted of 8×4 plasma screens – an overall size of 11x3m. We went with a very powerful custom PC (made by Gareth Griffiths of Uberact) as we needed to significantly increase the tag count and split the image over the 2 walls (using a DualHead2Go). We also needed the extra power as there were 5 kinects (all running from separate PCs). This allowed for up to 10 simultaneous users and meant more calculations than usual. Cell is an open source project and the code for the new iteration is available here. The piece requires openFrameworks v0.8.0 and Visual Studio Express 2012.

Cell_3

I was pleasantly surprised to discover that my app Konstruct (made with Juliet Alliban) was also exhibited at the event. This section was part of the AppArtAwards exhibition and was organised by the Goethe-Institut China and ZKM.

Finally, huge thanks to Audi for holding the exhibition, to ADP Projects for helping to curate the event and acting as producers in Beijing, to Keith Watson for providing some space at Level39 for testing, to Juliet Alliban for helping with the setup and to Gareth Griffiths for building the PC.

Bipolar at Digital Shoreditch

Bipolar is an experiment in using the human form as a medium for sound visualisation. It is an audiovisual virtual mirror that warps the participant’s body as they wander through the space. A soundscape designed by Liam Paton is generated from the presence and motion of the participants. The data from this (in addition to sounds from the user and environment) is used to transform the body into a distorted portrait that fluctuates between states of chaos and order.

Bipolar

Bipolar

Bipolar

This piece has evolved from an experiment I made 18 months ago when exploring the possibilities for using the body as a canvas for visualising sound – have a look here for more information on the technology. Since then it has been exhibited at a number of events including Digital Shoreditch, The Wired Popup Store in Regent St, The New Sublime exhibition at Brighton Digital festival and The BIMA awards. There are plans to install it at several more spaces in the coming months.

Bipolar at Wired Popup Store

Bipolar at Wired Popup Store

Bipolar at Digital Shoreditch

Bipolar at Digital Shoreditch

In the time since the original experiment, Bipolar has gone through several changes and optimisations. The biggest addition is the interactive sound aspect which was designed by Liam Paton, composer and co-founder of Silent Studios. The idea was to build a dark, abstract soundscape to compliment the visuals and react to motion, location and distance. He built the software using Max/MSP and I was able to communicate with it from my openFrameworks app via OSC.

Visually, I wanted to retain the chaotic nature of the original but with a few refinements and optimisations. The main issue with the original version was the fact that the extrusions appeared to be fairly random. Each spike is achieved by extruding a vertex in the direction of its normal but the normals weren’t very smooth. This was down to the way in which the depth data from the Kinect is presented. In order to get round this I implemented a custom smoothing algorithm that took place on the GPU (the vertex normals were also calculated by making a normal map on the GPU) which allowed me to create a much more pleasing looking super optimised organised chaos.

Kinect Processing

Another addition was some fake ambient occlusion. The original piece could seem a little flat in places, so this effect was added to create what look like shadows surrounding the spikes. I achieved this by darkening the colour of certain vertices surrounding the extruded vertex. The results should be visible in the image below.

Bipolar shading

At the moment all of the mesh processing is tightly interweaved into the application. I intend to release an addon in the coming weeks that will include most of this functionality along with some simple hole filling.

The Rite of Spring – A Sound Responsive Laser Performance

A sound-responsive laser installation set to The Rite of Spring, performed with the North Netherlands Symphony Orchestra.

Arcade were commissioned to make a visual accompaniment to Stravinsky’s masterpiece. The project was produced by the Groninger Forum for the Timeshift festival in Holland, to celebrate the 100-year anniversary of the controversial first performance of The Rite of Spring. Our response was to construct a virtual architecture from laser beams, transforming the music into a dynamic forest of sound and light.

50 lasers were installed in the auditorium, each one connected to an individual instrument. Custom-built electronics allowed them to react the musicians’ performances; the louder the musician played, the brighter the beam. At certain times mirrors would be moved or unveiled to direct the beams to different areas of the auditorium, creating new abstract forms in space to compliment the different movements of the piece.

The resulting walls of light emanating from behind the orchestra and extending through the audience formed a direct spatial visualisation of the music.

Le Sacre du Printemps - North Netherlands Symphony Orchestra and Studio Arcade

Photo by Martin Lambeek

Le Sacre du Printemps - North Netherlands Symphony Orchestra and Studio Arcade

Photo by Martin Lambeek

Le Sacre du Printemps - North Netherlands Symphony Orchestra and Studio Arcade

Photo by Asami Yoshida

Le Sacre du Printemps - North Netherlands Symphony Orchestra and Studio Arcade

Photo by Asami Yoshida

The performance used 50 custom devices consisting of 20mW laser modules connected to piezos via custom housed PCBs (designed by Neil Mendoza). The Piezos were attached to instruments with specially designed putty to pick up the audio vibrations. The resulting signal was processed by the PCBs (placed beneath the musicians’ seats), and used to control the lasers positioned on the balcony. The lasers were dimmed using PWM, allowing us to reflect the intensity of the musician’s efforts in the brightness of the beams. Controls were built in to adjust the signal coming from the piezos, allowing us to calibrate each device to a range of different instruments.

These were the only components; the installation responded only to the musicians, so no computers or software were used to control or moniter the performance.

In terms of the environment, we combined a water based hazer with two smoke machines, which were operated via a DMX controller. Moveable mirrors were used to control the direction of lasers, giving us several spatial configurations. The mirrors, along with the laser mounts, were designed to fit exactly within the existing architecture of the auditorium without damaging it. We took a more analog approach to this, using 3D modelling and CNC to design a series of bespoke but simple MDF fixtures, that were manually operated when required.

Custom housed PCB

Custom housed PCB

Preparing the devices

Preparing the devices

Addressing the orchestra

Addressing the orchestra – photo by Gerda Vrugteman

Testing the lasers

Testing the lasers

In terms of roles, Keiichi Matsuda designed the performance and fixings while I produced the project and oversaw the technical aspect from our side. Neil Mendoza designed and sourced the PCB and acted as technical consultant. Sander Trispel produced the piece on behalf of Groningen Forum. Also, huge thanks to our stage hands Martin Lambeek, Tanko and Emiel, to Lee Daley for filming and Juliet Alliban (and Lee) for doing some last minute cable tidying.

Arcade – New design studio

I’m excited to announce that I have set up an interdisciplinary design studio called Arcade with Keiichi Matsuda and Will Coleman (the guys I worked on Cell with). We are mostly working on interactive installations, immersive environments, short films and mobile apps at the moment, always with a view to engage, delight, question and combine virtual and physical spaces. Check out our showreel below and our temporary web page – http://arcade.cx.

We’re looking for new projects, collaborations, freelancers, interns (programmers and designers) and opportunities. Please get in touch if you are interested in any of the above.

We’ve already published our first project. The Always on Telephone Club is an idea about the future of mobile communication wrapped up as a short creative coding learning experiment. It was commissioned by Nokia for their AlphaLabs initiative. Here’s the documentation.

Feel free to pop by the studio for a cup of tea if you’re around. We’re currently based in Shoreditch.

Vevo Presents: The Maccabees (in the dark)

This live session with The Maccabees is presented by Vevo for the Magners ‘Made in the Dark’ campaign. The brainchild of Directors Jamie Roberts and Will Hanke, the performance contains a combination of live action footage (shot with an Alexa on a technocrane) and an animated sequence by Jamie Child and James Ballard. In addition to this, the scene was also shot in 3D. This was where I came in. In order to achieve this we built a rig that contained 10 Kinect cameras, each attached to a Macbook Pro. Seven of these were facing outward to the crowd and 3 were facing inward to the band.

Three applications were built to achieve this, all using openFrameworks. The client application used ofxKinect to record the point clouds. The millimetre data for each pixel of the depth map was transcoded into 320×240 TIFF images and exported to the hard drive at roughly 32 fps. A server application was used to monitor and control the 10 clients using OSC. Among other tasks, this starts/stops the recording, synchronises the timecode and displays the status, fps and a live preview of the depth map.

Once the recording had taken place a separate ‘mesh builder’ app then created 3D files from this data. Using this software, the TIFFs are imported and transformed back into their original point cloud structure. A variety of calibration methods are used to rotate, position and warp the point clouds to rebuild the scene and transform it into 2 meshes, one for the band and another for the crowd. A large sequence of 3D files (.obj) were exported and given to the post production guys to create the animated sequence in Maya and After Effects. This app also formats the recorded TIFF and .obj files so that there are only 25 per second and are in an easily manageable directory structure.

The whole project was one long learning experience consisting of many of hurdles. Networking, controlling and monitoring so many client machines was a challenge, as was dealing with and formatting such a huge amount of files.

One of the greatest challenges was calibrating and combining the band point clouds. I thought it would be possible to implement a single collection of algorithms to format each one then just rotate and position them. This wasn’t the case. It seemed that each camera recording was in fact slightly different. I had to implement a sort of cube warping system to alter each point cloud individually so they would all fit together.

I was aware that Kinect point clouds become less accurate at distances beyond 2-3 metres so I implemented a few custom smoothing techniques. These were eventually dropped in favour of the raw chaotic Kinect aesthetic, which we agreed to embrace from the early stages of the project. The idea from the start was to make an abstract representation of the band members.

The shoot took place at 3 Mills studios. Jamie and Will had managed to bring together a huge team of talented people and some incredible technology with a relatively small budget. In addition to Kinect rig, the technocrane and the Alexa camera there was also a programmable LED rig built and controlled by Squib. It was a pleasure to be part of the team and see it all come together.

Here’s some behind the scenes footage.

Cell in Shanghai

I was recently invited to Shanghai for a week to set up Cell with Keiichi Matsuda. It was for an art/music/film/food event organised by Emotion China. They had a 5×3 LCD wall erected specifically to display the piece. This made a huge difference to the usual rear projection configuration.

Here are a few photos:


-

More photos of the trip here.

TED@London talk

I was fortunate enough to be invited to speak at the TED@London salon event at the British library back in April. This is a TED curated event that is part of their global talent search. The best speakers will be invited to speak at TED2013 in California. It was a fun and thought-provoking evening and it was great to meet so many of the speakers, organisers and attendees.

I spoke about my mobile art apps. You can see the talk here – and now below. If you enjoy it please vote and leave some lovely comments ;)

James Alliban At TED@London

Photo by Lee Daley

Traces

Traces

Traces is an interactive installation that was commissioned by The Public in West Bromwich for their Art of Motion exhibtion and produced by Nexus Interactive Arts. The piece encourages the user to adopt the roles of both performer and composer. It is an immersive, abstract mirror that offers an alternative approach to using the body to communicate emotion. Kinect cameras and custom software are used to capture and process movement and form. This data is translated into abstract generative graphics that flow and fill the space offering transient glimpses of the body. An immersive generative soundscape by David Kamp reacts to the user’s presence and motion. Traces relies on continuous movement in space in order to create shadows of activity. Once the performer stops moving, the piece disintegrates over time, and returns to darkness.

The Public is an incredible 5 storey gallery, community centre and creative hub in West Bromwich that is concerned with showcasing the work of interactive artists and local people. I’ve been in talks with the curator Graham Peet for the last year with a view to potentially contributing. A few months ago, he commissioned me to build a new piece for the “Art of Motion” exhibition. I thought this would be an ideal opportunity to work with Nexus Interactive Arts. They were interested in the piece and agreed to produce it. The producer, Beccy McCray introduced me to Berlin based sound designer and composer David Kamp who did an excellent job with the generative soundscape.

Traces exhibition entrance

My aim was to build an installation that suited not only the theme of the show but the themes of play, discovery and creativity that already permeate the gallery spaces of The Public.

Traces

Traces was built with openFrameworks and Sadam Fujioka‘s ofxKinectNui addon. This allowed me to use Windows Kinects (kindly donated by Microsoft) and the new Microsoft Kinect SDK 1.0.

The show runs from 30th May until 9th September. I would highly recommend anyone with an interest in interactive art to take a trip to West Bromwich to visit The Public. In addition to the exhibition there are many other excellent pieces.

Here’s an excellent write up and interview on the Creators Project.

Traces










Tweets

Flickr Photos

Mesh study

Mesh study

Mesh study

More Photos

Follow

Get every new post delivered to your Inbox.

Join 59 other followers