Archive Page 2

Arcade – New design studio

I’m excited to announce that I have set up an interdisciplinary design studio called Arcade with Keiichi Matsuda and Will Coleman (the guys I worked on Cell with). We are mostly working on interactive installations, immersive environments, short films and mobile apps at the moment, always with a view to engage, delight, question and combine virtual and physical spaces. Check out our showreel below and our temporary web page – http://arcade.cx.

We’re looking for new projects, collaborations, freelancers, interns (programmers and designers) and opportunities. Please get in touch if you are interested in any of the above.

We’ve already published our first project. The Always on Telephone Club is an idea about the future of mobile communication wrapped up as a short creative coding learning experiment. It was commissioned by Nokia for their AlphaLabs initiative. Here’s the documentation.

Feel free to pop by the studio for a cup of tea if you’re around. We’re currently based in Shoreditch.

Vevo Presents: The Maccabees (in the dark)

This live session with The Maccabees is presented by Vevo for the Magners ‘Made in the Dark’ campaign. The brainchild of Directors Jamie Roberts and Will Hanke, the performance contains a combination of live action footage (shot with an Alexa on a technocrane) and an animated sequence by Jamie Child and James Ballard. In addition to this, the scene was also shot in 3D. This was where I came in. In order to achieve this we built a rig that contained 10 Kinect cameras, each attached to a Macbook Pro. Seven of these were facing outward to the crowd and 3 were facing inward to the band.

Three applications were built to achieve this, all using openFrameworks. The client application used ofxKinect to record the point clouds. The millimetre data for each pixel of the depth map was transcoded into 320×240 TIFF images and exported to the hard drive at roughly 32 fps. A server application was used to monitor and control the 10 clients using OSC. Among other tasks, this starts/stops the recording, synchronises the timecode and displays the status, fps and a live preview of the depth map.

Once the recording had taken place a separate ‘mesh builder’ app then created 3D files from this data. Using this software, the TIFFs are imported and transformed back into their original point cloud structure. A variety of calibration methods are used to rotate, position and warp the point clouds to rebuild the scene and transform it into 2 meshes, one for the band and another for the crowd. A large sequence of 3D files (.obj) were exported and given to the post production guys to create the animated sequence in Maya and After Effects. This app also formats the recorded TIFF and .obj files so that there are only 25 per second and are in an easily manageable directory structure.

The whole project was one long learning experience consisting of many of hurdles. Networking, controlling and monitoring so many client machines was a challenge, as was dealing with and formatting such a huge amount of files.

One of the greatest challenges was calibrating and combining the band point clouds. I thought it would be possible to implement a single collection of algorithms to format each one then just rotate and position them. This wasn’t the case. It seemed that each camera recording was in fact slightly different. I had to implement a sort of cube warping system to alter each point cloud individually so they would all fit together.

I was aware that Kinect point clouds become less accurate at distances beyond 2-3 metres so I implemented a few custom smoothing techniques. These were eventually dropped in favour of the raw chaotic Kinect aesthetic, which we agreed to embrace from the early stages of the project. The idea from the start was to make an abstract representation of the band members.

The shoot took place at 3 Mills studios. Jamie and Will had managed to bring together a huge team of talented people and some incredible technology with a relatively small budget. In addition to Kinect rig, the technocrane and the Alexa camera there was also a programmable LED rig built and controlled by Squib. It was a pleasure to be part of the team and see it all come together.

Here’s some behind the scenes footage.

Cell in Shanghai

I was recently invited to Shanghai for a week to set up Cell with Keiichi Matsuda. It was for an art/music/film/food event organised by Emotion China. They had a 5×3 LCD wall erected specifically to display the piece. This made a huge difference to the usual rear projection configuration.

Here are a few photos:


More photos of the trip here.

TED@London talk

I was fortunate enough to be invited to speak at the TED@London salon event at the British library back in April. This is a TED curated event that is part of their global talent search. The best speakers will be invited to speak at TED2013 in California. It was a fun and thought-provoking evening and it was great to meet so many of the speakers, organisers and attendees.

I spoke about my mobile art apps. You can see the talk here – and now below. If you enjoy it please vote and leave some lovely comments 😉

James Alliban At TED@London

Photo by Lee Daley

Traces

Traces

Traces is an interactive installation that was commissioned by The Public in West Bromwich for their Art of Motion exhibtion and produced by Nexus Interactive Arts. The piece encourages the user to adopt the roles of both performer and composer. It is an immersive, abstract mirror that offers an alternative approach to using the body to communicate emotion. Kinect cameras and custom software are used to capture and process movement and form. This data is translated into abstract generative graphics that flow and fill the space offering transient glimpses of the body. An immersive generative soundscape by David Kamp reacts to the user’s presence and motion. Traces relies on continuous movement in space in order to create shadows of activity. Once the performer stops moving, the piece disintegrates over time, and returns to darkness.

The Public is an incredible 5 storey gallery, community centre and creative hub in West Bromwich that is concerned with showcasing the work of interactive artists and local people. I’ve been in talks with the curator Graham Peet for the last year with a view to potentially contributing. A few months ago, he commissioned me to build a new piece for the “Art of Motion” exhibition. I thought this would be an ideal opportunity to work with Nexus Interactive Arts. They were interested in the piece and agreed to produce it. The producer, Beccy McCray introduced me to Berlin based sound designer and composer David Kamp who did an excellent job with the generative soundscape.

Traces exhibition entrance

My aim was to build an installation that suited not only the theme of the show but the themes of play, discovery and creativity that already permeate the gallery spaces of The Public.

Traces

Traces was built with openFrameworks and Sadam Fujioka‘s ofxKinectNui addon. This allowed me to use Windows Kinects (kindly donated by Microsoft) and the new Microsoft Kinect SDK 1.0.

The show runs from 30th May until 9th September. I would highly recommend anyone with an interest in interactive art to take a trip to West Bromwich to visit The Public. In addition to the exhibition there are many other excellent pieces.

Here’s an excellent write up and interview on the Creators Project.

Traces

Bipolar

Bipolar is the result of a short experimental journey into visualising sound using computer vision. The initial idea was to capture a mesh of my face in realtime, and warp it using the sound buffer data coming in from the microphone as I speak. Initially I explored ofxFaceTracker but had trouble segmenting the mesh so moved to the Kinect camera. I had a rough idea of how the final result might look but it turned out quite differently.

As this intense spiky effect began to take shape I realised this would be perfect for the chaotic and dark sound of Dubstep. Thankfully I know just the guy to help here. I met the DJ and producer Sam Pool AKA SPL at the Fractal ’11 event in Colombia. He kindly offered to contribute some music to any future projects so I checked out his offerings on SoundCloud and found the perfect track in Lootin ’92 by 12th Planet and SPL. This, of course, meant I would have to perform to the music. Apologies in advance for any offence caused by my “dancing” 🙂

This was build using openFrameworks and Theo Watson’s ofxKinect addon which now offers excellent depth->RGB calibration. I’m building a mesh from this data and calculating all the face and vertex normals. Every second vertex is then extruded in the direction of it’s normal using values taken from the microphone.

The project is still at the prototype stage and needs some refactoring and optimisation. Once it is looking a little better I will release the code.

Composite for Windows Phone 7

I’m proud to announce that Composite is now available for Windows Phone 7 devices.

My latest project, Cell (a collaboration with Keiichi Matsuda) was supported by Microsoft. Will Coleman, the UK Windows Phone product manager was our main point of contact throughout. After showing him my mobile apps, he expressed an interest in creating a Windows Phone version of Composite. A few weeks later this was confirmed.

Brighton based agency, Matchbox Mobile, who were also involved with Cell were brought in to develop the app and work with me and Juliet to keep to the original concept and style. It was a pleasure to work with these guys again and was interesting to see Composite adapted for the Metro design language. Big thanks to Juliet for redesigning the website (with lightnening speed) and tweaking the branding. Also, thanks to Steven Trew for helping to organise and maintain the partnership.

In terms of the aesthetic output I’m very pleased. We have managed to stay very close to the original app. Here are a few images I’ve made so far:

Composite

Composite

Composite

Composite

So install it, let me know what you think, and as ever, I’d love to see what you make with it. Please send your work to submissions [at] composite-app.com to be featured in the Composite gallery.

Now to make the iPhone 4 version. Perfect excuse to get the 4S 🙂

Cell

Cell

Cell at the Alpha-Ville festival

Cell is an interactive installation commissioned for the Alpha-Ville festival, a collaboration between myself and Keiichi Matsuda. It plays with the notion of the commodification of identity by mirroring the visitors in the form of randomly assigned personalities mined from online profiles. It aims to get the visitors thinking about the way in which we use social media to fabricate our second selves, and how these constructed personae define and enmesh us. As users enter the space they are assigned a random identity. Over time, tags floating in the cloud begin to move towards and stick to the users until they are represented entirely as a tangled web of data seemingly bringing together our physical and digital selves.

I first got in touch with the organisers of the festival, Estella Olivia And Carmen Salas, around May with a view to contributing. They asked if I knew of Keiichi Matsuda, and whether I would be interested in a collaboration. Coincidentally we had met up a month before and had discussed the idea of joining forces as our areas of research are very similar. We come from different fields, he architecture and film making, me new media art and interaction design. This turned out to be a perfect combination. We shared the concept and design, Keiichi focussed on the fabrication, planning the space and putting together the documentary while I happily wrote the software. Even with these distributed roles we found we were often offering suggestions and help to each other throughout the course of the project.

The wall

The concept wall

Microsoft have supported the project from the early stages. Keiichi and I were both speaking at an event in June when we met Paul Foster who was promoting the MS Kinect for Windows SDK. We discussed our project which would be utilising the Kinect camera and he was interested in helping out. He introduced us to William Coleman and since then they have supplied all the equipment and funded the studio space (thanks to Tim Williams and Tom Hogan at Lumacuostics for putting us up and all the advice).

In addition to this, Microsoft also introduced us to Simon Hamilton Ritchie who runs Brighton based agency Matchbox Mobile. These guys contributed a great deal to the project, most importantly, ofxMSKinect, an openFrameworks addon for the official Kinect SDK. One of the main advantages of using this over the hacked drivers is the auto user recognition, we no longer need to pull that annoying calibration stance which can be a big barrier in a piece such as Cell. In addition to depth/skeleton tracking, the potential for utilising the voice recognition capabilities is an exciting prospect for the interactive arts community. This will be integrated into ofxMSKinect in the coming months.

Multiple Skeletons

Skeletal data from 4 Kinect cameras

So on to the setup, Halfway through the project we realised that we would only be able to track 2 skeletons using a single Kinect camera. While this is fine for gaming, for a large scale interactive experience this would not be enough. So instead of 1 camera we decided to go with 4! We organised 4 Dell XPS 15 laptops each connected to a Kinect camera. The skeletal data from each client is fed to an Alienware M17x laptop through a Local Area Connection (with help from Matchbox) giving us the potential to track the skeletal data of up to 8 users in a space of around 5m x 4m. The software on the Alienware server then calculates and renders the scene which is rear projected onto a large screen using a BenQ SP840 projector.

The screen posed a bit of a challenge. We could either rent one for a ridiculous price or build our own and have complete freedom over the design. This was important to us so Keiichi put his woodwork skills to the test and made a 4.2m x 1.8m screen that can be reduced to 1.5m. Quite an achievement for a rear projection screen with no supporting beams! We used ROSCO grey screen material which was perfect for our requirements.

Setting up

Keiichi and Iannish preparing the screen at the Alpha-Ville festival

We were very pleased with the reaction to Cell. The feedback from the festival goers was really positive. It was important to us that the participants were both interested in the concept and taken by the experience. Many that we spoke to seemed to engage with the piece on both levels.

If you would like any more information please visit the Cell website. If you would like to contact us regarding this piece, please email – info [at] installcell.com

I’d like to thank the following for their help in realising this piece (in order of appearance):

Carmen Salas, Estela Oliva, Paul Foster, Will Coleman, Simon Hamilton Ritchie, Theo Watson, Kyle McDonald, Arturo Castro, Tim Williams, Tom Hogan, Claire Holdsworth, Vincent Oliver and Iannish Posooa.






Tweets

  • goodnewsnetwork.org. for balance. - 1 day ago
  • Conscious gaming with Alan Watts gracing your ears and mind throughout. Might have to dust off the PS4 for this one… twitter.com/i/web/status/8… - 2 days ago
  • Anyone else think about Goodfellas every single time they chop garlic? - 3 days ago
  • You can lead a horse to paradigm-shifting evidence but you can't make him think. - 3 weeks ago
  • My wife visited a website on Friday. Today a brochure from the company arrived in the post. She didn't create an account. Is that a thing? - 1 month ago
  • "Yesterday I was clever, so I wanted to change the world. Today I am wise, so I am changing myself." Rumi - 1 month ago

Flickr Photos

150909_Capture_2

150909_IMG_3419

150909_Capture_1

More Photos