Archive for the 'openFrameworks' Category

Lipton T.O

This was a collaboration between myself and DigitasLBi Labs Paris to help launch the T.O machine from Lipton. The end result was a digital bar set up at the Marais BHV in Paris. The idea was to build a bar that visually augmented the flavour of the tea as the user was drinking it. I was commissioned to create a series of realtime animated simulations that were inspired by flavour, story and colour of 10 of the teas.

Here is the video documentation:

And here is a short video showcasing the graphics a little more:

The animation, effects, normal calculation etc took place entirely on the GPU. This allowed me to procedurally animate scenes containing over 1 million particles/geometry points, to use a variety of shading and post processing effects (including realtime shadows), to draw to a canvas of 4920 x 2160 (4K + 1080p combined) and to maintain a solid 60fps. It all ran on an Alienware PC with an NVIDIA GeForce GTX 980 graphics card, and, as usual, the app was programmed using C++ and openFrameworks.

Here are a few screenshots taken at various stages of the project:

lipton_0lipton_1lipton_2lipton_3lipton_4.0lipton_4.1lipton_5lipton_6lipton_7

TOTEM

Here’s a cheeky little thing I made at a Kinect Hackathon that was hosted by Microsoft last year. The aim was to create a surreal and creative experience that resulted in a kaleidoscopic 3D collage. It was supposed to be a “vase generator” but it ended up being so much more…

I’ve looked into the possibility of adapting the app to be able to create zoetrope-like physical sculptures using a colour 3D printer. The cost of commercial printing made this unrealistic though. I think this could be a brilliant storytelling platform so if anyone can help bring this 21st century totem to life please get in touch.

Oh, and just to clarify, I’m not a huge narcissist and I haven’t gone completely bonkers 🙂

TOTEM_1TOTEM_2TOTEM_3TOTEM_4

 

 

NOVA

Nova is an experimental study of the visual arts technique slit-scan with a particular focus on emphasising its spacial and temporal properties. The slit-scan image is created using video footage of bioluminescent deep sea creatures, particularly Comb Jellyfish. These creatures acted as a source of inspiration for the form and animation. The resulting structure resembles a surreal organism that is locked into an ever-increasing state of perpetual growth, decay and transformation.

The structure is built and rendered in realtime and programmed in C++ using openFrameworks. For more info on the technical side have a look at this article on Creative Applications.

Music – No Idea by Luk Dab

Headphones are definitely recommended.

Here are a few screenshots taken at various stages of the project.

nova_0_0

nova_1

nova_2

nova_3_0

nova_3_2

nova_8

nova_12

nova_13

nova_15

nova_16

nova_18

nova_20

Last Train

Last-Train_Ron-Arad-Fist_Steimetz-Diamonds-Ring-3

Last train is a project by Ron Arad that I have been involved in over the last couple of years. The piece was commissioned by the diamond company Steinmetz, and inspired by a story from Arad’s youth where, as he attempted to catch the last train out of Naples, he saw a young man scratching an elaborate image onto a train window with a diamond ring. Arad missed his train but felt that he had been rewarded by seeing this beautiful work. This piece is inspired by that memory. It is an interactive drawing tool that allows participants to create glass etchings using an iPad. The lines sketched on the device are sent to a custom built CMC machine where they are recreated by a mechanical fist adorned with a ring designed by Arad.

Many artists were invited to contribute a sketch to this project – Anthony Gormley, Grayson Perry, David Shrigley, Tim Noble, Sue Webster, Richard Wilson to name just a few. Ai Weiwei also recorded a sketch remotely from China. The resulting Glass etchings are exhibited next to the piece in illuminated glass frames.

2013-06-01-14.05.39

diacore-last-train-all

My role was in this project was to design and build the software that would record the drawings and send them to the separately contracted machine. There are 2 pieces of software, both built using openFrameworks. The iPad app records drawn lines and continuously sends them wirelessly to the PC software via OSC. The PC app then receives this data, formats it and sends it in a timely fashion to the CMC machine via ofSerial (monitoring the machine’s internal buffer to avoid overloading). Finally the machine perfectly recreates the sketch by moving the cast of Arad’s fist and running the extruded the diamond across the glass.

I take a great deal of satisfaction working on tools that allow people to express themselves in new forms so this was a particularly rewarding project to be a part of. To see so many esteemed artists use the piece to produce new work was a particular highlight.

Cell at Audi City Beijing

I recently travelled to China to install Cell at a new media arts exhibition held at Audi City Beijing. Cell in an installation I made in collaboration with Keiichi Matsuda in 2011 that is a provocation, a comment on the commodification of identity and a vision of how we might present ourselves in the coming years (more here).

It was always our intention to change this installation over time to implement new technologies and adapt to different contexts. In this instance we decided to give the visitors the opportunity to contribute to the piece by submitting the tags. This was achieved via a web app that would present the user with one of twenty questions such as – “Where did you meet your first love?” or “What is something you couldn’t live without?”. The answer is submitted and added to the collection of tags. Whereas the original piece would allow users to adopt the role of fictional characters, the result of this version was a crowd sourced cloud of words and phrases that formed a collective identity over the course of the week long exhibition.

Cell_1

Cell_2

There were several challenges this time round. The display consisted of 2 “PowerWalls” that, when combined, consisted of 8×4 plasma screens – an overall size of 11x3m. We went with a very powerful custom PC (made by Gareth Griffiths of Uberact) as we needed to significantly increase the tag count and split the image over the 2 walls (using a DualHead2Go). We also needed the extra power as there were 5 kinects (all running from separate PCs). This allowed for up to 10 simultaneous users and meant more calculations than usual. Cell is an open source project and the code for the new iteration is available here. The piece requires openFrameworks v0.8.0 and Visual Studio Express 2012.

Cell_3

I was pleasantly surprised to discover that my app Konstruct (made with Juliet Alliban) was also exhibited at the event. This section was part of the AppArtAwards exhibition and was organised by the Goethe-Institut China and ZKM.

Finally, huge thanks to Audi for holding the exhibition, to ADP Projects for helping to curate the event and acting as producers in Beijing, to Keith Watson for providing some space at Level39 for testing, to Juliet Alliban for helping with the setup and to Gareth Griffiths for building the PC.

Bipolar at Digital Shoreditch

Bipolar is an experiment in using the human form as a medium for sound visualisation. It is an audiovisual virtual mirror that warps the participant’s body as they wander through the space. A soundscape designed by Liam Paton is generated from the presence and motion of the participants. The data from this (in addition to sounds from the user and environment) is used to transform the body into a distorted portrait that fluctuates between states of chaos and order.

Bipolar

Bipolar

Bipolar

This piece has evolved from an experiment I made 18 months ago when exploring the possibilities for using the body as a canvas for visualising sound – have a look here for more information on the technology. Since then it has been exhibited at a number of events including Digital Shoreditch, The Wired Popup Store in Regent St, The New Sublime exhibition at Brighton Digital festival and The BIMA awards. There are plans to install it at several more spaces in the coming months.

Bipolar at Wired Popup Store

Bipolar at Wired Popup Store

Bipolar at Digital Shoreditch

Bipolar at Digital Shoreditch

In the time since the original experiment, Bipolar has gone through several changes and optimisations. The biggest addition is the interactive sound aspect which was designed by Liam Paton, composer and co-founder of Silent Studios. The idea was to build a dark, abstract soundscape to compliment the visuals and react to motion, location and distance. He built the software using Max/MSP and I was able to communicate with it from my openFrameworks app via OSC.

Visually, I wanted to retain the chaotic nature of the original but with a few refinements and optimisations. The main issue with the original version was the fact that the extrusions appeared to be fairly random. Each spike is achieved by extruding a vertex in the direction of its normal but the normals weren’t very smooth. This was down to the way in which the depth data from the Kinect is presented. In order to get round this I implemented a custom smoothing algorithm that took place on the GPU (the vertex normals were also calculated by making a normal map on the GPU) which allowed me to create a much more pleasing looking super optimised organised chaos.

Kinect Processing

Another addition was some fake ambient occlusion. The original piece could seem a little flat in places, so this effect was added to create what look like shadows surrounding the spikes. I achieved this by darkening the colour of certain vertices surrounding the extruded vertex. The results should be visible in the image below.

Bipolar shading

At the moment all of the mesh processing is tightly interweaved into the application. I intend to release an addon in the coming weeks that will include most of this functionality along with some simple hole filling.

Vevo Presents: The Maccabees (in the dark)

This live session with The Maccabees is presented by Vevo for the Magners ‘Made in the Dark’ campaign. The brainchild of Directors Jamie Roberts and Will Hanke, the performance contains a combination of live action footage (shot with an Alexa on a technocrane) and an animated sequence by Jamie Child and James Ballard. In addition to this, the scene was also shot in 3D. This was where I came in. In order to achieve this we built a rig that contained 10 Kinect cameras, each attached to a Macbook Pro. Seven of these were facing outward to the crowd and 3 were facing inward to the band.

Three applications were built to achieve this, all using openFrameworks. The client application used ofxKinect to record the point clouds. The millimetre data for each pixel of the depth map was transcoded into 320×240 TIFF images and exported to the hard drive at roughly 32 fps. A server application was used to monitor and control the 10 clients using OSC. Among other tasks, this starts/stops the recording, synchronises the timecode and displays the status, fps and a live preview of the depth map.

Once the recording had taken place a separate ‘mesh builder’ app then created 3D files from this data. Using this software, the TIFFs are imported and transformed back into their original point cloud structure. A variety of calibration methods are used to rotate, position and warp the point clouds to rebuild the scene and transform it into 2 meshes, one for the band and another for the crowd. A large sequence of 3D files (.obj) were exported and given to the post production guys to create the animated sequence in Maya and After Effects. This app also formats the recorded TIFF and .obj files so that there are only 25 per second and are in an easily manageable directory structure.

The whole project was one long learning experience consisting of many of hurdles. Networking, controlling and monitoring so many client machines was a challenge, as was dealing with and formatting such a huge amount of files.

One of the greatest challenges was calibrating and combining the band point clouds. I thought it would be possible to implement a single collection of algorithms to format each one then just rotate and position them. This wasn’t the case. It seemed that each camera recording was in fact slightly different. I had to implement a sort of cube warping system to alter each point cloud individually so they would all fit together.

I was aware that Kinect point clouds become less accurate at distances beyond 2-3 metres so I implemented a few custom smoothing techniques. These were eventually dropped in favour of the raw chaotic Kinect aesthetic, which we agreed to embrace from the early stages of the project. The idea from the start was to make an abstract representation of the band members.

The shoot took place at 3 Mills studios. Jamie and Will had managed to bring together a huge team of talented people and some incredible technology with a relatively small budget. In addition to Kinect rig, the technocrane and the Alexa camera there was also a programmable LED rig built and controlled by Squib. It was a pleasure to be part of the team and see it all come together.

Here’s some behind the scenes footage.

Traces

Traces

Traces is an interactive installation that was commissioned by The Public in West Bromwich for their Art of Motion exhibtion and produced by Nexus Interactive Arts. The piece encourages the user to adopt the roles of both performer and composer. It is an immersive, abstract mirror that offers an alternative approach to using the body to communicate emotion. Kinect cameras and custom software are used to capture and process movement and form. This data is translated into abstract generative graphics that flow and fill the space offering transient glimpses of the body. An immersive generative soundscape by David Kamp reacts to the user’s presence and motion. Traces relies on continuous movement in space in order to create shadows of activity. Once the performer stops moving, the piece disintegrates over time, and returns to darkness.

The Public is an incredible 5 storey gallery, community centre and creative hub in West Bromwich that is concerned with showcasing the work of interactive artists and local people. I’ve been in talks with the curator Graham Peet for the last year with a view to potentially contributing. A few months ago, he commissioned me to build a new piece for the “Art of Motion” exhibition. I thought this would be an ideal opportunity to work with Nexus Interactive Arts. They were interested in the piece and agreed to produce it. The producer, Beccy McCray introduced me to Berlin based sound designer and composer David Kamp who did an excellent job with the generative soundscape.

Traces exhibition entrance

My aim was to build an installation that suited not only the theme of the show but the themes of play, discovery and creativity that already permeate the gallery spaces of The Public.

Traces

Traces was built with openFrameworks and Sadam Fujioka‘s ofxKinectNui addon. This allowed me to use Windows Kinects (kindly donated by Microsoft) and the new Microsoft Kinect SDK 1.0.

The show runs from 30th May until 9th September. I would highly recommend anyone with an interest in interactive art to take a trip to West Bromwich to visit The Public. In addition to the exhibition there are many other excellent pieces.

Here’s an excellent write up and interview on the Creators Project.

Traces






Tweets

Flickr Photos