In the coming years, once Augmented Reality glasses are commonplace, I hope to see apps that allow for creative user generated content to be distributed in virtual space. These tools would give people the opportunity to virtually build sculptures, paint on walls and leave trails using gesture and voice as they wander through cities. Others plugged into the same AR channel would see these digital artefacts seamlessly attached to the places they were created. Apps such as TagDis have explored this area, but, as with all mobile AR, there is a sizable barrier in the act of navigating to and using an app on a handheld device. Ubiquitous AR experienced via a head mounted display could transform whole cities into virtual art galleries.
Konstruct explores this vision by investigating generative art in an Augmented Reality environment. It is a sound reactive AR experience for the iPhone that allows the user to create a virtual sculpture by speaking, whistling or blowing into the device’s microphone. A variety of 3D shapes, colour palettes and settings can be combined to build an endless collection of structures. Compositions can be saved to the device’s image gallery.
Konstruct is a free app available on iPhone 3GS and 4 running iOS 4+. A version for the iPad 2 is planned in the coming months.
In terms of the credits, I came up with the concept and developed the app. Juliet Lall was responsible for designing the user interface and all branding including the Konstruct website.
Users are encouraged to email virtual sculptures to konstruct[at]augmatic.co.uk to be featured in the Flickr gallery.
So on to the technology. I used the new iPhone tracking library String. I’ve been experimenting with the beta version of String for a few months now and it’s surprisingly powerful – I managed to fit over 100,000 polys into a scene without any noticeable dip in frame rate. The level of accuracy in the tracking is also very sturdy. I’m told that that this has improved considerably with the latest iteration of the beta SDK so I’ll be updating the app very soon.
There are currently 2 approaches to developing String apps, Unity, and straight up OpenGL ES. With the Unity option, developers can be up and running within minutes. Being a sucker for punishment I decided to take the OpenGL ES and Objective-C route. This was my first proper experience using OpenGL. Coming from a Flash background, where most of the hard work is wrapped up for you with libraries such as Away3D and PaperVision3D, it was a bit of a learning curve. Ultimately though, it was a great experience to learn about and implement buffer objects, matrices, normals etc. I look forward to learning more about OpenGL and GLSL for future projects.