In the coming years, once Augmented Reality glasses are commonplace, I hope to see apps that allow for creative user generated content to be distributed in virtual space. These tools would give people the opportunity to virtually build sculptures, paint on walls and leave trails using gesture and voice as they wander through cities. Others plugged into the same AR channel would see these digital artefacts seamlessly attached to the places they were created. Apps such as TagDis have explored this area, but, as with all mobile AR, there is a sizable barrier in the act of navigating to and using an app on a handheld device. Ubiquitous AR experienced via a head mounted display could transform whole cities into virtual art galleries.
Konstruct explores this vision by investigating generative art in an Augmented Reality environment. It is a sound reactive AR experience for the iPhone that allows the user to create a virtual sculpture by speaking, whistling or blowing into the device’s microphone. A variety of 3D shapes, colour palettes and settings can be combined to build an endless collection of structures. Compositions can be saved to the device’s image gallery.
Konstruct is a free app available on iPhone 3GS and 4 running iOS 4+. A version for the iPad 2 is planned in the coming months.
In terms of the credits, I came up with the concept and developed the app. Juliet Lall was responsible for designing the user interface and all branding including the Konstruct website.
Users are encouraged to email virtual sculptures to konstruct[at]augmatic.co.uk to be featured in the Flickr gallery.
Vodpod videos no longer available.
So on to the technology. I used the new iPhone tracking library String. I’ve been experimenting with the beta version of String for a few months now and it’s surprisingly powerful – I managed to fit over 100,000 polys into a scene without any noticeable dip in frame rate. The level of accuracy in the tracking is also very sturdy. I’m told that that this has improved considerably with the latest iteration of the beta SDK so I’ll be updating the app very soon.
There are currently 2 approaches to developing String apps, Unity, and straight up OpenGL ES. With the Unity option, developers can be up and running within minutes. Being a sucker for punishment I decided to take the OpenGL ES and Objective-C route. This was my first proper experience using OpenGL. Coming from a Flash background, where most of the hard work is wrapped up for you with libraries such as Away3D and PaperVision3D, it was a bit of a learning curve. Ultimately though, it was a great experience to learn about and implement buffer objects, matrices, normals etc. I look forward to learning more about OpenGL and GLSL for future projects.
I am impressed. I am very impressed! I like to think that I have some talent as an artist, and programmer, but I couldn’t begin to equal this. Many thanks, Norm
Will this app be coming to the Android OS??
@Norm Thanks very much
@Kimberly An Android version is possible but I won’t be building one. I’m not a professional mobile developer and have many other projects. Need to keep moving forward.
As a regular reader of your blog I was very impressed to notice your app get a write up in this weeks new scientist magazine, to which I am also a subscriber!
Well done!
Alex
Thanks Alex. Yes I’ve been in contact with the New Scientist guys for the last month or so. Could you tell me the issue number?
Yeah, its issue number 2807
http://www.newscientist.com/article/mg21028075.200-one-per-cent.html
Hi James,
Your apps Fracture and Konstruct seem really wonderful and i’m going to purchase them now. :-).
I must say this, I’m greatly inspired by Augmented Reality when I came to read your blog. And I want to learn more about AR and do something my own. I searched Internet and getting no idea where to start and how. please help me with some advices.
I am pretty new to iphone programming, but I have some nice ideas and your suggestion about AR will help me a lot to make them work out.
Thanx 🙂
-S.Philip
Thanks S.Philip
String will be released very soon. Definitely worth looking into. You don’t actually need any iPhone programming experience as you can use Unity.
When will the Konstruct App be compatible with iOS7?
Hopefully in a month or so.
Hello James,
got a link to this site from an answer to a question on Metaio helpdesk. (http://helpdesk.metaio.com/questions/33634/audio-and-metaio)
Did you used openframeworks to realize the microphone recognition? Or did you used a different API? Is the code for microphone recognition as open source available?
I would like to use the software Metaio Creator or SDK to setup the AR-tracking and a 3d animation scene. And then I would like to do something similar to your App Konstruct. I would like to use the microphone input to trigger animations inside the 3d scene. The animation would be completely prepared with a 3d software. It would run as a sequenze while waiting for the microphone input to start the next phase / step / to jump to the next keyframe.
I didn’t use openFrameworks for Konstruct. Pure Objective C and C++. If I were to build this today I would use OF + ofxQCAR. Good luck with the app.
good morning !!
pack if you want to ask to make augmented reality object recognition can not .. ??
for example, I had a case like this ” AR implementation for the introduction of historical objects in the museum XYZ ”
please help pack !!
please reply through my email !!
Ancahamsar@gmail.com