Archive for the 'Webcam' Category

Gridsaw

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to FurlAdd to Newsvine

This was my first dip into the fantastic Arts based C++ toolkit openFrameworks. I actually made this a few months ago but haven’t had time to put the video footage together.

So here’s what’s happening:

This installation attempts to reconstruct the camera images from a collection of 3000 square sections taken from previous frames. The result is a chaotic animated grid that continually attempts to achieve order.

Each visible grid item compares its corresponding region of the camera frame with 10 randomly selected squares from the collection. The closest match is compared with the current visible square and and the closest of these two is displayed.

If you stay still for long enough, the camera image will appear to completely rebuild itself.

Phew! Confused? If so, feel free to have a look at the source code.

Advertisements

AR Business card

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to FurlAdd to Newsvine

Inspired by this guy, I just got a fresh batch of business cards from moo.com. There’s not really enough space on a business card to explain yourself in any detail so I thought I’d extend it using augmented reality. I recorded a short video bio and created a 3D grid of coloured planes. These planes are updated with the colours from the video and extruded depending on the level of brightness.

***EDIT***

This is now online for all to enjoy. First print out this (or open it on you’re phone). Then go here to play with it.

***EDIT 2***

Mac users with iSight camera who are experiencing a black screen might be able to get around this by Following these steps:

Right click on the app and select settings. Then click on the camera tab. There should be a drop-down menu. Select USB Camera.

Failing that apparantly it won’t work if you have any other apps that use the iSight open at the same time, so close PhotoBooth, Quicktime etc.

Thanks to JereDog and Bish for these workarounds.

***EDIT 3***

I’m using the following AS3 libraries in this project

FLARToolkit

FLARManager

Papervision The new version now uses the FP10 native 3D capabilities

TweenMax

***EDIT 4***

I’ve temporarily had to take the application down as I’ve had a huge amount of traffic which will undoubtedly cost a horrendous amount of money. Check back next week when things have calmed down a bit.

***EDIT 5***

OK the application is back up now. Big thanks to my agency Skive who are hosting it until the madness subsides.

***EDIT 6***

I’ve made quite a few changes to the applciation recently. The main change is that it no longer uses Papervision but is now run using the new Flash Player 10 3D capabilities. This has sped things up slightly. The app is now built using the PureMVC framework. There are also many optimisations and tweaks to make the app behave faster and better. If you click the link above it will now open the new version.

AR Particle Beam

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to FurlAdd to Newsvine

So I thought it was about time I jumped on the augmented reality bandwagon. Rather than the obligitory 3D model I decided to make something a little different. I’ve used 3D lighting techniques and physics here to create this beam of light surrounded by strange celestial light particles.

If you would like to interact with it, first download and print out the marker here then go here for a live demo.

Big thanks to a few people. First and foremost Saqoosha, the clever chap who created the FLARToolkit library, a port of the C++ library ARToolkit. Mikko Haapoja for providing a fantastic introduction to using FLARToolkit. Eric Socolofsky for building the framework FLARManager which makes working with FLARToolkit a sinch.

Incidentaly while making this, John Lindquist started an augmented reality competition on the Papervision forum. I’ve decided to enter it (hence the Papervision Logo on my marker – rules of the comp, not sucking up) so please feel free to vote for me on the forum. 🙂

For those of you who are interested, you can download the source code here. It’s an FDT project but it shouldn’t be too difficult to convert it over to your favourite development environment.

**** UPDATE ****
Kristin Rohleder has written a Flex version of this application and kindly given it to me to share with whoever wants it. He’s used the latest version of FlarManager and TimelineMax for the animation so the performance is probably far better.

You can download the project here.

Virtual Ribbons

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to TwitterAdd to TechnoratiAdd to FurlAdd to Newsvine

This artwork allows the user to experience an augmented reality in which the flow of ribbons and particles can be controlled. It works by capturing a colour from an object (in this case, a green glove and the light from my mobile phone) and then using this object to drag the ribbons around the screen.

I won’t be releasing the full source code for this but I have released the source of the two key elements used in this sketch. These are the colour detection and the 2D Ribbons. In my opinion this is the best way to share code for larger projects. Smaller context specific code snippets are far easier to look through and experiment with than entire projects. And if people want to build the final artwork, stitching together the various pieces and adding their own code is a great learning experience.

Colour detection in Processing

This is part of a Processing project I’m currently working on. This sketch allows the user to select colours from webcam input. The input is then analysed and a rectangle is drawn around all pixels close to the chosen colour. Up to 2 colours can be used for this sketch. This is similar to the getColorBoundsRect() method in the BitmapData class in Flash.

Click here to see the video on Vimeo

You can download the source code below:

colour_detection.pde, CoordsCalc.pde

Be aware though that in order to use it you should be in a well lit room with a good quality webcam. You might also need to fine tune your webcam settings. To select a colour, place your object in the red box in the bottom right hand corner of the frame and press 1 or 2.

Keys:

p to view the rectangles
1 to select the first colour
2 to select the second colour
0 to reset colours

Motion Trails v01

This is the first experiment in a webcam study I am doing using Processing. Movement results in circles of colour taken from the webcam footage. More movement results in fewer, larger circles and vice versa.

Click here to see the video on Vimeo

Click here to download the source code

In order to use this code you need to install WinVDIG 1.01. This can be obtained from Dan Shiffman’s site. I would recommend a good quality webcam in a well lit room (I’m using the Logitech Quickcam Pro 5000). Also a powerful machine is required for the best results.

The FrameDifferencing code by Golan Levin was used as a basis for this sketch. This can be found in the Processing examples.

Playing with particles using a webcam

Demo of installation

This installation is an AIR application that was built using Flash. It is intended to be placed in the reception area of my workplace, Skive. It was actually completed a few months ago but I just recently managed to put some video documentation together.

The piece uses a webcam to track any form of movement to displace particles. The letters of the Skive logo are broken up and forced to the edges of the participant’s silhouette. This creates a chaotic outline as the pieces attempt to rush back into their original places to reform the letters. The springiness of the particles brings a fluidic motion which encourages play.

Click here to see the video on Vimeo

I’ve worked with webcams several times before and found that differences in light and surroundings can cause the application to behave in different ways. This can be very frustrating as it means recompiling using different values. This time, I decided to do something about it and created a control panel which would allow me to alter and save values such as springiness, threshold and pixelation on the fly. This saved a lot of time, effort and stress.

Demo of installation

Wiimote Powered Self Portrait Generator

This is my first experiment using the open source framework WiiFlash. It is an AIR application built using Flash in which the user can generate a Cubist self portrait by pointing a Wiimote at an Infrared light. Webcam footage is masked and layered using a variety of shapes and lines, the propeties of which can be altered using the Wiimote controls.

Click here to see the video on Vimeo
Click here to see the video on YouTube

The controls are as follows:

B adds shapes to the stage
A gives the pseudo random positioning effect
+ increases the scale of the shapes
– decreases the scale of the shapes
up/right increases the alpha of the lines
down/left decreases the alpha of the lines

The AIR app and source code will be available to download soon. You will however need a Wiimote, a bluetooth dongle, the WiiFlash server installed, a webcam and an Infrared light to use it.






Tweets

Flickr Photos