RSS Feed
June 17, 2008

If you are in New York in July please come and check out a new interactive ecosystem by Emily Gobeille
and myself at the Riviera Gallery. Curated by the awesome Servicio Ejecutivo.



For exhibition info and artist bios here (pdf)



June 3, 2008

Audio Space is currently being shown at the excellent <TAG> gallery in the Hague.
The show which focuses on Augmented Reality has a lot of really interesting work.
I highly recommend checking it out.





Exhibition magazine as pdf


June 3, 2008

A collaboration with Emily Gobeille, we designed a poster series for young children that would combine elements of mathematics,
algorithm, story and design. Children spend a lot of time looking at the posters/paintings on their walls so we thought it
would be nice to make something to feed their brain.




I built a series of tools to programatically create graphics which Emily would use as building blocks for the worlds she designed.
Some of the origins are quite clear to see, like the vectorfield fish (above), but others like computer vision from the
elevation map of the moon are much more subtle.

The project was commissioned for the excellent Servicio Ejecutivo gallery, see project page here.
All the software was built with the latest version of openFrameworks.
See more images here: phttp://zanyparade.com/v8/projects.php?id=17.

A limited edition of posters will be printed and for sale soon.


June 3, 2008

IDN recently featured some of my work in the Pick of the month section of their excellent Creator's id issue.
They also included the Funky Forest installation and Science of Sleep show on their DVD.
Check it out, in stores now!




June 3, 2008

openFrameworks has come a long way in the last year and is almost ready for its first public release.
We put together a video showing some of the amazing projects that has been created with openFrameworks
while in private beta. Expect a lot more in the future.

Full credits for projects on the openFrameworks website.





June 3, 2008

Following MoMA where else but the TATE Modern?
Good old English weather prevailed but not before we managed to throw up some tributes to London's ultimate phallic symbol.
GRL strikes again and again.

IMG_2789

IMG_2791

IMG_2808

More here: http://fffff.at/fuckflickr/TATE


March 13, 2008

Crazy I know but Laser Tag is part of quite awesome Design and The Elastic Mind exhibition currently showing at MoMA.



Check it here:
http://www.moma.org/exhibitions/2008/elasticmind/#/126/

More pictures here:
http://fffff.at/fuckflickr/MoMA/



February 7, 2008



Coincidentally (???) I was asked for interviews by two Japanese online magazines with a couple of weeks of each other.
They are both spectacular publications and I really enjoyed trying to answer the questions they posed.

Check them out here:
Flash Film Interview and: HITSPAPER!


December 5, 2007



UPDATED Feb 2008 - Laser Tag 2.002 Release








Alright finally - it is here. Laser Tag 2.0 Mac and PC version.
Thanks for being so patient - a lot of time and work has gone into making Laser Tag 2.0 super nice, super dope and super easy to use. NOTE: To see a video of the first version of this system in use check it out here

Credits:
Laser Tag 2.0 is a Graffiti Research Lab project and is written by Theodore Watson and Zachary Lieberman using openFrameworks. It may be used free of charge as long as it is not used for marketing, advertising or promotion and especially not for lame guerrilla marketing events!

Some new features:
- 4 main brush modes - each with their own qualities and different brush types.
- You can design your own brushes by making png files.
- If you want to code your own brushes there is a super simple system for that too.
- Built in music player for playing your party jams while Laser Tagging.
- Network connection for sending the laser tag data to flash, Processing, openFrameworks, Max / Msp etc.
- Color! - editable color xml file - add up to fifty brush colors - works with all brushes.

The equipment:
Here is the essential equipment you will need for your Laser Tag system.
1x fast laptop (PC or Mac) that can connect to an external monitor. It helps if the laptop has a dedicated graphics card, so a Macbook Pro would be preferable to a Macbook for this reason (though macbooks seem to run it just fine!).
1x video camera that you can connect to your laptop. Video cameras that have manual controls tend to be a lot better at tracking the laser than ones that automatically adjust the image depending on how bright it is.
1x projector. Anything over 2000 lumens should be good. 1x laser pointer between 5mW and 80mW in power.

For this setup we are using a Asus F9S laptop with a Logitech Messenger camera, a Benq 2500 lumens projector and a 70mW green laser pointer. This is fine for a small setup but for larger scales you will need a brighter projector and a better camera.



Setting up the projector.

PC - The Laser Tag software is setup to work with the projector acting as an extended desktop to your laptop's main display. If you have a PC laptop and an NVIDIA graphics card you need to set your display to use horizontal span mode (or extended desktop) and set the total display dimensions to 2048 by 768. If you don't have horizontal span look into Realtime Soft's Ultramon which works with all graphics cards.

Mac users will need to setup the projector to be to the right of their Desktop and then launch the Laser Tag app, type Command-',' and then check the box that says 'use extended desktop'. Once this is done quit the application and the settings will be saved to your preferences. Set both your laptop's display and the projector to 1024 768 and you should be ready to go






High Res Guide:


Setting up the camera


To effectively track the laser we need to disable the camera's Auto White Balance, Auto Exposure and Backlight Compensation. In the Laser Tag software hit the 'C' key to bring up the camera settings box. Go through the settings box and set as much as possible to manual.

Projector alignment:


Now we are ready to align the camera and the projector. Run the Laser Tag software and hit the 'F' key to enter fullscreen. Now hit the spacebar to show the edges of the projection area. With your mouse drag the corners of the white projection quad so it fits onto the surface you are projecting onto. Hit the 'S' key to save your settings.

Camera alignment:


In the top left corner you can see the video panel of the camera. Make sure that the camera is covering the whole projection area and is in focus. If you can see the four corners of the projection quad with the camera then just drag each corner of the yellow camera quad so they line up. If the camera image is too dark to see the corners of the projection quad then have a friend beam the laser at each corner and drag the corners of the camera quad that way. Hit the 'S' key to save your settings.

Tracking the laser:


In 'Tracking settings' start off with saturation at 0. See if you can distinguish the laser just by adjusting the brightness (value) threshold alone. Note: Reducing the brightness in the camera settings (see step 3) to the point where only the laser is visible can make the this step much easier. If you are still seeing a lot of white noise in the tracking panel try using the 'Sat Threshold' to track by colour as well. Hit the 'S' key to save your settings.

At this point adjusting your camera settings to make the image as dark as possible but the laser still bright will make tracking a lot easier. Notice how in the above image the laser is isolated from the projected image even though the camera can see the projected 'paint'.

Testing the setup


If everything goes well you should be able to now tag with the Laser Tag software. When you are tagging the projected 'paint' should always appear to come from the location of the laser point if it doesn't try to re-align the camera to the projection area. If the tracking seems 'jerky' try adjusting the tracking settings till the motion
seems nice and smooth.

Clearing the image:
You can always hit the 'D' key to clear the projection manually but it is more convenient to set up an area as a button that clears the projection when hit with the laser. You can enable the clear zone in 'Clear zone settings'. The clear zone shows up as a red box in the video panel. Adjust the x, y, width and height properties to position it somewhere outside of the projection area.

Drawing settings
'Brush mode' allow you to change the type of brush you use. There are four basic brushes: The pngBrush, the graffLetters brush, the vector brush and the gesture brush. The pngBrush uses user created png files to draw with. Use 'Which brush image' to switch the image used and 'Brush color' to change the drawing colour. You can add your own png brushes to the app by saving them to the data/brushes/ folder and you can edit the /data/settings/colors.xml file to add colors. The vector brush also has a bunch of different styles - check the fat style bellow.



Drips mode
For ultimate realism enable drips in the 'Drips settings' to make drips come down from your tag as you are writing. You can also adjust how drippy and the speed of the drips with the other drip settings. Have fun!

Notes and tips:
For details about the all the settings check the guides which explain in more detail:
Brush Settings Guide (jpg)
Tracking Settings Guide (jpg)
Clear Zone Settings Guide (jpg)
Network Settings Guide (jpg)
Camera Settings Guide (jpg)

For framerate improvement: try tracking without advanced quad.
Stay tuned for the full video tutorial live from Lausanne!
 

Laser Tag 2.0 Software Download - Updated for Leopard and Vista
Laser Tag 2.002 Mac App 0S X 10.4 + 10.5 - Optimized!! (Feb 2008)
Laser Tag 2002 PC App - XP and Vista - Update for Camera settings Fix (Feb 2008)

Laser Tag 2.0 Source Code
XCode Project (needs Xcode 2.4 or greater) (Updated Feb 2008)
Visual Studio 2005 Project (Updated Feb 2008)






September 13, 2007

Besides the evil Laser Tag 2.0 system we were running at Ars Electronica, there was also some work from the students of my New Forms of Storytelling class on show. The assignment was to make a complete alphabet out of animated gifs and then send me the message they want to display to the people of Linz.


Here is the result:


Source code coming asap.


July 11, 2007

UPDATE: !!!!! LASER TAG 2.0 IS RELEASED !!!!!

Here is a little test for the next version of Laser Tag where I have the tracking run
on the PC sending the data over to a Mac to display the lines.





July 8, 2007




So one of the things I was invited to do at the Barcelona OFFF festival this May was to work with Zachary Lieberman on a performance called 'Liners'. The performance was a about the story of a never ending line told through animation and video. We sent out a call for short videos of a line being drawn from one side of the video to the other. We then wrote software to track the lines and connect the videos together to create the live performance.

You can see video from the OFFF performance here:





See more videos and download the source code on the liners page:
http://openframeworks.cc/liners/

Liners was made with openFrameworks


June 5, 2007



From the guys that brought you RES FEST, The Creators Series is a new kind of arts event. The events are designed to examine the future of creative culture, and its impact on the rest of the world. They’re meant to be fun and interesting, consisting of a multimedia gallery with work from all the creators that’s open to the public all weekend long, as well as ticketed presentations, discussions, and music performances, plus the odd party thrown in here and there.

Dates: New York: June 8–10 - Los Angeles: June 14–17

More info here: http://www.tomorrowunlimited.com/events/thecreatorsseries/2007/

I will be showing Vinyl Workout and a new version of Daisies. Please come down and check it out it should be a lot of fun and the exhibition is completely free! Check the video bellow for eary Daisies 2.0 hotness!





May 31, 2007



The next version of Audio Space is up and running at Montevideo (Amsterdam) as part of the (in)visible sounds exhibition.
My project, Audio Space allows you to explore and interact with a three dimensional sound space, making sounds with your voice
that float at the spot that they were made, even after you have left the room.

See a video from the setup, bellow:






May 8, 2007



Zach Lieberman and myself are working on a brand new performance for the opening night of OFFF.
The performance is called 'Liners' and tells the story of a line that starts but never ends, using
submitted videos and programmed graphics we will piece together on the fly a continuous
animation of a single line.

We would like your help! We need short 3-8 second videos of a line been drawn, in any way possible,
in any style, that will become part of the performance. We will credit all videos submitted at the end
of the performance. Think of it as an easy way to show how creative you can be with a few seconds of video.

SUBMIT HERE
Deadline is Wednesday 9th May - so be quick!!!



April 4, 2007



videoInput version 0.19 is out! videoInput is an opensource video capture library for windows
that lets you configure and capture multiple cameras and capture devices with ease!

quote "New stuff:
-Now with non-blocking callback functionality
-Ability to choose BGR or RGB for all you openCV heads
-Stop and restart devices
-New video formats supported
-Finds nearest matching size
-Much much cleaner - trust me you want to use this one!"


Anyways it is hot - so get it here:
http://muonics.net/school/spring05/videoInput/




April 2, 2007





Setup 'The Science of Sleep' exhibition for the third time now , this time in the rather
swanky location of Colette in Paris. We had a day to do the setup so it was quite frantic
but it was half the exhibit so it turned out quite okay in the end.




April 2, 2007

Some drawings from some of the first software I made.
You create small particles that relate to each other using the gravitational inverse square law.
The result is that they are attracted to each other in quite the same way as planets,
and the lines you see shows the paths they take due to the force.






April 2, 2007



An old project made with some software I wrote that develops
multi-channel photographs out of slowed down incoming video.

Reminds me of old 60's posters.

More here


February 20, 2007



UPDATE: !!!!! LASER TAG 2002 IS RELEASED - Feb 2008 !!!!!

The main event of the GRL Rotterdam tour - 'L.A.S.E.R TAG' - 60mw geek graffiti madness.
Watch the video here on the Graffiti Research Lab website.

In the spirt of GRL's and Eyebeam's open source beliefs, we are posting the code and executable
for the Laser Tag application online for you to download, dissect, reuse and hopefully improve. The code is
C++ and compiled in a super old school Codewarrior IDE for windows but it is oepnGL based and written
using openFrameworks which is a cross platform library for writing creative code. So it should be very
straight forward to run in Visual Studio, DevC++ or even xcode on a mac.

In its simplest form the Laser Tag system is a camera and laptop setup, tracking a green laser point across
the face of a building and generating graphics based on the laser's position which then get projected back
onto the building with a high power projector.

There are a bunch of things you need to do to get Laser Tag up and running yourself, so here follows the required
equipment and setup instructions. This will assume that you using windows but it will also apply for other OSs too.


EQUIPMENT

We used:

1 PC Laptop - ASUS A8JS - Core 2 Duo 1.83 Ghz 1GB Ram Nvideo Geforce Go 7300 256MB - VGA and DVI out.
1 Pansonic PTD5600U 5000 ANSI Lumens 1024x768 DLP Projector.
1 Watec 221S Security/Astrononmy Camera with manual iris zoom lens.
1 Bogen magic arm and super clamp.
1 Pinnacle PCTV USB capture card.
1 60mW Green Laser (super illegal in a lot of places and very dangerous)
and loads and loads of AAA batteries.





SETUP

Location:

Choose a building or wall that the laser shows up well on. If using a building make sure the lights aren't on
as this will make the tracking of the laser beam harder (also it is not a good idea to point powerful lasers
at a building with people in it) .

Pick a spot far enough back that the projector can overshoot the building by a couple of feet on all sides.




Alignment and calibration:

If the building is vertically orientated and the projector can handle it, put the projector on its side so you are
using the full image of the projection.

Fix the projector in place and then fix the camera underneath the lens of the projector so that it is looking at the
projection area. Allow the camera to see a little more than the projection area so that you can designate an area
as a clear button.

Make sure all the settings on the camera are set to manual - no auto exposure or gain control etc. Adjust the color
balance of the camera so that it looks as natural as possible - make sure any lights in the shot don't look green as this
will confuse the tracking sytem.

Software setup:

Set your resolution of your screen to 1024 by 768.

Connect the projector to the PC and in your display control panel set the display mode to horizontal span.
This should span the Desktop across the laptop screen and the projector so that you have one large screen
of 2048 by 768 resolution. If you have access to openGL and 3D preferences in the control panel, set all settings
to performance.

Connect the camera to the usb capture card and the capture card to the PC. You can use Amcap
to make sure you can capture video okay.

Start the app and you should see something like the image bellow.
Follow the instructions in the diagram to align the camera to the projection surface.



Once the camera is aligned to the projector hide the white alignment lines on the projected image by pressing L.
Turn down the iris of the camera to the point where the surface is as dark as possible but the laser still shows up strong.

Settings:

The settings for the app are saved in an xml file so once you have made your adjustments hit the S key to save your changes.
The up and down arrows on your keyboard will run through the list of settings and the left and right arrows can adjust the values.

The first settings you will want to adjust is the four at the bottom - which define the color tracking of the laser.

Hue Point - should be the value of green that the camera sees the laser as.
Hue Thresh - is how wide a range from the hue point it should consider.
Sat Thresh - is the minimum amount of saturation.
Value Thresh - is the minimum brightness.

Anything that is outside the range of these settings will show up as black in the Thresh Video anything that is whithin
(hopefully only the laser point) should show up white.

Here is an explanation of the rest of the settings:

Slide - slides the screen over so you can see what is being projected
Verticle mode - if the projector is being used on its side.
Use blobs - should stay on, blob tracking ended up being more reliable than motion difference.
Motion diff - don't touch
Use camera - toggle between using the live camera and an included test video.
Drips mode - when on, small drips of paint will descend from the letter forms.
Fade out - slowely fades the image over time - alternative to clear button.
Slant brush - toggles chisel brush and fatcap mode
Clear zone - Use an area of the image as a clear zone, if the laser is detected there it clears the image.

Brush width - Size of brush
Drips freq - How many drips you want - lower is more
Activity thresh - This determines the time before starting a new line, the background screen should flash red whenever a new line begins.
Min blob size - the smaller this value the more sensitive to noise it will be but also the smoother the line motion.
Jump dist - the max distance to draw a line between two captured points, any distance greater means it starts a new line.
Line resolution - how many points to interpolated a line with, higher number means smoother line but uses more cpu.
Clear X - adjust x position of clear area.
Clear Y - same for y
Clear thresh - lower means a more sensitive clear area , higher less sensitive - adjust so that only laser point clears image.


NOTES

Once everything is setup be very careful not to bump either projector or camera as even a move of a 1mm will be
enough to mess the alignment.

The settings are saved in settings.xml file - if you want to back up your settings for a location just zip the file and rename it.

If you are pushed for framerate - try overclocking your graphics with hacked drivers from laptopvideo2go,
you will notice quite a difference!

Press h to hide all gui except the projected image, this can also give you a bump in the framerate too.


DOWNLOAD
UPDATE: !!!!! LASER TAG 2002 IS RELEASED - Feb 2008 - Download Here !!!!!




February 15, 2007



One of the events for the GRL Rotterdam tour was an uncurated animted gif
show on the 80m tall by 40m wide KPN facade in Rotterdam. It is pretty
damn huge and can be seen from far around the city.

You can see the final video from the show on the Graffiti Research Lab website.

I had to make some software in order to take the animated gifs/ quicktimes
and convert them into the 1024 by 41 pixel monochrome bitmaps that the
KPN software accepts.

You can download the software bellow and also the code used to make it
(which is pretty messy at the moment :)




Some notes:

-1 Black pixels means the lights are on white pixels means that they are off.
0 The grid size of the facade is 22 wide by 41 high
1 You can convert either a quicktime file named input.mov or an animated gif named input.gif.
2 If the gif/mov is larger than 22 by 41 pixels then you can use the yellow box to select a subregion of the image.
3 The + and - keys adjust the threshold for what is considered black and what is white.
4 The facade has a framerate of 4 frames per sec hit the f key in the app to simulate that speed.
5 S starts and stops recording the frames to disk, the screen will flash red once the end of the sequence is reached.
6 The frames that are outputted are stored in a folder called frames.

The software is open source and there is both a mac and pc version. It was
written in C++ using openFrameworks, which is a processing like api for writing creative code.

DOWNLOAD:
Mac standalone app
Windows standalone app
Windows source code and codewarrior project
Mac source code and xcode project


February 7, 2007

The Graffiti Research Lab's brand new mobile installation - 'Laser Tag' is launching tonight
at the opening of the Art Rotterdam festival.

Come down to the parking lot behind the KPN building between
6 and 10 to try your hand at 150 foot high digital graffiti.

Map link

We are the hokey looking RV out in the middle of the parking lot with a bunch of lasers coming from it.

Laser Tag will be up every night till the 11th of Feb.






February 4, 2007

Currently working on interactive laser graffiti with Evan and
James for the Art Rotterdam fair.

Here is a little teaser picture from our first night of testing.






January 13, 2007


Just got back from setting up the Michel Gondry show in Milan. This one I think is a lot nicer as there is more space and I had some time to make little adjustments to the code as well.






January 13, 2007


I was asked to show Dasies at the Blend Store in Amsterdam for their Museum Night event in November. Here is a short video from the event.





January 13, 2007


These are some videos from animation software I worked on at Eyebeam that was meant to allow children to draw and animate in a fluid natural way.







September 9, 2006




So for the last month I have been working on four installations for Michel Gondry's Science of Sleep exhibition at Deitch Projects Soho. The opening was last wednesday and it was insane. There were people queued up 3 ways around the block!

The installations are:

An interactive piano that plays back video clips of people playing the note you pressed on a different piano - kind of hard to explain - but cool.

A cardboard working camera that does realtime chromakeying.

A piece where you pull the eyes of the actor open by pulling on ropes.

An HD video on two massive LCD screens designed to look like windows - with the lights in the space reacting to what happens in the video.

The show is up till the end of September and is free!
More info here:

http://www.deitch.com/projects/sub.php?projId=195&orient=v


The bluescreen camera and rope controlled eyes piece, minutes before the show opening.



August 18, 2006

Just released version 5 of muonics.net - the idea was to get rid of a lot of the structure of the old site and instead let the work do the talking. Hence large images, quicktimes and little else.

You can still reach the old site in the archive link above.




July 5, 2006




A new version of my interactive sonic environment 'Audio Space' is on show at Eyebeam as part of their summer exhibition. The show is up till July 15th - so come check it out - it is my favorite version so far.

The piece now generates tones based on your voice so that the experience is much more musical. Moving around the space you choose both spatially and temportally how you hear the piece.


Exhibition info:
http://www.eyebeam.org/engage/engage.php?page=exhibitions&id=99

Audio Space project page:
http://muonics.net/site_docs/work.php?id=15


June 25, 2006


Emily and I put together a protoype for a future kids mobile phone based puppet show called How To. It features two puppets Icky (a snake) and Orzo (a mad scientist) this is an outake music video made by www.zanyparade.com about the puppets secret love. You can see the real submission here.




June 19, 2006


Zach ( http://thesystemis.com ), Evan ( http://ni9e.com ) and I taught a workshop on openFrameworks at Eyebeam the other week.

OpenFrameworks is a free, cross platform, openGL based api of sorts for doing cool creative stuff in c++. The workshop was like two semmesters of classes crammed into 3 days.

Day 1 was a general intro to c++ with examples.
Day 2 was animation techniques and examples.
Day 3 was computer vision with and without openCV.

The code for all the classes is up here:
http://www.ninjagarden.com/of/ofEyebeamWorkshop.zip

This was all the mac projects as that was what people were using - though it would work fine on a PC too. Oh you need xcode 2.2 or above!!!



June 17, 2006




This is some free software I wrote to make video capture into a c++ application nice and easy. It can handle webcams - capture cards and the super high res pixel link cameras.

Currently precompiled for Codewarrior on a PC but it can and has been compiled for devC++ and Visual Studio too!

Get latest version here
http://www.muonics.net/school/spring05/videoInput



June 17, 2006


So after a little bit of frustration with having nowhere to post useful tools and ramblings I decided to make this news page. I will post loads of useful code and friends work too.

Theo