25.1.09

Success!

















Last week I began testing out the AR plug-in for Sketchup, and luckily the problem was in the way I was saving the jpg. Sketchup does not view jpgs with transparency so after a few modifications I finally had a transparent image in the AR viewfinder. I staggered the split landscape and can now project the cliff into Augmented Reality through the webcam and into my computer. So far the plug-in has a few limitations: as the pieces line up they need to be progressively enlarged to provide a faithful image, and it is limited in its 3dimensionality. As a feedback system this could be used with a more sophisticated system to have an environment altered by certain amounts of data. Say, placed in a room, the projected environment could morph positively or negatively depending on sound level, number of people, their movement through space. So the more people were in the room, the landscape would flourish, if it was a dead zone, the landscape would decay.

Setting the Table

After doing some further research into Augmented Reality I have been working with ARToolkit.
The ARToolKit is a collection of libraries, utilities applications, and documentation and sample code. The libraries provide the user with a means to capture images from video sources, process those images to optically track markers in the images, and to allow compositing of computer-generated content with the real-world images and display the result using OpenGL (Phillip Lamb, 2004). ARToolKit is designed to build on Windows, Linux, SGI Irix, andMacintosh OS X platforms.
It seems that the ARToolkit runs on a .net programming architecture instead of plugging into another programme - like the AR plugin for Sketchup, a much more direct system. So far the parts for the toolkit are downloading and then it is a matter of connecting everything together.
My plan is to separate the landscapes into components and then spatialize using either the ART or the plugin for Sketchup. The plugin requires fineagling, first in 3Dsm and then put into Rhino.

.........................................

The Wii Remote arrived and I am in the process of constructing a sensor bar. Maplin has the IR LED's thankfully!

19.1.09

The Tools

A few of the tools I am using to develop the landscapes in 3D are AR Toolkit, Augmented Reality and Johnny Lee's Wii Hacks. It seems like every Wii remote has been bought in the city of London, so I am waiting for one online.
The Description below is taken from Lee's website:

Head Tracking for Desktop VR Displays using the Wii Remote

Using the infrared camera in the Wii remote and a head mounted sensor bar (two IR LEDs), you can accurately track the location of your head and render view dependent images on the screen. This effectively transforms your display into a portal to a virtual environment. The display properly reacts to head and body movement as if it were a real window creating a realistic illusion of depth and space.

The program only needs to know your display size and the size of your sensor bar. The software is a custom C# DirectX program and is primarily provided as sample code for developers without support or additional documentation. You may need the most recent version of DirectX installed for this to work.


This type of technology would really allow to experiment with change over time within the landscape. I am curious as to how it could develop away from a flat screen. Perhaps how one can occupy the same space as the 3dimensional projection.

So far I have been testing out the spatialization of the cliffs with the Augmented Reality plugin for Google Sketchup. The setup being so simple I could immediately have a simple volume projected [see below]. However when I spliced the cliff into different staggered layers, like a loaf of bread on a flat plane,  the jpg images would not transfer with transparent backgrounds, prohibiting a compounded view of the cliff when projected in 3D. Since Sketchup is so basic I am planning on working with ARToolkit and seeing if the results will be better and at a more sophisticated level.

The Reaction

Post the Barbican exhibit [mentioned below] I began thinking about how a simple information relay could be applied to my project. The siting within the nuclear fallout idea [previous posts] was a tangent that began to take ideas into a limited direction. As the three renders are landscapes within themselves, the issue of siting is there already. Furthermore by the end of the project a smooth process needs to be apparent.
S suggested spatializing the landscapes in order to test out relationships within each [cliff, forest,swamp] and as a collective. The eventual change in each site is hazy still but I am learning that not every step needs to be pre-determined. So, going back to Hemmer's exhibit, the ideas of humans tuning the space/surrounding audio could easily be translated to the tuning of any surrounding landscape. Necessary elements would be: factors [people in the room, temperature,
movement, shadow], a warping device [a system that would read factors and allocate a prescribed rule of change], this change then would trigger the landscape to distort.


The previous post mentioned an interface called Reactable, a demo below. The factors that make up the interface are incredibly simple. A good model to follow when producing my own system.

Frequency and Volume at the Barbican_January 2009

This Saturday was the ending of Rafael Lozano-Hemmer's project Frequency and Volume. Not knowing what exactly to expect it was pretty brilliant to have the installation explained via one's own exploration. Meaning that if you truly wanted to understand the project you would have to look for the clues indicating the mechanics of the installation. The premise was so simple. The human body moving through the long curve of the Barbican hall is a giant tuning device for several radio stations. The stations belonged to news, popular music, maritime, aeronautical and even pirate radio stations. Volume was controlled by one's distance from a projector on the ground. The projector casts your shadow and while you move a webcam records your position in space. Position is then relayed back to the main radio control room where equalizers, amplifiers and computer harddrives calculated the appropriate radio station and volume. This setup was repeated every ten steps; if a shadow moved even slightly in either direction the station would change.
In Kevin Kelley's book Out of Control, hive mentality was explained through a similar experiment. In the mid nineties a relay system was devised so that a camera could read a room full of blue and red placards. Each person held a placard and the objective was to collectively land a virtual plane. After several tries, the group began moving instinctively instead of individually. Similar to Hemmer's project one could easily imagine a roomfull of people tuning the DJ's set at a party.
As we were exploring the exibition at the Barbican the docent gave us a mini tutorial and mentioned a similar system that Bjork used in her recent tour called Reactable.

 
The above is video taken from the Curve space at the Barbican, judging from the previous exhibition spaces Curve couldn't be more appropriate. The space one occupies to tune the radios has to be linear. The people at the exhibition seem to be missing the point, interested more in the shadow projections than the radio stations. *Notice when the giant baby shadow is cast, the volume soars.

13.1.09

The Skeleton of the Skeleton: The site destroyed

The idea of the site being a Nuclear Fallout result came from the last three drawings [the final layer] and this mapplet from CarlosLabs. They devised a project called Ground Zero.

"Have you ever wondered what would happen if a nuclear bomb goes off in your city? With Google's Maps framework and a bit of Javascript, you can see the outcome.
And it does not look good."

Skeletal Locations



++++++++++++++


Shopping for Ingredients

Within the process of any project it is imperative to ground concepts with research. Looking at Theory, History or Contemporary concepts gives a new dimension to one's work. During the Christmas break it was lovely to just gather books together and collect my thoughts and those of others.
Of the phrases typed into the Library's search engine the most popular were: landscape, viral landscape, bio-morphism, biomechanics, evolutionary architecture, evolving, war, conflict, hybrid...


This waterfall of reading yielded some standout literature.


Frazer, John. An Evolutionary Architecture. Architectural Association Press. London. 1994.
John and Julia Frazer's experiments at the AA were published in An Evolutionary Architecture. AD magazine had some further articles on the Groningen Experiment, a mid 90's testing for the capabilities of computers to assist with "lazy" [Frazer's term, not mine!] architecture. Taking cues from natural selection, DNA & its creation, they applied several rules to their design. In the case of the Groningen Experiment, the logic of science was applied to the urban planning for the city of Groningen.
Basically they thought of allowing the fabric of the city to grow organically through what they termed data-structure. The Frazers conceived the data-structure as a set of points in space through which information is passed locally from neighbor to neighbor.
The Urban nature would evolve via this pattern: [the chart below was interpreted parallel to the readings].
              
This method proved an interesting possibility within the concept of viral landscape; in order to determine how the landscape would change, dissemination of said virus, etc. Using logic such as the Frazer’s data-structure and urban design pattern could lead to a mapping within the cliff, forest & swamp. Aside from a mapping, the ideas they ascribed to the growing urbanism can by similarly transferred to how the organism would operate. The Keeper is the evolver, the Instigator is the enabler; both governed by methods of control in the case of the Groningen Experiment they are a genetic algorithm [which solves replication] and a neural network [providing checks and balances within an environment]. Hence, an understanding of the technology-machination behind the Keeper/Instigator is necessary.

10.1.09

The Party has Begun

Post the term review, and piggybacking on the three final renders came the reality that the project had too many unanswered questions. How exactly does the organism work? Where lies the Architecture, within the organism or the affected landscape? etc, etc. So to posit any response, a scientific methodology had to be adopted. I could imagine the project unfolding in a five step process:
I. Initial Concepts (ideas that I have been grappling with from the very beginning of the project).
II. Within a site defining desired outcomes of the project.
III. Adding variables and responses.
IV. Define the method by which outcomes are achieved.
V. Testing through drawing / fabrication (models).

4.1.09

The Tea Party in the Swamp


The effects of the Instigator and Keeper upon one environment. Here the Swamp; in the last stages color is heightened, miltiples are exaggerated to mutate the landscape.

The Tea Party in the Cliff




The effects of the Instigator and Keeper upon one environment. Here the Cliff; not only limited to the nominal environment of rock cropping but to also Ice floes and structures within like stalactites.

The Tea Party in the Forest




The effects of the Instigator and Keeper upon one environment. Here the Forest.

The Pre-Party Invitation

 
For those liberated from computers after reading this message, simply point and shoot with any mobile device at the code above and this blog will be in your pocket always. 
To download the reader software, go here to download on your computer or mobile phone.
*I can just imagine integrating a shotcode, like the one above, into a landscape insidiously. Growing one from grass like crop circles, or stones, clusters of trees, etc.  Information could be projected into space and snapped up from airplanes or viewed from the moon.