Wednesday, 11 March 2009
The final project: A proposal
As well as this there is also the concept of investigating a connection of GPS narrative elements with the GreenScreen and the Arch-OS which could be quite dynamic and interesting, or there is the option of investigating social ecology through virtual environments. All of these elements would be interesting to develop, specifically these new ideas as they would be new and a development of what was done previously. These ideas will need to be developed further however to consider whether they would be appropriate.
Ultimately for my project I would like to encorporate some use of the Arch-OS or GreenScreen or both as I found these the most engaging section of the module. However I would like to try to use these in a more abstract way rather than just taking the Arch-OS data and visualising it on the GreenScreen as this has already been approached a number of times before.
Monday, 9 March 2009
The Workshops: A Synthesis
In the three workshops a wide range of different and new concepts have been introduced. The aim of the synthesis is to combine all these different ideas and consider new possibilities that come from this. In particular there may be inspiration for the final Agile project based on combining different elements of the three workshops. In this synthesis I will draw separate conclusions of each workshop and then outline different ideas which have stemmed from the experiences I have had.
The Picnic
The Picnic was the first workshop which introduced a number of leading aspects for the module. These included abstraction, reconstruction and gave us the ability to look at information from a different perspective. The Picnic was a steep learning curve in terms of the more open ways of thinking and viewing the environment, but this in some ways made it more engaging. This was also a benefit for the future workshops as these concepts such as abstraction have seemed to run throughout the workshops and so by introducing us to them at an early stage it made it easier to carry out future tasks. In terms of the ecology aspect of this workshop, both social and human ecology were covered which gave a broader spectrum of the underlying concept of the module. The most engaging aspect of this workshop was discovering ways in which the data can be abstracted, as although I knew of this concept, at first I found it difficult to put it into practice and so working through it helped to develop my creative skills.
The Field
The Field was the second workshop that was covered involving hertzian space and locative media and was focused on human ecology. The main focus of this work shop was the GPS track and ways in which to develop it from a simple GPS drawing to a more dynamic outcome. In particular I found the hertzian space element of this workshop interesting as it allowed me to investigate a query that I had previously considered, whilst developing this further into a spatial understanding of my surroundings which was relevant to the task at hand. This particular section had little development from the initial investigation, unlike the GPS section, and so there is a possibility of continuation in the future.
The Territory
The Territory was the third and final workshop and revolved around the Arch-OS system. Out of the three workshops I felt that this one was the most informative in understanding and developing ideas for its ecological theme of deep ecology. The most interesting element of this workshop was the Arch-OS section as there are numerous possibilities for the data it provides and the concept of intelligent architecture is a new and exciting area to explore. By studying the Arch-OS system it introduced a new perspective of the building itself, which others who are unaware of Arch-OS would not see. The discovery of this understanding is also one of the reasons why the GreenScreen is interesting to work with, as it deals with the concept of people having to look closer and unravel the information to understand what it really resembles. The philosophical element to deep ecology also allows for this section to be more flexible in terms of concepts and ideas.
New Ideas
By combining the information learnt during these workshops a number of new ideas can emerge. These ideas can then widen our investigation, but more importantly can be used to develop a concept for the final Agile project.
Firstly there are ideas such as those previously voiced in this synthesis. In particular I feel the investigation of hertzian space not involving GPS was left undeveloped and so there are a number of ways in which this section could be continued. Firstly hertzian space could be investigated further using time. The maps made at the beginning of the Field using wi-fi and Bluetooth were static images, but the position, strength and other characteristics of these frequencies change over time. Therefore alike to the GPS narrative the hertzian space could be studied and the constructed over time. This would move the herztian space element away from the space-based approach and more towards a time-based approach, therefore linking elements of the Picnic with the Field.
Another idea that has emerged from carrying out these workshops is a way of combining elements of the GPS section of these workshops with the GreenScreen from the territory. Previously the GPS elements had either been defined using a static image or a simple video narrative which demonstrated the development of the path. However these elements have only been viewable through computer and this blog, whereas with use of the GreenScreen tracks could be displayed to the public. As well as this if some of the Arch-OS data could be used to somehow produce the GPS element of the concept the final image would be more relevant as a whole.
The ecologies covered in this module also introduce new ideas. In particular it occurred to me that social and human ecology is quite difficult to separate as whenever people are interacting socially they are also interacting with their physical environment. This therefore made me question whether it was possible to get social ecology without the interference of human ecology. For this to really be possible the social interaction would have to occur in a non-physical environment. This made me consider virtual social environments such as MSN and Facebook, as these would allow for a social environment without any involvement of the physical environment. This could be an interesting area to investigate as it approaches a completely different type of ecology in terms of social interaction.
Conclusion
Overall the Agile workshops have been very effective in introducing and developing new concepts relating to architectures of the near future. Although certain elements were difficult to understand to begin with, by working through the projects and looking at examples of similar work a better understanding began to develop. This module has been an interesting developmental process that I have enjoyed and has introduced a wide range of different techniques and strategies. The final stage of this module is to take these ideas and create a final project with them that will be interesting and effective.
The Territory: A Synopsis

In order to investigate the concept of deep ecology the Territory is based on the Arch-OS system. This is a computerised system set up in the Portland Square building of Plymouth University which monitors different characteristics inside and outside and collects data about them, which can then be manipulated to demonstrate the changes in the building. This in some ways can be considered the life of the building and therefore ties in with the concept of deep ecology, as the building is being viewed not as a resource but as a separate living object which can interact with the environment in its own way.
There is and overview and three main areas covered by the Territory which are listed below. To find out about each of them click on the link which will redirect you to the relevant post:
Once we had investigated each of these elements we were asked to carry out a small project in which we could use some or all of the elements to create an interesting streaming experience. For my project I decided to work with both the GreenScreen and the Arch-OS data to create a visualisation for the movement of people through Portland Square. To find out more about this project and to see the final result visit the posts below:
Overall the streaming project was quite successful as it created an intriguing and dynamic image. In particular it was successful in terms of the context of the project as it was designed for the building and personified it in a way that matched with the concepts set out by deep ecology. However its main downfall was that it was not taking the data directly from Arch-OS due to the vision system not working. This made the animation less relevant and did not give a sense of the real-time life of the building. It is not impossible to consider this being implemented properly though, and so the product that was produced at least acted as a successful demonstration of what could be created.
Sunday, 8 March 2009
The Field: space-based narrative
This space based narrative is quite a successful representation of the GPS drawing made. The main element that leads to its success is the relationship between the line that is drawn on the ground and the actual GPS drawing that is placed on top. Without the track overlay the viewer of the video is unaware of their distance or location along the track. However with the drawing laid on top it is possible to know precisely which part of the path the video is currently displaying. The book element of the video is also quite good as it gives the impression of the images developing linearly through time along with the narrative similar to the contents of a book. However the downside of this narrative is that it is not the most original way of displaying the information and there may have been more abstract and interesting ways to create the narrative.
The narrative is overall an element that, in this case, encompasses the rest of the GPS elements previously obtained throughout The Field. This makes it quite an interesting construction as it not only shows the different elements but highlights the relationships between them.
The Field: space-based processes

Overall the space-based processes section of The Field helps to inform further the space-based navigation section and to provide a better representation of the events that occured during the navigation task.
The Field: space-based navigation
The second practical session of The Field is labelled space-based navigation which relates more closely with the concept of locative media. In particular this part of the project dealt with GPS tracking through the use of GPS devices. By tracing your movement with a GPS device you are able to draw images onto the landscape, almost mimicking a virtual graffiti. The task was to create an image using a GPS track within the space around the university such as the GPS drawings created by others around the globe. After looking at a number of examples such as that shown right I decided that I liked the idea of being able write text onto the landscape and so began to design my own drawing. The image that I was to create had to be possible within the paths around the university and so I chose to use a map of the university to define my image. I also felt that my image should relate to the task at hand as this would make the drawing I made more relevant. The drawing I created shows the word GPS drawn out with the pencil following and is as follows:
The only trouble with the actual track that I made was that the GPS device drew the track as a series of points rather than a line and so I had to overlay a line on the image to make the image more readable. Apart from this the track that I made was quite successful and included a detailed navigation of the space available. For more information about creating this track view the following posts:
This space-based navigation developed the previous elements of the Field by introducing the concept of using and manipulating hertzian space in order to create images that only exist in a virtual environment.
The Field: space-based mapping
For my space-based map I first chose to investigate wifi using my laptop in my accomodation at Alexandra Works. I thought this would be an interesting hertzian space to investigate as I have a wireless network in my room, but I was intrigued as to how many other students in the building decided to do the same thing and what other networks were accessible. After investigating the building a number of times with my laptop I discovered a surprising amount of wireless networks and then made the map to portray their location and range(shown left). This demonstrates an interesting use of the hertzian space around the building, especially in areas where the signals overlap as the hertzian space is shared by more than one signal at these locations. To find out more about the production of this map view the post Hertzian Space.
However this space-based map was not all that accurate as the actual ranges of each of the networks were likely to be more obscure due to the interference of the building. I therefore decided to do a second map using bluetooth in the Roland Levinsky building of the university. The map I produced is shown right. This was an interesting area to investigate as the movement and range of the Bluetooth devices could be monitored due to the recognition of Bluetooth names on different floors. It also demonstrates where the hertzian space is most dense with Bluetooth signals and where it is not. To find out more about this second map visit Roland Levinsky Mapping.
Overall the space-based maps I produced were both realtively successful. By investigating the areas described it introduced a new way of viewing the space in which technology reaches as well as a different way of viewing the environment as a whole. It demonstrated an interaction with the environment relating to human ecology which may previously have been ignored and increased the awareness of the hertzian space used around us.
The Field: the beginning

Hertzian space is the concept of the space that is taken up by hertz based frequency transmissions. These transmissions are invisible to the naked eye and so are not ususally considered as filling any space at all. Examples of these transmissions include wifi, bluetooth, radio, tv,and mobile. All of these transmissions are broadcast through the air but due to the lack of any visual physicality are not considered to be using space. These transmissions however do hold a physical form, we just are not able to see the electromagnetic frequency at which they are viewable. Left is an example of a visualisation of the hertzian space based on mobile phone calls using balloons. By studying hertzian space a new view of the environment can be discovered in which the atmosphere is full of different hertzian materials.
The Picnic: the picnic mat



This model was designed so that the peaks of the object were where the highest frequency of the notations occurred, while the troughs of the model were where there was little notation. On the non-printed side however these characteristics are reversed. For more information on this model see the following post:
The main criticism to be had about this model however is that the characteristics are quite difficult to identify through images such as those above. In the physical form the different folds are more easy to read, and so to make the key folds easier to recognise via images I have made the following illustration:

This also demonstrates the reliance of the observations of the existence of people in the image as each of the key peaks in this image are located at the points in which the people were located in the original photograph. This highlights an interesting relationship between the environment and the existence of civilisation, as a lot of the possible elements of the environment are most often sourced from humans, such as sound and movement.
This model almost acts as a complete reconstruction of the original picnic, but is however missing a vital element. The original picnic operated in 4D in that it occured over a period of time. The above 3D object however does not encorporate time and so cannot be described as a complete reconstruction. This final element however was not required as part of the project. The final 3D object created provides an abstract and intriguing perspective of the picnic, and acts as a viable conclusion to the project.
The Picnic: time-based model


These models demonstrate the abstraction of the original interaction into a physical reconstruction of the event from a different perspective. This is similar to the aim of The Picnic in that the original picnic was to be deconstructed then slowly reconstructed to produce a map of the picnic that showed it from a different perspective. The time-based model therefore acted as a part of this process, similar to the moulds and models made in the above projects.
The interaction that I had chosen to focus on was that of the hands within the picnic, and so for the model I made a mould of the back of my hand. I particularly made the knuckles more prominent by moulding a clenched hand rather than a flat hand, as this made the mould easier to recognise and a more interesting shape to look at. The following are images of the model that I made:
Once I had moulded the model of my hand and photographed it the images were placed over the hands of those in the time-based photograph as a final notation modelling the interactions that were made. The final notation therefore was as follows:

For more information on the production and development of the model and notations see the following posts:
By incorporating the 3D element into the notations the interactions that were studied become instantly more prominent. The model also adds a sense of depth to the notation due to the placement of the model matching that of the time-based photograph. This is an interesting observation as not only does the inclusion of the model add a 3D object to the image, but it also translates the whole notation from a 2D perspective to a more 3D one. This demonstrates that the inclusion of the model was more effective than first intended, and therefore more successful.
Overall the time-based model allowed us to investigate further the ways in which we can interpret our interactions with the environment and develop them into a more abstract reconstruction.
Saturday, 7 March 2009
The Picnic: time-based drawing

Edward Tufte’s work revolves around the visualisation of data and information, but in a way that is more effective and efficient in portraying the information to the audience. This means using unconventional methods of communicating information i.e. not by using basic graphs and tallies, but by drawing stimulating and representative images to portray the same information, and by using as little notation as possible . This demonstrates an interesting and unique way of looking and dealing with information. Items of his work often include environmental elements although the basis is set on the modelling of data. By looking at works such as Tufte's this gave us inspiration into how to notate our images in a way that displays the information within it. The following is the final notation images that I produced during the project:
These notations also show the model section of the Picnic discussed later in The Picnic: time-based model. To find out more about the construction of these notations select the links to the posts below:
The time-based drawing that I produced actually was quite successful. To begin with for this part of The Picnic I was finding it difficult to find interesting and unique ways in which to notate the environment. This was mainly because of the concept of abstracting elements of the environment that are not usually paid any attention such as light and sound. However the more time spent experimenting and looking for ideas, the easier I found it to make the necessary abstractions resulting in the final notations shown above.
Out of the notations that I produced I particularly like the grid used to display light and dark areas, and also the dots used to portray hard and soft materials. These are the better notations as they are better abstracted from what they related to. This makes them more interesting as their purpose is less obvious. They also add interesting shape and colour to the drawing as they are more wide spread characteristics that do not rely on the people in the photograph.
Overall the time-based drawing was a revealing and developmental task that was vitally important in learning to think more abstractly about the task at hand. Not only did it help in terms of future projects but also in terms of the ecological factor of the Picnic. By abstracting the photograph of the Picnic it highlighted elements of the environment which we interact with that we may not have previously noticed, therefore enhancing the human ecology aspect of the Picnic.
The Picnic: time based photograph
The final collage that I created resulted in the following time-based photograph:
I think this collage is quite successful in portraying the social interactions that occurred on the Picnic. In particular a variety of hand gestures are portrayed, demonstrating the focus of the image. The image however has not lost any readability due to this and so the situation can be easily assessed by the viewer. The composition of the image allows for the various movements and gestures to be displayed without crowding the image. As well as social interactions it also demonstrates interaction with both natural and man-made objects within the environment, which directly links with the theme of both social and human ecology covered in this module.
The time-based photograph was an interesting and engaging way to begin this project and also the module overall. By using a practical and social event to launch us into the project it allowed us to instantly get to grips with the mind set needed for the project whilst also acting as a useful introduction to the people and places around us.
Friday, 6 March 2009
The Picnic: the beginning
The Report: An Introduction
In particular there are three sections of this module that will be covered. These are The Picnic: social ecology which is time based, The Field: human ecology which is space based, and The Territory: deep ecology which is based on Arch-OS concepts. The projects conducted for each of these sections were worked on throughout the sections. This allowed us to progress through each section and use the information gained to develop and improve our projects. By looking at these areas and concepts in detail this report will allow for a development of ideas for the final project, and ultimately a final project proposal based on this.
Thursday, 5 March 2009
The Building's Dream
After using the vision system video to draw out the movement of the people through the atria, I removed the video from the animation, leaving just the movement squares and placed them on a black background to match what is needed for the GreenScreen. However, when drawing my animation I did not match to the GreenScreen resolution and therefore the animation is currently the wrong shape for it to be properly streamed through the screen. This is not particularly important though as this animation is only to give some idea of what could be produced. The result is shown below:
Although this is not currently in real-time, if it were to be developed it could be programmed to run directly off of the vision system and so could be made to be more relevant. This animation is quite a literal representation of the original data but does look quite interesting as the direct relationship with the movement in the video has been removed. This changes the viewers perspective as they do not have the video to relate to and so they are able to focus on the less obvious aspects of the video displayed in the animation. The interaction between sets of squares is particularly interesting as separate clusters merge together at times, whereas the people they resemble were unlikely to touch at all. This is similar to the concept of personal space. By using squares which extend the space taken up by the people, the personal space that they have in some way visualised. The completion of this project completes this streaming section of the module. The next step is to document and synthesise all of the experiences and information gained since this module began, which will then lead on to developing a final project based on what we have learnt.
Seeing and Dreaming
I considered a number of different ways of manipulating this into an animation to show the movement of people through the building, but concluded that the best option was to draw upon the video and use the motion recognition system but abstract it from the video so that its purpose would be harder to recognise. This would make the animation more intriguing and interesting to watch while being simple and dynamic. I therefore began to build my animation in Flash by copying the red squares from the video and their movement through the atria. This meant that I could then remove the video itself and be left with the squares alone. By removing the relationship between the footage and the squares it makes it less obvious what the data is demonstrating and therefore makes it more abstract. To see the final animation see the post above.
Wednesday, 4 March 2009
My Streaming Project
In terms of my streaming project I was particularly interested in designing an animation for the GreenScreen based on the Arch-OS data similar to other visualisations produced previously. I was drawn towards this concept as it allows for a wide range of different interpretations and also approaches an abstract understanding of the world around us. The idea of being able to personify a building by studying and displaying the changes within it is unusual and relatively unique in that not many locations yet have technology such as Arch-OS to carry this out. This therefore makes these particular areas more engaging and intriguing, which encouraged me to choose these specific areas to focus my project on.
In order to decide more specifically on what my project should be about I decided to make a mind map with which to investigate the possibilities:

By looking for ideas using this mind map as well as elsewhere I was able to come up with my project idea. It occurred to me during my search for an idea that the vision data system is separated into a grid which is similar to that of the GreenScreen.

Arch-OS

The Arch-OS system has already been used for many different projects. These include a number of visualisations of the data that is available (see data screensaver left), as well as uses of sounds around the atria. The Noogy project (mentioned below) was also incorporated with the Arch-OS system so that Noogy portrayed elements of the environment. For example the wind data was used to animate Noogy's hair so that it was as if his hair was blowing in the wind. This convergence of the GreenScreen and the Arch-OS system is particularly interesting, as by doing this the building was given a public personality that provided a realistic and dynamic portrayal of the environment.
Monday, 2 March 2009
The GreenScreen

An example of the work that has been previously displayed in the Plymouth GreenScreen is the award winning Noogy.org project.
The aim of the Noogy project was to show ecological and social data around the university campus through a character called Noogy. By using data from the Portland Square building in which the GreenScreen is placed, and data from the public via text messages, Noogy was able to portray information about the area through his personality. By placing this on the GreenScreen this could then be recieved by the public, making them more aware of their environment.
Overall the GreenScreen acts as a public interface by which information can be delivered. It is a unique and interesting take on the concept of Urban Screens which can provide relevant and intriguing information about its surroundings.
Video streaming
The first is the Quicktime Streaming Server (QTSS) which includes Quicktime Broadcaster. This is the Apple based streaming server which allows Apple users to stream live or pre-recorded videos which can be easily recieved by others who have access to the internet. This system allows videos to be recieved by a number of different devices, including mobiles and set top boxes and can broadcast at a high quality with high compression e.g. H.264.
The second example is very similar to this is that it is based upon the QTSS. The Darwin Streaming Server is an open source version of the QTSS which can also be run on multiple platforms rather than only on Apple computers. The basis of the product is however the same in that it enables users to broadcast both live and pre-recorded video. By having an open source version developers are able to take the software and modify it to fit their needs. As well as these examples it is also possible to carry out streaming using your mobile. An example of a product that enables this is the Qik.com website. Once registered with the site, this system allows you to stream directly from your mobile phone camera (video) to the site where people can then watch the broadcast. This again allows both for live streaming and streaming of videos that have previously been uploaded or broadcast via the mobile. It is, however, easier to come into problems with this system than systems such as QTSS. Firstly it is not necessarily all that easy to get the right software on your mobile phone as not all phone types are listed in the inital set up. This means some users have to take more time, and possible cost to be able to use the facility. Also by relying on mobile phone cameras for the video the quality of the stream this means the actual video quality that is broadcast may not always be as good as that of normal cameras. Finally using a wireless submission system such as that of mobile phones makes the connection less direct, and therefore slower and less reliable.
The concept of live video streaming is quite interesting as there are many possible ways of using it. All sorts of data can be broadcast leading to a broad variety of possible outcomes. Possibilities include videoing your every day life, a daily video blog, or a live video tutorial. However a lot of video streaming has already been incorporated into everyday computing with help of sites such as YouTube that broadcast information in this way.