Thursday 2 May 2013

Neogeography and Volunteered Geographic Information: Theories and Concepts - By Sarah Labusga




A typical geographer: Map, compass and notebook. How long will he still exist in a world full of computers, cellphones and internet.
By now ordinary persons take tasks, that were done by agencies and civil service specialized for geodata. Why? How can it be, that unlearned people deal with geodata? When, where and why do they do it? Neogeography and Volunteered Geographic Information are terms that relate to this topic. They describe concepts of this new phenomena and define unclear questions.

First of all there should be a look on the server-client-model and how it changes. Until now it said, that a client sends a request to a server. This server edits it and, if possible, sends an answer back to the client. What changes now, is the role of the client. He is no longer the inactive part but becomes active by dealing with the contents of the internet by himself and shaping them. A central term is “Web 2.0”. Web 2.0, also called Social Web or Social Media, is understood as the age of the internet in which the consumer (user) becomes a so called prosumer. The user does not only download stuff but he also feeds in own website content. He can further edit, correct, comment and value many things that are already online. The new user is not only consumer but more a producer in the world of this unbelievable network. Wikipedia and Wikimapia, Flickrr, Twitter and Facebook, as well as OpenStreetMap or GoogleMaps are just a few examples.

 



In addition to the social web there is another conception which is very meaningful and interesting in reference to integrate users in designing the internet and its content. Especially now, in the age of smart phones, a huge part of population has access to the internet (almost) always and everywhere. Influenced by this fact the idea of VGI (Volunteered Geographic Information) gets more importance. It is used for example in crisis management. Behind the idea of VGI there are many technologies, that make fast and simple handling with spatial data possible. First of all, there is geocoding to get a reference. Without geocoding we could not be able to make precise localization, even if we have the place names. To help for example to reconstruct an accident situation, we need to have coordinates in order to make clear statements about the position. A simple tool for first steps is GPS (Global Positioning System). Also Geotags, Graphics and broadband communication become highly significant so as to get localized in a simple, fast and very accurate way.

All those aspects stand under the topic of Neogeography. The term describes the return of geography in the area of information systems, mainly by becoming a strong part of Web 2.0. Neogeographers are equipped with new tools like GoogleMap Creator or MapTube, all services, that are called mash-ups. Mash-up is the name for websites that are able to integrate data from all kinds of sources and connect them to a Single-User-Service. Of course there is many more concepts coming up with the social web like the Spatial Data Infrastructure, that is nowadays seen as a patchwork-system. The capture, edition and initialization of geodata is not anymore only the challenge of agencies but now the duty of many participants – companies, community facilities and private persons. The agencies should have the new task of arranging a standard for this spread work, in order to help combining all results.

Yet another topic, which I think is very impressing, is the idea of “Humans as sensors”. The model characterizes three different sensors. Those, that are static and inactive (thermometer), and those, that are carried by animals, humans and vehicles. The third and new sensor is seen in the human himself. A human can five senses and his intelligence to experience and interpret what is going on around him. If he notes what he sees, feels, hears and so on, he can help to collect spatial information. Sounds like a great thing, coming back to trust people and their capability to value. Are we leaving the image of a unreliable and failing human behind us?

It seems to be a great chance, everybody can join. Everyone? I do not own a smart phone...


References:

Schindler, M. & Liller, T. (2012): PR im Social Web. Das Handbuch für Kommunikationsprofis ; Medienwandel und Web 2.0 verstehen, von Praktikern und Experten lernen, nachhaltige Strategien entwickeln. Köln: O'Reilly.

Goodchild, M. (2007): Citizens as sensors: the world of volunteered geography. Online:Springer Science+Business Media B.V.

Hudson-Smith, A.; Crooks, A.; Gibin, M.; Milton, R. & Batty, M. (2009): NeoGeography and Web 2.0: concepts, tools and applications. In: Journal of Location Based Services Vol. 3, No. 2, S.118–145.

9 comments:

  1. This comment has been removed by the author.

    ReplyDelete
  2. Would be interesting in which extent VGI is already used for commercial purpose and how much private companies are interested in solving named problems/issues like spatial data infrastructure for VGI.

    ReplyDelete
  3. Comment by Louisa Schneider:


    The question arises how to evaluate such subjective data, and how to avoid "black sheeps". One challenge is to develop information systems that generalize the subjective information, process, and classify it in the correct way. It has become scientifically recognized that the definition of space is always the outcome of perception. However, how is it possible to provide correct data, if everyone can put his own perception, and thus his own experiences, feelings etc. into it? Or, in doing so - since everyone can monitor and correct each other - do we even get closer to 'reality'? Moreover, is it possible that the quality is improved by this kind of self-regulation?

    ReplyDelete
    Replies
    1. In the case of VGI and OSM, a "data model" is constantly developed along the growth of the OSM map. Thus, the contributor of geo-information has to comply with the rules of the data model, or "geo-tags" which a provided with the data model.
      In terms of everyone's own perception, isn't it very interesting to see what actually is mapped and what isn't? If we change our perception of what is "correct" and what is "wrong" we can pull out much more information about societies from maps. Sometimes, we maybe have to forget of what we learned about the classic rules of cartography to find stories about cultures and societies in maps? :-)

      However, when it comes to Crisis Managament and Crisis Mapping, evaluation and verifying geotagged information is a crucial point! Tools like Flickr, youtube, Twitter, facebook et al. enable us to geotag social media content on the fly and these information are widely used by humanitarian responders to get a better situational awarenees in crisis situations. But how do we know, which social media information is right and which is wrong? One might be interested in reading the excellent blogpost of Patrick Meier on how to verify (geotagged) social media information in disaster situations here:
      http://irevolution.net/2011/11/29/information-forensics-five-case-studies/

      Delete
  4. Due to different semantic awareness of each "sensor" it is hard to interpret the observations and thus create general information. To counter this, predefined classes could be offered for personal observations which on the other hand results in a data fuzziness. Since the "pure" personal data is very valuable, maybe the users should not be restricted to tightly structured forms - a decision the offering platforms have to make.
    For the "black sheep" problem: Mechanisms need to be established that automatically detect errors and malicious contributions so it can be ensured that the information is reliable. This would assume the development of automate accuracy assurance systems and/or intelligent algorithms.
    Also some kind of trust levels for human observers based on the quality of their previous observations could be introduced.

    ReplyDelete
    Replies
    1. If we look and wikipedia, wikimapia and OSM: Aren't the contributors themselves the instances who correct wrong data entries? One could try that by adding wrong content to OSM and see how long it takes until the error is fixed by another mapper.

      As for "trust levels" in ther twitter-sphere we could observe some cases in past disaster situations where some people became highly trusted sources information based on the tweets they sent and based on their twitter profile and the people who followed these particular persons...

      Delete
  5. What is the importance of Social Media, the concept of "citizens as sensors" and "Neogeography" for Crisis Management and Humanitarian Help?

    ReplyDelete
  6. Werner Clödy9 May 2013 at 16:51

    I would say to decrease unreliable information by using "trust levels" or by correcting the wrong data by contributors themselves is a good approach. However there has to be done more because information must be accurately, particularly in crisis management where lives are at risk and every mistake could have fatal follow-up.

    ReplyDelete
    Replies
    1. Totally agree, these approaches have proven to work. The question remains, how to detect wrong information in an crisis situation where the people in charge cannot wait for others to correct the data...

      Delete