by Verena Flörchinger
Social media is strongly embedded in our daily life. Twitter, Facebook, and types of communication technologies are constantly changing our habits of communication. We do not only share information in face to face contacts, but also via a multitude of new forms of social media. New technologies like smartphones comprise diverse sensors, such as cameras, microphones and GPS, which enable a new level of information distribution.
This new technology combined with social media allows the user to share information with a huge and growing community. Twitter for example opens up the possibility to share instant messages with the public. Via Instagram and YouTube, photos and videos respectively, become accessible to everyone.
There is not just the information we knowingly share, but also metadata about location, time and so forth. Within these new forms of communication and hidden data lies a big opportunity for crisis mapping and disaster management if the V&TC are capable of extracting information nuggets out of the big data provided. These new types of information-usage and new technology can be useful for situational awareness, the distribution of alerts and further functions of disaster management.
Albeit these possibilities, there is also a wide range of challenges and problems. It is challenging to detect events while big data volume is rising every day. The first hurdle is to extract useful information from a large amount of data. Hence, there is a need for new methods of detection and characterization of events, disasters, crisis etc. In academia as well as in volunteer communities the rise in effort can already be felt. With the help of the growing community of volunteers it is possible to process data regarding an event.
An example would be the extraction of relevant twitter messages, which provide data worth using for disaster management. The big data set and the velocity of data make it difficult to react promptly, because the analysis of big data is highly time-consuming and resource-intensive. In order to tackle the problem of real-time processing of data, which is crucial during the first hours of an event, it is necessary to develop program filters for such processes. When an event seems relevant for disaster management the next step is to categorize all the information. Within this phase the data quality and the verification has to be discussed before the analysis starts. If an automatic method is generated further questions arise: Is it possible to use one method for all events of the same type? How is the information redistributed to the public? The latter question is especially relevant when looking at countries and regions where the access to these kinds of media is restricted or not widespread due to financial hurdles.
Besides the challenges of detection, coordination and categorization there is the recurring and important factor of data privacy: Which mechanisms have to be considered if we want to respect individuals’ privacy while using the data of social media?
In the context of disaster management the high potential of social media is almost only used on disaster response. Can social media be used during the phases of resilience, mitigation and preparedness as well? Could the next task be to build the according tools?
Selected Literature:
ADAM, N. R., SHAFIQ, B., & STAFFIN, R. (2012). Spatial Computing and Social Media
in the Context of Disaster Management. Intelligent Systems, IEEE, 27 (6), 90-96.
IMRAN, M., ELBASSUONI, S., CASTILLO, C., DIAZ, F. & MEIER, P. (2013). Extracting
Information Nuggets from Disaster-Related Messages in Social Media. 10th
International ISCRAM Conference, Baden-Baden, Germany, 1-10.
MEIER, P. (2013). Analysis of Multimedia Shared in Millions of Tweets After Tornado
(Updated). iRevolution. From innovation to Revolution, Online: http://irevolution.net/
(09.06.2013).