=Paper=
{{Paper
|id=Vol-1952/Disaster
|storemode=property
|title=Open Data And Disaster Management
|pdfUrl=https://ceur-ws.org/Vol-1952/Disaster.pdf
|volume=Vol-1952
}}
==Open Data And Disaster Management==
Open Data and Disaster Management Ditsuhi Iskandaryan Universitat Jaume I Spain ditsuhiiskandaryan@yahoo.com Abstract This paper is describing the role of open data during disaster management. The frequency and intensity of disaster occurrence force to focus on management and planning. Day by day more data is becoming open, which add transparency and support decision makers to react faster in an extreme situation. The method was based on research and comparison between countries, tools, services and apps. The result shows with new technologies continuous improvements are doing for covering existing gaps, such as handling with big data, analyzing faster. 1 Introduction Disasters, such as earthquake, flood, hurricane and so on, are events which occurred suddenly and cause human, economic and environmental damage. Population growth, spread of disease, climate change affect on frequency and intensity of disaster occurrence. During Haiti earthquake disaster took 220,000-336,000 lives[7], during Hurricane Katrina it took 1245-1836 lives[8]. For decreasing these numbers, for sustainable development it is important to concentrate on disaster management. Organizations and agencies working on disaster management need to collaborate with partners, find more data, find a way for mitigating losses and risk. Open data is one of the valuable sources, which plays crucial role in disaster management. Open data is defined as data that can be freely used, re-used and shared by anyone. Data to be open must be open legally and technically. First one refers open data license which allows anyone freely to access, reuse and redistribute [9,10]. There are many types of license like Creative Commons License, Open Data Commons, the Open Government License and so on[3]. Technically open means that data must be machine-readable and in bulk form[9,10]. 2 Use cases In 2010 during Haiti earthquake because of the lack of data many issues were appeared. It was difficult to find people who needed help. After two days Google, GeoEye and DigitalGlope together got high resolution image which was widely used. But still there was a lack of data and volunteers realized that they can help from distance. They started to do online mapping. After a few weeks in Open Street Map were done near 10,000 edits to the Port-au-Prince region. Another service, that was widely used, is Ushahidi. It was created in 2008 in Kenya for reporting violence during election, however it became more popular and during Haiti earthquake many volunteers reported SMS, MMS geotagging in interactive map. The fact that people could report SMS was one of the main advantages of Ushahidi, because only 11% has access to internet and more than 30% has mobile phone[1,6]. Another case is about Colombia. In the beginning of April in this year huge water flood and landslide almost destroyed a town Mocoa (Putumayo) in south of Colombia. More than 200 people died, 500 families were affected by this disaster. There were issue related to available open data and it was difficult to generate data in a short time. Some authorities Copyright © by the paper's authors. Copying permitted for private and academic purposes. do not have the capacity to generate data in the speed that the disaster needs. The Open Street Map and the Mapping Humanity community worked together to map the affected zone. They used social networks, some data which are available in the official portals. After several days they finished mapping task of the area and the output was used even by the official authorities for saving lives, for distributing the resources[11,12,13]. 3 Challenges and barriers Despite the fact that the role of Open data is crucial, there are many challenges which need to solve in future. Challenges are related to access, usage, dissemination, data collection. For decision making it is important to have up-to-date information about emergency situations, to collect data regularly and filter these huge data. Related to the latter one is remarkable to mention about Karnataka state natural disaster monitoring center(KSNDMC) in India. India is considering to be as vulnerable country to disasters. Only this year more than ten disasters happened, including earthquakes and landslides. The main function of KSNDMC is to collect data, analyze these data and to visualize output on map. But there were problems regarded handling large data, real time analysis and conveying result to end user. However, later using ESRI technologies, such as Arc SDE Spatial Database, ArcGIS Desktop, ArcGIS Server and Web GIS Technology, they overcame these issues[14]. It is necessary to mention also about accuracy, completeness and dependency of volunteer geographic information. Volunteers using their mobile devices can provide near-real-time information about some events (including electronic reports, pictures, videos, location), they are more active and information provided by them is increasing hugely. However, they are with different background, they can report some information about certain geographic problem without deep knowledge. So, for decision makers it is very important to check accuracy and one of the way is to compare these data with reference dataset (for example compare with dataset provided by government) and after analyze the differences[2]. During Haiti earthquake there was duplication of effort, there was a problem of combination data created by different software. Researcher from the University of Virginia worked on one project, which aim is to find way for increasing trustworthiness of crowdsourcing data. In their web site first strategy is classify group membership (e.g. police is more trustful, than unknown group). The second strategy is about ranking of the posts. Each post is tagged with five-stars rating system and user can evaluate a post after reading They also suggested to comment report and to rate viewer. Actually, these suggestions are more technical and mainly depend on viewer activity[5]. The team from University of Munster did survey to understand which type of crowdsourcing is more usable by asking questions to experts from different organizations related to disaster management. The result showed that most of disaster managers used Twitter, but mostly for broadcasting not for collecting data. Half of the participants was aware about Ushahidi, but only 21% was using this. Experts brought several reasons, such as uncertainty, trust and semantic problems. According to these experts they need some improvements in training of volunteers, filtering and rating of information and etc[4]. While disasters occur disaster management will stay actual discipline for study and for improvement. One big achievement is the creation of following free software such as InaSAFE[15] and WebSAFE[17]. The aim of these software is, using data from scientists, local governments and communities, to generate consequences of future disasters. 4 Conclusion As a conclusion we can say Open Data is valuable tool for disaster management. It can help to make quick and effective decision. When data is not fully available volunteer geographic information with existing open data can save lives, resources and mitigate losses. Also, we have to take account that early-warning systems can save and protect huge numbers of lives. During Cyclone Phailin in 2013, when new early-warning system was applied, forty five people died, In 1999 during the same size storm were died 10,000 people[16]. So, we can see the benefit of early-warning system. References [1] J. Heinzelman and C. Waters, “Crowdsourcing Crisis Information in Disaster,” Phys. Rev. Lett., vol. 96, no. 25, p. 258102, 2010. [2] S. Jackson, W. Mullen, P. Agouris, A. Crooks, A. Croitoru, and A. Stefanidis, “Assessing Completeness and Spatial Error of Features in Volunteered Geographic Information,” ISPRS Int. J. Geo-Information, vol. 2, no. 2, pp. 507–530, 2013. [3] N. Korn and C. Oppenheim, “Licensing Open Data : A Practical Guide,” no. June, pp. 1–8, 2011. [4] J. Ortmann, M. Limbu, D. Wang, and T. Kauppinen, “Crowdsourcing Linked Open Data for Disaster Management,” Proc. Terra Cogn. 10th Int. Semant. Web Conf., no. January, pp. 11–22, 2011. [5] A. C. Weaver, J. P. Boyle, and L. I. Besaleva, “Applications and trust issues when crowdsourcing a crisis,” 2012 21st Int. Conf. Comput. Commun. Networks, ICCCN 2012 - Proc., 2012. [6] M. Zook, M. Graham, T. Shelton, and S. Gorman, “Volunteered Geographic Information and Crowdsourcing Disaster Relief: A Case Study of the Haitian Earthquake,” World Med. Heal. Policy, vol. 2, no. 2, pp. 6–32, 2010. [7] https://en.wikipedia.org/wiki/2010_Haiti_earthquake [8] https://en.wikipedia.org/wiki/Hurricane_Katrina [9] http://opendatahandbook.org/guide/en/what-is-open-data/ [10] http://opendatahandbook.org/glossary/en/terms/open-data/ [11] http://blog.openstreetmap.co/2017/04/02/llamado_mocoa/ [12] https://www.humanitarianresponse.info/node/77/search?search=mocoa [13] http://pierzen.dev.openstreetmap.org/hot/leaflet/OSM-Compare-before-after.html#14/1.1455/-76.6512 [14] http://geoithub.com/esri-indias-arcgis-disaster-management-karnataka/ [15] http://inasafe.org/ [16] https://www.bloomberg.com/view/articles/2014-02-18/seven-steps-to-surviving-a-disaster [17] http://www.rappler.com/science-nature/environment/62263-indonesia-philippines- disaster-risk-reduction-gis