I just wrote a guest blog post on the Betahaus blog about the four extraordinary months spent working in Berlin that have culminated in the deployment of a humanitarian response application for Caritas in Japan. The application was used during the response to the tragic Earthquake and Tsunami in Japan. You can read it here: Betahaus English Blog. The application will be going public soon, so watch this space.
Photo by Anna Hodgkinson
An email to the HOT list from Nicholas Chavent earlier this month (March 2011) announced the opportunity to participate in a Mapping Party for the Humanitarian OpenStreetMap Team – yep a “HOT” Mapping Party. Many OpenStreetMap enthusiasts would be familiar with the notion of a Mapping Party. Mapping parties are local social events with a focus on having fun while mapping for OpenStreetMap an open licensed map of the world. But what’s a HOT mapping party? HOT mapping parties are for OSM enthusiasts who are also interested in humanitarian mapping. We all know maps form a critical part of disaster response. But how can HOT and OSM be better prepared for the eventuality of assisting during a disaster? Disaster mapping in contrast to general OSM mapping has some specific requirements.
Disaster response must incorporate good geographic data collection practices:
During disasters mapping serves to allow quick and accurate decision making. But there can be no maps without any data. So good disaster response workflows incorporate good geo data collection practices. The HOT Kit is designed to do just this. Over the last year the HOT team has been working in Haiti and have continued to refine and add to a HOT Kit. HOT Kit merges traditional practices of disaster response coordination with the best that the open source geospatial tool-set has to offer.
In the event of a disaster mapping sits close to the planning, support and coordination units. Map-makers working in these efforts must be ready to answer any number of questions thrown at them by the coordinators of the disaster response. Thus being organized and knowing where everything is, is critical, an important goal of the HOT Kit. The kit is organised as five top-level folders under which sit almost anything a HOT team member supporting an operation would need.
Steps during a response will include data collection, cleaning and merging and product creation. The product creation may involve analysis and summarizing. Supporting the operation means having all the data and analysis at your finger tips. In addition it requires a systematic procedure for collection of geographic data and its inclusion into the map making process.
Hot Kit systematizes the steps of map making during disasters:
So the five root directories of HOT Kit are
00_DATA/ <--- geo data ready for use 01_SURVEYS/ <--- data collection 02_PROJECTS/ <--- desktop GIS projects (QGIS being default) 03_PRODUCTS/ <--- products like pdf, shpfiles, garmin files, kml 04_SUPPORT_DOCS/ <--- a kitchen sink of documentation
OSM community must get to know and help refine the HOT Kit: As the HOT Kit gets used it will continue to change and improve. HOT Kit must be able to handle changing requirements based on the kind of project or disaster its being used in. It also needs to elegantly support users of varying skills. Lastly it needs eyeballs. To help folks quickly getting acquainted with the HOT kit I have just published a ruby gem: https://github.com/sabman/HOT-Kit-Generator
HOT Kit Generator is only a day old gem written during the drive from Brittany to Paris. Its aimed at quickly creating a fresh HOT Kit for a new deployment. It currently only stubs out the folders needed during a HOT deployment. This is done based on a configuration file that defines the directory structure of a HOT Kit. My vision for this generator is to allow pulling distributed web and data resources using a single command to create an up-to-date HOT kit in seconds. Allowing different configurations depending on the disaster type, geographic region or project goals or length.
Give it a try and feel free to fork to get started on some of the TODO items. I would also appreciate any feedback on the idea itself.
On-ground surveys and OSM tags:
Once a HOT team is in the field and collecting data, they should be making the most of the opportunity. This means collecting details that would only be available to someone in close proximity to the resource being mapped. This is where the HOT tagging and forms come in. Over the last few months a number of HOT team members have been developing forms for the OSM Humanitarian Data Model. This along with the Humanitarian Data Model presets in JOSM can make the task of data entry after a survey go a lot smoother. Nicholas Chavent’s work on HDM is particularly impressive.
Role of remote mappers and tagging:
In an emergency the focus is on saving lives. Depending on the kind of disaster the tagging scheme will vary but the regular tags in OSM aren’t always compatible with the needs of first responders. As the HOT team matures so will their ability to engage with the regular OSM mappers who are willing to help support disaster mappers often using satellite imagery. As an arm-chair mapper, instead of mapping for fun you are mapping to create raw data which will form input into products that can give actionable information to first responders.
So where does Haiti-like remote mapping fit into a disaster response. Mapping a disaster from home can be a very useful activity. Useful to the first responders, provided some common sense is applied to the process. For one thing mappers who want to assist in the work of HOT should look at the current mapping priorities here: Current Humanitarian OSM Team Projects Then its worth noting that the mapping you do from home can give first responders some very useful base data. In the case of Haiti it was location of IDP camps, destroyed buildings, blocked roads and landslides. These are all things that first responders can take and plan their missions. It should however be noted that its possible only if the imagery is of high enough resolution. Haiti was and still is, an exception, where sub-meter post disaster imagery was made public and as a result the first-responders were able to save many lives. But we have yet to see such high resolution and clear images for other disasters. So when mapping from home its better to err at the side of caution and not tag an item if you are unsure. Having said that there is no doubt that some data is better than no data and so getting as much of the base infrastructure such as transports, hydrology, admin and populated areas mapped is a good thing.
Long live the HOT Mapping party:
Finally one man without whom the HOT mapping party in Beg-Meil would never happened was Roderic Bera. He was behind the smooth running of the weekend, having organized everything ranging from food to accommodation to transport – it was all very well thought out. Thank you Rod. Then of course Nicholas and Fredric worked late into the night on Friday to get everything to do with the mapping ready. Fred’s mock reports of a humanitarian crisis in Brittany and Beg-Meil read like a true forced migration.
Thanks to all who came and made the HOT mapping party possible, in order of appearance:
- Nicholas Chavent
- Roderic Bera
- Frederic Bonifas
- Manuel Robert
- Maxime Leguillier
- Anna Hodgkinson
- Joseph Reeves
- Rodolphe Quiedeville
A special thanks to Manuel for letting me hang at his place in Paris on Sunday night. Good luck to him and Maxime in Haiti. Until next time.
No doubt about it – Berlin has a fantastic tech scene and this weekend it hosted the 2nd Global Random Hacks of Kindness. I had a slow start to the morning so missed out on the introduction to the problem statements. But when I arrived I began looking around for projects to work on.
Having looked over the problem statements I had thought I’d be working on Risk in a Box – a project that we had done some consulting work for in the past. However, upon arrival I had a look at the problems that were proposed by local orgnizers. One that stood out due to its technical challenges was Caritas Germany’s field mapping system .
A team was contemplating taking on this problem and they gladly allowed me to join them, despite the fact I spoke no German. More on that later.
Brainstorming the idea further led to the expected friction between team members. But everyone was very civilised – except maybe when I proposed the database – I heard some German words fly around. The words either expressed extreme complements or the opposite :-) NoSql will do that.
1. What are we building?
Our first argument was over what we were building. Was it a web application, was it a database, was it a native mobile app, was it a mobile web app with an HTML5 (local storage)? The argument was resolved with an agreement to build a core API with a database backend and have loosely coupled clients on top that could work in offline mode if needed. For the purpose of the hackathon we would build a mobile client.
2. What mobile platform?
As alluded to above, eyebrows wrinkled and tempers flared (just kidding) with iphone vs. android vs. html5 (local storage). Given that we had an iPhone developer in the team (Engin Kurutepe) we decided the html5 and android client could easily be built later but for the proof of concept we would build an iphone client. This client would update the database via the API in real-time using the REST web service and have a local storage for off-line operations. Another client would be build for the office or operations room of an emergency coordination center. This would show the field updates in real-time using websockets.
3. What database?
Then came the question about what database to use. It had to support some degree of geospatial data and indexing. We could have gone with PostgreSQL+PostGIS or Sqlite+Spatialite or even MySQL but I intervened. As if we hadn’t already exhausted all available buzz words I suggested mongodb. Yes, thats right nosql. After some mumbling we agreed there were advantages to an document storage – especially when the data model was somewhat nebulous.
4. The data model
What were we storing in our database? … hummm … after some fluffing around on what we interpreted from the problem statement, we decided to called Gernot Ritter, the Disaster Relief Coordinator at Caritas. He was really excited to hear about our idea to build the real-time application. He recommended we make the data model as flexible as possible. So for the sake of simplicity we went with storing Events which would have the following attributes:
Event: tags description location photo attachment
Some more discussion on the finer details of the data model began but we felt this was enough to start writing code. And so we wrote code. Initial commit was a Rails app and by the 3rd commit we had decided Rails was over kill and we would build a light-weight API in Sinatra which anyone could build on. Thus I began work on the Sinatra app with Max, JoJo and Florian, while Klas started to investigate the web client and Engin sat in a corner writing Objective C for an iphone client.
The coding stints were very productive thanks to the great setup by the organisers and sponsors of the event. The food was great. One of the rooms had a XBox 360 + Kenetic set up so we could take a break and get some physical exercise.
Some photos from the weekend: http://www.flickr.com/photos/rhokberlin
I gave a quick talk:
Sadly I didn’t get a chance to meet all the judges but hope to catch up with them while I am in Berlin. Here is a shot of the judges. We were very pleased to have our project judged as the winner – though it took me a second to register we were the winners since the announcement was in German.
The winning team :-)
With our most recent project we came another step closer to the realisation that government geospatial projects are moving out of the 90′s. We were asked by a client to build an online modelling tool for evaluating risk resulting from a natural hazard such as an earthquake. Understanding risk requires identifying the hazard (e.g. Tsunami) and exposure (e.g. population in a coastal city).
Data on hazards and exposure forms inputs to the risk and impact models. Such data is normally under the custodianship of national governments, with the responsibility of custodianship often distributed across departments and agencies. Bringing the data together into a single modelling tool is not just a technically challenging task but also requires cooperation and sharing of data across agencies. And anyone who’s worked in government knows this is often harder to do than solving the engineering challenges. We were very fortunate to have the involved custodians onboard with an eagerness to learn and deploy new technologies.
Back in the 90′s such hazard models would be run on a desktop with access to the data via a local disc copied from all the DVD’s posted by the data custodians using snail-mail.
This was a fine solution back in the 90′s but in the 21st century, quite frankly, it starts to look embarrassing. Our proposed solution was a distributed architecture which pulled hazard and exposure data using OGC web services and ran the model on a modelling server that could be scaled. The modelling server pushed the results of the run to another OGC compliant server. This is a simplified drawing of the architecture:
Once the model run was complete the client was notified via websockets. Here is the final application in action:
Cool eh! With all these moving parts we needed to ensure that the various services were being monitored. Introducing Monit. If you haven’t heard of Monit check it out here: System Monitoring With Monit. It keeps tabs on your server daemons. If you have any questions feel free to post in the comments or get in touch via email.
… and I am in Berlin, if you are in town come say hi.
Mapping and Planning Support (MAPS) is a volunteer group based out of Canberra that provides Geographic Information Systems (GIS) support to emergency services during a “major” disaster. It was started in 2005 by Frank Blanchfield and Ian Batley who foresaw the need for a stand-by volunteer mapping team that could spring into action during a major natural disaster. MAPS was strongly supported by Adam Atkinson (RFS) and Steve Forbes (SES/RFS). By the time I joined MAPS in 2008 the team had already had several dozen deployments for a range of emergency activities: everything from fires, floods and flus. MAPS volunteers come from a multi-talented pool of professionals and they are the Australian equivalent of MapAction. There is a great presentation by Ian Batley on the volunteer technical community model of MAPS.
You can’t really talk about MAPS without making reference to the Black Saturday Bushfires. The Black Saturday firestorm of 13th February 2009 was Australia’s worst natural disaster. The fires were intense and ferocious and led to over a 170 fatalities and damaged over 3000 properties. Some towns were nearly wiped off the earth. Just two days after Black Saturday the first MAPS team was deployed on ground working with Victoria Police and other state and federal agencies to assist. That operation continued for 9 weeks. During the deployment MAPS assisted Victoria Police in the coordination of Search & Rescue efforts. Accomplishments included developing map products for carrying out search of over 3000 property parcels in a period of 2 weeks; developing workflows, data quality control routines and databases for capturing, storing, and analysing data collected by field search and rescue teams; developing mapping products to aid reporting of fatalities. Data captured included spatially referenced photographs, high resolution aerial imagery and real time data from GPS enabled handhelds and cameras. Software used included ArcPAD, ArcGIS, ArcSDE, SQLServer and MS-Access. I also wrote custom scripts (in ruby) to process over 9000 georeferenced photographs.
After the MAPS deployment to Victoria we started a working group to investigate the potential for using Open Source tools along with the proprietary applications that are currently used.
Then in January 2010 another natural disaster caught the world’s attention. I wanted to help. But this time I was several thousand miles away from the disaster. I have never been to Haiti, yet I was able to contribute towards the mapping of a country that had had it’s national mapping agency completely destroyed and sadly most of its staff killed. The work was done sitting in my home office working late at night to digitize aerial photograph, intermittent with some wiki gardening. This time I was not working under any formal organisation but rather with a loose community of open source and mapping enthusiasts. And due to my background of working with MAPS I was acutely aware of the value of mapping to first responders. Thanks to the OSM community this work was being used by responders in Haiti. After Haiti I gave a few presentations on how this data was collected, used and will play a key role in the future of Haiti. That talk is at slideshare.
After Haiti I realised that although MAPS can handle disasters in Australia and the Asia Pacific region – if they had formal links with the OpenStreetMap, CrisisCommons and CrisisMappers communities they could learn a lot from each other and the collaboration would help become much more versatile. As I was thinking about formalising this relationship the time for the MAPS Autumn Training also came along. This provided an ideal opportunity to introduce OSM to MAPS. And as it turned out other MAPS co-ordinators where also thinking the same thing. So last weekend we had a training day at the ACT State Emergency Services HQ and we included OpenStreetMap – a crowd sourced, open and free database of spatial data. AFIK this was the first time in the history of MAPS and Australian volunteer emergency community, in general, that a crowd sourced database was used. This is the first step by MAPS to engage with the OSM community and over the coming weeks the working group will be meeting regularly to start to pull together the tools and expertise needed to make OSM a regular source of data and information. My presentation slides and tutorial videos are below:
Here is a video showing how to print walking papers and collect data.
Here is the video showing the use of walking papers to add data to OSM.
Since Haiti I have been doing a lot of reading on disaster response and recovery. Today I needed to load a GeoRSS feed from USGS. They publish the location of earthquakes as a GeoRSS feed. They also publish a product called ShakeMaps. This is a fantastic product since it provides a map of the actual “ground shaking” a not just the simplistic magnitude and epicentre. Creating a ShakeMap is a function of local geology, epicentre, magnitude and many other variables. You can read about it here. ShakeMap are thus much more useful from a disaster response and recovery point-of-view. For example if you have the necessary data on geology and population exposure you can run a model that simulates an earthquake over a region to predict the likely effect on the population.
Getting back to the GeoRSS – the ShakeMap feed allows us to integrate this into any web mapping application thus creating interesting mashups. E.g. overlaying it with socio-economic or population exposure or vulnerability data to show the simulation results I mention above.
Here is the code for the routes in sinatra:
require 'rubygems' require 'sinatra' require 'open-uri'
get '/' do haml :shakemaps end
get '/proxy' do open params["url"] end
When the client hits the our root url (/) we will render the OpenLayers map with the USGS ShakeMap. We may also want to let people add other feeds.
shakemaps.rb comes along. It looks for the parameter url at the end of the URI and simply opens it. This is the GeoRSS XML feed. Finally there is a form allowing us to point to other GeoRSS feeds.
Last few years have been difficult for Pakistani citizens. However the ordinary citizen is so removed from the geopolitical forces driving the conflict in Pakistan that it is hard to imagine how an ordinary citizen may be able to contribute to resolving the conflict. In these circumstanses I was delighted to see the progress that the openstreetmap community in Multan have been making towards mapping their city using Openstreetmap. Here is a video from Aleks created from the GPS tracks that Kashif has been editing for the OSM community in Multan. Keep it up folks!
OSM in Pakistan hasn’t been very active in the past. However the future is looking bright. This year during the state of the map in Amsterdam you will run into a university professor – but don’t let the absent minded-professor-look fool you. This is a man on a mission – as a professor at the largest university in Multan (Bahauddin Zakariya University) he has managed to generate a ground swell of students and faculty to start systematic mapping of Multan. He has just received a small (as in tiny) grant from his University to help kick-start the mapping of Multan.
There is a lot of work ahead and being in Amsterdam (thanks to the generous scholarship from Open Society Institute) he will be keeping his ears open to learn as much as possible from the seasoned OSMer – especially those from countries at a similar stage of mapping as Pakistan.
So get to know this face and if you see it introduce yourself and share ideas. He will be giving his lightening talk along with the other scholarship holders – add it to your calender here.
You can checkout the planned mapping parties in Multan, Pakistan. Also feel free to get in touch with Asif via email: mianasifrasul at gmail.com
Note that this post is for folks who are new to ruby. I have to admit a lot of geospatial developers have not been using ruby due to the poor support for geo libraries in ruby – so it’s understandable. In this post I will show how to start parsing the FOSS4G 2009 Schedule. Lets start by gathering our tools. I am going to assume you have ruby installed. If you don’t have Ruby install, don’t fret there are now one-click installers – yes even for windows: http://www.ruby-lang.org/en/downloads/.
Reading YAML: The first thing we will need to do is to read the ‘official’ FOSS4G schedule YAML file that’s hosted at http://2009.foss4g.org/schedule.yml – as of writing this it contains data for workshops and tutorials. We will connect to the url using OpenURI module which is part of Ruby’s standard-lib. So lets try something simple, like reading the data into a Hash:
The data from the schedule is now loaded into the hash named data. If you are not familiar with ruby hashes checkout the class docs http://www.ruby-doc.org/core/classes/Hash.html also if you listen to the surreal voice in these videos you will fall in love with hashes forever – don’t say i didn’t warn you.
Ok moving on, lets poke around this data structure containing the FOSS4G schedule. Lets say I want to iterate over each tutorial and workshop then create an ical entry that we can share with our friends – so they know which sessions they may be interested in attending. Useful eh? – OK maybe not that much but it’s a start. The code for iteration would look something like this:
Next I’ll show you how to create an iCal entry and perhaps email it to your friends.