Tag Archives: Robert Munro

The Munro Report: Crowdsourcing and Humanitarian Relief

This week in class, we were fortunate to have Robert Munro as a guest lecturer (learn about him here). Munro is a “computational linguist working in communication technologies,” which makes sense in the context of the case study he wrote about Mission 4636, a humanitarian aid effort that utilized crowdsourcing and text messages in post-2010-earthquake Haiti.

Munro gave us several examples of and lessons learned from humanitarian aid efforts utilizing crowdsourcing to process information: Pakreport (after the 2010 floods in Pakistan), the Libya Crisis Map (to follow events occurring in the country as part of the Arab Spring), and most recently, the Sandy Mapmill (to assess damages following Hurricane Sandy).

From the presentation, it seems that crowdsourcing for Pakreport was relatively efficient because of local people’s involvement, but that crowdsourcing for Libya Crisis Map was a failure. This failure arose from the fact that many locals did not want to be associated with the project out of fear of safety for their lives. Sandy Mapmill might be considered to be in between the other two, as volunteer-based damage assessment was greatly useful; however, it was determined that paying professionals would have been more accurate and cost-efficient than crowdsourcing. In the case of Mission 4636, one can apply the same basic lessons – utilizing locals’ (and/or people of the diaspora) knowledge was vital to the success of the project because they are more familiar with both language and locales than outsiders. It was clear that non-Haitians processed MANY fewer pieces of information than Haitians or former Haitians.

One point Munro hammered on was that private data practices should always be utilized in all cases to protect the identities of people in the disaster zones and that public mapping is generally discouraged, for the same reason.

Another interesting tool mentioned was “Natural Language Processing (NLP),” which is an automated language processing system that allows the processing of large amounts of language information. Of course, there are flaws, as with any automated systems – out-of-context translation and limitations because some languages are not already in the system; however,  NLP seems to be a valuable tool when there is just too much information to be processed and may expedite humanitarian relief efforts.

The presentation was very interesting to me since I never even considered that crowdsourcing could be used in humanitarian aid efforts. I’m glad that from Mission 4636, they were able to realize that “insiders” obviously have more helpful information and can be of more use to relief efforts than “outsiders.” This isn’t to say that the “outsiders” made no contribution; they definitely helped, but the majority of information processed was done rapidly and mostly accurately by those who knew the disaster zone.

Advertisements

Lessons Learned: Don’t Believe the Hype

ICT4D looks to the future to try to address problems as ancient as rebuilding after a disaster and needs as base as a child’s need for food during a famine. The staying power of these problems makes the world seem incredibly bleak and can foster a strong sense of alienation that only grows as the divide between the haves and the have-nots expands. The successes and failures of our field offer a multitude of lessons about what we are capable of and what we still fall victim to when setting out to change whatever small part of the world we can.

  • 1.) Very smart people can do very stupid things, so proceed with caution and humility, even if you’re sure whatever you are dealing with is wonderful. I think for many of us, the bleakness of the development world fosters an intense desire to find the silver bullet and turn the whole world on its head. It’s easy to get excited about a new project, or innovator, or idea of your own and get tunnel vision. Good projects have failed because the leaders thought they had such a universal or perfect idea that they didn’t need pilots, and only saw the flaws in their work once it had been deployed en masse (or, alternatively, fail to get funding because they didn’t pilot). Bad projects have gotten huge through sheer force of personality and media manipulation, only to crash and burn, succeeding not only in not helping anyone at all, but often causing harm. Research, feedback and repeat monitoring and evaluation may take up time and money, but they are not short-cuts that should be taken, especially if the justification is “Of course my _________ will work, it’s perfect. I know best.”
  • 2.)Leading off of the first point, projects and initiatives need to get target community participation whenever and wherever they can.  Every project seems to benefit from increased target community participation, input, and eventual management, and many suffer for lack of it. If a project can get stakeholder participation, I feel it should embrace it, and if it can’t, the leaders need to take a step back and make sure that lack of participation isn’t the community signaling that the intervention is unnecessary or unwanted.

As far as tools, I feel my first foray into mapping, facilitated by this class and the Red Cross, will help immensely in my future career, which mostly concerns determining the most efficient and equitable means of distributing public goods in states with limited funds, and how to prioritize development. Being able to access and interpret more empirical data through mapping is vital, especially in areas with a lack of official and up-to-date mapping. More philosophically, the nuances of development that we’ve been shown in the class have, I believe, made me a more thoughtful person. Before Rob Munro, I never would have considered the issues surrounding public mapping, even though looking back it seems so intuitive that there are very real concerns with publishing such sensitive information. There are so many causes and effects and realities of life on the ground that I think we miss out on when just looking at raw data and looking for quick interventions, and the cautionary tales presented in our class has really helped me proceed more cautiously and less dogmatically than I would have before.

I think in general, the best frameworks for tackling a problem involve enabling existing community initiatives and focusing on bottom-up projects. While there are of course many things (especially in physical infrastructure) that must be financed and implemented on a governmental or higher level, I feel all projects would benefit from community involvement and that community voice should be the main driver of project creation and implementation. The mantra of “if we drop it in, they will come” has largely failed, and the most successfully utilized initiatives have been those that already had clear demand. Grassroots projects have the benefit of already containing a number of dedicated individuals who have proven an intense desire for the intervention, and a commitment of time or money to see it through. By focusing on what target communities are telling us they need, rather than us telling them what they need and what we’re going to give them, I think we can decrease the paternalistic flavor of many ICT4D programs and increase both the sustainability and benefit of interventions.


Automated Texting Services for Low-Resource Languages

Following our class period with Robert Munro, I found myself browsing through his Twitter and found an article describing his PhD topic in an August 9th Tweet. Within the article, he elaborates on some of the concepts discussed in class; as he explains, so many of the 5,000+ languages of the world are being written for the first time ever with the proliferation of mobile telephony, but the technology to process these languages cannot keep up. Compounding the problem, these phone users are of varied literacy levels, making for spelling inconsistencies among users. However, he concludes that automated information systems can pull out words that are least likely to vary in spelling (ie people, places, organizations) and examine subword variation by identifying affixes within words as well as accounting for phonological or orthographic variation (ie recognize vs. recognise). The article goes on to provide more technical prescriptions for automated text response services, and he even links to another article in a separate Tweet, which describes Powerset, a natural language search system that ultimately failed, but utilized a few valuable processes.

Ultimately, Dr. Munro implies that the capacity for automated text services in “low-resource languages” is well within reach, particularly because the messages are generally just one to two sentences. Because spelling variations are predictable, they can be modeled, and hopefully reliably answered by automated systems. However, the use of these systems will not be realized until they become more reliable and efficient than human responders, which, as he explained in class, can be extremely effective.


Guest Speaker: Robert Munro

Our guest speaker this week via the computer was Robert Munro. He gave us a very enlightening talk on crowdsourcing and the opportunities it creates in solving many issues around the world. Robert Munro is a computational linguist, which is someone who models natural languages through a computational perspective. This gives him a wide array of skills, which he uses in his many projects. Munro got his PHd from Stanford University and was top 5% in his class of engineering/science candidates. Currently, he is the CEO of Idibon, a company for language technologies, and does work for Energy for Opportunity in Sierra Leone. Munro also has an impressive background of many interesting and diverse projects. In 2011, he worked at Global Viral Forecasting, which aimed to track diseases worldwide. Munro also coordinated Mission 4636, in which he translated and categorized emergency texts for disaster relief in Haiti. Munro has done work with crowdsourcing world wide, and has used his unique skills set to help better the world through the use of language technology.

More information about Robert Munro can be obtain at his website.


ICT4D Professional Profile: Robert Munro

Robert Munro is a computational linguist in the area of communication technologies and works largely on less resourced languages.  As a graduate fellow at Stanford University, much of his research involves topics such as crowdsourcing and machine learning.  Mr. Munro originally came into the field through his previous research experience. After graduating from University of Sydney in 2004 with majors in Linguistics, Computer Science, Information Systems, and English and Film Studies, he proceeded to work on the Hans Rausing Endangered Languages Project at the University of London.  Mr. Munro worked as the project’s software developer – designing digital archives, working with multimedia development, and researching into computational linguistics.

After his original experience with HRELP, Mr. Munro proceeded to work on many ICT4D projects worldwide.   For example, Mr. Munro was involved with the Mission 4636 service during the January 12th, 2010 earthquake in Haiti.  With this service, Haitian’s were able to text their medical needs and receive aid.  Mr. Munro helped to coordinate the translation and categorization of text messages that were received.  With the help of Crowdflower, their crowdsourcing platform, Mr. Munro and his colleagues were able to translate the messages within ten minutes.   Overall, the initiative was successful and they were able to process more than 80, 000 messages – “the first time that crowdsourcing had been used for real-time humanitarian relief and the largest deployment of humanitarian crowdsourcing to date.”Along with crowdsourcing efforts, one of Mr. Munro’s major areas of interest includes machine loading.  In 2011, Mr. Munro worked as Chief Technology Officer at the Global Viral Forecasting, an initiative dedicated to predicting and preventing the emergence of new disease outbreaks.  In particular, he worked with a system called EpidemicIQ.  With the help of thirty labs worldwide, the team, currently, is able to gather information about epidemics and load them into the system to filter out what is relevant.  The machine-loading technique gathers various types of information that it can then use to predict a certain epidemic arising in an area.  For example, Google Flu trends determined that flu outbreaks could be predicted by simply tracking the symptoms that are usually searched.Beyond these experiences, Mr. Munro has worked in Sierra Leone as Chief Information Officer for Energy for Opportunity (EFO), an organization devoted to finding a safe and environmentally friendly way of providing electricity to communities throughout West Africa. He currently, “heads the IT services at EFO and does everything from developing software systems to training and acceptance testing” (EFO).When he is not involved in attending conferences or performing research, Mr. Munro enjoys blogging at Jungle Light Speed and traveling around the world.

Sources:  Robert Munro’s Website, EFO