Qualitative Methods and Ripple Effects

by

Bianca Shah

“So I am an anthropologist who is not so familiar with so-called disaster mitigation. After joining the disaster platform, I understand better how to deal with disasters. Before, when there was a disaster, I ran away, [but] it turns out, you have to calm down when there is a disaster; right, calm down, don't panic, we prepare, we look at the conditions first, then walk slowly."


This is what a school teacher in Bali told me about her emotional shift from helplessness to capability in the face of disasters.


She’d discovered the platform (PetaBencana.id) on Instagram and on account of her school being on a cliff, in an earthquake-prone region, had wanted to learn of ways to reduce the impact of earthquakes and floods. Her interest in the platform grew when she learnt that by using the platform, one could report disasters via social media, without the need to visit government offices.

A screenshot of PetaBencana.id showing a resident’s report during flooding. Flood selfies, a common trope in Indonesia, provide an invaluable flood height estimate, offering vital scale and context for validating conditions on the ground when formal sensors fall short.

Since 2021, she has educated herself, her peers, her students, and an entire community, about disaster response and preparedness. Now her students report blocked roads and find alternate ways to reach school. They’re enthusiastic about the platform (active users get merchandise), and they’ve taught their parents, siblings and communities on how to report on the platform. Effectively, she’s made her students the ambassadors.

A standard evaluation would have noted: 'X students trained, X reports submitted from Y region.' It would have missed the teacher's shift from fear to capability, the students becoming agents of change, the culture of active participation during disasters, that now endures. It would have missed the changes that weren't in the theory of change but matter most – how people feel and experience disaster events.


This raises a question: What do we need to do to see these intangible shifts? And why are qualitative methods essential for capturing these shifts?

When I was approached in October 2024 to conduct an evaluation of two disaster reporting platforms, PetaBencana.id (in Indonesia), and MapaKalamidad.ph (in the Philippines), I made a methodological choice of abandoning the standard evaluation questions. After hearing this story, I wanted to ask questions that would help me uncover such shifts.

The Platforms and the Problem


The disaster reporting platforms allow citizens to report disasters (floods, earthquakes, fires, landslides, haze, and wind) through various social media channels and platform websites. The reports are presented in real-time on a public map.

The objectives are clear:

  1. Provide a platform for sharing and accessing actionable, accurate, real-time information about disasters

  2. Improve disaster response, and

  3. Reduce disaster risk. 

Bambang Suryaputra, Director of the Emergency Operations Control Center at the National Emergency Management Agency monitors PetaBencana.id to respond to resident reports in real-time
Bambang Suryaputra, Director of the Emergency Operations Control Center at the National Emergency Management Agency monitors PetaBencana.id to respond to resident reports in real-time

Bambang Suryaputra, Director of the Emergency Operations Control Center at the National Emergency Management Agency monitors PetaBencana.id to respond to resident reports in real-time

But I struggled to imagine how the platform actually fit into people's daily lives and work. If I were a resident, would I use it beyond navigation? If I were a government official, how would it shape my decisions? The monitoring data told me that the platforms were well adopted; people used them for navigation during disaster events; government agencies and relief organizations used the platform.

I needed to understand how and why people actually used the platform before I could assess whether disaster response had truly improved or risk had been reduced.

The Methodological Choice


The evaluation statement of work shared with me had all the standard questions: 

  • To what extent did the objectives and implemented activities meet the stakeholders' and collaborators' needs and priorities? 

  • What factors or actors influenced the achievement of project objectives? 


These questions felt like traps. How do you ask someone "to what extent did the program activities meet your needs" without it being a leading question? How do you measure "improvements" when you don't yet understand what people most value?

So I decided to start with the most basic, open-ended questions: ‘how did you first get to know about the platform’, and ‘how did you engage with the platform then’.


I shaped questions to make it more conversational, giving the respondents space to start wherever they wanted to start, and emphasize whatever they found significant. Disasters can be a difficult and sensitive topic to talk about, which made the choice to keep it conversational so much easier.

For instance, instead of asking “What critical successes (achievements) were attained through implementing the 3 year BHA-funded project?” I asked, "How would you define the impact of the platform on your community? Has this changed over the years? Could you share a story as an example?"


This choice to give people the space to share their experiences and feelings in the way they wanted made all the difference in what I was able to capture through the process. In hindsight, it iterated how asking different questions let me see differently. But, I didn’t feel confident about it right up until I wrote the final report. The stories that I had captured were  touching and meaningful, but I worried about the deviation from the evaluation objectives.

What emerged when we asked differently


In these interviews, people didn’t just talk about faster disaster response or improved information sharing, but also about agency and responsibility - in training others, ensuring personal and community safety, and coordinating emergency response. They talked about communities, networks of resilience, and collective action that extended far beyond the digital space into real-world action and mutual support.

Marian’s story

Marian is the President of a Youth Council and a town councilor in a local government unit in the Philippines. In 2020, as part of a workshop hosted by the MapaKalamidad team, she invited ~330 youth members to participate in a challenge to make a mock report on the disaster reporting platform, testing knowledge and spreading awareness about the platform. 


This sparked something. She started thinking about formally involving youth in disaster reporting and response.

The problem with disaster response in her area was clear: her community's emergency responders were aged 40-50, some were senior citizens, unable to conduct rescue operations efficiently because of health limitations. Youth were active in disaster response but untrained, which sometimes led to casualties.


She worked to pass an ordinance in her local government unit mandating:

  1. Youth be trained in emergency response, and be involved in planning and hazard management

  2. Disaster reporting be done on the Mapakalamidad platform alongside official reports,

  3. Local government train citizens in platform use, and

  4. Funds be allocated for youth training, allowances, and insurance.

Youth volunteers lead a peer-to-peer workshop on using social media for real-time disaster reporting
Youth volunteers lead a peer-to-peer workshop on using social media for real-time disaster reporting

Youth volunteers lead a peer-to-peer workshop on using social media for real-time disaster reporting

More significantly, she saw an opportunity to redirect existing intervention programs (typically clean-up-drives) for out-of-school children and youth struggling with delinquency by involving them in disaster response activities.

When I asked her about the incentives for youth to engage in such activities, especially with regards to the at-risk youth, she shared:

They saw how important their role is. They saw how important they are to the community. They don't see the need (for clean-up drives), or they don't see the impact... Now [with them being emergency responders] they saw how important their role is.


The National Youth Commission recognized Marian's ordinance as a model of good local practice and recommended that all local government units adopt similar policies.

A survey would have told me that Marian used the platform for “improving disaster response”. It would have missed how a single workshop led to an ordinance, how that ordinance formally included youth in disaster response, how it gave at-risk youth a sense of purpose, and how that local policy became a national model. It would have missed the entire chain of transformation.

Similar patterns in other interviews

Marian’s story wasn’t an isolated shift. 

Individual Agency

A Red Cross employee shared how the platform had shifted the conversation: “Disasters are now everyone's topic of discussion because of PetaBencana. Because everyone can actually participate and contribute in disaster mitigation."

Members of the Red Cross Youth lead a peer-to-peer training session on reporting disasters through PetaBencana. Peer-to-peer learning strengthens community capacity to respond quickly and collaboratively during disasters.

They noted how the platform gave youth a sense of purpose.

"For children or young people who may not know (their tasks and responsibilities during disasters), they have a feeling of participation and are aware that they have a role in the safety of others.” 

Community Networks

First responders and community volunteers described relying on networks of organizations that had formed around the platforms for obtaining aid and other essentials. Volunteer networks, youth groups, and community-based organizations that were built around the shared practice of disaster reporting had extended beyond digital spaces into real-world action and mutual support.

The Comparison


If you're still wondering what the methodological difference looked like, had I used quantitative methods, here's a couple of examples for comparison:

Interview question

Survey Question

How do you engage with the platform?

What do you use the platform for? (Navigation/ Safety/ Relief & Aid/ Monitoring for Disaster Response and Rescue/ Others)

How would you define the impact of the platform on your community? Has this changed over the years? 

Could you share a story as an example?

What has been the impact of the platforms in your community? (Improved access to life-saving information in disasters/ Feeling confident about facing disaster events/ Quicker disaster response/ Lesser casualties and injuries/ Others)


The survey would have told me that people use the platform for safety and that they feel more confident. True statements. But it wouldn't have told me: 

  • Why the teacher shifted from panic to calm 

  • How at-risk youth came to see themselves as essential community members. 

  • How people began viewing disaster response as a shared community responsibility 

  • How one training led to an ordinance that became national policy

  • How fear transformed into agency and shaped a culture of preparedness and resilience

It couldn't capture the journey of change. The ripple effect. The moment when something shifted from an external event to internal transformation.

Truthfully, it felt uncomfortable to assess a widely adopted platform primarily through individual narratives rather than big numbers. The new approach was unfamiliar territory and I was unsure if deep, personal stories could build a compelling case for impact at scale. 

Why does this matter?


When Marian told me “they saw how important their role is”, I realized I was witnessing something that couldn't possibly be captured through quantitative tools alone. 

This wasn't just about information sharing or response times; it was about young people discovering their worth, communities finding their agency, and institutions reimagining what's possible. These transformations were never in the theory of change; one could not have anticipated these outcomes. They emerged because we created space for them to be seen.


Such evaluations matter because it allows us to circle back to the questions we start programs with, “How will this program change someone’s life?” and then go beyond them.


Such assessments allow us to measure things that standard evaluations miss, in this case:

  1. the intangible shifts (from panic to calm, from stigmatized to valued, from passive to active), and

  2. the shifts that we didn’t anticipate (students reaching school during rainy seasons, youth training their families, government agencies reimagining what counts as legitimate data).


Seeing these shifts has made me think about impact differently.

If we measure such meaningful shifts, can this push the needle towards more meaningful interventions? Can we create an ecosystem that values and nurtures such shifts rather than only what we can count/measure?


In a world that prioritizes driving big changes in short periods, assessments like these remind us to engage deeply and focus on what really matters. They push us to pay attention to shifts that require time to evolve and mature.

These stories have also allowed me to understand disasters more holistically – not just as catastrophic events but as multi-layered experiences that shape daily lives, educational outcomes, social relationships, and personal agency. Recognizing this complexity is essential  to designing better interventions and then allowing time for these interventions to create fundamental shifts in people’s everyday lives. 

Because what we choose to measure, shapes what we value, and what we value shapes what we design. If we systematically miss transformations like these, we risk undervaluing meaningful interventions, and failing to nurture what's working.


For complex social interventions in dynamic contexts, we often don't know what will matter most until we create space for emergence. The youth who discovered their importance, the teacher who moved from panic to calm, the communities that cultivated shared responsibility – these weren't just nice anecdotes. They were the mechanisms through which the intervention achieved its technical goals and the reasons it sustained over time. They were how change actually happened.


I think this is what qualitative methods offer: not just answers to our questions, but questions we didn't know to ask, and discoveries that reshape what we think matters.

Bianca Shah

Researcher | Urban Development

Bianca is a qualitative researcher with a background in urban planning and development studies, working at the intersection of institutions and lived experiences. She examines urban governance, health systems, and disaster response through field-based research, and has experience in project management and M&E across India. She has a keen interest in long-term research, evaluation, or advisory work – particularly interdisciplinary projects that use participatory and qualitative methods, take a bottom-up approach, prioritize contextual sensitivity, and aim for meaningful systems change.