Skip to content

Climate & Energy

A Roadmap for Managing Disasters: How Climate-Vulnerable Countries Can Use Tech


Paper2nd December 2021

Our Time to Zero In series makes the case for an inclusive transition to net zero that focuses on people, fairness, technology, markets and communities.


Chapter 1

Executive Summary

Climate crises threaten to displace 1.2 billion people by 2050, with the cost of adapting to these new threats estimated to reach the range of $280 billion to $500 billion per year. Vulnerable people and regions, including sub-Saharan Africa, will be disproportionately impacted. Yet climate-vulnerable countries have received minimal funding for adaptation to date. Early-warning and early-action systems have an essential role to play in enabling effective disaster-preparedness and response efforts. As we outlined in the opening paper of our series on climate disasters and tech, tech-enabled solutions could potentially help to prevent $66 billion in loss and damage annually.

This second paper specifically identifies technology-enabled solutions, the challenges governments face in adopting and scaling these technologies, their relative impacts and trade-offs, and the actions needed to ensure technologies can be integrated into existing national systems.

A range of technologies is critical to developing robust, state-of-the-art, disaster-risk-management systems. This paper provides three key tools to help governments determine which technological tools are relevant, depending on national and local contexts:

  1. Use-Case Library: this library provides policymakers with a resource to stay informed of the technologies that are relevant for specific preparedness and response activities.

  2. Complexity Versus Impact Model (Figure 2): this is a decision-support tool that presents the value of a technology relative to others and highlights those that offer the most effective path towards desired outcomes. The tool can help policymakers identify the highest-impact technologies – or the least complex – depending on their priorities.

  3. Decision Trees (Figure 3): these tie the use-case library with the complexity-impact model, walking policymakers through the process of adopting a technology for specific use cases.

Finally, in addition to understanding which technologies can be used and when, governments also face challenges associated with ineffectual data-management practices. Policymakers can draw on expertise about responsible data practices developed by organisations such as the Centre for Humanitarian Data, while setting up data-sharing agreements that lay out consistent expectations and streamline the process using minimal best-practice standards. Governments also face a disconnect between national systems and externally developed technologies, which leads to fragmentation and a lack of interoperability between systems, creating duplicative efforts and friction between the links in the early-warning and early-action chain of events.

To enable better integration across systems, we recommend that governments:

  1. Leverage open-source, open-data and open-API tools to ensure a pathway for future integration and collaboration. Open-source software prevents costly software lock-ins and makes it easier to build in a modular manner, pool resources and outsource specific functions of a tech stack to those with greater expertise and experience. Tech-enabled initiatives must demand and seek out open-data sources while also making their own data available for use, privacy issues notwithstanding. Open data are especially important for building predictive models in the risk-knowledge and analysis phases. Leveraging open data sets up a virtuous cycle: giving greater access to data spurs more tech-enabled use cases that then build further evidence of impact, and subsequent interest in more sustainable, systematic data collection. Tech-enabled initiatives must leverage and provide tiered access to their platforms or outputs via standardised open APIs. APIs are “code that acts as an intermediary between two different pieces of software and enables them to communicate with each other.”This promotes an ecosystem of innovation in disaster preparedness and response by allowing other innovators to build on previous work.

  2. Ensure tech-enabled initiatives align closely with NDMAs. Tech-enabled initiatives should align with the standard procedures of existing National Disaster Management Agencies (NDMAs). According to a 2019 document, published to coincide with the third ITU Global Forum on Emergency Telecommunications (GET-19), “sustainability dictates that governments themselves are best placed to know how best to utilise disruptive technologies under different contextual environments. This is becoming particularly important for coordination purposes, given the variety of technology tools in the hands of different groups.” To ensure eventual government ownership, it is essential that providers of tech-enabled solutions align their approaches with existing government processes and approaches and follow governance structures already in place.

  3. Mandate inclusion of members of vulnerable communities in tech-enabled solutions and be problem led: Complete and thorough integration cannot happen without the participation of those communities most likely to be impacted. Climate-related shocks are often highly local, and the needs and capacity of impacted communities are essential for effective early-warning, early-action systems. Proposed tech-enabled solutions should be problem led and “get the job done” for the community. Tools that do not meet the community’s needs or capabilities will be abandoned in favour of homegrown solutions, exacerbating fragmentation and foregoing external expertise.


Chapter 2

Introduction

It can be politically challenging for any policymaker or government to prioritise funding for a disaster that has yet to happen. However, as Covid-19 has demonstrated, it is essential for governments to track potential crises, to have the right tools and frameworks to manage these risks, and to have the necessary infrastructure and capacity to respond effectively when disaster does strike.

Covid-19 has put tens of millions of people at risk of falling into extreme poverty; 272 million people are – or are at risk of becoming – acutely food insecure, and more than 5.1 million people have lost their lives since the start of the pandemic. The impacts of climate change are likely to far outweigh those of the current pandemic.

Climate-linked disasters are growing in scale and frequency, disproportionately impacting populations in climate-vulnerable regions and low-income countries. Disasters triggered by natural hazards now occur nearly five times more often than 40 years ago. In the past decade, 1.7 billion people around the world have been affected by climate-related events, which have cost $132 billion annually in economic losses.

By 2030, climate change threatens to push over 130 million people into poverty, and could mean that 200 million people a year – twice as many as today – need emergency aid. By 2050, climate-related crises could displace 1.2 billion people, most of them concentrated across the Sahel, Southern Africa, the Middle East and Central Asia. Ignoring the impacts of these disasters threatens to undermine and derail long-term development goals and mitigation efforts, including net-zero targets.

Most climate-linked disasters can be predicted. Over the past decade, a wealth of technologies and tools have been developed to enable governments, donors and communities to be better prepared and more empowered to respond early. Early warning and early action save lives and livelihoods while reducing the damage costs associated with climate shocks could save more than $66 billion in costs each year. However, governments in low- and middle-income countries (LMICs) face challenges in using these technologies for the following reasons: 1) knowledge of how different technologies can be best leveraged is limited; 2) the ability to act is constrained by access to technology and interpretation of data; and 3) disaster-financing models are outdated.

This paper focuses on the barriers governments face in identifying and using technologies to support effective and cost-efficient disaster-risk management. It then provides recommendations for how governments can use the right technologies to transform their disaster-preparedness and response efforts. In the coming weeks, two additional papers in this series will set out how governments can access and incorporate the right technology into a comprehensive, government-wide, tech-enabled disaster-management system, and how they can leverage international and national finance flows to build and maintain such systems. Collectively, these papers underscore the importance of tech-enabled disaster-risk management but, ultimately, these systems must be complemented with a focus on curbing emissions and achieving net zero, while increasing investment in long-term adaptation and climate-focused innovation to counter the irreversible effects of climate change.


Chapter 3

What Is the Tech?

From enabling governments and communities to predict climate disasters to deploying targeted and timely responses, technology can transform disaster-risk management to save lives and livelihoods.

Early warning and early action upend the traditional reactive manner of waiting for a disaster to unfold before funding is released and aid delivered. It enables vulnerable communities to be forewarned, ensuring a more effective response to disasters. This shift is made possible by technologies that facilitate more accurate and earlier warnings, better targeting, faster communication, risk-reducing pre-emptive actions and real-time situational awareness of an unfolding disaster. However, many governments face challenges in adopting and scaling these technologies.

Most critically, there is a mismatch between the availability of these technologies and knowledge about them. The International Telecommunication Union recently recommended that governments “increase understanding about which technologies are relevant for different country circumstances and types of disasters.” Without this knowledge, governments cannot demand country-specific, problem-oriented solutions. They need a deeper understanding of the technology options available to them, their relative benefits and trade-offs, and the actions needed to ensure that additional technologies can be integrated into existing national systems.

Available Technologies for Early Warning and Early Action

A wide range of technologies is critical to developing robust, state-of-the-art, transformational disaster-risk-management systems, and to preventing climate shocks from becoming long-term economic and social disasters.

Early warning and early action require high-quality, accurate data to be collected and analysed for risk-informed impact forecasts and targeted response action plans. Three types of data are necessary for risk-informed disaster-risk management (DRM): hazard, exposure and vulnerability data. "Hazards" are natural events such as landslides, and data to track and monitor these hazards include weather-related indicators such as rainfall measurements. "Exposure "is an inventory of communities in the path of a hazard, and their economic and physical resources such as farms, businesses and residences. "Vulnerability" measures susceptibility to climate shocks and is an amalgamation of individuals’ capacities to withstand a climate disaster as well as the capacity of their local community to respond. Poverty indicators are important for assessing vulnerability as are the number of hospitals and emergency shelters.

Figure 1 outlines the critical steps that support robust tech-centred disaster preparedness and response efforts: 1) data collection is critical to tracking and monitoring hazards, and informing early warning and early action through in-depth forecast information and risk analysis; 2) data analysis is essential for determining where and when a climate disaster might strike, who will be affected and the potential impacts; and 3) action ensures timely and accurate warnings, and appropriate assistance to affected populations.

Figure 1

Landscape of disaster preparedness and response technologies

Source: TBI

Data-collection technologies enhance forecasting and hazard monitoring by continuously collecting detailed climate and weather information over large areas. Satellite imagery is the foundational technology supporting data collection and is one of the most mature technologies for disaster-preparedness and response efforts, with much of it available for free. Due to spatial or temporal resolution limitations of satellite technologies, aeroplanes, drones and ground-based technologies, such as automated weather stations and the internet of things (IoT), can fill the gaps. Although they cover much smaller areas than satellites and can be hindered by bad weather, the higher-resolution imagery of aeroplanes and drones complements satellite imagery. Automated weather stations and IoTs can also be used to automatically and continuously collect and transmit data from hard-to-reach areas. Inconsistent power and unreliable internet connectivity pose a challenge with most of these technologies, but this can be mitigated with solar panels and a hybrid automated/manual approach of periodically downloading data.

Satellite imagery is also the basis for the collection of exposure-related data, with drone and crowdsourced data filling the gaps. Crucially, exposure data must be labelled and technology plays an important role here too. Crowdsourcing initiatives can aid in labelling images, but this is arduous with quality-control challenges. Machine-learning models can automate the process, speeding up the identification of damaged or at-risk infrastructure. Vulnerability data are essential for informed DRM, but they are one of the most complicated to collect. Census and social-protection registry data are two key sources. However, neither are complete nor updated frequently enough to capture those people affected by sudden climate shocks. Satellite imagery combined with mobile-phone-call data records can fill the gaps in identifying poor and vulnerable people, and Unstructured Supplementary Service Data (USSD)-based registration surveys can collect the information of potential beneficiaries, making aid distribution more inclusive, targeted and faster.

Data-analysis tools synthesise information about the likelihood of a climate shock occurring, where it will occur, how soon and how local communities will be impacted. The tools have grown in sophistication from traditional statistical modelling to artificial intelligence, machine-learning and deep-learning algorithms. These new tools can identify patterns missed by humans tracking fewer indicators, thereby enhancing forecasts, and improving early warning and early action. This stems, in part, from the ability to incorporate alternative data sets such as mobile call-detail records, which have been especially useful in expanding national social-protection registries to be more inclusive in the aftermath of a climate shock. These data-analysis techniques also provide earlier, more accurate predictions so responders can act sooner. Accurate, detailed and comprehensive data sets and a high level of expertise, particularly during model development, are required to leverage advanced data-analysis tools.

Data analysis informs effective Action so local authorities and responders can deliver aid effectively to the right people. Authorities can leverage satellite imagery in addition to call-detail records, which adds greater detail on the locations and movements of individuals. Communication is fundamental to effective early-warning systems; radio and television, long used to disseminate warnings, have to be “on” and messages cannot be restricted by population. The growth in mobile phones has opened up new methods of communicating directly with impacted populations, including through SMS- or USSD-based apps and mobile-phone-based broadcasting. In addition, the proliferation of smartphones has enabled richer methods of communication at lower costs. Temporary equipment, such as Cell on Wheels, are invaluable for humanitarian organisations’ communication with each other and with impacted populations after communications infrastructure has been destroyed. Once a disaster hits, a combination of aerial and crowdsourced street-level-imagery technology can be transformational in identifying, communicating and enabling search-and-rescue and evacuation strategies. Blockchain technologies can support coordination of aid delivery and the monitoring of the supply chain. Mobile-money apps deliver one of the most effective forms of relief – cash transfers – enabled by digital ID.


Chapter 4

An Integrated Tech-Centred Approach

Ideally, the tech-enabled disaster-preparedness and response process is streamlined, with the latest, relevant data from various sources compiled into a database that can be shared with multiple stakeholders. The data are analysed to create impact-based forecasts that are communicated to decision-makers and responders to semi-automatically or automatically trigger a cascade of early actions, with in-built mechanisms for monitoring and feedback.

This streamlined pipeline leverages technology to create a systematic, non-duplicative yet exhaustive process that builds towards one primary goal: minimising loss of lives and livelihoods as a result of climate-related shocks. The timeliness of data is key and depends on the nature of the climate-related event: some flood-related indicators must be registered hourly, while others can be measured less frequently. Diversity of data sources mitigates bias in predictive models and promotes accuracy overall while highly localised data layered on satellite images result in more effective predictions.

Creating shareable databases, while still preserving privacy and ownership, for other experts to conduct sector-specific analyses is more efficient than each stakeholder creating a replica of the data. To minimise loss of lives and livelihoods, and release response funds to mitigate impacts, authorities must focus on impact-based forecasting, which moves beyond traditional hazard forecasting to predicting the impact of climate shocks on communities. Insights from impact-based forecasts facilitate the automated release of funds based on pre-agreed thresholds. Automation speeds up response times and efficiency, but must include redundancies (duplicate processes or systems to ensure continuous service), manual backups and overrides to account for unreliable internet or power. The funds facilitate pre-disaster early actions, reducing the financial burden while increasing impact and post-disaster response actions. Monitoring for unexpected or unwanted results, along with feedback mechanisms for corrections and exceptions, must be built into the process.

Early Warning and Early Action Driven by Tech

Data

  • Hazard, exposure and vulnerability data are collected from global, national, and local commercial and open-source models leveraging aerial and non-aerial technologies (such as the internet of things).

  • Data are stored responsibly in shareable databases that integrate government data sets with those of NGO and private-sector partners.

  • Data interpretation and analysis is performed by national meteorological departments that monitor, forecast and model the potential impacts of climate-linked shocks on affected populations. Combining traditional and big-data models provides an optimal balance of efficiency, flexibility and accuracy, and leads to more granular insights.

Decision

  • Cross-government procedures, and clear and distinct triggers, are prenegotiated to streamline timely decision-making.

  • Insights from data analysis are combined with prenegotiated triggers to determine the level, location and mechanism for financing and to aid delivery-response efforts, enabling effective early action.

Action

  • Prenegotiated triggers automate appropriate aid-delivery-response efforts, such as cash transfers ex ante.

  • Response efforts are highly targeted and tailored to affected populations based on impact-based forecasting that accounts for the ability of these populations to be resilient in the aftermath of a climate-related shock.

  • Communication technologies enable more effective and rapid coordination between response organisations, and between response organisations and affected populations.


Chapter 5

Addressing the Mismatch Between Opportunity and Knowledge

The proliferation of technological innovations, and their potential to radically improve disaster-risk management (DRM), can overwhelm even the most attentive policymaker. The following three tools help navigate this space in a systematic and objective manner.

Use-Case Library

A library of use cases, or “specific situation(s) in which a product or service could potentially be used”, breaks down disaster-preparedness and response into specific actions and identifies the technologies that are needed. Policymakers can use this library to stay informed of the technologies that are relevant for specific preparedness and response activities. Policymakers can also utilise the tool to map out, identify and engage the government departments accountable for successful completion of the use case(s) by asking:

  1. Is there is a clear line of responsibility for the delivery of the outputs of the use case within government structures, and who holds that responsibility?

  2. Do the owners of the use case(s), once identified, are willing to adopt technology?

  3. Do the owners have the capacity to change and still manage existing procedures?

Complexity Versus Impact Model

The complexity-impact model is a decision-support tool that presents the value of a technology relative to others and highlights those that offer the most effective path towards desired outcomes. Each data point on the chart depicts a technology by DRM phase and the estimated degree of complexity in adopting that tech compared to the potential impact it can have. This tool also provides policymakers with insights into layering technologies: if they need to collect hazard data, the chart plots each of the technologies associated with that use case so that policymakers can start with the highest-impact technology or the least complex, depending on their priorities. Finally, the tool offers insight into a pathway from the low-hanging, “low-complexity, high-impact” solutions to higher-complexity technologies, as experience and expertise are developed.

Figure 2

Complexity versus impact model

Source: TBI

When determining the complexity and impact of technologies, each technology was first assigned a complexity score that considered the following elements:

  • Onerous or outdated regulations hinder wide-scale adoption, especially for nascent technologies. Drone regulations, for example, are restrictive in many countries.

  • A greater digital divide means a greater hurdle for technologies that require a constant, consistent connection to a mobile network, the internet or a power source. This element includes the capacity of end users; if the end user is a government employee, it may be feasible to train them on a geographic information system (GIS)-based tool.

  • The greater the knowledge and skills required to build, own and operate a particular technology, the greater the hurdle to adopting it. Deploying a drone for situational awareness is impossible if the expertise to pilot one is not immediately available.

  • Technologies that rely on input or output data must abide by data-management best practices, which dictate a minimally acceptable set of standards. From a technical perspective, appropriate software, hardware and expertise are required for the data to be collected, stored and transmitted. Data sourced from third parties require partnerships and agreements, further complicating implementation. From an ethical perspective, providers must ensure strict data privacy and protection.

  • The direct cost of owning, operating and maintaining software and hardware varies with technologies and delivery models. Open-source software can mitigate some of these costs.

  • Maturity refers to how long a technology has been applied in a similar context. Greater maturity means greater familiarity for users, rationalised costs, wider expertise and fewer defects. Use of drone imagery in low- and middle-income countries is newer than satellite imagery and so, based solely on this indicator, would be less desirable.

  • Operational complexity considers the steps required to adopt a technology successfully. The more complex it is, the less likely it will be adopted successfully. For example, deploying a manned aircraft requires a pilot, authorisation and runway space, at the minimum. Drones or satellites offer attractive alternatives, based on this single factor.

  • Time available considers the DRM phase where there is generally more time available in the risk-knowledge and analysis phase, and less in the preparedness and response phases. For example, there is less time to task a satellite during the response phase when time is of the essence, making drone imagery more expedient.

Next, an impact score was assigned based on each technology’s facilitation of early warning and early action, taking the following factors into account:

  • Technologies that speed up a step have a positive impact. Mobile money, for example, transfers funds quicker than the alternatives.

  • Technologies that cover greater geographic areas, people or time have a positive impact. USSD-based surveys cover more people than door-to-door surveys or possibly even smartphone-based surveys.

  • Technologies that improve accuracy will have a positive impact. Machine-learning predictive models that incorporate many more variables are more useful than traditional models that track fewer variables.

The complexity-impact model quantifies the relative outcomes of adopting one technology over another, but it does not facilitate the deeper discussion about how to proceed from a starting line to tech-enabled use cases. The following decision trees facilitate this discussion and should be used in conjunction with the complexity-impact model.

Decision Trees

The three decision trees tie the use-case library to the complexity-impact model. Leveraging a use-case approach, the decision trees walk policymakers through the process of adopting a technology and the key aspects of that use case. They depict three such use cases to exemplify how best to determine the appropriate technologies to invest in.

Figure 3

Three decision trees to help policymakers determine which technologies they should be using 

Download all the decision trees

These tools should be considered a starting point by policymakers. There is considerable country-specific nuance that cannot be accounted for in a generalised model. Policymakers should closely examine the complexity-impact methodology to see which of the factors present the greatest hurdle for their country. For example, in some countries, the digital divide may not present as much of a hurdle as in others. Moreover, policymakers should be prepared to adopt more than one technology per use case, depending on their analysis of complexity-impact and decision-tree answers.

However, some general conclusions can be drawn. Governments must prioritise technologies that facilitate insights, with policymakers identifying the risk-informed insights from data already available, and which additional data should be collected.

Since outputs from earlier phases are fundamental to nearly everything else in early warning and early action, more skills and time should be invested earlier in the DRM process. Experience from developing tech-enabled use cases can then be leveraged later in phases where crisis-related time pressures add another layer of complexity.


Chapter 6

Recommendations for Adopting and Scaling Technologies

Beyond knowing which technologies to use and when, policymakers still face two key challenges to adopting and scaling them: a disconnect between national systems and externally developed technologies, and ineffectual data-management practices.

Identifying the Barriers

Disconnect with national disaster-risk management efforts. Early warning and early action are a tightly linked chain of steps involving the gathering of intelligence, analysing it and then directing actions. Each step requires the previous one to provide clear and timely outputs. But many tech-led approaches to disaster-preparedness and response run in parallel to existing government structures and systems, and in isolation from each other. This fragmentation leads to a lack of interoperability, which creates duplicate efforts and friction between the links in the early-warning chain, resulting in costly inefficiencies for end-users. A recent review, by Indonesia’s Universitas Gadjah Mada, of disaster-risk management policies cited as a key challenge: “...weak coordination, cooperation and linkages among the sectors related to disaster risk reduction [and] absence of consensus regarding terminology, and limited coordination between stakeholders.” In addition, standalone initiatives hinder scale-up, independent evaluation and government accountability. The dynamics behind this barrier are investigated further in our next paper.

Ineffectual data-management practices. Technology can facilitate faster and more effective collection, storage and transmission of data. However, clarity is required on who is responsible for collecting and holding the data while ensuring they remain up-to-date and in a machine-readable format. For example, an app developed to direct health care in the aftermath of a disaster fell short because of the absence of machine-readable, health-care-related indicators and no clear lines of responsibility for filling the gap. After collection, how and whether data can be shared in a standardised format, especially when owned by private-sector stakeholders, such as mobile-network operators, must be determined. Researchers have cited this difficulty: “Getting access to the geographic information (mobile phone tower location) essential for spatio-temporal analysis and mapping of call detail records data, which is not necessarily included as standard practice by mobile operator databases…[and released by operators] in different formats.”

Mobile network operators (MNOs) may have rational, proprietary reasons for not sharing data, but collecting data from local government authorities can be just as complicated. In one paper, researchers said that “obtaining … data owned by local agents [is] … a time-consuming and expensive process, especially if models are developed off-site.” In addition, an often overlooked issue in data management is the ethical treatment of data subjects. Tech-enabled initiatives, with their potential to collect vast amounts of personally identifiable data, are responsible for adhering to the laws and policies that protect against data misuse. This is especially important in humanitarian activities because data subjects include marginalised communities at, perhaps, one of the most vulnerable points of their lives. A review of predictive analytics within humanitarian action discovered that “not covered explicitly in the majority of initiatives’ documentation was the crucial issue of data safeguarding.” Without complete knowledge of available data sources, streamlined procedures for sharing and mechanisms to prevent misuse of data, tech-enabled disaster-preparedness and response efforts run the risk of delivering incomplete services, triggering unintended consequences for beneficiaries, and/or lowering trust in authorities.

Creating Integration Across Systems

Governments must take three approaches to build a more connected and integrated disaster-risk management approach:

Leverage open source, open data and open API. Governments must encourage open-source, open data and open API-based tools to ensure future integration and collaboration. The Convergence Initiative offers a useful example for developing an interoperable, integrated ecosystem in the social-protection sector. Open-source software prevents costly software lock-ins and makes it easier to build in a modular manner, pool resources and outsource specific functions of a tech stack to those with greater expertise and experience. Subsequently, open-source tools make scaling more cost effective. However, the cost of developing and maintaining open-source software can be high or account for a significant proportion of budgets, and it is equally important for governments to identify “best-of-breed” solutions with a strong, responsible team, such as the oft-cited OpenDataKit, which is a tool to collect, manage and use data.

Additionally, tech-enabled initiatives must demand and seek out open-data sources while also making their own data available for use, privacy issues notwithstanding. Open data are especially important in building predictive models in the risk-knowledge and analysis phases. Leveraging open data sets up a virtuous cycle: giving greater access to data spurs more tech-enabled use cases that then build further evidence of impact, and subsequent interest in more sustainable, systematic data collection.

Tech-enabled initiatives must leverage and provide tiered access to their platforms or outputs via standardised open APIs. APIs are “code that acts as an intermediary between two different pieces of software and enables them to communicate with each other.” This promotes an ecosystem of innovation in disaster preparedness and response by allowing other innovators to build on previous work. Research projects from Indonesia and India exemplify good use of APIs to build sophisticated tools where tweets are tracked to deepen understanding about crisis-related behaviour and to crowdsource taxonomy for time-critical crisis information.

Align closely with NDMAs. Tech-enabled initiatives should align with the standard procedures of existing National Disaster Management Agencies (NDMAs). According to a 2019 document, published to coincide with the third ITU Global Forum on Emergency Telecommunications (GET-19), “sustainability dictates that governments themselves are best placed to know how best to utilise disruptive technologies under different contextual environments. This is becoming particularly important for coordination purposes, given the variety of technology tools in the hands of different groups.” To ensure eventual government ownership, it is essential that providers of tech-enabled solutions align their approaches with existing government processes and approaches and follow governance structures already in place.

Be community driven and problem led: Complete and thorough integration cannot happen without the participation of those communities most likely to be impacted. Climate-related shocks are often highly local, and the needs and capacity of impacted communities are essential for effective early-warning, early-action systems. Proposed tech-enabled solutions should be problem led and “get the job done” for the community. Tools that do not meet the community’s needs or capabilities will be abandoned in favour of homegrown solutions, exacerbating fragmentation and foregoing external expertise. Suitable hybrid approaches of tech and human intervention should not be overlooked. For example, unforeseeable complications interrupted Togo’s prize-winning prototype of an algorithm-derived flood-prediction model from sending warnings to villages downstream of a dam. While the prototype is being further developed as a priority for the Togo government, regular flood warnings continue through ongoing collaboration between humanitarian teams and dam operators who interpret and share data updates in real time via WhatsApp. This exemplifies the best possible and achievable outcome until the more sophisticated model can be re-implemented: a tech-enabled solution that meets end-users’ needs and capabilities.

Strengthening Data Management for Tech-Led Early Warning and Action

Data are so vital to the effective running of tech-enabled early-warning and early-action systems that their collection and management must be a government-led priority. According to the UN: “In order to scale and systematise the use of mobile data, the lead position needs to shift to policymakers and decision-makers.” This holds true for all data relevant to early warning and early action. Policymakers should draw on the expertise in responsible data practices developed by organisations such as the Centre for Humanitarian Data, which issued a Data Responsibility Guidance document with complementary guidance notes. These provide support for best data-management practices for disaster-preparedness and response efforts.

In particular, governments need to:

Map existing data. Using a problem-led framework, governments should identify the data needed to implement early-warning and early-action efforts, and then systematically identify where the data are located (within government, humanitarian organisations or other third parties), ensure they are in a machine-readable format, and define the processes through which they can be obtained in an efficient and scalable way. Data should be catalogued with appropriate metadata and standardised formatting, starting with government-owned data. A “wish list” of missing and non-digitised data should be created to eventually be included in funding requests. The potential benefits of such an exercise are summarised here: “While there is not an absolute lack of data in LMICs, decentralised and uncoordinated efforts result in duplication of mapping activities, as indicated by multiple custodians having different versions of the same dataset.”

Share data. Government entities and disaster-relief organisations wishing to access data held by third parties should not have to replicate their efforts with each owner of the data for each crisis. This data should be categorised as necessary for humanitarian purposes and included in a systematic review by governments. Governments must engage with the data owners to build fair and sustainable humanitarian-driven agreements. This is for the mutual benefit of the owners of the data (who are under increasing pressure to share data but may not be able to meet the crisis-related deadlines) as much as for governments and the broader disaster-risk management ecosystem. By engaging with private-sector stakeholders in a unified way, data-sharing sets up consistent expectations and offers a streamlined process. Data-sharing agreements should include minimal standards and governments should take creative approaches as to how data are shared. For example, rather than MNOs transferring their data to external parties, it may be preferable and just as effective for the MNO to run the algorithm safely from behind its own firewall. While not always feasible, this demonstrates the sort of compromises that are possible in an atmosphere of goodwill and cooperation.

Conclusion

Technology can transform disaster management, saving lives and livelihoods, and reducing costs from damage and loss through early warning and early action. For governments to use disruptive tech-driven approaches to transform disaster-management systems, they need to know how technologies support and strengthen early-warning and early-action efforts, which different technological solutions should be used under which circumstances, and how to implement strong coordination and data-management processes and policies.


Chapter 7

Acknowledgements

This paper benefited from the insights of the following experts in disaster-risk management and technological innovation:

  • Omar Abou-Samra at the American Red Cross

  • Eric Anderson at NASA

  • Valentina Barca, independent expert on social protection

  • Victoria Gonsior at ODI and advisor to the Mayor of Freetown

  • Innocent Maholi at Humanitarian OpenStreet Map

  • Patricia Nying'uro at the Kenya Meteorological Department

  • Josee Poirier at the Centre for Humanitarian Data

  • Andrew Schroeder at Direct Relief

  • Josh Woodard at USAID

Lead Image: Getty Images

Article Tags


Newsletter

Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions
Radical Ideas
Practical Solutions