The power of data to address humanitarian crises | NTT DATA

Tue, 04 May 2021

The power of data to address humanitarian crises

At NTT DATA we’ve recently started working with the Foreign, Commonwealth and Development Office (FCDO), measuring the impact of their programmes. We’re also long-term partners of the UNHCR, where impact measurement is equally vital. Those impacts are felt across the world, through long term projects to improve health and education but also in humanitarian crisis situations.

A ‘crisis’ might suggest short-term agile approaches which focus on rapid analysis of raw data. Sadly however, some types of crisis can last years, so we must also address measures which have a longer payback.

Just as I advocate data-driven solutions in other walks of life, being data driven in a humanitarian crisis is equally vital. To take just one example, the Global Action Plan to End Statelessness calls for better quantitative information because international agencies are hampered in their emergency response and stateless persons can otherwise be passed over for services. Indeed, individuals with the smallest data footprint can often be those in greatest need.

A second example is the competition for aid: using data to make the case is also crucial – both up-front in describing the need and afterwards in demonstrating accountability. Understanding the right KPIs to measure is part of the work we’re doing at the FCDO, which gives focus to programmes but can be subject to Goodhart’s law (see Hannah Fry’s excellent article in The New Yorker on What Data Can’t Do). Helping the UK government in the first crisis response to the pandemic has shown me that even looking at our national picture, we could be better prepared from a data perspective. This article addresses a few steps we can take to be better prepared for utilising the power of data in a humanitarian crisis.

Centralised control and local agility

Humanitarian agencies are no different from many organisations that have disparate operating functions, regularly faced with trying to balance local autonomy and innovation with central control of data resources.

One principle we’ve seen work well (take a look at Ben Overton’s blog post on data mesh architecture) is ‘just enough central data governance’. Data definition, quality, ownership and mastering are issues that must be solved in the same manner because of the way they cut across the organisation and have to be considered centrally. Some other areas can be devolved for agility on the ground and working groups established to determine points of mutual interest or to identify new cross-cutting concerns.

When it comes to data visualisation tooling, there are cost and consistency advantages of standardisation. However, if a new tool can add value locally and help address a specific problem, the underlying data infrastructure should not be compromised by adding new data exploration tooling. This is where the citizen developer or citizen data scientist can come into their own, developing rapid applications using platforms like Microsoft Power Platform.

In order to react to a volatile situation on the ground, there are tools available that allow users to rapidly interact with raw data to quickly make sense of what’s happening. Great can be the enemy of good if data pipelines have to be put in place before data can be analysed.

Augmented Analytics, which uses AI to attempt to ingest data and infer relationships and trends in the data, is a valuable tool in the time-critical humanitarian crisis. However, as Rita Sallam, analyst and fellow at Gartner has pointed out, greater data literacy is needed to get the most value out of these tools. This is another area where NTT can help - our Data Literacy programme has enabled clients to make leaps in culture change around data. Greater data literacy is needed because the discovery process is not the same as a ‘dashboard’ tailored to the business problem. Users also need to realise that automated analytics cannot always reflect the subtleties of the real world, so it can be relied on for ideas but not hard facts. Moreover, having designed solutions like this for use in the field, I’m aware that some pre-processing is beneficial, even if it’s done to make sure the data reaches a standard where the tool can be most useful and that bad records are properly handled.

A second incredibly valuable feature is to have standard reference datasets available and the means to clean and match new data against those. Again, a solution developed by a human being with specific knowledge of this problem space will outperform the AI. The time to consider these people and technology issues is ahead of when they are needed.

New data sources

Where on the ground data collection is compromised because of infrastructure damage, Earth Observation (EO) data can really come into its own. EO data is derived from satellites - much of it consists of images but options such as radar are also available. The amount of EO data is burgeoning and some is available with hourly refresh. This allows for ‘near real time’ situational awareness, augmented by AI, to recognise known entities such as people and vehicles. We are working with start-up x-Sentience to drive EO further; for example to look for what is changing on the ground or for correlations in EO data and more traditional datasets.

Personal data, even in a crisis

There are areas where shortcuts should not be taken, for example with personal data and biometrics. Biometrics can be incredibly useful for situations of undocumented persons and there are solutions for capturing biometrics in harsh environments. However, that data must be properly handled from the point of capture to the point of destruction. Humanitarian organisations are in a position to lead on this where local data protection legislation is not developed.

Aid money in the right hands

Sadly, one person’s crisis is another person’s opportunity – the pandemic has taught us this. We don’t withhold aid simply because the unscrupulous might find a way to profit from it. Nevertheless, it is still important for those giving to have confidence that donations end up in the right hands. That applies at an individual and government level – especially with competing demands for philanthropy. In collaboration with our partners, Quantexa, we can offer counter-fraud solutions that look for anomalies in data and for individuals trying to hide their identity. Alternatively, our Blockchain Centre of Excellence has had recent success with a donor platform that – while preserving anonymity – gives strong guarantees that recipients have benefited.

I often use the hashtag #datachangeslives, and it’s particularly appropriate here. We look forward to our continued client partnerships that enable data to be part of addressing our greatest humanitarian challenges.


How can we help you

Get in touch