There’s a tendency in any field of business (and life!) to focus on the newest, shiniest objects. Just like obsession over the release of the latest iPad, the ICT4D community holds a special place in its heart for the newest technology offerings. Machine learning, blockchain, and the internet of things (IoT) come to mind today, whereas in years past it was crowdsourcing, social media, or smart phones.
This focus on new technology is very useful to international development, ensuring proper integration of technologies as they are released, as opposed to waiting years before effective adoption. That being said, I do worry that a focus on the latest and greatest can be detrimental to effective use of what may be considered legacy methods.
Getting Basic Data Right
One example comes from the field of data science. Machine learning methods have gained a lot of attention in recent years, and for good reason. Our team has used machine learning for some very discrete but powerful purposes, and created efficiencies in our organization where they would not have previously been possible. However, effective machine learning use requires a large amount of training data—something not all international development organizations are likely to have. Fortunately, there are a lot of non-machine learning analytical functions that are equally, if not more, urgent for organizations to tackle.
Descriptive statistics about core business operations such as program expenditure can be critical to a program’s ability to make agile decisions. Programs also rely on knowledge of country demographics, economics, agricultural production levels, or political sentiment to develop theories of change. Our team has a method called site selection that we use for decision-support at DAI. But most high-quality data is available only at the country level, as opposed to broken out by sub-region. And this type of data management is a lot of work, requiring both an in-depth understanding of the thematic area of focus, as well as the ways data and technology can be leveraged to develop new insights.
Fortunately, there are some very promising initiatives that are bringing the fundamentals of data analysis into focus.
Building Your Skills
First, it is easier than ever to learn data analysis skills. Online learning makes access to both content and subject matter expertise easier than ever. If you want to learn statistics, go on over to datacamp. Want a great overview of data collection technologies? TechChange has a great four-week course for that. Want to learn web development? Khan Academy has great free introductory courses that will set you up for more in-depth offerings by sites such as codeacademy or team treehouse.
Universities and bootcamps are also catching on to this need, and focusing on mid-career professionals. Code Partners, a new coding academy in Montgomery County, Maryland, launched this past year by DAI and partners, offering a 20-week fast-track educational environment for aspiring developers. While some may argue that there’s no replacement for a computer science or statistics degree, the barrier to data analysis and technology knowledge is lower than ever.
Big Orgs Take Data Seriously
Second, major development organizations are taking data analysis seriously. Whether that be through the development of embedded analytics teams, or by funding the development of public good datasets, both private and public organizations increasingly invest in or hire staff dedicated to quality data analysis. At DAI we have a growing cadre of programmers and data analysts, whether they be within our Center for Digital Acceleration, or within our monitoring evaluation and learning practice, Managing for Development Results.
Government organizations are taking data public good data analysis seriously, too. The State Department’s Mapgive program launched in 2014, and continues to coordinate efforts around OpenSteetMap, and offer free satellite imagery to humanitarian agencies in response to humanitarian emergencies through its Imagery to the Crowd (IttC) initiative. These initiatives stand in parallel to the U.S. Agency for International Development (USAID)-funded Youth Mappers project, which organizes a global community of learners, researchers, educators, and scholars to create and use geographic data that addresses locally defined development challenges worldwide.
Professional Data Use Policies
In addition to developing programs, USAID is taking data to the policy level. This summer USAID released ADS 579, which provides guidance for complying with USAID’s evaluation policy. It states that “all quantitative data collected by USAID or one of the Agency’s contractors… must be uploaded and stored in a central database.” The policy and its addenda detail the way data should be structured, offering clear data management guidance, and a vision for how the data can be used to support policy and program decision-making. While this may be seen by some as another bureaucratic box to check, I see this as a commitment to using data for decision-making.
The Rise of Business Intelligence Tools
However, the rise of business intelligence tools brings both risks and rewards, as they open the door for nonprogrammers or database managers to get involved in descriptive data analysis. One word of caution is that business intelligence tools can almost make data analysis too easy, allowing users to explore data without much structure, possibly causing inference about correlation where no relationship exists.
The ICT4D community will continue to explore the latest technology, working to bring innovations into our practice in both ethical and (hopefully) efficient ways. But as we continue to do so, I’m determined that we will continue to put energy into ensuring that fundamental data and technology capabilities continue to mature in ways that strengthen projects and policies geared toward development and diplomacy efforts around the globe.