This month I’ve been reflecting on all that has happened since this time last year, back when we were still blissfully unaware of the trials, trauma, and transformation 2020 would bring. Instead of focusing on the challenges we faced over the last several months, I’d like to highlight some relevant concepts for colleagues who are gaining momentum in understanding and incorporating digital technologies into their international development work. Below are some important topics (upon which my tech-savvy colleagues and others have written extensively, so check out the links!) that can be applied across our development projects and fieldwork. Consider this the sophomore edition of the “Digital Development for Newbies” series, but if you’re brand new to this topic, I encourage you to start with Part 1: The Basics.
Increased Digitalization = Increased Vulnerability
Stakeholders across the international development and humanitarian sectors continue to show increased interest in digitization (converting something to a digital format) and digitalization (converting processes or offline systems to usage of digital technologies). We’ve seen this demand from clients, government counterparts, and users alike. While such digital transformation can offer efficiency and transparency, it is critical to note that this comes with increased cybersecurity risks. You may be tired of being told to strengthen your password or to add a second layer of authentication, but these tips are more important now than ever before.
Photo courtesy of Addie Ryan.
Application: While technological advancements continue at a rapid pace, regulations governing privacy and security lag far behind. No longer confined to expert computer coders and hackers, cybersecurity has become a shared responsibility. Each of us plays a key role in not only protecting our own personal data but also the systems and data of our colleagues and project beneficiaries. Cybersecurity should not be thought of as an emergency intervention added to a program when a need is identified. Rather, it should be “baked in” to the project design from the start, before vulnerabilities are exposed, and kept top of mind throughout program implementation as well as closedown.
Takeaway: As international development professionals, we must consider privacy and cybersecurity when writing proposals, training staff during project startup, and throughout implementation (where, for example, the Government of Kosovo asks for support to implement an electronic procurement system or Central American farmers seek a mobile app that shares information on climate patterns). As you introduce digital tools in your programming, include a plan for educating users on safe practices and the risks associated with their use.
Let’s Acknowledge and Confront the Digital Divide
Now more than ever, there is a sense that everyone is online. However, there remain significant gaps in digital access and digital literacy across populations based on myriad factors such as age, gender, income, urban-versus-rural, and race. Communities and groups that are already marginalized risk becoming further disadvantaged in the mid-digital age.
Application: So what role can development and tech professionals alike play to narrow rather than expand the digital divide? Firstly, in our digital projects and programming we can acknowledge this divide and design interventions to bridge the gap, supporting the excluded populations in gaining connectivity and digital literacy skills. Secondly, for any projects—whether focused on health, agriculture, or supporting entrepreneurs—we can be intentional in designing our activities, outreach, and data collection in a way that is inclusive of folks across this divide.
Takeaway: When designing activities, ask your team: What audience might you miss by pushing out behavior change campaigns only via social media? What data might you omit if conducting a small business market survey only via email? What population will be left unvaccinated if appointments can only be made via a website? How else could you reach those who are not online?
Disinformation is an Increasing Threat to Global Governance and Public Health
Digital technologies are rapidly expanding the reach and impact of misinformation (unknowingly false information) and disinformation (false information spread with malintent). Combatting false information online has become an increasing area of concern in the United States as well as countries abroad, as it is particularly common with information related to politics and COVID-19 but also prevalent in other areas such as climate change and racial justice. (Tip: My colleague just shared a great list of further reading on mis/disinfo.)
Application: How does this affect our work in international development? We are now seeing donors fund standalone programs that focus on this topic, such as the U.S. Agency for International Development’s efforts to mitigate Kremlin manipulation of information in Georgia and the U.K. Foreign, Commonwealth & Development’s work to tackle the spread of false information related to COVID-19 in Southeast Asia and Africa. This is an increasingly important element to consider in any governance or public health program.
Takeaway: Implementers of more traditional development activities—such as supporting free and transparent elections; conflict mitigation and stabilization; water, sanitation, and hygiene campaigns; and vaccination roll-outs—should consider activities to increase community media literacy and combat the spread of mis- and disinformation, especially via the most rapid (i.e. online) channels.
Technology Can be Racist
Institutional racism has gained increased and long overdue attention in 2020. It should be no surprise that digital tools designed by humans can likewise be subject to the known and unknown biases of the individuals who make up the design team, serve as test users, and provide training data. My colleague recently wrote about an infuriating example of such bias in pulse oximeters that give misleading and potentially fatal diagnostic readings to Black patients. Extending beyond human biases, artificial intelligence (AI) and machine learning (ML) algorithms can also show racial bias, particularly when the machines are trained on homogenous data. This problem is particularly profound as it relates to facial recognition software.
Application: Such biases underscore the fact that digital tools do not necessarily work the same way across all geographic contexts and populations. Many countries where we work have bourgeoning tech scenes. It is important to consider that innovative tools and skilled developers are often available within the local market that could support project needs with less risk of racial bias or irrelevance.
Takeaway: Before introducing a flashy app developed in Silicon Valley to community health workers in Ethiopia, research whether there are comparable solutions available locally. As you are shopping around or evaluating offers, ask the developers some probing questions: Was the app designed to address this particular problem (or was it designed for the sake of design and now in search of a market)? Do you have insights that indicate the anticipated users would be receptive to this type of technology? Has this tool ever been tested with the target user population? In the case of AI/ML technology, is the training data representative of the target community you hope to serve?
Thanks for joining me on my own international development-to-digital development journey. Inspired to learn more? For further reflection on 2020 and links to dive deeper into various ICT4D hot topic areas, check out [email protected]’s 2020 Year in Review.