Let us know what type of content you'd like to see more of. Fill out our three question survey.
“We, The Data” Holds a Human Rights Lens to the Digital Age
This post is one of a series of posts on Digital Democracy.
Aug 22, 2024
In “We, The Data,” author Wendy H. Wong looks at the applicability of human rights to the digital age, making the case for extending the application of human rights outside of our physical bodies. She explores the extensive implications of datafication on our rights and what actions we, as data creators and subjects, need to take to protect these rights. She notes that a focus on human rights gets us to the core of why regulating datafication matters and provides us with globally accepted values through which to talk about it.
Exploring data rights, facial recognition technology, human rights after death, big tech, and data literacy, Wong makes the case for collective engagement as stakeholders to hold data collectors and processors to account.
While we are all voracious readers over here in the DAI Center for Digital Acceleration, we are not literary experts. In this blog, part of the Digital Democracy Series, I provide a personal review of the book and think about how it aligns to our work in international development.
Data are “sticky”: We can’t decouple the data from the person. Wong uses this term throughout the book, explaining that the “stickiness” of data is for four reasons:
- Data are co-created and inextricably linked to us as a person.
- Data are mostly about mundane activities that don’t appear to be of interest to anyone. Our perceptions of what “personal” data are is outdated, because…
- Data are linked—our data is combined with other datasets to create patterns, so even the most mundane data says much about us.
- Data is easily replicated and transported, lasting forever.
We are all data creators and thus are stakeholders, not just subjects of datafication. We facilitate the collection of data about us by using the websites, apps, and devices that collect these data. Yet for a vast majority of us, it probably isn’t plausible to opt out of their use: Our data is a part of who we are. Wong argues that current policies do not reflect how inextricably our data is linked to us, and we are currently left out of conversations around technology policy, regulation, and ethics. These conversations matter—and we should be stakeholders in our datafication.
We should be proactive in protecting our data, and not just focus on what to do once data are already created. Wong argues that we should be proactive in considering how technologies can affect human rights in both positive and negative ways. If we aren’t proactive, we can’t safeguard against transgressions. Essentially, the collection of data is not a foregone conclusion that should be legislated only once created. As co-creators of our data, we should have a voice and can only advocate with a full understanding of the implications of sharing our data.
We should engage in protecting our data rights as a collective. Wong’s book is a call to action, rather than a focus on the personal impacts of datafication. Because the processing of personal data has such significant collective impacts, due to the existence of big data, she argues that we should stand against data abuses.
Photo: Claudio Schwarz/Unsplash.
The Limits of Data Literacy
How do the main themes in the book apply to our work in development?
The author considers data literacy to be foundational to our rights: It helps make the trade-offs of sharing our data clear and help us advocate for better protection of our rights. Aside from risks at an individual level, Wong argues that data literacy is vital for communities to be able to collectively stand against data abuses. However, Wong rightly recognizes that having data literacy is not possible for all: “Literacy is not a burden that individuals can take on without the practical opportunities and resources to become literate.” We know that digital literacy more broadly remains a key barrier to the adoption of digital tools for many, most notably for women and girls, and other marginalized populations. Low basic literacy and numeracy, restrictive gender and social norms, time and financial constraints, and a lack of content in local languages are just a few of the barriers to digital literacy that people experience throughout the world. So, while Wong argues that data literacy should be a human right—achieved through education in libraries and advocacy by civil society—this is optimistic for those living in global majority countries, where basic digital literacy is still yet to be achieved. As technology evolves, so too does the definition of what it means to be digitally literate. As the development community continues efforts to increase digital literacy in global majority countries, we should ensure that data literacy is a core component of those efforts.
The Need for Collective Action
As noted above, Wong makes the case that to protect our human rights, we should collectively advocate for the ethical collection, use, and processing of our data. I wholeheartedly agree with this point, yet collective action for the protection of our data rights will be a particular challenge in many global majority countries. Wong focuses on the role of civil society organizations in facilitating collective action for our data rights, yet in many of the countries in which DAI works, civil society capacity and financing are low. For example, I have noted the challenges related to data literacy above, which Wong says forms the foundational knowledge for collective action. Many civil society groups may not have knowledge of and expertise in digital rights and datafication. Supporting civil society to advocate for digital rights would need to begin with an understanding of the environment within which they operate.
Creating a Rights-Savvy Tech Workforce
The requirement for good data literacy should not just fall on us: Big Tech and the technology workforce should be more literate on data rights. Wong argues that there should be norms around product development, ensuring that new products are created with potential social, political, and cultural consequences in mind from inception. She also notes that there are not enough technologists who value ethical and humanist training, suggesting that students in technology should have training in ethics, for instance embedding ethical units in computer science courses. She notes: “Data experts need to have more literacy in the context into which their products go.” Does this point sound familiar? I had an “Aha!” moment reading this after writing my last blog. She explains this a lot more eloquently than I but the perspective that the technology workforce needs a deeper understanding of human rights is a global issue.