Essay I1

Critical issues—such as privacy—are often abstract to the user. Use clear language to describe them

Implications

LEAD AUTHORS: Emrys Schoemaker and Bryan Pon

Photo of people sitting on a bench at a railway station

Privacy is universal but concepts differ

In carrying out this research on everyday experience of negotiating identity transactions, we often struggled with the words we used to explore abstract concepts, such as privacy, with participants. You can see this in our essays on framing the very concept of identity, where for example, Ganga, the puppeteer in Delhi described her identity first and foremost as an artist. Similarly, translating concepts such as empowerment highlighted how ID systems can introduce new kinds of vulnerability, such as one respondent’s experience of bribing intermediaries in his attempt to register as scheduled caste to access employment opportunities. Many of the concepts involved in identity systems are hard to research because they are abstract. For example, how would you define “privacy”? Like so many of the terms in the debate around identity systems it’s a tough concept to describe, as it’s an abstraction with lots of different aspects to it. Translating abstract concepts into real world language is a challenge, as we discussed in our essays on the material nature of digital artifacts, around questions of individual sovereignty and the nuanced ways in which gender shapes everyday identity practices. The conversations around abstract concepts and abstracted digital systems must necessarily involve engaging with people who do not necessarily share the same conceptual vocabulary to speak about their experiences. These and the many other similar challenges are not just methodological quirks to researching complex experiences, but rather a core challenge in the design and deployment of inclusive, ethical digital systems, in the field of ID specifically but also beyond. This essay outlines the challenges of translating abstract concepts into language that people understand, and the significance of this for the development of ethical, principled ID systems.

The design of new identity systems, technologies, and artifacts has been strengthened by the emergence of common standards and principles, such as the Principles on Identification for Sustainable Development. A strength of these guiding documents is their emphasis on understanding user needs and concerns, and their principles on the use of digital identity technologies to help realize people’s “right to participate fully in their society and economy.”1 Yet these overarching frameworks are insufficient in detail to support their operationalization into concrete policy and tangible design. Without an understanding of user’s everyday experience of these abstract principles, policy and design runs the risk of making faulty assumptions. A striking example of this is the question of privacy, about which it has been heard said that “the poor don’t care about privacy.” In part, we believe that this kind of belief is the product of failing to ask users important questions in ways that translate abstract principles to everyday experience.

Getting this right is crucial, particularly around the issue of privacy. The “Principles on Identification for Sustainable Development,” make two recommendations explicitly about privacy: #6, “Protecting user privacy and control through system design” and #8, “Safeguarding data privacy, security, and user rights through a comprehensive legal and regulatory framework.” The Omidyar Network argues that privacy is central to building the trust in new identity technologies, terming them “trust architectures.” We argue that translating abstract principles such as privacy into terms that users understand more [tangibly/concretely] can mitigate the risk of “disempowering individuals.”2

Reframing privacy: harms and benefits, not PII

To operationalize the abstract concept of privacy we drew on established social science research into privacy and developed questions that enabled people to share their experiences and perceptions of privacy in relation to the use of identity technologies. We found that privacy is often described as an abstract principle or in individualistic terms, such as concerns around the protection of personally identifiable information (PII), leading to questions that simply ask if people care about privacy, or the framing of personal information in financial terms. Even if the concept of privacy is understood, framing it around financial dimensions of PII leads to answers that support the conclusion that the poor attach little value to privacy. When we asked key stakeholders for their views, a number highlighted the possibility that “Western” cultural norms around privacy might be dominating debates about privacy and the design of new identity systems.

This framing of privacy is, however, only one way of conceptualizing privacy, and doesn’t translate well to every culture and context. Privacy scholars argue that there is no universal concept of privacy, that it “is too complicated a concept to be boiled down to a single essence.”3 Furthermore, others argue that the cultural translation of privacy outside of the West further complicates traditional conceptions of privacy, especially around evolving technologies.4 Privacy involves experiences of validation, judgment, legitimacy—not just about eligibility for services but also acceptance by community and peers. In order to operationalize the abstract concept of privacy in a way that elicited people’s actual experience and perspectives, we drew on existing research on privacy in India and reframed privacy from an abstract principle or a narrow focus on PII to one of the harms people experience and perceive in relation to privacy failures.5 In addition to exploring the ways people managed and shared information, we also asked people to identify the kinds of information they were concerned to protect and their opinions on the kind of harms that might result if their private information was revealed.

We found that many respondents were not concerned about identity theft, and were comfortable with the idea of their personal information being collected by others, even when they were aware of the risks of it being stolen. For example, when we asked people whether they were concerned about losing their artifacts or their personal information being shared, they framed their response in economic terms and, being poor, felt they had nothing to lose. As Mansoor, a male street trader of woolens at a Bangalore market, said: If a poor man’s ID like my ID is lost, it doesn’t matter…I have nothing to lose. I have no money. Based on this abstract framing of privacy it’s understandable why many policymakers conclude that poor people don’t care about privacy, develop identity systems that place low priority on privacy. Yet we believe that, to the contrary, everyone cares about privacy, but understanding in what ways requires asking questions in the words and language that people use. It requires seeing the world through their eyes.

We also found that many respondents shared their personal identity credentials and were comfortable with the idea of their identity being in the hands of people they knew and trusted. Remember Rahul, Ganga’s son, who described how people share ration cards amongst each other, saying We give our ration card to one of our neighbors and say ‘ok today you get your ration with this. Take it for a month.’ Where relationships are built on trust that is established over time, patterns of sharing personal information are common. Similarly, the use of shared personal identity artifacts also involves trusted relationships. Doddaraghu, the ration shop owner from Garudahalli in rural Karnataka, described how they accept people who bring family members’ cards to claim rations because We keep seeing the people, who come and take ration from our shop. We interact with them every day. We found that in many cases the sharing of personal information forms part of traditional practices and is built on trusted relationships.

We found that respondents placed real value on their privacy when it was described in terms of the harms that might arise from privacy breaches in which information exposure contravened cultural norms. For example, Ganga, a craftswoman from Rajasthan who sells handiwork at a Delhi crafts bazaar and her son Rahul described their discomfort at having to share a portrait photograph when registering for an Aadhaar card. For Ganga, Rahul and many other respondents, cultural norms demanded that the female face remained veiled, a demand in tension with Aadhaar and other ID systems’ requirement that a recognizable profile photo be supplied. By reframing privacy from an abstract concept to people’s everyday experience we were able to identify how seemingly innocuous aspects of design, such as the provision of a photograph, contravened cultural norms.

Respondents identified health and financial information as categories of information they feared would cause harms if they couldn’t control how they were revealed. After respondents had identified these categories, we asked respondents if they would mind their neighbors accessing their medical records or bank account, most respondents said yes, and that it would cause some form of harm. For example, remember how Ayesha, the community health worker in North East India, speculated that women might stop sharing medical information with their doctor if they believed that medical records were linked to databases held by other people in different contexts. We found that people negotiate privacy in terms of a constant shift between the need to reveal certain aspects of oneself, but at the same time to be responsible towards one’s duties either as a citizen, or a micro- entrepreneur, etc., vis-a-vis the state. People negotiate a dynamic interplay between a need for privacy and a duty to be visible to the state.

Operationalizing abstractions for principled policy and design

Reframing the abstract concept of privacy in terms of people’s experiences helped operationalize the concept to reveal otherwise hidden insights. For example, privacy is a highly gendered concept, with female experiences of privacy shaping behaviors and attitudes towards identity technologies. For example, Rahul, the son of puppet maker Ganga, described how the presence of photographs on identity credentials violated cultural norms around female visibility. We also found that privacy can only be understood in the context of the relationships within which it operates, including but not limited to the state. The users we spoke to actively sought to selectively negotiate how and when they were visible to the state. For example, Biswaroop, a sari-seller from urban Bangalore, described how he chose not to report a theft in order to avoid registering a complaint, which would reveal his shop was illegal. In contrast, Devi, a street-side peanut seller, sought to register for multiple credentials in order to access multiple forms of benefits linked to different credentials.

In India, privacy policies are a contested area. On the one hand, deliberations on the Right to Privacy Bill have been at a standstill after the last meeting on it in 2015, apparently following concerns by the intelligence agencies.6 On the other, the Assistant Secretary General Narasimha stated in relation to the privacy of WhatsApp data, that personal data is a reflection of an individual’s personality and integral to dignity and life.7 More recently the central Government of India stated to the Supreme Court that privacy was indeed a fundamental right, but a “wholly qualified”8 one. We suggest that taking a user centered approach that reframes abstract concepts such as privacy in terms that people themselves use can help the development of policy that reflects people’s everyday experiences and needs.

First and foremost, our research shows that there are real differences in attitudes and behaviors toward privacy based on specific identity categories such as class, gender, and religion. The individual identity and the context of identity artifact use shapes the implications of how identity systems impact people’s lives. Yet the specificities of local contexts, such as the nuances of gendered privacy norms or the negotiation of the state’s demand for visibility, are difficult insights to obtain. Indeed, the ways in which individuals negotiate all digital contexts are subtle, nuanced practices that are challenging for policy development and system design. Policymakers need to meet people where they are with user research that doesn’t rely on abstractions. And that research has to cast a wide net to capture all of these perspectives.

People have different ideas about what information is sensitive and what isn’t, as we showed with our examples of the cultural specificity around images of women and concerns about possible harms resulting from privacy breaches. The best identity systems would follow principles of minimal disclosure to limit what data is shared during identification and authentication, and ideally utilize zero-knowledge proofs to conduct authentication transactions so that the requesting entity only receives a “yes/no” and not any PII. Such measures would make significant contributions towards strengthening individual agency and realizing the promise of empowerment enshrined in the Principles on Identification for Sustainable Development.

It’s not just the design of technologies of course, but also the social and political context. Importantly, this also means looking beyond the individual as user to the system as a whole. For example, in the context of privacy, it’s important to design systems that can strengthen the ability of intermediaries to play enabling roles, and mitigate their power to constrain individual benefits. Finally, as the debate around privacy legislation in Indian courts shows, identity ecosystems will always function better if privacy laws and regulations are clear and well-enforced.


  1. World Bank, “Principles on Identification for Sustainable Development: Toward the Digital Age” (Washington, DC: World Bank, 2017).

  2. Mike Kubzansky, “Digital Identity: No Empowerment without Privacy,” Omidyar Network, July 2014.

  3. Daniel J. Solove, “A Taxonomy of Privacy,” U. Pa. L. Rev. 154 (2005): 477.

  4. Syed Ishtiaque Ahmed et al., “Privacy, Security, and Surveillance in the Global South: A Study of Biometric Mobile SIM Registration in Bangladesh,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI ’17 (New York, NY, USA: ACM, 2017), 906–918, doi:10.1145/3025453.3025961

  5. A Sethia and N. Kher, “The Indian Identity Platform (‘Aadhaar’): The Implications for Citizen-Government Relationships in a Developing Country Context” (The Internet, Policy & Politics Conferences., University of Oxford, 2016).

  6. Yatish Yadav, “Privacy Bill Held up due to Intel Agency Reservations—The New Indian Express,” The New Indian Express, March 7, 2017.

  7. Bar & Bench, “WhatsApp User Policy: Personal Data Integral to Life and Dignity, Centre to Supreme Court,” Bar & Bench, July 21, 2017.

  8. Krishnadas Rajagopal, “Privacy Is a Fundamental but Wholly Qualified Right: Centre,” The Hindu, July 26, 2017, sec. National.