The Digital Purdah: Navigating Feminist AI and Data Sovereignty in Pakistan

0

By Tanveer Ahmed :

The word “purdah” carries generations of meaning for Pakistani women. It speaks of boundaries, of what can be shown and what must remain hidden, of who has permission to see and who does not. For centuries, these boundaries were physical

walls, veils, the careful architecture of separation.

Today, a different kind of purdah is taking shape. It is built not from fabric but from algorithms, not from tradition but from technology. And it may prove far more difficult to challenge.

When the Algorithm Knows Too Much

In late 2025, a social media trend swept through South Asia. Women uploaded their photographs to AI applications that transformed them into elegant digital portraits soft lighting, traditional attire, eyes rendered with almost unsettling clarity. It felt playful, empowering even.

Then a Pakistani woman received her transformed image and noticed something disturbing. The AI had rendered a small mole on her arm a detail that existed in real life but was not visible in the photograph she had submitted. The system had somehow inferred its presence.

This is not a technical glitch. It is a glimpse into a future where algorithms learn more about us than we consciously reveal.

In countries with robust data protection frameworks, such incidents trigger investigations. In Pakistan, where comprehensive data protection legislation has languished in draft form for years, they expose a profound vulnerability. Every photograph uploaded, every trend embraced adds to the reservoir of female imagery feeding global AI systems. Once images enter this ecosystem, they can be repurposed, analyzed, and used to train models that may later be deployed in ways no user anticipated or consented to.

The Architecture Women Were Never Asked About

Pakistan’s National Artificial Intelligence Policy, unveiled in 2025, was presented as a roadmap toward a technologically advanced future. Structured around six ambitious pillars, it promised progress, innovation, and inclusion.

But when civil society stakeholders gathered in Islamabad to scrutinize the framework, the assessment was sobering. Participants at a multi-stakeholder dialogue organized by IRADA pointed to critical gaps. The policy’s commitments to privacy, security, and inclusion remained dangerously vague, with no clear mechanisms for implementation.

Humaira Mufti of the National Commission on the Status of Women raised a concern that resonates across every conversation about women and technology in Pakistan. Despite rhetoric about inclusivity, the policy lacks concrete provisions for building women’s capacity or protecting them from the gendered dimensions of online disinformation. Without targeted training and explicit safeguards, women risk being further marginalized in the very digital future the policy claims to build.

The critique goes deeper than implementation. It questions whether women were ever meaningfully consulted in shaping the frameworks that will govern their digital lives.

The Deepfake Reality Already Here

The Digital Rights Foundation’s helpline received 3,171 complaints in 2024 alone. Women filed more than half of them. Since its inception in 2016, the organization has handled over 20,000 cases of technology-facilitated gender-based violence.

These are only the cases brave enough to report. Research by the same organization found that 70 percent of women fear the misuse of their images online. The gap between fear and reported harm is immense, filled with shame, social stigma, and the realistic assessment that justice systems offer little recourse.

Earlier this year, a young Pakistani content creator discovered that her Instagram photographs had been manipulated into explicit images. The fabricated visuals spread rapidly across messaging platforms. Instead of receiving support, she faced public shaming. In a society where women’s reputations are treated as family property, such violations carry consequences far beyond embarrassment social exclusion, professional destruction, even physical danger.

Digital rights advocate Sadaf Khan articulates the fundamental problem: Pakistan’s existing legal frameworks were not designed for AI-generated harms. Most platforms operate from jurisdictions outside Pakistan’s reach, making accountability nearly impossible.

The Data That Was Never Safe

Between 2019 and 2023, personal data of more than 2.7 million citizens was stolen from NADRA’s regional offices. Millions of Pakistanis’ records sit in government databases without robust safeguards, clear access controls, or enforceable rights to challenge misuse.

The Digital Rights Foundation’s recent research on emerging technologies drew on focus groups with 79 participants, survey responses from 60 individuals, and interviews with experts. The findings reveal widespread anxiety: 65 percent of participants fear loss of privacy, 63 percent worry about disinformation, and half express concern about AI-enabled monitoring.

Women shared real experiences of being targeted by AI-generated deepfakes. Communities described how AI-fuelled disinformation exacerbated sectarian violence in Kurram. The overwhelming sentiment was not outrage but resignation. As one participant put it, “AI is already here; we just have to deal with it.”

Digital Colonialism’s New Face

Most AI systems used in Pakistan are developed elsewhere in Silicon Valley, in Beijing, in labs that have never consulted the populations their products will affect. These systems are deployed in the Global South with minimal localization, carrying assumptions about privacy, social norms, and gender that do not translate.

Rights groups describe this dynamic as digital colonialism. Developing countries bear disproportionate social costs while also supplying cheap labor for data labelling and content moderation. In documented cases, workers are paid as little as $1.32 per hour to train the very AI systems that may later be used against their communities.

UNESCO’s Hamza Khan raises an additional concern: Pakistan’s Right to Information laws were written for an analog age. Without stronger transparency mechanisms, citizens will have no way to challenge misuse of their data or contest AI-driven decisions that affect their lives.

What Feminist Governance Requires

In January 2026, over 40 stakeholders gathered at a Digital Rights Foundation roundtable to identify priorities for people-centered technology governance. The consensus was clear: mandatory human rights impact assessments for AI deployments, passage of long-delayed data protection legislation, transparent content moderation, participatory oversight bodies, and protections for workers and journalists.

Dr. Saadia Ishtiaq Nauman of Fatima Jinnah Women University notes an often-overlooked resource. Pakistan’s universities are well connected to a diaspora of technologists engaged in groundbreaking work globally. Ayesha Arif, an AI engineer, adds that while Pakistan has the talent to build tools, success will depend on whether solutions are rooted in local contexts and audience needs.

The Path Forward

The Saree Portrait trend will fade from memory. Its warning should not. In societies where images can determine a woman’s fate, AI’s capacity to see, infer, and expose carries dangers that cannot be ignored.

Pakistan cannot afford to treat AI as an inevitable force to be passively received, nor can it simply adopt regulations designed for other societies. As Nighat Dad of the Digital Rights Foundation argues, governance must be built from the margins centering the experiences of journalists facing censorship, women targeted by deepfakes, and workers displaced by automation.

The digital purdah being woven today is more insidious than its physical predecessor. It does not simply hide women. It exposes them, exploits them, and holds them hostage to systems they cannot see and cannot challenge.

Lifting it will require more than policy documents. It will require recognizing that data sovereignty is not ultimately about borders or servers. It is about bodies. And women’s bodies have been controlled for long enough.

About Author

Leave a Reply

Your email address will not be published. Required fields are marked *