A voice cloned in seconds. A face swapped with a click. A video that never happened but looks real enough to fool millions. What once belonged to science fiction has become an everyday digital threat. Across the world, deepfakes – AI-generated audio and video fabrications – are no longer curiosities but weapons, fueling fraud, harassment, and political manipulation.
The numbers tell the story. A 2023 report by security firm Pindrop found that deepfake fraud attempts had surged by 3,000% in a single year, with businesses losing an average of $343,000 per incident. Keepnet Labs estimated in 2024 that a deepfake attack now occurs every five minutes globally, contributing to losses exceeding $4.6 billion. And behind those headlines is an even darker reality: watchdogs confirm that the majority of deepfake content is pornographic, and disproportionately targets women and girls.
From Elections to Everyday Life
Pakistan has not been spared. The 2024 – 25 election cycle saw deepfakes enter mainstream politics for the first time. Fact-checkers scrambled to verify authenticity, while rights groups warned that disinformation powered by synthetic media was undermining electoral integrity.
“Deepfakes are not just a tech issue; they are a democracy issue,” said a Karachi-based media analyst. “When people stop trusting what they hear or see, the very idea of shared truth collapses.”
Beyond politics, the damage is creeping into everyday lives. Women activists and journalists report having their images manipulated into explicit material. Community monitors in Punjab and Sindh have flagged synthetic videos designed to stoke sectarian tensions. Child-protection agencies warn that students are increasingly encountering manipulated images or voice clones used for pranks and bullying.
For parents, the risks feel overwhelming. “My 13-year-old son showed me an app that could swap his face into movie clips,” said a mother in Quetta. “It looks harmless, but what if the same tool is used to humiliate someone in his class? How would I even know?”
Why Pakistan is Especially Vulnerable
With one of the world’s youngest populations and a mobile-first internet culture, Pakistan faces a particular set of risks. More than two-thirds of its citizens are under 30, and smartphone penetration has outpaced media literacy education. For many first-time users, the ability to tell fact from fiction online is already a challenge – deepfakes add another layer of complexity.
The “liar’s dividend” makes matters worse. Once people realize content can be faked, bad actors can dismiss even real evidence as fabricated. In courtrooms, in political debates, and even in family disputes, the consequences are profound.
The Case for Media and Information Literacy
Global institutions like UNESCO have been urging governments to invest in media and information literacy (MIL) as the most sustainable defence. Instead of relying solely on technological detection tools – which are often one step behind – MIL teaches people to recognize manipulation, verify content, and build ethical habits around sharing information.
In practice, this means equipping students, parents, and teachers with skills: how to reverse-search an image, how to spot irregularities in audio, how to cross-check multiple sources before believing a viral clip. It also means conversations about consent, dignity, and digital ethics – especially critical in a context where women and children are often targets.
A Local Initiative Steps In – Mediatiz Foundation
In Pakistan, one of the few organizations attempting to meet this challenge is the Mediatiz Foundation. Established in 2024, the non-profit has introduced what it describes as the country’s first specialized K–12 curriculum on media and information literacy – Media Mind. The program uses a mix of classroom drills, online resources, and take-home materials for families. Modules cover not just misinformation but also digital privacy, online safety, and the ethics of sharing content, etc.
Media Mind, is successfully running in Baluchistan’s public schools, where over 3000 schools integrated lesson plans on misinformation, digital safety, and verification practices, etc. Teachers there say the difference is noticeable. “Students are asking questions they never asked before – ‘Who made this video? Why is it going viral? Can we check if it’s real?’” said a secondary-school teacher in Pishin, “That curiosity is the first line of defence.”
Following its initial success, the curriculum is now being scaled to public schools of Khyber Pakhtunkhwa. Moreover, it has been introduced as part of the scheme of studies in select national-level private school systems.
Mediatiz has also developed AI Mind, a companion curriculum that introduces students to artificial intelligence and robotics, explaining how generative models work and why deepfakes can be so convincing. Paired with its Social & Emotional Learning (SEL) curriculum, the approach ensures that students are not just technically skilled but emotionally equipped to pause, verify, and seek help if targeted.
The Gaps That Remain
Despite these efforts, Pakistan’s broader response remains uneven. Laws addressing synthetic media are still under debate, platform safeguards are inconsistently enforced, and awareness among the general population remains low. Experts warn that without systemic adoption of media literacy; piecemeal interventions will struggle to keep up with the scale of the problem.
“Technology alone will not save us,” said a Lahore-based digital rights advocate, “We need a generation that understands both the potential and the risks of AI. That means starting in schools, now.”
A Problem That Won’t Wait
Deepfakes are not on the horizon – they are here. They are shaping elections, invading privacy, and eroding trust in institutions. For Pakistan, the challenge is amplified by demographics and digital habits, but so too is the opportunity: by embedding critical thinking and verification skills early, the country can build resilience where it matters most.
The work of organizations like the Mediatiz Foundation shows what is possible when local solutions meet global challenges. Yet the scale of the problem demands more: broader adoption by provincial governments, deeper collaboration with schools, and sustained investment in equipping young Pakistanis with the skills to navigate a world where seeing is no longer believing.
Until then, the cloned voices and fabricated faces will keep knocking at Pakistan’s digital doorstep. The question is whether society will be ready to answer.
37
previous post