PIL in Punjab & Haryana High Court seeks strict law to curb deepfake content
Terming deepfake as threat to democratic processes, petitioner demands strict punishment for the offence
A Public Interest Litigation (PIL) has been filed in the Punjab and Haryana High Court seeking direction to the government to frame comprehensive standalone legislation to curb the creation and dissemination of malicious deepfake contents, prescribing enhanced punishment maximum upto 7 years’ imprisonment with a heavy fine.
The PIL has been filed by advocate Ravinder Singh Dhull, a resident of Sector 20, Panchkula.
He has made the Ministry of Electronics and Information Technology, the Ministry of Information and Broadcasting, the Ministry of Home Affairs State of Haryana, Punjab, and Chandigarh parties as respondents.
Dhull, in the petition, said deepfakes are hyper-realistic digital forgeries created using artificial intelligence, particularly deep learning algorithms, which can manipulate audio, video, and images to depict real individuals saying or doing things they never said or did.
He termed the deepfake a threat to democratic processes and national security. He said manipulated videos and audio of political leaders making inflammatory statements can influence electoral outcomes, incite violence, and undermine public faith in democratic institutions.
Deepfakes can be weaponised by hostile state and non-state actors to create fake communications from military or government officials, potentially triggering security incidents.
India has witnessed an alarming 550% surge in deep fake incidents between 2019-2024, with projected financial losses estimated at Rs 70,000 crore by 2025.
The technology has been weaponised to produce non-consensual intimate imagery targeting women, spread fabricated political content during elections, commit financial fraud through voice cloning, and destroy the lives of ordinary citizens, as evidenced by recent incidents reported in Punjab.
The current legal framework in India, comprising the Information Technology Act, 2000, the IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, and the Bharatiya Nyaya Sanhita, 2023, is wholly inadequate to address the unique challenges posed by this technology. The MeitY Advisory dated March 15, 2024, being merely recommendatory, lacks statutory force.
The proposed amendments to the IT Rules, 2021, while addressing labelling and metadata obligations, fail to prescribe enhanced criminal penalties, provide civil remedies, mandate blocking of deep fake creation platforms, establish rapid takedown mechanisms, or create dedicated institutional frameworks.
A comparative analysis with the European Union, the United States, China, South Korea, the United Kingdom, and France demonstrates that India lags significantly behind in regulating this technology.
He further seeks directions to the States of Haryana, Punjab and UT Chandigarh to take concrete measures for the rehabilitation and protection of victims of deep fake crimes, including establishment of dedicated Cyber Crime Cells, Victim Compensation Schemes, free counselling and psychological rehabilitation services, dedicated helplines, free legal aid, cyber awareness programmes in educational institutions, and sensitisation of police officers and judicial officers.
He said there is a need to frame legislation on the pattern of the EU Artificial Intelligence Act, 2024, the United States Take It Down Act, 2025, the Defiance Act, and the South Korean amendments to its Act.
He also demanded for establishing civil remedies, including statutory damages of not less than Rs 10,00,000 for victims.
He emphasised the urgent need for a comprehensive legislative and regulatory framework to address synthetic media and deepfakes.
This framework should clearly define and classify such content; mandate disclosure and watermarking requirements for AI-generated material; provide for the blocking or barring of apps, websites, and platforms that enable the creation of deepfake content without adequate safeguards; establish technical standards for detection and authentication; and impose enhanced criminal penalties proportionate to the severity of the harm caused.







