Context: The Ministry of Electronics and Information Technology (MeitY) has notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2025.
The amendment aims to prevent the misuse of Artificial Intelligence (AI)—particularly deepfakes, misinformation, and election-related manipulation—by mandating greater transparency and accountability in online content moderation.
Objectives of the Amendment
- Prevent the spread of synthetic or manipulated media.
- Ensure user awareness about AI-generated or altered content.
- Strengthen oversight and accountability in online content blocking.
- Maintain a balance between innovation and digital safety.
Key Provisions of IT Amendment Rules 2025
1. Authority Restriction
Only senior officials can issue takedown notices:
- Joint Secretary (or above) in Ministries/Departments.
- Deputy Inspector General (DIG) or above in police departments.
This ensures misuse prevention and greater accountability in content regulation.
2. Reasoned Orders
Each takedown order must include:
- The statute or rule violated.
- The legal justification.
- The specific URL or content identifier to be removed.
This makes the process transparent and verifiable.
3. Monthly Review
All takedown actions under Rule 3(1)(d) must be reviewed monthly by a senior officer not below the rank of Secretary, ensuring procedural compliance and preventing arbitrary censorship.
Regulating Synthetic & AI-Generated Content
Definition
“Synthetic information” refers to any content artificially created or algorithmically modified using computer resources to appear genuine.
Labelling Requirement
- Platforms must label all AI-generated or modified content to alert users about its artificial origin.
- This aims to build digital literacy and public trust in online spaces.
User Declaration & Verification
- Users must declare whether their uploaded content is AI-generated or altered.
- Significant Social Media Intermediaries (SSMIs)—those with over 5 million registered users—must deploy tools to verify user declarations and detect synthetic content.
Safe Harbour Protection
Platforms retain “safe harbour” immunity under Section 79 of the IT Act, 2000, if they act in good faith to identify and remove synthetic or manipulated content.
This provision incentivises proactive compliance while protecting genuine intermediaries.
Significance
The IT Amendment Rules 2025 mark a critical step in responsible digital governance by:
- Curbing AI misuse and disinformation,
- Promoting accountable online regulation, and
- Safeguarding citizens’ rights to authentic information.
These amendments align with India’s broader goal of building a secure, transparent, and ethical AI ecosystem under the Digital India framework.
