The Indian government has introduced a strict new rule for social media platforms.
From now on, companies must remove unlawful or harmful online content within three hours of receiving a government notice.
Earlier, platforms had up to 36 hours to act. The new timeline is much shorter — and among the fastest in the world.
The rule comes under amended Information Technology regulations and is part of a broader push to tighten digital oversight in India.
What the Three-Hour Rule Means
Under the updated rules, any platform that hosts user-generated content — including major global social media companies — must act quickly when they receive an official takedown order.
Here’s what they must do:
Remove or block access to the flagged content within three hours
Act on all types of unlawful material, including threats to public order, hate speech, defamation, and illegal activities
Upgrade their internal systems and teams to meet the faster deadline
If companies fail to comply, they risk losing certain legal protections under Indian law.
This means platforms must now process and review government notices almost immediately.
Why the Government Tightened the Rules
The government says the move is necessary to stop harmful content from spreading rapidly online.
In today’s fast-moving digital world, even a few hours can allow misinformation or illegal content to go viral.
By shortening the takedown window, authorities aim to:
Prevent harmful posts from spreading widely
Respond quickly to threats affecting public order or national security
Keep pace with the speed of online conversations
The rule is also part of a broader effort to regulate artificial intelligence and synthetic content.
Concerns Raised by Industry and Rights Groups
Not everyone supports the new timeline.
Legal experts and digital rights groups argue that a three-hour window leaves very little time for proper review.
They warn that:
Platforms may remove content too quickly without careful assessment
Automated systems may be used heavily, increasing the risk of mistakes
Legitimate content could be taken down due to fear of penalties
Some analysts believe overly strict deadlines could affect freedom of expression, especially in cases where content is complex or legally disputed.
AI Content and Stronger Compliance Rules
The three-hour rule is part of wider changes in India’s digital policy.
The new amendments also aim to:
Regulate AI-generated content
Require clear labeling of synthetic or AI-created posts
Push platforms to use stronger tools to detect illegal material
These changes will apply to both global and Indian platforms operating in the country.
As India continues to be one of the world’s largest social media markets, the new rules could significantly reshape how digital companies manage content and moderation.
The coming months will show how platforms adapt to one of the strictest content takedown timelines globally.




