Publishing or transmitting obscene material in electronic form Sec.67

Section 67 of the IT Act criminalises the publication of obscene material online. Learn about the legal definitions, harsh penalties, and how platforms can manage content risks in India.

May 21, 2012

Section 67 of the Information Technology Act, 2000 sets the legal boundary for what can be published or transmitted in digital form in India. It specifically targets "obscene" material — defined as anything lascivious or appealing to prurient interests that tends to deprave or corrupt those likely to see it. If you are managing a platform or sharing content online, understanding where the law draws the line is essential to avoid severe criminal penalties.\n

\n\n

What the Law Defines as Obscenity\n

\n

The legal test for obscenity under Section 67 focuses on the effect of the material on the viewer. It is not just about the content itself, but whether it has the potential to corrupt the intended audience. This applies to any electronic record, including images, videos, text messages, and social media posts. In practice, courts look at the context of the publication and the prevailing social standards to determine if a specific piece of content crosses into illegal territory.\n

\n\n

Penalties for First and Subsequent Convictions\n

\n

The IT Act treats digital obscenity as a serious offence. For a first conviction, the law prescribes imprisonment for a term up to three years and a fine that may extend to five lakh rupees. If the offence is repeated, the punishment increases significantly. A second or subsequent conviction can lead to imprisonment for up to five years and a fine reaching ten lakh rupees. These penalties are designed to deter the spread of offensive content across Indian networks in accordance with the 2008 amendment.\n

\n\n

How Businesses and Platforms Must Respond\n

\n

For businesses that host user-generated content, the risk of Section 67 violations is a constant reality. Relying on automated filters is rarely enough to ensure legal safety. You must have clear moderation policies and a responsive takedown mechanism to address illegal content as soon as it is identified. A failure to act can lead to your platform being treated as an abetter to the crime. For guidance on structuring your compliance, see our IPR and cyber law support page.\n

\n\n

Evidence and Investigation in Obscenity Cases\n

\n

Investigations into Section 67 offences rely heavily on digital traces. Authorities look for upload logs, IP addresses, and device metadata to link the content to a specific individual. If you are facing a situation involving the unauthorized distribution of content, preserving the original links and timestamps is the first step toward a legal resolution. You can learn more about how we handle these cases through our cyber crime investigation services.\n

\n\n

Legal Support for Content Moderation\n

\n

Navigating the complexities of content law requires a proactive approach to moderation and legal oversight. If you need to audit your platform's content policies or require help with an ongoing investigation, contact our legal experts to discuss a strategy that protects your organization and ensures legal safety.\n

Found this helpful?

Share this page with others