What are the regulations for online content moderation in India?

The regulations for online content moderation in India were governed primarily by the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. These […]

The regulations for online content moderation in India were governed primarily by the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. These rules were introduced by the Ministry of Electronics and Information Technology (MeitY) in February 2021 to regulate digital content and social media platforms operating in India. However, please note that the regulatory landscape may have evolved since then, and We recommend checking for the most recent updates.

The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 establish guidelines for intermediaries, which include social media platforms, messaging services, video-sharing platforms, and other entities that host or transmit user-generated content. These rules aim to provide a framework for curbing the spread of unlawful and objectionable content, promoting accountability and transparency, and safeguarding the rights and interests of users.

Here are some key aspects of the regulations for online content moderation in India:

  1. Appointment of Grievance Officer: Intermediaries with more than 5 million users in India are required to appoint a Grievance Officer, who serves as a point of contact for users to report complaints regarding content and address grievances.
  2. Compliance Officer: Intermediaries with more than 5 million users must appoint a Compliance Officer responsible for ensuring compliance with the rules and coordinating with government agencies.
  3. Nodal Contact Person: Intermediaries are required to designate a Nodal Contact Person for 24×7 coordination with law enforcement agencies.
  4. Removal of Unlawful Content: Intermediaries must take prompt action to remove or disable access to unlawful content upon receiving a court order or being notified by an appropriate government agency. The rules define unlawful content to include content that affects the sovereignty, integrity, defense, security, friendly relations with foreign states, public order, or incites violence.
  5. Content Removal Process: Intermediaries are required to publish detailed information about their content removal processes, including the categories of content they consider prohibited. They must also notify users whose content is removed and provide them with an opportunity for a hearing if they choose to contest the removal.
  6. Voluntary User Verification: Intermediaries are encouraged to provide mechanisms for voluntary user verification, which may be used to flag certain accounts for additional scrutiny or provide users with a “verified” status.
  7. Traceability of Messages: Messaging services with more than 5 million users are required to enable the identification of the originator of certain messages upon receiving a court order or a request from a government agency. However, end-to-end encrypted messaging services are exempted from this requirement.
  8. Compliance Reports: Intermediaries must submit regular compliance reports, including details of complaints received, actions taken, and the number of accounts and content removed.

It’s important to note that these regulations have been subject to criticism and debate concerning their potential impact on free speech, privacy, and the burden placed on intermediaries to monitor and moderate content. However, the rules were introduced with the intention of promoting responsible behavior by platforms and protecting users from harmful and unlawful content.

As regulations can change over time, it’s advisable to consult the latest official sources or legal experts for the most up-to-date information on online content moderation regulations in India.