top of page

Cast Adrift without Safe Harbor: The Risks of Ignoring IT Act Protections (PART 2)

Analyzing the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, by PRS India[1]

Key Features of the Rules

· Social media intermediaries, with registered users in India above a notified threshold, have been classified as significant social media intermediaries (SSMIs). SSMIs are required to observe certain additional due diligence such as appointing certain personnel for compliance, enabling identification of the first originator of the information on its platform under certain conditions, and deploying technology-based measures on a best-effort basis to identify certain types of content.

· The Rules prescribe a framework for the regulation of content by online publishers of news and current affairs content, and curated audio-visual content.

· All intermediaries are required to provide a grievance redressal mechanism for resolving complaints from users or victims. A three-tier grievance redressal mechanism with varying levels of self-regulation has been prescribed for publishers.

Key Issues and Analysis

· The Rules may be going beyond the powers delegated under the Act in certain cases, such as where they provide for the regulation of significant social media intermediaries and online publishers, and require certain intermediaries to identify the first originator of the information.

· Grounds for restricting online content are overbroad and may affect freedom of speech.

· There are no procedural safeguards for requests by law enforcement agencies for information under the possession of intermediaries.

· Requiring messaging services to enable the identification of the first originator of information on its platform may adversely affect the privacy of individuals.

Intermediaries are entities that store or transmit data on behalf of other persons, and include telecom and internet service providers, online marketplaces, search engines, and social media sites.[2] The Information Technology Act, 2000 (IT Act) was amended in 2008 to provide an exemption to intermediaries from liability for any third party information.[3] Following this, the IT (Intermediary Guidelines) Rules, 2011 were framed under the IT Act to specify the due diligence requirements for intermediaries to claim such exemption.[4] The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 were notified on February 25, 2021, to replace the 2011 Rules.[5] Key additions under the 2021 Rules include additional due diligence requirements for certain social media intermediaries, and a framework for regulating the content of online publishers of news and current affairs, and curated audio-visual content. The Ministry of Electronics and Information Technology noted that the changes were necessitated due to widespread concerns around: (i) prevalence of child pornography and content depicting sexual violence, (ii) spread of fake news, (iii) misuse of social media, (iv) content regulation in case of online publishers including OTT platforms and news portals, (v) lack of transparency and accountability from digital platforms, and (vi) rights of users of digital media platforms.[6],[7][8],[9] The validity of the 2021 Rules have been challenged in various High Courts.[10],[11]


· Due diligence by intermediaries: Under the IT Act, an intermediary is not liable for the third-party information that it holds or transmits. However, to claim such exemption, it must adhere to the due diligence requirements under the IT Act and the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (which replace the earlier 2011 Rules). Under the 2011 Rules, the requirements included: (i) specifying, in service agreements, the categories of content that users are not allowed to upload or share, (ii) taking down content within 36 hours of receiving a court or government order, (iii) assisting law enforcement agencies, (iv) retaining blocked content and associated records for 90 days, and (v) providing a grievance redressal mechanism for users and affected persons, and designating a grievance officer. The 2021 Rules retain these requirements, while: (i) modifying the categories of content that users are not allowed to upload or share, and (ii) prescribing stricter timelines for the above requirements.

· Significant social media intermediaries: The 2021 Rules define social media intermediaries as intermediaries which primarily or solely enable online interaction between two or more users. Intermediaries with registered users above a notified threshold will be classified as significant social media intermediaries (SSMIs). The additional due diligence to be observed by these SSMIs include:

Personnel: An SSMI must appoint: (i) a chief compliance officer for ensuring compliance with the Rules and the Act, (ii) a nodal person for coordination with law enforcement agencies, and (iii) a grievance officer, all of whom should reside in India.

Identifying the first originator of information: An SSMI, which primarily provides messaging services, must enable the identification of the first originator of information within India on its platform. This may be required by an order of a Court or the competent authority under the IT Act. Such orders will be issued on specified grounds including prevention, detection, and investigation of certain offences such as those relating to national security, public order, and sexual violence. Such orders will not be issued if the originator could be identified by less intrusive means.

Technology-based measures: SSMIs will endeavour to deploy technology-based measures to identify: (i) content depicting child sexual abuse and rape, or (ii) information that is identical to the information previously blocked upon a court or government order. Such measures: (i) must be proportionate to interests of free speech and privacy of users, and (ii) have a human oversight and be reviewed periodically.

User-centric requirements: SSMIs must provide users with: (i) a voluntary identity verification mechanism, (ii) a mechanism to check the status of grievances, (iii) an explanation if no action is taken on a complaint, and (iv) a notice where the SSMI blocks the user’s content on its own accord, with a dispute resolution mechanism.

· Digital Media Publishers: The 2021 Rules prescribe certain requirements for online publishers of: (i) news and current affairs content which include online papers, news portals, aggregators and agencies; and (ii) curated audio-visual content, which is defined as a curated catalogue of audio-visual content (excluding news and current affairs) which is owned by, licensed by, or contracted to be transmitted by publishers and available on demand. The Rules institute a three-tier structure for regulating these publishers: (i) self-regulation by publishers, (ii) self-regulation by associations of publishers, and (iii) oversight by the central government.

· Code of Ethics: For publishers of news and current affairs, the following existing codes will apply: (i) norms of journalistic conduct formulated by the Press Council of India, and (ii) programme code under the Cable Television Networks Regulation Act, 1995. For online publishers of curated content, the Rules prescribe the code of ethics. This code requires the publishers to: (i) classify content in specified age-appropriate categories, restrict access of age-inappropriate content by children, and implement an age verification mechanism, (ii) exercise due discretion in featuring content affecting the sovereignty and integrity of India, national security, and likely to disturb public order, (iii) consider India’s multiple races and religions before featuring their beliefs and practices, and (iv) make content more accessible to disabled persons.

· Grievance redressal: Any person aggrieved by the content of a publisher may file a complaint with the publisher, who must address it within 15 days. If the person is not satisfied with the resolution, or the complaint is not addressed within the specified time, the person may escalate the complaint to the association of publishers, who must also address the complaint within 15 days. The complaint will be considered by an inter-departmental committee constituted by the Ministry of Information and Broadcasting if: (i) escalated by the complainant or the association under certain conditions, or (ii) referred by the Ministry itself.

· Oversight by Ministry: The Ministry of Information and Broadcasting will: (i) publish a charter for self-regulating bodies, including Codes of Practices, (ii) issue appropriate advisories and orders to publishers; (iii) have powers to block content on an emergency basis (subject to review by the inter-departmental committee). Any directions for blocking content will be reviewed by a committee headed by the Cabinet Secretary.


Regulation of online intermediaries

Intermediaries include a vast array of entities who facilitate the flow of data on internet. These include telecom service providers, internet service providers, search engines, online marketplaces, payment sites, cyber cafes, messaging services, and social media sites. While many intermediaries are mere conduits or storage providers, where they are unaware of the content being transmitted or stored on their platform, other intermediaries may be aware of the user-generated content on their platform. This raises the question that to what extent intermediaries should be held liable for the user-generated content on their platform.

In some jurisdictions such as European Union and India, intermediaries are regulated through the safe harbour model. Under this model, intermediaries are granted immunity from any liability for any illegal user-generated content provided they comply with certain requirements.[12],[13] ,[14] The intermediaries remain immune from liability unless they are aware of the illegality and are not acting adequately to stop it.13 They are subject to ‘duties of care’ and ‘notice and take down’ obligations to remove illegal content.13

In recent years, some online platforms have gained a central role in enabling access, facilitating the exchange of information and sharing of information at scale.[15] Many online platforms have expanded their role from mere hosts of information to that of entities governing how content is displayed and shared online, and undertaking significant actions in the areas of moderation, curation, and recommendation. There are growing concerns around misuse of these platforms for the proliferation of illegal or harmful content such as child sex abuse material, content provoking terrorism, misinformation, hate speech, and voter manipulation.5,6,7,8,14 This has raised questions on the role and responsibility of platforms in preventing diffusion, detection, and subsequent removal of such content.

Some platforms have been self-regulating the publication of such content. However, this has raised concerns about arbitrary actions taken by these platforms which could affect freedom of speech and expression. These developments pose an important challenge for the regulatory framework for intermediaries in terms of finding the correct balance between enhancing the role of platforms and governments in detection, moderation, and curation, and protection of individual’s rights. The 2021 Rules may address some of these issues. Implications of certain provisions under the Rules are discussed in the following sections.

The Rules may be going beyond the powers delegated under the Act

The central government has framed the 2021 Rules as per the following rule-making powers under the Act: (i) carrying out provisions of the Act, (ii) specifying the safeguards or procedures for blocking information for access by the public, and (iii) specifying due diligence to be observed by intermediaries for exemption from liability for third-party information. The 2021 Rules define new types of entities, state their obligations, and prescribe a new regulatory framework for some of these entities. This may be going beyond the powers delegated to the Executive under the Act. Such instances are discussed below. In various judgements, the Supreme Court has held that Rules cannot alter the scope, or provisions, or principles of the enabling Act.[16],[17],[18]

Distinct obligations for new classes of intermediaries: The Act defines an intermediary and states its obligations. These include: (i) taking down content upon a court or government order, (ii) retaining certain information, (iii) providing information and assistance to law enforcement agencies in certain conditions, and (iv) observing due diligence to be exempt from intermediary liability. The Rules define two new classes of intermediaries: (i) social media intermediary and (ii) significant social media intermediary (SSMIs). The Rules also specify the additional due diligence to be observed by SSMIs. These include: (i) appointing certain personnel, (ii) identifying the first originator of information (where SSMIs primarily provide messaging services), and (iii) deploying technology-based measures to pro-actively identify certain types of information on a best-effort basis. The Rules also empower the central government to: (i) determine the threshold for classification as SSMIs, (ii) require any other intermediary to comply with additional due diligence requirements for SSMIs. Defining new types of intermediaries, and empowering the government to specify thresholds under these definitions and cast obligations on select entities, may be going beyond the powers delegated to the government under the Act. Provisions such as the definition of new entities and their obligations may have to be specified in the parent Act.

Identification of the first originator of information: The Rules require SSMIs, which provide a service primarily or solely in the nature of messaging, to enable the identification of the first originator of information within India on its platform. This rule has no related provision under the parent Act. The Rules also prescribe certain details such as: (i) information on the first originator can be required only by a government or court order, (ii) the grounds on which such orders can be passed, and (iii) not issuing such an order if less intrusive means to obtain the information are available. It may be questioned whether this amounts to instituting legislative policy, and hence, is required to be provided in the parent Act.

Regulation of online publishers: The Rules prescribe a regulatory framework for online publishers of news and current affairs and curated audio-visual content (such as films, series, and podcast). Regulation of such publishers may be beyond the scope of the IT Act.

Certain grounds for restricting content may affect freedom of speech

The Constitution allows for certain reasonable restrictions with respect to freedom of speech and expression on grounds such as national security, public order, decency, and morality.[19] The IT Act prohibits uploading or sharing content which is obscene, sexually explicit, relates to child sex abuse, or violates a person’s privacy.[20] The 2021 Rules specify certain additional restrictions on the types of information users of intermediary platforms can create, upload, or share. These include: (i) “harmful to child”, (ii) “insulting on the basis of gender”, and (iii) “knowingly and intentionally communicates any information which is patently false or misleading in nature but may reasonably be perceived as a fact”. Some of these restrictions are subjective and overbroad, and may adversely affect the freedom of speech and expression of users of intermediary platforms.

The Supreme Court (2015) has held that a restriction on speech, in order to be reasonable, must be narrowly tailored so as to restrict only what is absolutely necessary.[21] It also held that a speech can be limited on the grounds under the Constitution when it reaches the level of incitement. Other forms of speech even if offensive or unpopular remain protected under the Constitution.

The Rules require the intermediaries to make these restrictions part of their service agreement with users. This implies that users must exercise prior restraint, and intermediaries may interpret and decide upon the lawfulness of content on these grounds. Such overbroad grounds under the Rules may not give a person clarity on what is restricted and may create a ‘chilling effect’ on their freedom of speech and expression. This may also lead to over-compliance from intermediaries as their exemption from liability is contingent upon observing due diligence.

While examining the 2011 Rules on intermediary guidelines, the Lok Sabha Committee on Subordinate Legislation (2013) had observed that to remove any ambiguity, the definitions of the grounds used in the Rules should be incorporated in the Rules, if the definitions exist in other laws.[22] If not defined in other laws, such grounds should be defined and incorporated in the Rules to ensure that no new category of crimes or offences is created through delegated legislation.21 The 2021 Rules do not provide definitions or references for the terms listed above and hence, may cause ambiguity regarding the interpretation of these terms.

Procedure for information requests from government agencies lacks safeguards

The Rules require intermediaries to provide information under their control or possession upon request by a government agency. The government agency which is lawfully authorised for investigative or protective or cybersecurity activities may place such a request. The request may be placed for verification of identity, or prevention, detection, investigation, or prosecution of offences under any law or for cybersecurity incidents. However, the Rules do not state any procedural safeguards or requirements for such actions.

An earlier set of Rules notified in 2009 specify the procedure and safeguards subject to which interception, monitoring or decryption of information of intermediaries may be undertaken.[23] These state that such orders must be given by the union or state home secretary (with exceptions in case of unavoidable circumstances and remote regions), and be subject to review by a committee (headed by cabinet secretary or the state’s chief secretary). Further, the authority issuing such orders should first consider alternate means of acquiring information.22

Further, the 2021 Rules do not restrict the extent or type of information that may be sought. For example, the information sought may be personal data of individuals such as details about their interaction with others. Such powers, without adequate safeguards, as those in the 2009 Rules, may adversely affect the privacy of individuals.

Enabling traceability may adversely affect the privacy of individuals

The Rules require significant social media intermediaries, which provide services primarily or solely in the nature of messaging, to enable the identification of the first originator of information within India (commonly referred to as traceability). The Rules state that: (i) such identification should be required by a court order or an order passed by a competent authority under the 2009 Rules (union or state home secretary), (ii) order for identification will be passed for specified purposes including prevention, detection, and investigation of offences related to sovereignty and security of the state, public order, and sexual violence (rape, sexually explicit material or child sex abuse material), and (iii) no such order will be passed if less intrusive means are effective for the required identification.

Enabling such identification may lower the degree of privacy of communication for all users. Identifying the first originator of information on a messaging platform will require the service provider to permanently store certain additional information: (i) who all exchanged a message, and (ii) the exact message or certain details which uniquely describe a message so that information in question may be matched against it. This will be required for every message exchanged over the service provider’s platform to enable tracing the first originator of any message. Note that permanently storing such details about a message is not a technological necessity for providing messaging services over internet. The Rules also do not specify any timeline in terms of how far back in time the messaging service will be required to check for determining the first originator. Overall, this requirement will lead to the retention of more personal data by messaging services which goes against the principle of data minimisation. Data minimisation means limiting data collection to what is necessary to fulfil a specific purpose of data processing, and has been recognised as an important principle for the protection of personal privacy.[24],[25]

The Supreme Court (2017) has held that any infringement of the right to privacy should be proportionate to the need for such interference.[26] Traceability is required to prevent, detect, and investigate specified offences. For enabling traceability for a few messages that may be required for investigative purposes, the degree of privacy of communication of all users of online messaging services will need to be permanently lowered. Hence, the question is whether this action could be considered proportionate to the objective.

Note that a case related to the issue of traceability is currently pending before the Supreme Court.[27]

Framework for regulation of content of online publishers

Content on conventional media including print, TV, film, and radio are regulated under specific laws as well as license agreements (in the case of TV and radio).[28],[29] ,[30] ,[31] These regulations seek to ensure that community standards are reflected in content easily accessible by the public. They also seek to restrict access to certain content based on its age-appropriateness and if it may be deemed unlawful.[32] Economic costs and certain licence requirements for some of these operations mean that their numbers are few. In the past few years, internet has become a more mainstream medium for the publication of news as well as entertainment content. The regulatory framework for content on digital media may not be similar to conventional media as there are certain challenges in terms of: (i) defining who is a publisher; individuals and businesses publishing online may not be regulated in the same manner, (ii) the volume of content to regulate, and (iii) enforcement (cross-border nature of internet means that publishers need not have a physical presence in India). The 2021 Rules under the IT Act prescribe a framework for regulation of content by online publishers of news and current affairs and curated audio-visual content (such as films, series, and podcasts). Certain issues with these Rules are discussed below.

Regulation of online publishers under the 2021 Rules may be beyond the scope of the parent Act

The framework provides for norms and oversight mechanism for the regulation of content of online publishers. The press note by the central government on 2021 Rules noted that online publishers are digital platforms which are governed by the IT Act.6 The IT Act is aimed at providing legal recognition for transactions carried out by means of electronic data interchange and other means of electronic communication, and to facilitate electronic filing of documents.[33] The Act prohibits cybercrime including publishing specified content such as sexually explicit content, child sex abuse material, and content violating other’s privacy.

Laws such as the Press Council Act, 1978, the Press and Registration of Books Act, 1867, the Cable Television Networks (Regulation) Act, 1995, and the Cinematograph Act, 1952 are specific laws regulating publishers of news in print, television broadcast of news and audio-visual content, and films, respectively (similar content through other media).27,28,29,30 Regulation of content of these classes of publishers deals with questions of freedom of press and freedom of artistic expression. It may be questioned whether regulation of online publishers is envisaged under the IT Act and hence, if the 2021 Rules exceed the scope of the Act in this regard.

Oversight mechanism for digital news media lacks the independence accorded to print news

The oversight mechanism for content regulation in case of news in print is under the Press Council of India (PCI), which is an independent statutory body. One of the main objectives of the PCI is to uphold the freedom of the press. The Council consists of a chairman and 28 other members including working journalists, persons from the management of newspapers, members of Parliament, and domain experts. The Chairman is selected by the Speaker of the Lok Sabha, the Chairman of the Rajya Sabha and a member elected by the PCI. Key functions of the PCI include: (ii) adjudicating upon complaints of violation of standards, (iii) issuing directions upon violation of code of conduct including admonishing, warning, and censuring. For similar functions in case of digital news media, the oversight mechanism will be under the Ministry of Information and Broadcasting. Thus, the oversight mechanism for digital news is not through an independent statutory body unlike that for print publications.

Note that the content of TV news is regulated under the Cable Television Networks (Regulation) Act, 1995 (CTN Act). The CTN Act empowers the central government to prescribe programme code and advertising code to be followed by the publishers. The central government may prohibit the transmission of a programme in the public interest on certain specified grounds if it violates these codes. A three-tier self-regulation mechanism for TV broadcasters, similar to that for online publishers, has been prescribed under the CTN Act in June 2021.[34]

The procedure for emergency blocking of content of online publishers lacks certain safeguards

As per the Rules, the Secretary of the Ministry of Information and Broadcasting may pass an order for blocking the content of an online publisher in case of emergency. Such orders may be passed on certain specified grounds including national security and public order, without giving the publisher an opportunity of hearing. Such an order will be examined by the inter-departmental committee for its recommendation on the confirmation or revocation of the order. The Rules do not give the publisher an opportunity for hearing during this entire process. This is in contrast with the process for examination of violation of the code of ethics. Under this process, the concerned publisher will be allowed to appear and submit their reply and clarifications before the committee.

Definition of social media intermediary may be too broad

The Rules define a social media intermediary as an intermediary which primarily or solely enables interaction between two or more users and allows them to create, upload, share, disseminate, modify or access information using its services. This definition may include any intermediary that enables interaction among its users. This could include email service providers, e-commerce platforms, video conferencing platforms, and internet telephony service providers.

FAQs made by Meity[35]

In order to ensure an Open, Safe & Trusted Internet and accountability of intermediaries including the social media intermediaries to users, Ministry of Electronics and Information Technology (MeitY) has notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (hereinafter referred to as “IT Rules, 2021”) on 25th February, 2021.

These Rules prescribe the due diligence to be followed by all intermediaries as well as the additional due diligence to be followed by significant social media intermediaries.

The Rules also provide guidelines to be followed by publishers of news & current affairs and also online curated content providers.

These Rules supersede the earlier notified Information Technology (Intermediaries Guidelines) Rules, 2011. The Rules are available at

The Rules have two segments:

(i) Intermediary Guidelines (Part-II of the Rules, except rule 5*) administered by MeitY.

(ii) (ii) Digital Media Ethics Code (Part-III of the Rules) administered by the Ministry of Information & Broadcasting (MIB) in line with the distribution of subjects under the Government of India (Allocation of Business Rules), 1961.

* Rule 5 in Part-II is related to due diligence to be observed by an intermediary in relation to news and current affairs content made available on their platform by such publishers and shall be administered by MIB, Govt. of India.

The following FAQs have been prepared to bring clarity as well as to explain the nuances of the due diligence to be followed by intermediaries.

Section I: Basic Information

1. Why were the erstwhile Information Technology (Intermediaries Guidelines) Rules of 2011 revised with the new IT Rules, 2021?

Ans: In order to ensure an Open, Safe & Trusted Internet and accountability of intermediaries including the social media intermediaries to users and in tune with the changing requirements, the Rules have been revised.

Some of the reasons which led to the introduction of the IT Rules, 2021 are as follows:

· Two significant Supreme Court orders on 11th December, 2018 in the Prajwala matter [In Re: Prajwala letter dated 18.2.2015 Videos of Sexual Violence and Recommendations– SMW (Crl.) No(s).3/2015] and on 24th September, 2019 in the Facebook transfer petition [Facebook Inc. v Union of India & Ors. Transfer Petition (Civil) Nos 1943-1946 of 2019] besides other court judgements;

· The Right to Privacy being confirmed as a fundamental right by the Supreme Court in Justice KS Puttaswamy (Retd.) & Anr. vs. UOI & Ors. [(2017) 10 SCC 1].

· Commitment given in the Parliament on 26th July, 2018 on prevention of misuse of social media platforms in view of concerns raised by members;

· Recommendations made by the Rajya Sabha Ad-hoc Committee on Pornography in February 2020;

· Growing concerns of safety and security of users particularly women and children on the internet, wherein the victims had no forum for redressal of their grievances;

· Significant expansion of online intermediary ecosystem;

· Growth of online social media platforms and their influencing capabilities;

· International developments in social media regulation;

· Compelling need to have a framework to deal with fake/hate messages which have become viral and have resulted in riots, mob lynching or other heinous crimes including those concerning dignity of women and sexual abuse of children;

· Alignment with the requirements of the Law Enforcement Agencies (LEAs) and other Appropriate Governments or their agencies;

· Need to contemporize the intermediary liability framework.

2. Since when the Rules have come into effect?

Ans: The Rules have come into effect from the date of their publication in the Gazette (i.e., 25th February, 2021). The threshold criteria [Ref. rule 2(1)(v)] for significant social media intermediaries (SSMI) was published on 26th February, 2021. Additional due diligence for SSMI have come into effect from 26thMay, 2021.

3. What process was followed for amendments in the erstwhile Information Technology (Intermediaries Guidelines) Rules, 2011?


· MeitY proposed amending the erstwhile Information Technology (Intermediaries Guidelines) Rules, 2011 and invited public comments on the draft new Rules on 24.12.2018.

· Based on public comments as well as suggestions received during various stakeholders’ meetings, and also taking into account the change of Allocation of Business Rules (AoBR) of MIB in November, 2020, the Rules were finalised and integrated as one common set of Rules, namely the “Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021” replacing the earlier Information Technology(Intermediaries Guidelines) Rules, 2011.

4. What are the major changes in the Rules over the erstwhile Information Technology (Intermediaries Guidelines) Rules, 2011?

Ans: Some of the significant changes in the IT Rules, 2021 as applicable to the intermediaries are:

(a) Intermediary due diligence now includes:

o Due diligence to be followed by all intermediaries (rule 3).

o Additional due diligence requirements for SSMI (rule 4).

o Additional due diligence rules, as applicable, for other intermediaries as and when specifically notified by the Central Government (rule 6).

(b) Increased User Safety: Provision for direct requests by the affected individuals for content takedown in specific cases of content relating to breach of bodily privacy, impersonation, morphed imagery of the concerned individual in order to address the immediate need to prevent harm and emotional distress, particularly in instances of revenge porn and other similar cases [Ref. rule 3(2)(b)].

(c) Alignment with the Supreme Court’s Order in Prajwala case (for SSMIs):

o The Supreme Court in a suo motu writ petition (Prajwala case), in its order dated 11/12/2018, had observed that the Government of India may frame necessary guidelines to eliminate child

pornography, rape and gang-rape imageries, videos and sites in content-hosting platforms and other applications.

o The new IT Rules, 2021 provide that the SSMI should make an endeavour to deploy technology-based measures for identification of above such content available on their platform, in accordance with the safeguards provided in these Rules [Ref. rule 4(4)].

(d) Revision in terms and conditions offered to users by the intermediaries:

o The terms and conditions have been made clearer, simpler and revised to reflect emerging issues.

(e) Clear timelines have been provided for:

o Grievance Redressal: 24 hours for acknowledgement/15 days for disposal [rule 3(2)]. Information takedown from platform upon actual knowledge based on court order or notice from appropriate government authorised by law: 36 hours [rule 3(1)(d)].

o Providing information on a lawful request: 72 hours [rule 3(1)(j)]. o Removal of revenge porn (sexual extortion/non-consensual porn publication/sexual act or conduct involving impersonation, etc.) and other similar content: 24 hours [rule 3(2)(b)].

5. How do these Rules enhance the safety of women and children?

Ans: The new IT Rules, 2021 have a clear objective of enhancing online safety of users, particularly women & children. Various provisions of these Rules focus on enhanced safety of women and children. These include:

· Specific inclusion of certain requirements to be explicitly conveyed in the terms and conditions [rule 3(1)(b)].

· Reporting by the aggrieved individual in respect of revenge porn and similar content breaching physical privacy and taking action within 24 hours for content removal [rule 3(2)(b)].

· Enhanced grievance redressal mechanism by intermediaries [rule 3(2)(a)].

· Additional provision for SSMI to appoint a Resident Grievance Officer, a Chief Compliance Officer and a nodal contact person, all to be residents in India; and a physical contact address of the significant social media intermediary to be in India [rule 4(1) and 4(5)].

· The Rules also have provisions that intermediaries shall cooperate with Law Enforcement Agencies (LEA) to identify the first originator of information related to rape, sexually explicit material or child sexual abuse material (CSAM) for investigation and prosecution [rule 4(2)].

· The significant social media intermediaries shall endeavour to deploy technology-based measures to identify any imagery of child sexual abuse, rape, etc. whether real or simulated in accordance with the safeguards in the Rules [rule 4(4)].

6. Do these Rules affect the right to privacy of individuals?

Ans: Privacy is a fundamental right in India. The IT Rules, 2021 are consistent with this fundamental right. The Rules, therefore, have a clear focus on protecting online privacy of individuals. Various provisions of these Rules, as stated in Part-II, focus on the protection of privacy. These include:

· Intermediaries are required to convey to users that they should not share information that is invasive of another person’s privacy, including bodily privacy [rule 3(1)(b)].

· Intermediaries are required to periodically inform users that in case of non-compliance with their privacy policies, the intermediary has the right to terminate access or block such information [rule 3(1)(c)].

· In case an individual comes across any content of a platform that depicts such person in full/ partial nudity, in a sexual act or through morphed images, such person may make a complaint to the concerned intermediary, which is then obliged to take all reasonable and practical measures to remove such content within 24 hours of such complaint [rule 3(2)(b)].

Even with regard to identification of the first originator of messages, there are a number of safeguards in place in rule 4(2) to ensure that the privacy of users is not violated. Some of these safeguards are:

· SSMIs in the private messaging space are only required to enable identification of the first originator upon receiving authorised directions. They do not enjoy the authority to identify such users or information on their own in the absence of appropriate orders.

· Such appropriate orders can only be passed by a competent court or a competent authority under the Information Technology (Procedure and Safeguards for Interception, Monitoring and Decryption of Information) Rules, 2009.

· Such orders can be passed only in relation to certain specified grounds contained in the Rules, such as sovereignty, national security, public order, rape or child abuse, etc.

· Such orders shall not be passed when other alternative measures are available.

7. Do these Rules affect the right to free speech and expression?

Ans: No. Article 19 of the Constitution guarantees right to freedom of speech and expression and article 19(2) defines the reasonable restrictions. The new IT Rules, 2021 have been framed consistent with these rights. The Rules place no additional obligations on users and do not contain any sort of penalties applicable on users. Further, a robust grievance redressal mechanism has been put in place to ensure that users whose content or access is unreasonably removed may highlight such error to the intermediary for remedial action.

8. How will users be benefitted from these new Rules?

Ans: The IT Rules, 2021 are meant to benefit a general user, who is using any intermediary platform. These Rules specially provide for:

· increased safety of users and also ensure accountability of intermediaries to the users;

· establishment of a robust grievance redressal mechanism by Intermediaries (including appointing a Resident Grievance officer for SSMI). This will ensure that Intermediaries are responsive to the concerns and grievances of users;

· a physical contact address in India for communication (for SSMI), to make sure that users can meaningfully reach out to SSMIs to voice their concerns, as many SSMIs are international organizations;

· periodic reminders by intermediaries to users that content which may be illegal or harmful to other users, such as those falling under harassment, insults, fake news, misrepresentation, invasions of privacy, paedophilic, pornographic, defamatory and obscene should not be posted;

· expeditious removal of content violative of physical privacy (generally considered as revenge porn material); yearly reminders about privacy policy and other terms and conditions offered by the intermediary;

· prior notice before content/account deletion/suspension in certain cases;

· voluntary verification;

· public accountability and transparency of SSMI.

These features are likely to benefit all concerned. These measures are intended to empower users in the online space, and protect their rights and dignity. The Rules, by providing these reasonable mechanisms and remedies, strive to ensure that social media platforms remain a safe space for all users, free from cyber threats, harassment and unlawful content.

9. Whether platforms and Apps, which act as news aggregators, qualify as intermediaries or whether they will also be covered as Publishers under the IT Rules, 2021?

Ans: Some entities may be functioning both like an intermediary as well as a “news aggregator” or “publisher of news and current affairs content” as defined in rule 2(o) and rule 2(t). Further clarification with respect to rule 5, and Part III of the Rules relating to news and current affairs content may be sought from the Ministry of Information & Broadcasting (MIB).

Section II: Basic Terminology and Scope of the Rules

j) Which entities are covered under the scope of Part II of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 administered by MeitY?

Ans: Any intermediary as defined under section 2(1)(w) of the Information Technology (IT) Act, 2000.

11. Which entities can qualify as ‘intermediary’ under the IT Rules, 2021?

Ans: The section 2(1)(w) of the Information Technology Act, 2000 (21 of 2000) defines an intermediary as:

“intermediary”, with respect to any particular electronic records, means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes telecom service providers, network service providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online-market places and cyber cafes.

With the evolution of technologies and hence proliferation of digital businesses and services, many other platforms of different kinds may qualify as intermediaries with respect to the third-party content made available, shared, hosted, stored or transmitted on their platforms including websites and mobile Apps.

12. Which intermediaries will qualify/ not qualify as a ‘social media intermediary’ under these Rules?

Ans: The IT Rules, 2021 define ‘social media intermediary’ as an intermediary which primarily or solely enables online interaction between two or more users and allows them to create, upload, share, disseminate, modify or access information using its services.

To qualify as a social media intermediary, enabling of online interactions should be the primary or sole purpose of the intermediary. Therefore, typically, an entity which has some other primary purpose, but only incidentally enables online interactions, may not be considered as a social media intermediary.

Indicative features that may clarify the scope of the phrase “enables online interactioninter-alia are as follows:

(a) Facilitates socialization/social networking, including the ability of a user to increase their reach and following, within the platform via specific features like “follow”/“subscribe” etc.;

(b) Offers opportunity to interact with unknown persons or users;

(c) Ability of enabling virality of content by facilitation of sharing. Virality, in this context, means the tendency of any content to be circulated rapidly and widely from one internet user to another.

Typically, any intermediary whose primary purpose is enabling commercial or business-oriented transactions, provide access to internet or search-engine services, e-mail service or online storage service, etc. will not qualify as a social media intermediary.

13. Which social media intermediaries will qualify as ‘significant social media intermediaries’?

Ans: As per the notification made by the Central Government, a social media intermediary having fifty lakh (five million) registered users in India shall be considered as a significant social media intermediary. The Gazette Notification in this regard can be accessed at:

Registered users for the purpose of computing the threshold for a SSMI are those users who have registered/created an account with SSMI.

Section III: Due Diligence by an Intermediary

14. Rule 3(1)(d) requires an intermediary to remove or disable access to certain information about which the intermediary is notified/ requested by the Appropriate Government or its agency within 36 hours. What kind of details pertaining to the said notice/ request will be provided by the Appropriate Government authority?

Ans: This rule was already present in the IT (Intermediaries Guidelines) Rules, 2011, and therefore, there is a clear and existing practice in relation to the orders of law enforcement or Appropriate Government authorities to communicate this information to the intermediary. Typically, this communication should contain-

(a) the platform specific identified URL(s);

(b) the law that is being administered by the Appropriate Government/ authorised agency and the specific clause of the law which is being violated;

(c) justification and evidence; and

(d) any other information (e.g., time stamp in case of audio/ video, etc.) as may be relevant.

15. Rule (3)(1)(h) requires an intermediary to retain information collected from user for registration on its computer resource for 180 days after any cancellation or withdrawal of registration. Is the intermediary required to store only the data that was collected at the time of registration, or the intermediary is required to save all data pertaining to usage of computer resource after registration (like IP address, user log, etc.)?

Ans: The IT Rules, 2021 warrant the intermediary to store or retain data that have been collected from the user at the time of registration (mainly the location, time and date stamp of the user to understand when and where the account was created) if the user has withdrawn from the platform or in case of cancellation of account by the intermediary. Regarding the information that has been collected after registration and before withdrawal, it will vary from platform to platform. How much information should a platform store otherwise would be addressed through the IT (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 and notifications under Section 67C of the IT Act 2000.[36]

16. Do the intermediaries need to publish the name of the Grievance Officer?

Ans: Yes, the intermediaries need to prominently publish the name of the Grievance Officer and his/her contact details as well as mechanism adopted for grievance redressal on their platforms [rule 3(2)(a)]. The Grievance Officer will be responsible for handling grievances and facilitating an effective grievance redressal system. It is expected that the platform would provide grievance registration process in easy-to-understand terms for the benefit of the users.

17. What are the timeframes for action by an intermediary?



Actions to be taken by the


Reference in


the Rules


Grievance Acknowledgement

24 Hours

Rule 3(2)(a)


Response to Grievance

15 days

Rule 3(2)(a)


Removal/ disabling of content which exposes the private area of such individual, shows such individual in full or partial nudity or shows or depicts such individual in any sexual act or conduct, or is in the nature of impersonation in an electronic form, including artificially morphed images of such individual

Within 24 hrs

Rule 3(2)(b)


Content removal on receipt of court order or notice from Appropriate Government or its agency

36 hours

Rule 3(1)(d)


Provide information under its control or possession, or assistance to the Government agency which is lawfully authorised for investigative or protective or cyber security activities

Within 72 hours of the receipt of an order

Rule 3(1)(j)


Preservation of information and associated records relating to removal/ disabling of access to such information

180 days or as may be required

Rule 3(1)(g)


Retaining user’s registration information after cancellation or withdrawal of his registration

180 days

Rule 3(1)(h)

Section IV: Additional Due Diligence by Significant Social Media Intermediaries (SSMI)

18. Rule 4(1) mandates significant social media intermediaries to appoint a Chief Compliance Officer responsible for ensuring compliance with the IT Act 2000, a nodal contact person for 24x7 coordination with law enforcement agencies and a Resident Grievance Officer. Can one person be appointed to fulfil the roles of the nodal contact person as well as the Resident Grievance Officer? If not, can one person be appointed to fulfil the diverse roles of the Chief Compliance Officer as well as the Resident Grievance Officer?

Ans: As mentioned in the rule 4(1) of the IT Rules, 2021, the Chief Compliance Officer and the nodal contact person cannot be the same person, whereas the roles of the nodal contact person and the Resident Grievance Officer may be performed by the same person. However, keeping in view the functional requirements of the nodal contact person and the Resident Grievance Officer, it is desirable that SSMI appoints separate persons for the two roles. The Government, through this rule, expects the intermediary to provide separate contact details for grievances submitted by users and the requests/orders made by the Government or authorized Government agencies, since the nature of requests might vary in view of different compliance timelines.

19. In case a parent company owns multiple products/ services that cross the threshold for significant social media intermediaries, can the parent company appoint common officers across all such products/ services or do they have to be appointed individually for each product/ service?

Ans: A parent SSMI can appoint common officers across its products/ services. However, the contact details to approach these officers are required to be clearly mentioned on each of those product/ service platforms separately.

20. Rule 4(1)(d) requires a significant social media intermediary to publish periodic compliance reports every month mentioning the details of complaints received and action taken thereon. Does the intermediary have to submit a physical copy of this report to MeitY? Is the intermediary required to publish the report on its website? Is there a particular format for publishing this report including details on the type of information that is essential or the level of granularity of the information published?

Ans: The intermediary does not have to submit a physical copy of the compliance report to MeitY. Rather, the intermediary is required to publish the monthly compliance report on its platform. The report should contain details of the preceding month.

No specific format of monthly compliance report has been prescribed in the Rules. However, the essential requirements that are already mentioned in the Rules should be included in the monthly compliance report. It is left to the discretion of the SSMI to decide other details of the report.

Insofar as actions taken by an SSMI on the user complaints received by it are concerned, ideally, the report should contain summary details of the complaints received, e.g., the subject under which the complaint is received (e.g., copyright) and action taken under these different heads. This information could be disclosed in the aggregated form, without disclosing granular details of all cases.

With regard to the voluntary actions taken by an SSMI, mentioning the number of communication links removed by the SSMI would serve the purpose of this rule.

The intermediary, while publishing such a report, also needs to ensure that it does not impinge upon the privacy and safety of its users while publishing such details.

21. Rule 4(6) requires the intermediary, to the extent reasonable, to provide the complainant with reasons for any action taken or not taken. Is there a criterion on what would qualify as ‘reasonable’? In case of frivolous complaints, would it not be reasonable to desist from providing reasons for inaction?

Ans: The objective of this rule is to allow aggrieved users (including those whose content have been taken down or whose accounts have been disabled) to understand how their complaint is being dealt with by the Resident Grievance Officer of the intermediary, and promote a two-way communication between the aggrieved users and the intermediary.

It is expected that the intermediary provides a reasonable explanation to the aggrieved user. In case of a frivolous complaint, the nature of the complaint can be cited as the reason for any action not taken. The rules provide flexibility to the intermediaries to decide the best way to give an explanation and due process to the user, keeping in mind the safety of the reporting party. The idea is to promote accountability while giving flexibility. It is expected that the intermediary provides details of its grievance redressal mechanism for the benefit of the aggrieved users.

22. Rule 4(8) requires the significant social media intermediary to notify the user whose information is taken down or made inaccessible by the intermediary on its own accord and also allow adequate opportunity to dispute the action. Should the user be notified in all such scenarios?

Ans: The user may be notified only in a scenario where the content is removed or disabled by an SSMI “on its own accord” for violation of terms and conditions of the service. The term “on its own accord” implies, where SSMI:

(i) uses automated tools/filters or some national or international agency/ specialised organisation has identified child sexual abuse materials (CSAM) and related materials;

(ii) concludes that the content falls under the prohibited category as defined under any law for the time being in force;

(iii) is of the opinion that the content is blatantly illegal and notifying might harm the complainant in any way; or

(iv) removes the content as advised by its Resident Grievance Officer in accordance with its grievance redressal mechanism.

SSMI need to notify the user in such cases falling under the categories (ii) and (iv) as mentioned above.

23. The requirement to issue notification to users in case their content is removed may compromise the ability of an intermediary to counter activity by bots. If bots are notified about action taken, they may tweak their attack strategy. In such a case, does rule 4(8) allow for an intermediary to not send a notification to suspected bots and/or to implement a lag in notification to help the intermediary handle bot activity?

Ans: There might be situations, e.g., in case of a bot activity or malware, terrorism related content, spam, etc., where the intermediary may not find it prudent to inform the user prior to taking down their content. In such a scenario, it is expected that the intermediaries may undertake steps while handling a non-human user, to effectively counter bot activity.

24. Would detection of first originator of the message in the messaging platforms compromise end-to-end encryption?

Ans: The intent of this rule is not to break or weaken the encryption in any way but merely to obtain the registration details of the first Indian originator of the message. The electronic replica of the message (text, photo or video, etc.) will be shared by the requesting agency along with a lawful order. A typical principle of detection is based on the hash value of the unencrypted message, wherein identical messages will result into a common hash (message digest) irrespective of the encryption used by a messaging platform. How this hash will be generated or stored needs to be decided by the concerned SSMI, and SSMI are free to come up with alternative technological solutions to implement this rule. The rationale of this requirement is that if the intermediary has to convey to its users not to upload or share a particular type of content as part of its terms of use, it should have the capability of determining so or else the platform loses its own capability to enforce its own terms of usage. While encryption ensures safety and security of the data, and the privacy norms self-imposed by the intermediary may be needed, it is also imperative that the platforms should not be used to carry out sharing of any unlawful content as specified under the IT Rules, 2021 and other applicable laws.

25. What additional information can MeitY call for from significant social media intermediaries (SSMI) under sub-rule (9) of rule 4 of the IT Rules, 2021?

Ans: Under sub-rule (9) of rule 4, MeitY can only call for information that is in the power and possession of SSMI pertaining to their grievance redressal mechanism, which may include compliance reports in relation to the complaints received and action taken thereon and such additional information that MeitY is empowered to seek under the IT Act for effective implementation of Part II of the IT Rules, 2021. This would typically exclude any commercially sensitive, trade secret or otherwise confidential information held by the intermediaries.

26. Where an SSMI publishes ads, owned/ licensed content and identifies such content as advertised, promoted etc., or pays third parties to upload certain types of content, will it still qualify as an intermediary and be eligible to claim immunity under Section 79?

Ans: Rule 4(3) is intended to provide transparency to users so that they are aware whether the content being accessed by them is based on commercial considerations or otherwise. Hence, the rule requires appropriate labelling of such content so that the users can make an informed choice at the time of accessing it. It does not change the basic character of the intermediary and their ability to avail of the exemptions (safe harbour provisions), which shall be determined based on the provisions under Section 79 of the IT Act as judicially interpreted from time to time.

Section V: Non-Compliance to Intermediary Rules

27. What would be the impact of non-compliance by an intermediary to these Rules?

Ans: The intermediary shall lose its exemptions from liability as provided under section 79 of the IT Act and rule 7 of these Rules may become applicable with respect to the extant law violated.

28. Are there any penalties that users may face under these Rules?

Ans: No. However, users do need to ensure that the content they share on intermediary platforms is not violative of the IT Act (e.g., under sections 67, 67A, 67B, etc.) or other existing laws such as the Indian Penal Code, the Copyright Act, etc. as they may be liable to be prosecuted/ penalized under all such laws.


Siddharth Dalmia

The StartUp Sherpa


[1] PRS Legislative Research (“PRS”) [2] Section 2 (1) (w), The Information Technology Act, 2000. [3] Section 79, The Information Technology Act, 2000. [4] The Information Technology (Intermediaries Guidelines) Rules, 2011, accessible at Appendix [.] or [5] Supra Note 1 [6] Official Debates, Rajya Sabha, July 26, 2018- as per PRS India [7]Government notifies Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021”, Press Information Bureau, Ministry of Electronics and Information Technology, February 25, 2021 accessible at Appendix [.] and [8] Suo Moto Writ Petition No. 3 of 2015, Supreme Court of India, December 11, 2018 accessible at Appendix [.] or [9]Report of the Adhoc Committee of the Rajya Sabha to Study the Alarming Issue of Child Pornography on Social Media and its Effect on Children and Society at Large”, February 3, 2020 accessible at Appendix [.] or [10] W.P. (Civil) No. 6272 of 2021, Kerala High Court, accessible at Appendix [.] or [11] W.P. (Civil) No 3125 of 2021, Delhi High Court accessible at Appendix [.] or [12] Article 13-15, Directive 2000/31/EC of The European Parliament And of the Council accessible at Appendix [.] or [13] Section 79, The Information Technology Act, 2000. [14]Reform of the EU Liability Regime for Online Intermediaries”, European Parliamentary Research Service, May 2020 accessible at Appendix [.] or [15] “Liability of online platforms”, European Parliamentary Research Service, February 2021 accessible at Appendix [.] or [16] Agricultural Market Committee vs Shalimar Chemical Works Ltd, 1997 Supp(1) SCR 164, May 7, 1997, accessible at Appendix [.]. [17] State of Karnataka v Ganesh Kamath, 1983 SCR (2) 665, March 31, 1983, accessible at Appendix [.]. [18] Kerala State Electricity Board vs Indian Aluminium Company, 1976 SCR (1) 552, September 1, 1975, accessible at Appendix [.]. [19] Article 19, The Constitution of India, accessible at Appendix [.]. [20] Section 67, 67A, and 67B, The Information Technology Act, 2000 [21] Shreya Singhal vs Union of India, Writ Petition (Criminal) No. 167 Of 2012, Supreme Court of India, March 24, 2015 accessible at Appendix [.]. [22] 31st Report of the Committee on Subordinate Legislation of Lok Sabha on Rules under the IT Act, 2000, March 2013, accessible at Appendix [.] or [23] The Information Technology (Procedure and Safeguards for Interception, Monitoring, and Decryption of Information) Rules, 2009 under the Information Technology Act, 2000, accessible at Appendix [.] or [24] White Paper of the Committee of Experts on Data Protection Framework for India under the Chairmanship of Justice B.N. Shrikrishna accessible at Appendix [.] or [25] Article 5, General Data Protection Regulation of European Union accessible at Appendix [.] or [26] Justice K.S.Puttswamy (Retd) vs Union of India, W.P.(Civil) No 494 of 2012, Supreme Court of India, August 24, 2017 accessible at Appendix [.]. [27] Facebook Inc vs Antony Clement Rubin, Diary No 32478/2019, Admitted on January 30, 2020, Supreme Court of India accessible at Appendix [.]. [28] The Press Council Act, 1978 accessible at Appendix [.] or [29] The Press and Registration Of Books Act, 1867, accessible at Appendix [.] or [30] The Cable Television Networks (Regulation) Act, 1995 accessible at Appendix [.] or [31] The Cinematograph Act, 1952 accessible at Appendix [.] or [32]The Challenge of managing digital content”, International Telecommunications Union, August 23, 2017 accessible at Appendix [.] or [33] Introduction to the Information Technology Act, 2000 [34] The Cable Television Networks (Amendment) Rules, 2021 issued under the Cable Television Networks (Regulation) Act, 1995, accessible at Appendix [.] or [35] FAQs accessible at Appendix [.] or [36] Section 67C of Information Technology (IT) Act 2000 refers to preservation and retention of information by intermediaries. It states the following: (1) Intermediary shall preserve and retain such information as may be specified for such duration and in such manner and format as the Central Government may prescribe, (2) any intermediary who intentionally or knowingly contravenes the provisions of sub-section (1) shall be punished with an imprisonment for a term which may extend to three years and also be liable to fine.

1 view0 comments
Recent Articles

Subscribe to Our Newsletter

Thanks for submitting!

bottom of page