top of page

Cast Adrift without Safe Harbor: The Risks of Ignoring IT Act Protections (PART 1)

The safe harbour provisions protect the enterprises and people who provide an infrastructure and act as intermediaries from the liabilities from the acts committed by the third parties who use this infrastructure for their own benefit in the online world. For example, the Cloud Service Providers (the, ‘CSPs’) are not held liable for the illegal data which is stored on their servers, the Internet Service Providers (the, ‘ISPs’) are not held liable for the unlawful acts committed by their subscribers, the e-commerce marketplaces are not held liable for the spurious goods sold by the sellers on their platform, and social media platforms are not held liable for the defamatory content on their platforms.

These safe harbour safeguards emanate from Information and Technology Act, 2000 (the, ‘IT Act’)[1] and the corresponding rules. The safe harbour rules have evolved considerably since their enactment either through legislative/ administrative enactments and/or notifications, or by judicial interpretations.

Initially, when the IT Act was enacted, it did not contain any safe harbour provision. Initially, the scope and ambit of the definition of the ‘intermediaries’ was very narrow and was restricted only to those entities who on behalf of another person receives, stores or transmits any electronic message or provides any service with respect to that message.[2] The intermediaries which were covered in this narrow scope were only protected from the offences under the IT Act.[3]

The limited protection extended to these entities came into light in 2004 when a CD containing an obscene clip was posted to be sold by a seller on an online platform called bazee.com[4]. This resulted in both the seller (Mr. Ravi Raj) and the CEO of the platform (Mr. Avinash Bajaj) being arrested and both were charged with the same offense. The lacuna in law was imminent, a platform/intermediary could be implicated for the material it did not generate but only provided a platform to publish/circulate. This threatened the future of ecommerce ecosystem. Therefore, the IT Act was amended and Information and Technology (Amendment) Act, 2008 (the, ‘ITA Act’) was introduced, which widened the scope of ‘intermediaries’ and the Safe Harbour protection.

The Digital India programme has now become a movement which is empowering common Indians with the power of technology. The extensive spread of mobile phones, Internet etc. has also enabled many social media platforms to expand their footprints in India. Common people are also using these platforms in a very significant way. Some portals, whichpublish analysis about social media platforms and which have not been disputed, have reported the following numbers as user base of major social media platforms in India:

  • WhatsApp users: 53 Crore

  • YouTube users: 44.8 Crore

  • Facebook users: 41 Crore

  • Instagram users: 21 Crore

  • Twitter users: 1.75 Crore

These social platforms have enabled common Indians to show their creativity, ask questions, be informed and freely share their views, including criticism of the Government and its functionaries. The Government acknowledges and respects the right of every Indian to criticizeand disagree as an essential element of democracy. India is the world’s largest open Internet society and the Government welcomes social media companies to operate in India, do business and also earn profits. However, they will have to be accountable to the Constitution and laws of India.

Proliferation of social media on one hand empowers the citizens then on the other hand gives rise to some serious concerns and consequences which have grown manifold in recent years. These concerns have been raised from time to time in various forums including in the Parliament and its committees, judicial orders and in civil society deliberations in different parts of country. Such concerns are also raised all over the world and it is becoming an international issue.

Of late some very disturbing developments are observed on the social media platforms. Persistent spread of fake news has compelled many media platforms to create fact-check mechanisms. Rampant abuse of social media to share morphed images of women and contents related to revenge porn have often threatened the dignity of women. Misuse of social media for settling corporate rivalries in blatantly unethical manner has become a major concern for businesses. Instances of use of abusive language, defamatory and obscene contents and blatant disrespect to religious sentiments through platforms are growing.

Over the years, the increasing instances of misuse of social media by criminals, anti-national elements have brought new challenges for law enforcement agencies. These include inducement for recruitment of terrorists, circulation of obscene content, spread of disharmony, financial frauds, incitement of violence, public order etc.

It was found that currently there is no robust complaint mechanism wherein the ordinary users of social media and OTT platforms can register their complaint and get it redressed within defined timeline. Lack of transparency and absence of robust grievance redressal mechanism have left the users totally dependent on the whims and fancies of social media platforms. Often it has been seen that a user who has spent his time, energy and money in developing a social media profile is left with no remedies in case that profile is restricted or removed by the platform without giving any opportunity to be heard.

Evolution of Social Media and Other Intermediaries:

  • If we notice the evolution of social media intermediaries, they are no longer limited to playing the role of pure intermediary and often they become publishers. These Rules are a fine blend of liberal touch with gentle self-regulatory framework. It works on the existing laws and statues of the country which are applicable to content whether online or offline. In respect of news and current affairs publishers are expected to follow the journalistic conduct of Press Council of India and the Programme Code under the Cable Television Network Act, which are already applicable to print and TV. Hence, only a level playing field has been proposed.

Rationale and Justification for New Guidelines:

These Rules substantially empower the ordinary users of digital platforms to seek redressal for their grievances and command accountability in case of infringement of their rights. In this direction, the following developments are noteworthy:

  • The Supreme Court in suo-moto writ petition (Prajjawala case) vide order dated 11/12/2018 had observed that the Government of India may frame necessary guidelines to eliminate child pornography, rape and gangrape imageries, videos and sites in content hosting platforms and other applications.

  • The Supreme Court vide order dated 24/09/2019 had directed the Ministry of Electronics and Information Technology to apprise the timeline in respect of completing the process of notifying the new rules.

  • There was a Calling Attention Motion on the misuse of social media and spread of fake news in the Rajya Sabha and the Minister had conveyed to the house on 26/07/2018, the resolve of the Government to strengthen the legal framework and make the social media platforms accountable under the law. He had conveyed this after repeated demands from the Members of the Parliament to take corrective measures.

  • The Ad-hoc committee of the Rajya Sabha laid its report on 03/02/2020 after studying the alarming issue of pornography on social media and its effect on children and society as a whole and recommended for enabling identification of the first originator of such contents.

Consultations:

  • The Ministry of Electronics and Information Technology (MEITY) prepared draft Rules and invited public comments on 24/12/2018. MEITY received 171 comments from individuals, civil society, industry association and organizations. 80 counter comments to these comments were also received. These comments were analyzed in detail and an inter-ministerial meeting was also held and accordingly, these Rules have been finalized,[5] and are now called the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021[6].


India is the world's largest open Internet society. The Digital India has enabled the empowerment of the common man. The extensive spread of mobile phones, Internet etc. has also enabled many platforms to expand their footprints in India.

These platforms are associated with a bevy of benefits and risks; and they give rise to new concerns, which have been raised from time to time in various forums including in the Parliament of India and its committees, judicial orders and in civil society deliberations in different parts of India. Prime among them is abuse of social media to share morphed images of women and contents related to revenge porn have often threatened the dignity of women and therefore, it is important to prevent the dissemination of such content.

The Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules of 2021 (hereinafter referred to as 'the Rules') has been enacted by the Central Government under the powers conferred to it by Sections 69A(2), 79(2)(c) and 87 of the Information Technology Act, with thorough coordination with the Ministry of Electronics and Information Technology and the Ministry of Information and Broadcasting. The formulation of these Rules is in response to the growing criticism against the government, while it recognizes the right to criticize and disagree as an essential element of democracy. It aims to provide a robust complaint mechanism for social media and OTT platform users to address their grievances, a mechanism earlier inexistent.

The proposed framework has been quoted to be progressive, liberal and contemporaneous, as it lays a special emphasis on the protection of women against the progression of sexual offences on social media. It emphasizes on the need of social media intermediaries and online content providers, whether for entertainment or informative purposes, to strictly comply with the Constitution and domestic laws of India. It extends its approach to instill a sense of accountability against misuse and abuse by social media users and is the first of its kind to bring social media use under the regulatory framework of the Information Technology Act.

These rules have been in light of the recent run-down on the OTT platforms by the government, which have been actively, rather vehemently, lobbying for stronger and more stringent regulations in place. However, contrary to such a view, as per the PIB, the Rules have been formulated keeping in mind the importance of free speech and journalistic and creative freedoms. Regardless of the political connotations, the enactment of these Rules puts India at par with international regimes on digital media regulation, providing a more comprehensive and holistic protection to its users.

Obligation of Due Diligence on Intermediaries

General Guidelines for All Intermediaries

These general guidelines extent their scope over all intermediaries, including social media intermediaries as well as significant social media intermediaries. Rule 2(1)(z) omits from the scope of social media intermediaries those intermediaries that facilitate commercial or business transactions, provide access to networks, search-engines and certain other types as specified.

Classification of Intermediaries - The 2011 Rules regulated "intermediaries" without any classification or distinction between said intermediaries and in terms of their user base or the content hosted on their platform; however, the 2021 Rules classify the regulated entities into the following types:

  1. Social media intermediary with less than 50 lakh registered Indian users;

  2. Significant social media intermediary ("SSMI") with more than 50 lakh registered Indian users;

  3. Publisher of news and current affairs content including news aggregators;

  4. Publisher of online curated content which covers all online streaming platforms including Over-the-Top ('OTT') platforms.

Due Diligence: Rule 4 enlists certain due diligence obligations of an intermediary, which include the duty to publish their rules and regulations, privacy policies and user agreements for access, either on its website and/or application, to allow its users to access the same. The material so published must crystalize the user's responsibility not to "host, display, upload, modify, publish, transmit, store, update or share"[7] any form of information which:

  1. Belongs to another person

  2. Is defamatory, obscene, pornographic, pedophilic, invasive of one's privacy, libelous, or inconsistent to the laws of the land

  3. Is dangerous for minors

  4. Results in the infringement of any intellectual property right

  5. Is deceiving or misleading regarding the origin of the message

  6. Impersonates another person

  7. Hampers the integrity, defense, security or sovereignty of the country, friendly relations with foreign states, public order or results in the incitement of any cognizable offence

  8. Contains any software virus or any program designed to corrupt or interrupt the functionality of any computer resource

  9. Or is patently false and untrue, regardless of its form is published or in order to mislead or harass a person.

Rule 4 also enunciates with regard to the umbrella of the safe harbour provisions that are articulated under Section 79 of The Information Technology Act, 2000, i.e. if the intermediaries observe legal due diligence on their part, they will be entitled to safe harbour protections from liability in relation to any third-party information, data, or communication link made available or hosted by them insofar as they also meet the content neutrality conditions under the Act. The due diligence to be observed by intermediaries includes:

  1. informing users about rules and regulations, privacy policy, and terms and conditions for usage of its services;

  2. blocking access to unlawful information within 36 hours upon an order from the Court, or the government;

  3. and retaining information collected for the registration of a user for 180 days after cancellation or withdrawal of registration. Intermediaries are required to report cybersecurity incidents and share related information with the Indian Computer Emergency Response Team;

  4. No such order is required when a complaint is received about sexual imagery wherein the intermediary must take down such content within 24 hours of the receipt of the complaint; Intermediaries are also required to provide any information under their control or possession, within 72 hours of receipt of an order in this regard, to a government agency for investigation, detection or prevention of cyber security incidents or offences under any law.

Transparency - An SSMI will be subjected to a greater standard of transparency and accountability towards their users. They shall have to fulfil by publishing six-monthly transparency reports, where they have to outline how they deal with requests for content removal, how they deploy automated tools to filter offensive content, and so on. Other requirements under this transparency principle include giving notice to users whose content has been disabled, allowing them to contest such removal, etc.

Chief Compliance Officer - An SSMI is further required to be in compliance with additional obligations including the appointment of a chief compliance officer who will be liable for the failure of an intermediary to observe due diligence and a nodal contact person (who should be available 24*7) to ensure compliance with orders of courts and to coordinate with law enforcement agencies and is also required to establish a physical contact address in India.

The Delhi High Court has established guidelines in accordance with the IT Rules and its provisions aforementioned which are to be followed by courts while dealing with cases related to the removal of objectionable content from the internet to ensure removal of such offensive material at the earliest, along with limiting access to and redistribution of said material.

In the case of X v. Union of India and Others[8] the Court had to deal with a matter in which a woman had her photographs and images that she had posted on her private social media accounts on 'Facebook' and 'Instagram' have been taken without her knowledge or consent and have been unlawfully posted on a pornographic website by some miscreants and despite court orders, the content could not be removed in entirety from the world-wide-web and "errant parties merrily continued" to re-post and redirect the same to other sites.

Justice Anup Jairam Bhambhani, in his judgement began with a poignant remark, i.e. "The internet never sleeps ; and the internet never forgets!" The Court relied on various judgements across different jurisdictions to paint a more coherent picture of the state of governance when it comes to regulation of offensive content. It relied on the judgement in X. vs. Twitter Inc.[9] where the Supreme Court of New South Wales stated the following: "Where a third party such as Twitter comes into possession of confidential information and is put on notice of the character of the information and the circumstances in which it was unlawfully obtained, it becomes subject to an equitable obligation of confidence. It is liable to be restrained from publishing the information." "...there is a public interest in making the proposed orders; in demonstrating that wrongful conduct will be remedied as effectively as can be achieved; and in ensuring that the plaintiff's rights are respected to the extent that it is possible to do so. The plaintiff should not be left without a remedy."

Similarly, in Google Spain SL, Google Inc. vs. Agencia Española de Protección de Datos (AEPD), Mario Costeja González[10] it was held that"the operator of a search engine is obliged to remove from the list of results displayed following a search made on the basis of a person's name links to web pages, published by third parties and containing information relating to that person, also in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful."

And finally, the High Court, referred to Eva Glawischnig-Piesczek vs. Facebook Ireland Limited[11] in which it was categorically stated that "in order to ensure that the host provider at issue prevents any further impairment of the interests involved, it is legitimate for the court having jurisdiction to be able to require that host provider to block access to the information stored, the content of which is identical to the content previously declared to be illegal, or to remove that information, irrespective of who requested the storage of that information."

The Court followed the following judgements with a reference to judicial decisions within India itself, including Shreya Singhal vs. Union of India[12], ABC vs. DEF & Ors.[13] and YouTube LLC & Anr. vs. Geeta Shroff[14]. It relied on the judgement laid down under Swami Ramdev & Ans. vs. Facebook, Inc. & Ors.[15] by the Delhi High Court itself, saying: "The removal and disablement has to be complete in respect of the cause over which this Court has jurisdiction. It cannot be limited or partial in nature, so as to render the order of this Court completely toothless."

With the help of the aforementioned judicial decisions, the Delhi High Court proceeded to lay down the following guidelines for removal of offensive content

  1. the court may issue a direction to the website or online platform on which the offending content is hosted, to remove such content from the website or online platform, forthwith and in any event within 24 hours of receipt of the court order. Since this timeframe is mandated in Rule 3(2)(b) of the 2021 Rules read with Rule 10 of the 2009 Rules for other similar kinds of offensive content;

  2. A direction should also be issued to the website or online platform on which the offending content is hosted to preserve all information and associated records relating to the offending content, so that evidence in relation to the offending content is not vitiated;

  3. A direction should also be issued by the court to the search engine(s) as the court may deem appropriate, to make the offending content non-searchable by 'de-indexing' and 'dereferencing' the offending content;

  4. The directions issued must also mandate the concerned intermediaries, whether websites/online platforms/search engine(s), to endeavour to employ pro-active monitoring by using automated tools, to identify and remove or disable access to any content which is 'exactly identical' to the offending content;

  5. Directions should also be issued to the concerned law enforcement agency/ies, such as the jurisdictional police, to obtain from the concerned website or online platform all information and associated records, including all unique identifiers relating to the offending content such as the URL (Uniform Resource Locator), account ID, handle name, Internet Protocol address and hash value of the actual offending content along-with the metadata, subscriber information, access logs and such other information;

  6. The court must direct the aggrieved party to furnish to the law enforcement agency all available information that the aggrieved party possesses relating to the offending content;

  7. The aggrieved party should also be permitted, on the strength of the court order passed regarding specific offending content, to notify the law enforcement agency to remove the offending content from any other website, online platform or search engine;

  8. The court may also direct the aggrieved party to make a complaint on the National Cyber-Crime Reporting Portal

  9. Most importantly, the court must refer to the provisions of section 79(3)(a) and (b) read with section 85 of the IT Act and Rule 7 of the 2021 Rules, whereby an intermediary would forfeit the exemption from liability enjoyed by it under the law if it were to fail to observe its obligations for removal/access disablement of offending content despite a court order to that effect.


Notifications provided to the User: Apart from merely publishing such obligations, the intermediary must notify the user that non-compliance with the above mentioned may result in the termination of their access or usage rights.[16] Also, these rules and regulations, privacy policies or user agreements may be subject to periodical amendments, which ought to be notified to the users in due time.[17]

Enforcement Action to be Undertaken: Intermediaries are amenable to halt the hosting, storage or publication of any information prohibited by law, in the interest of national sovereignty, integrity, security, etc., as prescribed under Rule 4(1)(d), on the knowledge of the same through an order of a court of competent jurisdiction or a Government notification. The intermediary has been provided a strict time limit of thirty-six (36) hours to remove or restrict access to such information. Following the removal of such information, the evidence collected must be preserved for one hundred and eighty (180) days for investigative purposes.[18] Further, the process has been prescribed under Rule 4, with respect to the intermediaries' duty to fully cooperate with Government and law enforcement agencies. In order to address the complaints raised by users or victims, the intermediaries must appoint a Grievance Officer, whose details must be made public, who would acknowledge and resolve such complaints within a period of one month.[19]

Additional Compliance Measures for Significant Social Media Intermediaries

Due Diligence: A peculiar feature about the Rules is that it creates a distinction between social media intermediaries and significant social media intermediaries. The demarcation is based on the user size and once it has been defined through the notification of the Government, it would act as the threshold between the two.[20] The reason behind this is clarified through Rule 5 which provides additional compliance measures for significant social media intermediaries due to the large volume of users and content that they process. Barring the criteria of the user size, the Government can prescribe the provisions of Rule 5 on any other intermediary as well through a notification.[21] The following due diligence ought to be observed by such intermediaries within three months of publication of these rules:[22]

  1. Appointment of a Chief Compliance Officer, assuming the responsibility to ensure compliance and oversight of the functions of significant intermediaries

  2. Appointment of a nodal person of contact, who would act as a link between law enforcement agencies

  3. Appointment of a Resident Grievance Officer, whose responsibilities would lay parallel to that of the Officer appointed under Rule 4(1)(n)

  4. Publishing the compliance report on a periodical basis of six months, containing the details and contents of complaints handled and information removed or interrupted by intermediaries in pursuit of their monitoring activities

In order to facilitate the processing of complaints, with respect to the violations mentioned under this Rule, an appropriate mechanism shall be developed by the significant intermediary under Rule 5(6). In such a process, the intermediary must notify the complainant of the extent of action taken.

First Originator: Rule 5(2) provides an additional responsibility on significant social media intermediaries involved in providing messaging services to assist the law enforcement agencies to identify and track the first originator of any contentious or problematic information. This can only be executed through an order of a competent court or the Competent Authority under Section 69 of the Act. This power can only be exercised in order to curb any offence threatening the integrity or security of the State, inciting the commission of rape, child sexual abuse or other grievous offences. However, this may not be resorted to on the availability of less intrusive means and must be employed as a measure of last resort.

Special Measures for Sexual Offences: Other means have been provided to significant intermediaries in order to curb the commission or instigation of the offences of rape or child sexual abuse, such as under Rule 5(4). Such intermediaries must deploy certain technology-based measured to promptly identify any material that may depict or simulate such offences. This must be done in the absence of any bias or discrimination, with the highest regard to privacy and free speech.

Voluntary Verification of Users: Users of significant social media intermediaries must be provided a facility to voluntarily verify themselves under Rule 5(7). The verification can take place on the basis of their number or account and would provide the user with a visible mark of verification. This method to regulate the users has been undertaken to eliminate the misuse of these services and provides a greater level of surveillance over their activities.

Notification to Originators on Removal of Information: In the situation that a significant intermediary has removed or restricted access to any information or data, they must ensure that the originator is made aware of the same, including the grounds for such action, after providing them a reasonable opportunity. Further, Rule 5(8) provides that this process must be overseen by the Resident Grievance Officer.

Procedure and Safeguards for Digital/Online Media

Digital media, as defined under Rule 2(1)(k), represents any digitized content transmittable through the internet or other networks and includes the same content as stored or transmitted by intermediaries as well as publishers of news or online curated content. It includes:[23]

  • news and current affairs publishers,

  • intermediaries enabling the transmission of news and current affairs,

  • online curated content publishers, and

  • intermediaries enabling the transmission of online curated content,

which operate in India and conduct their business activities by making content available in India, targeting Indian users.[24] However, these following rules applicable to such entities would only come into force after the lapse of a three month period from the publication of these rules.[25]

Grievance Mechanism

An Online Grievance Portal, established by the Ministry within three months of the commencement of the rules, would act as the central repository for accepting and disposing of grievances, with respect to the Code of Ethics, as per Rule 9(1). In pursuance to this, the Rules provides a three-tiered grievance mechanism, consisting of:

  1. Level I: Self-regulation by the applicable entity

  2. Level II: Self-regulation by the self-regulating bodies of the applicable entities

  3. Level III: Oversight mechanism by the Central Government

Level I: Under Rule 9(4), the applicable entity would be informed of the grievance and encouraged to address it themselves, while keeping the complainant and the Grievance Portal in the loop. In exercise of such a power, the applicable entity is required to appoint a Grievance Redressal Officer, who would be governed by the Code of Ethics.[26] The applicable entity is to classify the online curated content that it transmits, granting it with an appropriate certificate, as per the Schedule.[27] The certification may take place on the basis of the content, its impact, target audience, etc. and must be displayed in a conspicuous place, allowing the users to be notified of the same before accessing the content[28].

Level II: If the procedure under the first level does not take place within 15 days, the matter would escalate with the appeal of the complaint to a Self-regulating Body, of which the entity is a member. Such bodies ought to be independently constituted by such entities or their association and headed by a retired judge of the Supreme Court or a High Court.[29] This body would provide guidance on the Code of Ethics and decide on the grievances passed on from the first level. For the enforcement of such decisions, a self-regulating body can issue warnings, censoring, require an apology, reclassify ratings of online curated content, make appropriate modifications in the content descriptor, or refer the matter to the Oversight Mechanism under Rule 12.[30]

Level III: In case the Self-regulating bodies fail to offer any solace to the complainant, they have the last resort of the Oversight Mechanism of the Central Government for a resolution, under Rule 12. Such a measure would be coordinated by the Ministry, who would constitute an Inter-Departmental Committee for addressing grievances, under Rule 13. This committee would consist of representatives from the Ministry of Information and Broadcasting, Ministry of Women and Child Development, Ministry of Law and Justice, and other relevant Ministries as mentioned.[31] The purpose of this Committee is to obtain a holistic and all-encompassing view on the violations under the Rules. The violations may arise through grievances of Level I and II, on a suo motu basis, or those referred by the Ministry.[32] Similar powers granted under Rule 11(5) would be applicable, including the right to initiate the procedure under Rule 14. This allows the Committee to take action to ascertain the creator of violative content and block the same content.

Code of Ethics

The underlying thread that binds the whole Rules together is the Code of Ethics, mentioned under the Appendix[33]. This spans over News and Current Affairs, Online Curated Content and Advertisements.

Online Curated Content

Providing a comprehensive and an in-depth take on regulating online curated content, the Code makes reservation for the multi-racial and multi-religious sphere of India, where due caution and respect ought to be paid in the depiction of their activities, beliefs or practices. It classifies content on the basis of its target audience, assigning a:

  • 'U' rating for content suitable for children and people of all ages

  • 'U/A 7+' for content that can only be viewed by a person below the age of 7 years with parental guidance

  • 'U/A 13+' which requires parental guidance for viewers below the age of 13 years

  • 'U/A 16+' for persons below 16 years requiring parental guidance, and

  • 'A' for content solely reserved for viewing by adults

Further classifications may be made on the basis of themes and messages, violence, sex, nudity, drug and substance abuse, etc. These classification ratings must be displayed in a conspicuous and unambiguous manner and place, allowing the user to be aware and informed. Provisions for access control mechanisms, such as parental locks, ought to be made for content classified as U/A 13+ or higher, and in spirit of the same, establishing a reliable verification mechanism of the age of the viewer for content rated 'A'.

Criticism

The introduction of the concept of tracking the first originator under Rule 5(2) has been perceived as rather contentious and worrisome. It enables significant social media intermediaries providing messaging services to allow the enforcement mechanism to access the originator of any information. This is attempted towards curbing the spread of fake news and illegal activities taking place over messaging applications. However, cyber experts fear that this would eventually result in the overriding of the end-to-end encryption, allowing for the formation of a surveillance state. This may result in a major privacy breach, which most messaging applications wear on their sleeve as a badge of honor. The authority of tracking the originator can also be enforced in order to prevent or investigate into an offence relating to the sovereignty, integrity and security of the State. What the Rules fail to identify is the unimaginable scope of misuse of such a wide and discretionary power.

In addition to this, members of the media fraternity emphasize on the Rule's implementation to dissolve the freedom of speech. While analyzing the grievance redressal mechanism, the executive has been authorized to rule over the suitability of content published by the media through the Oversight Mechanism, an unprecedented move that may be perceived as ultra vires of the Constitution. The inter-ministerial committee of bureaucrats have been granted the authority to adjudicate on matters relating to free speech and journalistic freedoms, which may in turn prove not to be conducive for the same.

[1] The IT Act can be accessed through Appendix [.] or https://bit.ly/3G2xhTC [2] IT Act, initial enactment, section 2(w) [3] Previous text of Section 79 of the IT Act was :- “Section79 : Network service providers not to be liable in certain cases:- For the removal of doubts, it is hereby declared that no person providing any service as a network service provider shall be liable under this Act, rules or regulations made thereunder for any third party information or data made available by him if he proves that the offence or contravention was committed without his knowledge or that he had exercised all due diligence to prevent the commission of such offence or contravention. [4] Avnish Bajaj v. State, Para 6, MANU/DE/1357/2004 (Delhi High Court) accessible at Appendix[.] [5] Appendix [.] or https://www.meity.gov.in/writereaddata/files/Addendum1_Public_comments_on_draft_intermediary_guidelines.pdf [6] Ibid [7] Rule 4(1)(b), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [8] W.P.(CRL) 1082/2020 & Crl. M.A. Nos.9485/2020, 10986-87/2020. [9] (2017) NSWSC 1300. [10] Case C-131/12; ECLI:EU:C:2014:317. [11] Case C-18/18; ECLI:EU:C:2019:821. [12] (2015) 5 SCC 1. [13] CS(OS) No.160/2017. [14] 2018 SCC OnLine Del 9439 [15] 2019 SCC OnLine Del 10701. [16] Rule 4(1)(c), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [17] Rule 4(1)(f), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [18] Rule 4(1)(g), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [19] Rule 4(1)(n), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [20] Rule 2(1)(y), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [21] Rule 6(1), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [22] Rule 5(1), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [23] Rule 7(1), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [24] Rule 7(2), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [25] Rule 7(4), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [26] Rule 10(2) & (3), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [27] Rule 10(4), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [28] Rule 10(6), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [29] Rule 11 (1) & (2), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [30] Rule 11(5), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [31] Rule 13(1), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [32] Rule 13(3), Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021. [33] Appendix, code of ethics, the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021

0 views0 comments
Recent Articles

Subscribe to Our Newsletter

Thanks for submitting!

bottom of page