top of page

Safe harbour provisions: IT Act, platform immunity; Part 3.1

UNDERSTANDING THE JOURNEY OF PLATFORM IMMUNITY BEFORE THE INFORMATION TECHNOLOGY (INTERMEDIARIES GUIDELINES) RULES OF 2011 REVISED WITH THE NEW IT RULES, 2021 by SFLC[1]*


When Internet platforms were growing their business in the United States, they were considered bastions of free speech and were given ‘safe-harbour’ against third party content to pro-mote innovation on the condition that they will self regulate their platforms for illegal content.[2] Over time, these companies acquired millions of users around the world and began centralizing power by subsuming smaller businesses within themselves.[3] For example, Facebook had ac-quired both WhatsApp and Instagram by 2014 to consolidate its business into a social media and private communications behemoth.[4]


As these platforms grew, it became increasingly difficult for them to self-regulate the large volume of content flowing through their pipelines. The misuse of data available on platforms, coupled with the growing menace of disinformation and misinformation online, increasing calls for imposition of greater liability on intermediaries for third party copyright infringement, access and assistance to law enforcement agencies and the rampant harassment and abuse of women and other vulnerable groups have highlighted the failures of these tech companies in regulating their channels. Not only did companies fail to police their platforms, they developed business models that directly conflicted with any such objective. Their business of advertisement sales came to be based on a continuous flow of behaviour data acquired by monitoring users of their platforms. The profitability of this business depends on maximizing the amount of time users spend on a platform. So, the goal became algorithmic recommendation of arresting content that sticks eyeballs to the platform and causes behaviour that can be used to profile readers for advertisers. Thus the real objective of the system is almost directly in opposition to the perceived social good the platforms are sup-posed to further.


By monitoring users’ reading and behaviour, the platform companies ceased to be the neutral conduit for “user-generated content” that justified their safe-harbour immunity in the first place. Collecting and analyzing all their users’ behaviour, and aggregating what they captured themselves with all the other personally-related information they could buy, the platforms ceased to perform the task of democratizing expression: that became a by-product of their real effort, which was –– in the phrase originally adopted by the US national security agencies –– Total In-formation Awareness.



The platforms that were granted safe harbour protections were expected to police their platforms but have failed miserably to do so. Victims of online abuse, harassment have no leverage to insist that platforms respond to their complaints. Companies have failed to establish mechanisms to address complaints swiftly and continue to play the game of “Lexi Loci Server” claiming they only have a “sales offices” in India. On the other hand, governments more often than not have used this failure, ambiguity and secrecy to enact overtly broad legislation that facilitate censorship by proxy and stifle innovation.


Countries around the world have called for greater regulation of their activities. In 2017 Germany enacted a law for the takedown of il-legal content. As of the date of publication of this report in 2019, an anti-encryption law has emerged in Australia, the proposed EU copy-right directive requires proactive content filtering, and the draft EU terrorist content regulation requires takedowns within an hour of content being flagged.



Internet platforms have systematically failed to protect user rights in certain, particularly egregious cases. In India, per certain estimates, 33 people were killed in 69 incidents of mob violence between January 2017 and July 2018, their “lynchings” being linked to messages or “fake news” being spread on WhatsApp, the Facebook-owned messaging platform.[5]


In 2018, Facebook was used to spread anti-Rohingya propaganda for inciting murders, rapes and the largest forced human migration in recent history.[6] Most of the 18 million Internet users in Myanmar consider Facebook to be the Internet. It was reported that members of the Myanmar military were the prime operatives behind the systematic campaign, exploiting the wide reach of Facebook.[7] The social media platform was accused of doing little to prevent the harmful content from proliferating on its platform. Even though, Facebook eventually deactivated the accounts of the military personnel, millions of sham accounts went undetected.[8]


In the United States, the role of platforms like Facebook and Twitter in the 2016 presidential election has given way to society wide skepticism about tech companies and invited a kind of backlash that was unimaginable a few years ago.[9] Senators Mark Warner (D-VA) and Amy Klobuchar (D-MN) introduced the Honest Ads Act following the use of Facebook advertisements by Russian provocateurs, that would require platforms to make “reasonable efforts” to bar foreign nationals from purchasing certain categories of political advertisements during campaign.[10]


During the media blitzkrieg following the Cam-bridge Analytica scandal and before his US Congressional hearing, Mr. Zuckerberg in an inter-view to CNN said, “I actually am not sure we shouldn’t be regulated. I think, in general, technology is an increasingly important trend in the world. I think the question is more what is the right regulation rather than ‘yes or no’ should we be regulated?”[11]


Intermediary liability – the focus of this report – illustrates how lawmakers were forced by the Internet to conceptualize and implement new approaches to an old legal construct i.e. vicarious liability. Intermediaries like blogging platforms, discussion boards and social media sites that of-fer platforms for users to publish self-generated content, search engines that index and provide access to user-generated content, online shop-ping sites that allow users to trade in products/ services and so on raised the question: who is to be held liable in the event that some products, services, or content hosted by these intermediaries were found to be unlawful?


The answer to this question has been different in different jurisdictions.


While some jurisdictions like Thailand and China hold intermediaries strictly liable for user-generated content, others like the European Union and the United States grant them conditional immunity from liability, where compliance with certain conditions specified under relevant laws immunizes intermediaries from the consequences of unlawful user-generated content. India’s own Information Technology Act, 2000 was amend-ed in 2008 to introduce such a safe-harbour regime, and the Information Technology (Intermediaries Guidelines Rules), 2011 specified certain due-diligence criteria that intermediaries were to observe in order to qualify for immunity. The initial version of this regime was plagued by several problems including ambiguity in prohibited content and forced adjudication by intermediaries, but much of these problems were resolved by a historic judgment of the Supreme Court of India in 2015 in the matter of Shreya Singhal v. Union of India. Subsequently, on December 24, 2018, the Ministry of Electronics and Information Technology issued Draft Rules proposing to amend the 2011 Rules to include prescriptive obligations on the intermediary such as enabling traceability of the originator of the information, deploying automated tools for proactive monitoring of content and incorporation under the Companies Act. The reason for this, as provided by MeitY was “Misuse of Social Media and spreading Fake News”.


In India, the regulation of intermediaries are spread out across various laws and sub-legislations. Apart from the IT Act, India’s copyright law institutes a notice-and-takedown regime for intermediaries. Sector specific regulation such as data localisation requirements as per the rules of the Reserve Bank of India for fintech players and license requirements for telecom and Internet service providers also apply. In addition to this, the courts have interpreted law with substantial variance, making the intermediary liability landscape of India complicated enough to cause confusion to tech companies.


Due to the lapse in judgment of intermediary platforms in various situations as highlighted above, sovereign states around the world are de-manding more accountability from them for user generated content on their portals. Nation states while imposing regulations on Internet companies must be mindful that such rules should not be over-broad resulting in hampering basic digital rights such as privacy and free speech in the online word.



WHAT IS INTERMEDIARY LIABILITY?

Defining an intermediary

An intermediary in the context of the Internet can be understood as an entity that acts as a facilitator of the flow of data across the vast and complex synapses of the Internet. While the actual functions of intermediaries are dynamic and often not clear-cut, they can broadly be seen as falling into one of two categories i.e. conduits for data traveling between nodes of the Inter-net, hosts for such data.[12] An Internet intermediary could therefore refer to Telecom Service Providers (TSP) that supply network infrastructure like optic fiber cables and spectrum band-width over which Internet data is transmitted, Internet Service Providers (ISP) that utilize this infrastructure to offer Internet connectivity to the public, web-hosting platforms that provide servers on which Internet data is stored, search engines that sort through and index petabytes of data for easy retrieval, and the myriad online services that provide ways for end-users to lever-age the power of the Internet for the efficient conduct of activities like commerce, governance, education, entertainment, and social networking to name a few. In other words, intermediaries play very crucial roles in the functioning of the Internet. Owing to the complex and diverse nature of functions performed by intermediaries, significant variations can be seen in global and national efforts at formally defining the term. The Organization for Economic Co-operation and Development (OECD) in April 2010 pro-posed that “Internet intermediaries” be defined as follows:[13]


“Internet intermediaries bring together or facilitate transactions between third parties on the Internet. They give access to, host, transmit and index con-tent, products and services originated by third parties on the Internet or provide Internet-based services to third parties.”


The OECD also identified the following as falling within the scope of this definition, though it was also careful to leave room for future expansion: ISPs, data processing and web-hosting providers, search engines, e-commerce platforms, Internet payment systems, and participative networking platforms. This definition was also cited by the United Nations Educational, Scientific and Cultural Organization (UNESCO) in a 2014 report on Internet freedoms.[14]


Some national jurisdictions on the other hand, have chosen to not attempt defining the term “intermediary” as such in relevant laws. Instead, broader alternate terms like “information society services”[15] and “interactive computer services”[16] are employed, and intermediary regulations are incorporated into law without referencing the term “intermediary”.


The above being said, this report examines intermediary liability primarily in the context of Indian law. As such, the best place to look to under-stand the term “intermediary” for the purposes of this report is the IT Act, specifically Section 2(1)(w), which defines the term in some detail.


Section 2(1)(w) reads:


“Intermediary, with respect to any particular electronic records, means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes telecom service providers, network service providers, Internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, on-line-market places and cyber cafes.”[17]

According to Section 2(1)(w) of the IT Act therefore, an intermediary is any person who receives, stores or transmits an electronic record on behalf of another person or provides any service with respect to that record.[18] The Section then clarifies that the term includes telecom service providers, network service providers, Internet service providers, web hosting service providers, search engines, online payment sites, online auction sites, online marketplaces and cyber cafes.[19] This list is non-exhaustive and Section 2(1)(w) also covers entities such as social media websites, blogging platforms, message boards, consumer review websites and so on. In other words, virtually any website that features user-generated content and a large number of Internet service providers fall within the definition of an intermediary under Section 2(1)(w) of the IT Act.


User-generated content and liability

“Intermediary liability”, to put it simply, refers to the extent of liability that an intermediary stands to incur due to the non-permissibility under law of content they deal in. Seeing how intermediaries neither create nor modify content, the predominant consensus has been that it would be inequitable to hold them strictly accountable for unlawful user-generated content. Users of intermediary services are the true content creators and as such, it has generally been felt that they should be the ones made to answer for the illegality of content hosted or transmitted on intermediary platforms unless intermediaries have meaningful degrees of editorial control. However, some jurisdictions such as China and Thailand have opted to see things differently and maintained that it is the responsibility of platform providers i.e. intermediaries to ensure that the content they host or transmit remains within the confines of legal permissibility.


Based on these divergent viewpoints, three broad models of intermediary liability have emerged globally, as pointed out by Article 19 in their 2013 report titled “Internet Intermediaries: Dilemma of Liability”.[20] These are:


a The strict liability model: Intermediaries are held unconditionally liable for user-generated content. Intermediaries are effectively required to monitor content in order to comply with the law; if they fail to do so, they face a variety of sanctions, including the withdrawal of their business license and/or criminal penalties. Examples include Thailand and China.


b The safe-harbour model: Intermediaries are given conditional immunity from liability arising out of user-generated content i.e. if they comply with certain requirements laid out under law. This model can be further divided into:


a The vertical model: Liability is determined according to the type of content at issue. No distinctions are made as to the type of service provided by intermediaries e.g. hosting vs. trans-mitting.


b The horizontal model: Liability is determined according to the kind of function performed by the intermediary. Intermediaries acting only as a transmitter of content may thus be exempted unconditionally from liability whereas those acting as hosts may be held to more stringent standards. The latter may forfeit immunity if they do not expeditiously remove unlawful content on being notified.


The safe-harbour model is also characterized by the existence of “notice-and-takedown” processes, which are legally prescribed procedures that clearly outline how content takedown requests must be received and processed by intermediaries. Intermediaries may further be encouraged to institute some form of technology-based or self-regulatory content filters so as to prevent the publication of unlawful content. The EU e-commerce Directive, US Digital Millennium Copyright Act and the Indian IT Act are legislations that employ this model of intermediary regulation.


a The broad immunity model: Intermediaries are given broad, at times conditional, immunity from liability arising out of user-generated content. Notably, intermediaries are also expressly excluded from any obligation to monitor for unlawful content. This model treats intermediaries as messengers who merely transmit content on behalf of users, rather than publishers of content. Section 230 of the Communications Decency Act is an example of this model.


Regardless of the model, almost all regulatory regimes overseeing Internet intermediaries obligate intermediaries to remove unlawful content from their platforms upon being asked to do so in accordance to applicable legal procedures. This, coupled with the fact that availability of immunity from liability is contingent in some regulatory regimes on expeditious compliance with takedown requests, means that regulators and intermediaries alike must be mindful of the impact of their actions on freedom of expression, which is a fundamental human right recognized under almost all major national and international jurisdictions. Regulators that impose ambiguous content limitations or ask intermediaries to remove content based on their own judgement while running the risk of forfeiting safe-harbour protection for non-removal of content, as well as intermediaries that over-comply with take-down requests will adversely impact freedom of expression. Google’s transparency reports shows that there has been a sharp increase in the number of content takedown requests received from governments in recent times. While Google received 1,031 such requests in the second-half of 2009, this number climbed to 15,961 in the second half of 2016, representing a fifteen-fold increase.[21] The latest report reveals that 25,534 requests were received in the first half of 2018 itself.[22] According to this report, national security is the most cited reason for takedown requests with 11,430 and 17,999 requests in the years 2016 and 2017 respectively[23]. This is followed by defamation with an increase from 3,440 to 4,257 requests in years 2016 to 2017.[24] Takedown requests on the basis of ‘Privacy and Security’ have also increased from 2404 to 2497 requests in the years 2016 to 2017.[25]

The Intermediary Liability Regime in India

Enlarging the Scope of Safe-Harbour

Protection

The Indian Government enacted the IT Act[26] to provide legal recognition to e -commerce, to facilitate electronic filing of documents with government agencies and amend other existing laws like the Indian Penal Code, 1860 and the Indian Evidence Act, 1872. This was based on the UN General Assembly adopting the Model Law on Electronic Commerce issued by the United Nations Commission on International Trade Law,[27] to which India was a signatory. According to the Statement of Objects and Reasons of the IT Act, “There is a need for bringing in suit-able amendments in the existing laws in our country to facilitate e-commerce. It is, therefore, proposed to provide for legal recognition of electronic records and digital signatures.”

At the time the IT Act was enacted, the definition of the term ‘intermediary’ was as follows:

Section 2(1)(w):

“intermediary” with respect to any particular electronic message means any person who on behalf of another person receives, stores or transmits that message or provides any service with respect to that message.

Section 79 is currently the provision that guarantees safe-harbour protection to intermediaries for third party content. Section 79 of the original Act only protected network service providers[28] from liability arising from third party content, if they proved absence of knowledge; or application of positive application of due diligence on their part to prevent commission of an offence/ contravention.[29]

Subsequently, an amendment to the IT Act in 2008[30] (“the IT Amendment Act”) made substantial changes to Section 79 (the safe-harbour provision) and the definition of intermediaries. One of the triggers for amending the IT Act in 2008, specifically for widening the protection given to intermediaries, was the MMS scandal affecting Baazee.com (at that time, a wholly owned subsidiary of Ebay Inc. USA). In this case, an MMS clip was listed on Baazee.com (an e-commerce website) which contained sexually explicit content which was being offered for sale on the website. For selling of such content on its website, Avnish Bajaj, the then Managing Director of Baazee.com. was arrested and criminally charged with provisions under the Indian Penal Code, 1860 (“the IPC”) and the IT Act, which dealt with acts of obscenity. In a petition challenging the criminal charges against him, the Delhi High Court in Avnish Bajaj v. State[31] held that a prima facie case for obscenity may be made against Baazee.com. It cannot be made against Avnish Bajaj for provisions under the IPC, but he may be charged for publishing of obscene content in electronic form as per Section 67 of the IT Act[32] (it is important to note that Baa-zee.com was not arraigned in the case as an accused). The court in its judgment had stated that owners or operators of websites that offer space for listings might have to employ content filters to prove that they did not knowingly permit the use of their website for pornographic material.[33] On an appeal made by Avnish Bajaj against the charge under Section 67 of the IT Act, the Supreme Court of India in the year 2012,[34] quashed the proceedings against him on the ground that prosecution of the Managing Director could not go ahead without arraigning the company as an accused party. Drawing parallels between the Negotiable Instruments Act, 1881 and the IT Act in terms of offence by companies and the consequent liability of its officers, the court held that vicarious liability will only arise when the company is arraigned as an accused party.[35]


The IT Amendment Act enlarged the definition of the word ‘intermediary’[36] to service providers like telecom service providers, Internet ser-vice providers, search engines, online market-places and even cyber cafes. It also widened the safe-harbour protection given to these intermediaries under Section 79[37] from only network service providers to all intermediaries and protected intermediaries from all unlawful acts rather than offences and contraventions covered under the IT Act itself. This new provision ad-opted a function based approach, wherein if the intermediary - (a) only provided access to a communication system for information made available by third parties, which is transmitted or temporarily stored/ hosted; and (b) it did not initiate the transmission, select the receiver and select/ modify the information, then it could claim protection under this provision for content made available by third parties (user generated content).

The amended provision made this safe-harbour protection available to intermediaries based on certain conditions:

I. Observance of due diligence and certain guidelines issued by the Central Government;


II. Not conspiring, abetting, aiding or inducing the commission of the unlawful act; and


III. Upon receiving ‘actual knowledge’ or being notified by the government, taking down unlawful content.


In the Report of the Expert Committee, set up by the Ministry of Information and Technology in 2005 to recommend changes to the IT Act, the rationale for amending the safe-harbour provision i.e. Section 79 was explained as to bring it in line with the EU’s Directive on e-commerce (2000/31/EC).[38]


3.2 ‘Due Diligence’ Guidelines for Attaining Safe-Harbour


After the amendment to the IT Act in 2008, which incorporated the ‘due-diligence’ requirement for intermediaries for claiming safe-harbour, the Government of India on 11th April, 2011, issued the Information Technology (Intermediaries Guidelines) Rules, 2011[39] (“the Intermediaries Guidelines”). The Intermediaries Guidelines, inter alia, brought in the following conditions, which all intermediaries had to ad-here to for their safe-harbour protection:[40]


a) Publishing rules/regulations; privacy policies; user agreements;


b) Terms and conditions to specify prohibited content- grossly harmful, harms minors, infringes intellectual property rights, contains virus (among other things).[41]


c) A strict notice and takedown process;


d) Assistance to government agencies for law enforcement;


e) A duty to report cyber security incidents to the government; and


f) Appointment and notification of a grievance officer.


According to the thirty-first report of the Parliamentary Committee on Subordinate Legislation[42], which studied the Intermediaries Guidelines, among other delegated legislation notified by the Indian Government under the IT Act, there were a number of ‘infirmities’ with the Intermediaries Guidelines, the report identified them as:

a) Ambiguous and Vague Terms: the committee recommended that to remove such ambiguity, terms which are borrowed from other laws shall be incorporated within the guidelines and un-defined terms shall be defined and inserted into the text.


b) Removal of Content by Intermediaries: the committee recommended that there is a need for clarity on the notice and takedown process and there should be safeguards to protect against any abuse during such process.


c) Reconstitution of the CRAC - the Cyber Regulations Advisory Committee: the committee recommended that the CRAC must be re-constituted. It found that the CRAC had met twice since the enactment of the IT Act in the year 2000. According to the committee, MeitY would benefit from the advise of the CRAC and it should incorporate such members who rep-resent the interests of the principally affected and who have special knowledge of the subject matter.


Unfortunately, none of the recommendations made by the Committee on Subordinate Legislation were incorporated by the government either at the time of such consultation or subsequently.



3.3 Narrowing the scope of ‘actual knowledge’

In a batch of writ petitions filed before the Supreme Court of India starting from 2012, a number of provisions of the IT Act were challenged - Section 66A (punishment for sending offensive messages), 69A (power to block websites) and 79 (safe-harbour provision) for severely affecting the fundamental right of free speech and ex-pression as guaranteed under Article 19(1)(a) of the Constitution of India. This case - Shreya Singhal v. Union of India[43] which is otherwise popularly known as the Shreya Singhal judgment, struck down Section 66A of the IT Act as un-constitutional for having a chilling effect on free speech, (Section 66A[SD1] [44] provided for punishment for sending offensive messages through communication services. It created criminal liability for sending information which was grossly offensive, inconvenient, insulting, dangerous etc.)


This was a landmark judgment in the Supreme Court’s jurisprudence as for the first time the court recognized the Indian citizen’s free speech rights over the Internet and struck down a draconian pro-vision from the IT Act. As India’s Constitution provides for ‘reasonable restrictions’ on free speech in certain circumstances [as per Article 19(2) of the Constitution], [45]the court in Shreya Singhal tried to read in the elements of Article 19(2) into Section 66A but failed to do so.


On the issue of intermediary liability, the Supreme Court read down Section 79 and held that the ‘actual knowledge’ requirement for an intermediary to take down content has to be read to mean either an intimation in the form of a court order or on being notified by the government and such requests must be restricted to the limitation listed by Article 19(2) of the Constitution. The court similarly read down the ‘actual knowledge’ requirement from the Intermediaries Guidelines which operationalised the notice and takedown mechanism under law-


“119. (c) Section 79 is valid subject to Section 79(3)(b) being read down to mean that an intermediary upon receiving actual knowledge from a court order or on being notified by the appropriate government or its agency that unlawful acts relatable to Article 19(2) are going to be commit-ted then fails to expeditiously remove or disable access to such material. Similarly, the Information Technology “Intermediary Guidelines” Rules, 2011 are valid subject to Rule 3 sub-rule (4) being read down in the same manner as indicated in the judgment.”


This marked a significant change in the intermediary liability regime in India, as previously any person could request intermediaries to take down content, if they felt it was unlawful. The law also placed intermediaries in a precarious position to adjudge the legality of content on their platforms, which directly conflicted with their status of being mere functionaries. In fact, the Supreme Court in Shreya Singhal acknowledged that intermediaries like Google and Facebook would have to act upon millions of requests for takedowns, making them the adjudicators as to which requests were legitimate according to law.[46]



The following inferences can be drawn to broadly sum-up India’s Intermediary Liability law:


a) Intermediaries need to fulfill the conditions under Section 79 of the IT Act as discussed above (conditional safe-harbour);


b) Intermediaries are required to comply with all requirements listed under the Intermediaries Guidelines (due diligence rules); and


c) Intermediaries, other than enforcing their own terms and conditions and privacy policies, are liable to take down content from their plat-forms only when notified by a court or an authorised government agency[47] and that too for matters listed under Article 19(2) of the Constitution (the actual knowledge requirement).


3.4 Proposed Amendment to Intermediaries Guidelines


On 24th December, 2018, MeitY released the Draft Information Technology [Inter-mediaries Guidelines (Amendment) Rules], 2018 (“the Draft Rules”) to amend the existing Intermediaries Guidelines. These Draft Rules sought to introduce requirements on intermediaries like - tracing out of originator of information for assistance to law enforcement, deployment of automated tools for proactive filtering of unlawful content, takedown of illegal content within 24-hours, and mandatory incorporation of companies having 5 million + users in India (among other things).[48]

In a press note issued by MeitY[49] alongside the Draft Rules, it has been mentioned that social network platforms are required to follow due diligence as provided in Section 79 of the IT Act and the Rules notified there-in, subject to the import of Article 19(2) of the Constitution, they have to ensure that their platforms are not used to commit and provoke terrorism, extremism, violence and crime. The press note also states that in-stances of misuse of social media platforms by criminals and anti -national elements have brought new challenges to law enforcement agencies, such as inducement for recruitment of terrorists, circulation of obscene content, spread of disharmony, incitement of violence, public order, fake news etc. The press note points to fake news/ rumours being circulated on WhatsApp and other social media platforms for various mob- lynching incidents reported across India in the last year. As MeitY has not issued any other official statement behind their intent in revising the intermediaries guidelines under the IT Act, the Draft Rules need to be read in con-junction with the press note for a critical ex-amination of the proposed changes therein.



MeitY invited comments on the Draft Rules and received responses from around 150 stakeholders, a number of them expressing their concerns around the proposed guidelines for their capacity to severely affect free speech and privacy rights of citizens online.[50]


Key Issues with the Draft Rules


A. The Traceability Requirement: Rule 3(5) of the Draft Rules requires intermediaries to enable the tracing out of originator of information on their platforms as may be required by authorised government agencies. The most concerning aspect of this requirement is how it will affect intermediaries like WhatsApp and Signal who provide personal communication ser-vices which are end-to -end encrypted[51] i.e. wherein even the service provider does not have access to the content of messages/ in-formation which flows through their plat-form. Introducing a traceability requirement for end-to-end encrypted services will lead to breaking of such encryption and thus compromising the privacy of individuals making use of such services for their private communication. In August of 2017, a nine -judge bench of the Supreme Court in KS Puttaswamy v. UOI (“the Privacy Judgment”)[52], held the right to privacy[53] as a fundamental right guaranteed under the Constitution of India.[54]


B. Proactive Filtering of Content: Rule 3(9) of the Draft Rules requires intermediaries to deploy automated tools for proactive filtering of unlawful content on their platforms. Online intermediaries are considered channels of distribution that play a merely neutral, technical and non-adjudicatory role. This Rule requires intermediaries to scrutinize user generated content and determine its legality - a task which must be undertaken by the judiciary considering that there are no clear standards of what is ‘unlawful’. This provision of proactive content filtering is against the judgment in Shreya Singhal (as discussed above), where in the Supreme Court of India had held that intermediaries are neutral platforms that do not need to exercise their own judgment to decide what constitutes legitimate content.


Automated moderation systems that are in use today rely on keyword tagging which is then followed by human review. Even the most advanced automated systems cannot, at the moment, re-place human moderators in terms of accuracy and efficiency. This is mainly because artificial intelligence is currently not mature enough to understand the nuances of human communication such as sarcasm and irony.[55]It should also be noted that global communication is influenced by cultural differences and overtones which an effective system of content moderation has to adapt to. Given the amateurish stage at which AI is at the moment, it may be short sight-ed to rely on this technology.

As societies evolve and change, so does the definition of “grossly harmful / offensive content”.

This implies that algorithms have to constantly understand nuanced social and cultural context that varies across regions. Research on AI has not yet produced any significant sets of data for this kind of understanding. The immediate result of using automated tools will be an increase in content takedowns and account suspensions which in turn will lead to over-censorship as has been seen around the world. Legitimate users (content creators) including journalists, human rights activists and dissidents will have their speech censored on a regular basis.


YouTube’s “Content ID” system for detecting content that infringes copyright has been deemed notorious for over-censoring innocent material. Use of AI without human intervention for detecting hate speech, misinformation, disinformation, trolling, etc which is even more nuanced than identifying copyrighted material will be catastrophic for freedom of speech and expression on the Internet.


The key limitations of natural language pro-cessing tools are:[56]


1. Natural language processing (“NLP”) tools perform best when they are trained and applied in specific domains, and cannot necessarily be applied with the same reliability across different contexts;


2. Decisions based on automated social media content analysis risk further marginalizing and disproportionately censoring groups that already face discrimination. NLP tools can amplify social bias reflected in language and are likely to have lower accuracy for minority groups who are under-represented in training data;


3. Accurate text classification requires clear, consistent definitions of the type of speech to be identified. Policy debates around content moderation and social media mining tend to lack such precise definitions;


4. The accuracy and intercoder reliability challenges documented in NLP studies warn against widespread application of the tools for consequential decision-making; and


5. Text filters remain easy to evade and fall far short of humans’ ability to parse meaning from text.


C. Local Office, Incorporation and Appointment of Nodal Officer: Rule 3(7) of the Draft Rules requires all intermediaries with more than 5 million users in India to be incorporated, have a permanent registered office in India with a physical address and appoint a nodal officer and a senior functionary for 24-hour coordination with Law Enforcement Agencies. At present there is lack of clarity about what this number of users refers to i.e. whether it refers to daily, monthly or yearly users, or the number of total registered users. To understand the implication of this requirement, reference to the user base of popular messaging apps is pertinent. WhatsApp, India’s most popular chatting app, has around 200 million users in India. Relatively newer chatting applications Hike[57] and ShareChat[58] have 100 mil-lion users and 25 million users respectively. The 5 million users specified in the Draft Rules represent around 1% of the Internet user base in India which might bring a substantial number of intermediaries under a new set of compliance requirements. This may cause many start-ups to bear the brunt of high costs stemming from incorporation under the Indian companies law - the Companies Act, 2013.


D. Ambiguous Terms: The Draft Rules contain mandates regarding a broad category of content that is classified as ‘unlawful’. Such a broad category of content described using terms such as “grossly harmful”, “harassing” and “blasphemous” could result in a chilling effect with intermediaries being forced to re-move even lawful content.[59]


Intermediary Liability in Reality


Shreya Singhal brought in a welcome respite to Internet intermediaries in India as they no longer were required to act upon sundry requests for content takedowns and could rely on court orders or notifications of authorised government agencies. This judgment also upheld constitutionally guaranteed rights of free speech of citizens on the Internet and clarified that restriction on speech will need to be within the contours of Article 19(2) of the Constitution, the court held that -


“86. That the content of the right under Article 19(1)(a) (free speech right) remains the same whatever the means of communication including Internet communication is clearly established …” Problems remain though, constitutional limits on free speech like - the security of the state, public order, decency/ morality, defamation or incitement to an offence are not defined, there are various tests established by courts for each of these limits but they are to be determined based on the facts and circumstances of each case. The ambiguity surrounding the meaning of these words and phrases might make it difficult for intermediaries to act upon orders received from competent authorities based on these limits.


Phrases used in the Intermediaries Guidelines, which online platforms are required to incorporate in their terms and conditions remain vague and undefined. According to these, content that is grossly harmful, hateful and blasphemous must not find a place on intermediary platforms. Following Shreya Singhal, such mandate must come from courts or the government, but plat forms might takedown similar content relying on their community guidelines or terms and conditions, which may lead to private censorship.


Then there is the reality of online platforms being utilised by bad actors to disseminate disinformation, terrorist content, child pornography etc. pushing governments around the world to hold intermediaries more accountable for third party content on their platforms. In India, public lynchings which have been attributed to rumour mongering on intermediary platforms have resulted in the government wanting to bring in changes such as - automated content filtering and traceability, which will have negative effects on rights like free speech and privacy. Countries across the world are pressuring intermediaries to be more responsible for the content flowing through their platforms. Though intermediary liability needs to be revisited in the current global context, any changes to law and regulation must ensure that it doesn’t abrogate basic human rights.


Content takedown requests are sometimes also received by intermediaries in the form of orders of law enforcement agencies under Section 91 of the Code of Criminal Procedure,1973 (“CrPC”).[60] Section 91 empowers courts and authorised police officers to ‘summon’ produce ‘any document or other thing’ which may be required for conducting investigation[SD2] .[61] The IT Act, gives enough powers to central and state governments for intercepting, monitoring, decrypting and taking down content from their platforms.[62] No part of Section 91 of the CrPC gives powers to law enforcement agencies to have content taken-off online platforms, it only provides for summoning of documents for aiding investigation. Despite the specific applicability of the IT Act in matters of online content,[63] law enforcement agencies fall back on general laws such as the CrPC to issue orders for content takedowns. The courts in India have held intermediaries more accountable for IP protected content flowing through their channels, which has been discussed in the next section.

Intermediary Liability and IP Disputes in India


The intermediary liability law in India is primarily governed by Section 79 of the IT Act as discussed above. As per that provision, online intermediaries enjoy a safe-harbour for third-party content on their platforms, till they prescribe to certain due diligence rules set out under the Intermediaries Guidelines. Provisions under the Copyright Act, 1957 provide for some protection to certain intermediaries as well.[64] Section 79 of the IT Act in conjunction with the ruling of the Supreme Court of India in Shreya Singhal, which broadened the protection given to intermediaries and allowed them to takedown content only on instructions by courts or authorised government agencies, is the authoritative law of the land on intermediary liability. Though, it is important to point out that in terms of intellectual property rights (“IP rights”), courts in India have placed a higher responsibility on intermediaries to take down content that infringes IP rights.



Liability under the IT Act


Beyond Section 79 of the IT Act, Section 81 is a non-obstante clause, providing for an overriding effect of the IT Act over all other laws in times of conflict. But, this clause carves out an exception for copyright and patent holders.[65]


The Intermediaries Guidelines also require intermediaries to notify their users for not uploading content that - “infringes any patent, trademark, copyright or other proprietary rights”[66] and to not host/ publish such content on their platforms.


Limited and Conditional Protection under the Copyright Act, 1957 (“the Copyright Act”)


Section 52(b) and (c) of the Copyright Act provides protection to intermediaries for transient or incidental storage of copyrighted works, if:


1. It is purely in the technical process of electronic transmission or communication of such content;


2. It is for the purpose of providing links or access/ integration to content, when not expressly barred by the copyright owner and when the intermediary does not have reasonable grounds for believing that such storage is of an infringing copy (actual knowledge requirement).


Section 52(c) also provides for a notice and take-down mechanism, wherein copyright owners could request intermediaries to remove protect-ed content from their platforms for a minimum period of 21 days (or for a longer period in case of a court order mandating such requirement). As per this provision, intermediaries on being satisfied are required to remove content within 36 hours of being intimated.[67][SD3]


Thus, reading the IT Act and the Copyright Act in conjunction, in cases of content protected by copyright, intermediaries must prescribe to a higher standard of care in ensuring that their platforms are not used to make infringing con-tent available to the general public. It is also worthwhile to note that the Copyright Act does not define what is ‘transient or incidental storage’ and without such clarity, ambiguity remains on which intermediaries are protected/ unprotected under this clause.



The IP Effect - Distinguishing Actual Knowledge from Shreya Singhal


As discussed at the starting of this section, according to Section 79, safe-harbour protection is available to intermediaries in India if they, upon receiving ‘actual knowledge’, remove unlawful content from their platforms. This ‘actual knowledge’ was interpreted to mean intimation by appropriate government agency or an order of a court by the Supreme Court in Shreya Singhal, but subsequently in matters concerning the infringement of IP rights, courts have distinguished the ‘actual knowledge’ requirement as enunciated in Shreya Singhal and replaced it with a ‘specific knowledge’ requirement i.e. if intermediaries are given specific knowledge of infringing works by IP owners, they are liable to take it down to keep their safe-harbour protection under the IT Act.


In its landmark judgment in Myspace v. Super Cassettes Industries[68] the Delhi High Court while distinguishing copyright matters from those contained under Article 19(2) of the Constitution of India[69] stated that -


“50. … In the case of copyright laws it is sufficient that MySpace receives specific knowledge of the infringing works in the format provided for in its website from the content owner with-out the necessity of a court order.”


Reiterating the actual knowledge requirement in cases of content protected by copyright, the court stated that -

“57. … If copyright owners, such as SCIL inform MySpace specifically about infringing works and despite such notice it does not takedown the content, then alone is safe harbor denied. However, it is for SCIL to show that despite giving specific information the appellant did not comply with its notice.”


Apart from distinguishing the actual knowledge requirement in cases of copyright, the Myspace judgment is also important since it clarified that unspecified material, including takedowns of all future infringing content is not what intermediaries are required to do un-der law as this will lead to private censorship and will have a chilling effect on free speech.


The court held that -


“62. … The remedy here is not to target intermediaries but to ensure that infringing material is removed in an orderly and reasonable manner. A further balancing act is required which is that of freedom of speech and privatized censorship. If an intermediary is tasked with the responsibility of identifying infringing content from non-in-fringing one, it could have a chilling effect on free speech; an unspecified or incomplete list may do that. … Such kind of unwarranted private censor-ship would go beyond the ethos of established free speech regimes.”


In another matter before the Delhi High Court,[70] this time for the infringement of a design under the Designs Act, 2000, the rights owner wanted the intermediary (eBay) not only to remove existing infringing products but to screen similar listings in future and remove infringing products without the intimation of the owner. The court rejecting such a claim held that intermediaries cannot be expected to exercise such vigilance over their platforms and are liable to only re-move infringing content which is specifically asked for. The court held that -



35. … Moreover the question, whether an IP right has been infringed or not is more often than not a technical question with which the courts steeped in law also struggle and nothing in the IT Act and the IT Rules requires an intermediary, after having been once notified of the IP Rights, not allow anyone else to host on its portal infringing goods/matter. The intermediaries are not possessed of the prowess in this respect. As aforesaid, it is a different matter, when attention of the intermediary is invited to infringing product and complaint made with respect thereto. Merely because intermediary has been obliged under the IT Rules to remove the infringing content on receipt of complaint can-not be read as vesting in the intermediary suo motu powers to detect and refuse hosting of in-fringing contents.”

More recently, the same court in Christian Louboutin v. Nakul Bajaj[71], a matter relating to trademark infringement held an e-commerce company not be an intermediary as per Section 79 of the IT Act[72] and held that for e-commerce portals to claim exemption under the safe-harbour provision, they need to ensure a passive and not an active participation in the selling pro-cess. The court held that -


“78. … When an e-commerce company claims ex-emption under Section 79 of the IT Act, it ought to ensure that it does not have an active participation in the selling process. The presence of any elements which shows active participation could deprive intermediaries of the exemption.”


With respect to IP rights, taking into consideration the law and the above-mentioned judicial pronouncements, the following inferences can be made:


1. Despite the ruling of the Supreme Court of India in Shreya Singhal, courts have distinguished the ‘actual knowledge’ requirement for matters of free speech[73] from claims of IP infringement. In cases of IP, courts have operationalised the notice and takedown mechanism, wherein rights owners can request for infringing content to be taken off by intermediaries on intimating them of the infringement (the notice and takedown mechanism); and


2. Such requests need to be specific and not broad, rights owners may not request intermediaries to be vigilant about all future violations, as this will require constant monitoring/ screening, which is outside the role played by intermediaries (the specific knowledge requirement);


None of the cases discussed above, eventually lead to revocation of intermediary safe-harbour to either place primary or contributory liability for infringement on the Internet platforms. In Christian Louboutin, though the Delhi High Court held that due to the active role played by the e-commerce portal in selling activities it did not fall into the definition of an intermediary, the court didn’t hold the portal liable for trademark infringement.


Due to the lack of clarity on intermediary liability, for content protected by IP rights, uploaded to platforms by third parties, intermediaries will end up over complying with takedown requests to ring fence their safe harbour protection. This may have a negative effect on content which falls under ‘fair use/ fair dealing’ categories of law,[74] severely impacting the free-speech rights of citizens. This, coupled with the fact that tech giants like - Facebook, Google and Twitter already use automated filters which often lead to taking down legal content, could prove to be problematic for the digital rights of Indian people.


Though courts have recognized that intermediaries cannot and should not play the role of judges in determining what is illegal or legal content,[75] by empowering rights owners to send notices for specific content removal, courts have also made it difficult for intermediaries to defend instances of fair use/ fair dealing.


In a recent draft policy document issued by the Department for Promotion of Industry and In-ternal Trade,[76] the government has raised issues around the liability of e-commerce platforms for counterfeit and pirated products. The draft pol-icy has recommended that if trade mark owners require, e-commerce platforms shall not list their products without prior consent. On the copyright front, the draft policy has recommended that, “Intermediaries shall put in place measures to pre-vent online dissemination of pirated content.” The draft policy reiterates the ‘specific knowledge’ requirement and the ‘notice and takedown’ mechanism established by courts (as discussed above)


“Upon being notified by the owner of copyright protected content/ work that a website or e-commerce platform is making available, selling or dis-tributing the copyrighted content/ work without the prior permission/ authorization of the owner, such website or platform should expeditiously re-move or disable access to the alleged content.”


The draft e-commerce policy has used both terms - e-commerce platforms and intermediaries, creating further confusion. The way e-commerce platforms function in India, any demands for ensuring non-listing of products may lead to pre-screening which will dilute the safe-harbour protection granted to such platforms under law. Pre-screening and active monitoring of content has also been held to be not required by law and may have a chilling effect on free speech (as observed in the Myspace judgment). In terms of copyright violation, though the draft policy is in line with the current jurisprudence, this does create disproportionate pressure on intermediaries to takedown content, which may not be illegal and also makes intermediaries the judges of what is legal/ illegal.



The growing trend of making intermediaries more liable for the content on their platforms is apparent from the draft policy’s demand on such services to show a higher level of ‘social responsibility’. The draft policy states that intermediaries need to ensure ‘authenticity’ and ‘genuineness’ of content flowing through their pipelines - “… With a growing importance of these entities, their social responsibilities also increases. Due to the fact that traders, merchants, individual users, organizations, associations are all dependent on them, the authenticity of content posted on their websites cannot be compromised. In this regard, it is important to emphasize on responsibility and liability of these platforms and social media to ensure genuineness of any information posted on their websites.


From a policy standpoint this is problematic for various reasons, firstly, this recommendation uses very broad and vague phrases like social responsibility, authenticity and genuineness; secondly, this makes intermediaries the judges of deciding what is legitimate and what is not, which will have the unintended consequence of private censorship (this is also held to be illegal by various courts); thirdly, it is very difficult to ascertain the authenticity and genuineness of content, whether protected by IP rights or not, as it may depend on various factors which a ma-chine or even a human reviewer may find it hard to determine. As iterated at various points in this report, any suggestions/ recommendations for increasing the accountability of intermediaries must not abrogate free speech and privacy rights of netizens.



Indian courts on intermediary liability

Having gone over the applicable laws with regard to intermediary liability in India, this section of the report will examine some of the notable cases around intermediary liability in India. Only cases from various High Courts (at the state level) and the Supreme Court of India have been considered for this section, and the list is non-exhaustive. The cases discussed herein are relevant to provide an overview of the jurisprudence which has evolved in India on issues surrounding intermediary liability.


Avnish Bajaj v. State[77] (2008)


As discussed previously, this case was an inflection point for the debate on intermediary liability in India.


This case holds importance in the intermediary liability landscape in India as for the first time the managing director of a company (in this situation eBay) was charged with criminal pro-visions both, under the penal law of India and under the IT Act, for content circulated by a third party on an e-commerce platform. In this matter, Avnish Bajaj escaped liability on technical grounds as the company Baazee.com was not arraigned as an accused in both matters - before the High Court and subsequently the Supreme Court of India. Another important aspect of this case (the Delhi High Court judgment) was that the court recognized the use of content filters for blocking pornographic content and stated that companies bear the risk of acquiring knowledge if such content escapes the filters.[78]


Google v. Visakha Industries (2009)[79]

In 2009, Visakha Industries, a construction company involved in the manufacturing of asbestos cement sheets, filed a criminal defamation case against Ban Asbestos Network India (BANI), its coordinator and Google India. It alleged that the coordinator of BANI had written blog posts on a website owned by BANI, that contained scathing criticism of the company and therefore harmed its reputation in the market. Google India was also arraigned as a party in the litigation because the blogpost was hosted on the blog publishing service of Google.


Google India moved the High Court of Andhra Pradesh for dismissal of the criminal charges against it on the grounds that it enjoyed safe-harbour protection under Section 79 of the IT Act. It was contended that Google is not the publisher or endorser of the information, and only provides a platform for dissemination of information. It, therefore cannot be held liable. The High Court refused to accept Google’s contention and dis-missed the petition on the grounds that Google failed to take appropriate action to remove the defamatory material, in spite of receiving a take-down notice from the company.


Aggrieved by the judgment of the High court, Google filed an appeal in the Supreme Court in 2011, where the matter is currently pending.


Shreya Singhal v. Union of India[80](2015)


As discussed previously, the Shreya Singhal judgment was a watershed moment for the the debate on intermediary liability in India.


Myspace Inc. vs. Super Cassettes Indus-tries Ltd.[81] (2017)

This case is important from a copyright perspective as the division bench of the Delhi High Court in this matter reversed a single judge decision holding Myspace liable for copyright infringement. The division bench held that if intermediaries are tasked with the responsibility of identifying illegal content, it could have a chilling effect on free speech.

In this matter, the court also distinguished the ‘actual knowledge’ requirement from Shreya Singhal to mean ‘specific knowledge’ in matters of copyright infringement i.e. if intermediaries are pointed to specific infringing material by rights holders then they must remove such con-tent, without the necessity of a court order.


Kent RO Ltd & Anr. Vs. Amit Kotak & Ors[82] (2017)

In January, 2017, a single judge bench of the Delhi High Court, refused to compel intermediaries to screen content that infringes intellectual property laws on an ex-ante basis.[83]


The petitioner Kent RO Systems, a company that manufactures water purifiers filed for permanent injunction against one Amit Kotak (respondent) for infringing its intellectual property rights by copying its designs and eBay India Pvt Ltd. for aiding the infringement by allowing the respondent to sell its product on their website.


eBay India Private Limited sought the protection of Section 79 of the IT Act, under which it is saved from any liability arising out of third party generated information, data or communication link established by it, as long as its function is confined to providing access to a communication system.

The single judge bench of Justice Rajiv Sahai Endlaw held that compelling an intermediary to screen content would be “an unreasonable interference with the rights of the intermediary to carry on its business.”[84]


The court also asserted that requiring an intermediary to screen any kind of content would change the role of an intermediary from a facilitator to an adjudicator. Under Section 79 and the IT Rules, 2011, an intermediary is only obliged to remove content on receipt of a court order or Government notification.


In Kent RO, the court reiterated the specific knowledge requirement as expounded in Myspace, stating that when the attention of the intermediary is brought to infringing products, then they are liable to remove such listings from their websites.


The Registrar (Judicial), Madurai bench of Madras High Court v. The Secretary to Government, Union Ministry of Communications, Government of India, New Delhi and Ors.[85](2018)


This case arose from the unfortunate circum-stance of the death of a 19-year old student, allegedly after playing the online game “The Blue Whale Challenge”. This game required players to undertake 50 extreme tasks which eventually lead to them committing suicide. The Madras High Court took suo motu cognizance of the matter as there was public interest at play.


The court had asked the government to request online services like Google, Facebook, Microsoft, Yahoo and Instagram to remove ‘links’ of the blue whale game from their portals. To this, Google replied by stating that its Indian subsidiary can-not remove content as their app store was run by the parent company, which was governed by US laws. Google clarified, that their team in the US was aware of the game and will continue to take action against providers who violate their app store policies.


The court, while highlighting Google’s response and noting how difficult it is for law enforcement to get access to crucial information, reprimanded online services stating that they cannot abdicate their duties and responsibilities under law -


“The service providers cannot abdicate their responsibilities. They cannot also plead that they have no control over the content. A mere look at the net neutrality debate that is presently going on would show that the service providers are in a position to have control over the content that passes through their information highway. If the service providers can attempt to control the con-tent for commercial considerations, they can certainly be called upon to exercise their power of control in public interest also. Rather they must be mandated to do so.”

The court thus directed the Central Government to take appropriate steps to bring “Over The Top” services into a legal framework obliging them to comply with the laws of India and to provide the required information to the law enforcing agencies - “Methods must to be devised to ensure that those OTTs which could not be brought within such framework are not accessible in India.” The court also requested the government to amend laws and regulations so that Indian laws are applicable to these foreign services and law enforcement can get access to relevant information at crucial points.

This case highlights an important pain point in the current intermediary liability debate, not just in India, but around the world i.e. access to information by law enforcement. The government, while introducing changes like the Draft Rules, of-ten point to this problem highlighting the fact that foreign entities take refuge behind source country laws at the time of providing assistance to Indian law enforcement agencies. It will remain to be seen how tech-companies and governments solve the problem of access to information by law enforcement, but any new changes will have to be in consonance with free speech and privacy rights.[86]


Christian Louboutin SAS v. Nakul Bajaj and Ors[87] (2018)

In November 2018, the Delhi High Court laid down certain guiding principles in respect of li-ability of e-commerce platforms for trademark infringement.


The plaintiff, Christian Louboutin, a company that manufactures high end luxury shoes, was the owner of registered trademarks in India and sold its products only through authorized dealerships. The defendant, Darveys.com was an e-commerce platform that markets itself as a “luxury brands marketplace.” The plaintiff alleged that the defendant sells counterfeit products bearing the plaintiff’s name on its website. Apart from offering for sale and selling the Plaintiff’s products, on the website of the defendant, it also alleged that the defendant used the names “Christian” and “Louboutin” as meta tags to attract traffic towards its website, and this resulted in infringement of the trademark rights of the Plaintiff, and violation of personality rights of Mr. Christian Louboutin, the founder of the brand.



The defendant argued that the goods sold were genuine, and that there was no infringement on its part because it was a mere intermediary, and entitled to protection under Section 79 of the IT Act.


The High Court examined in detail what constitutes an ‘intermediary’ under Section 2(w) of the IT Act, and whether online marketplaces as intermediaries qualify for safe harbour protection under Section 79.


In determining the role of an online marketplace and ambit of ‘service’ as has been used in the definition of ‘intermediaries’ under the IT Act, the court laid down twenty six tasks that an intermediary may undertake, ranging from identification of the seller, advertising products on the platform, transporting the product to the purchaser, using trademarks through meta tags, among other things.


The judgment also stated that it has to be seen whether the platform is taking adequate mea-sures to ensure that no unlawful acts are commit-ted by the sellers. Measures include the manner in which the terms of the agreements entered into between the sellers and the platform are en-forced, consequences of violation of the terms, among others.


The Court noted that the elements summarised above would be key to determining whether an online marketplace or an e-commerce website is ‘conspiring, abetting, aiding or inducing’ and is thereby contributing to the sale of counterfeit products on its platform. “When an e-commerce website is involved in or conducts its business in such a manner, which would see the presence of a large number of elements enumerated above, it could be said to cross the line from being an intermediary to an active participant”, the judgment stated.


After considering all the above mentioned factors, the Court concluded that Darveys.com can-not be termed as an intermediary that is entitled to protection under Section 79 of the IT Act.


This case is particularly important because it was the first time that the Court decided on the is-sue of trademark infringement by online e-commerce platforms that have maintained that they are immune from liability by virtue of Section 79 of the IT Act. It is also pertinent to mention that the court despite ruling that Darveys.com was not an intermediary, it did not hold it liable for trademark infringement.





































Expanding Intermediary Obligations

Although Section 79 of the IT Act provides safe-harbour protection to intermediaries from liability arising out of third-party con-tent, with the intermediaries’ primary obliga-tion in this regard being that unlawful content be taken down on receipt of a court order or Government directive, a number of petitions have been filed before various Indian courts seeking to expand the scope of intermediar-ies’ obligations with respect to user-generated content. These petitions filed before various High Courts and the Supreme Court have been observed to attempt and partially succeed at broadening the scope of obligations in two major directions i.e. proactive monitoring of content, and Right to be Forgotten.



Proactive Monitoring of Content

Despite the Supreme Court’s judgment in Shreya Singhal, in which the Court clarified that intermediaries are not responsible for judging the legitimacy of content on their plat-forms, the last two years have seen litigation that involved intermediaries to act as content monitors. A few notable cases are:



Sabu Mathew George v. Union of India[88]

In 2008, Sabu Mathew George, a gender activist and doctor, filed a writ petition in the Supreme Court of India to ban advertisements related to pre- natal sex determination from search engines like Google, Bing and Yahoo. It was contended by the petitioner that the display of these results violated Section 22 of the Pre -Conception and Pre-Natal Diagnostic Techniques (Prohibition of Sex Selection) Act, 1994 (“PCPNDT Act”) . In their reply, the respondents argued that they are “conduits” and not content providers and hence protected un-der Section 79 of the IT Act. It was also argued that there are innumerable activities banned by law, but their information is still available online and offline. Disentitling anyone from receiving information or gaining knowledge on a subject is violative of Article 19(1)(a) of the Constitution, which includes the right to know and right to receive or access information.

Over the course of proceedings, the court is-sued interim orders directing Google, Micro-soft and Yahoo to ‘auto -block’ pre -natal sex determination ads from appearing in search results. The court also drew a list of forty key words that were to be auto-blocked if anyone attempts to look them up. Expert in-house committees were directed to be formed by search engines to evaluate and delete content violative of Section 22 of the PCPNDT Act “based on its own understanding.”


The Supreme Court also directed the Central Government to constitute a nodal agency for receiving complaints from anyone who came across anything that has the nature of an advertisement or has any impact in identifying a boy or a girl in any method, manner or mode by any search engine. The nodal agency was then required to convey actionable complaints to the concerned intermediaries, who were obliged to delete the content in question within 36 hours and intimate the nodal agency.


This petition was disposed off in December 2017, with the apex court issuing additional directions to the newly formed nodal agency and expert committee to hold a meeting with the assistance of the petitioner’s legal team, “so that there can be a holistic understanding and approach to the problem”. Google, Yahoo and Microsoft were also directed to work with the committee to identify and implement a “constructive and collective approach to arrive at a solution”.


Significance: In this matter, the Supreme Court of India stated that intermediaries are obliged to keep unlawful content from appearing on their networks. Even after the ruling of the Supreme Court in Shreya Singhal, wherein the court made it clear that intermediaries must not be asked to exercise their personal judgment in determining the legality of content for takedown purposes, the court continues to ask intermediaries to proactively filter their platforms for illegal content. Such decisions by courts contribute to the confusion over the level of due diligence which is to be followed by intermediaries to protect their safe-harbour.



Kamlesh Vaswani v. Union of India[89]

This public interest litigation was filed by an Indore based lawyer before the Supreme Court of India challenging Sections 66, 67, 69, 71, 72, 75, 79, 80 and 85 of the IT Act as being unconstitutional, as they were argued to be inefficient in tackling the rampant availability of pornographic material in India. These pro-visions were said to be ineffective as the IT Act was primarily meant to govern e- commerce and e-governance and was therefore not suited to tackle cyber crimes including the distribution of pornographic content online.


The petitioner prayed, among other things, to declare the above mentioned provisions unconstitutional, draft a national policy and draft an action plan to tackle pornography, and to declare the watching of pornographic videos as a non-bail-able, cognizable offense. During arguments in court the petitioner also prayed that intermediaries be asked to proactively filter out pornographic content from public access. Though, the court appeared somewhat sympathetic to the petitioner’s grievances, concerns about technical feasibility and privacy implications of proactive filtration of content were expressed by the pre-siding judges. The Cyber Regulations Advisory Committee, which was directed by the Court to explore ways to block pornographic content on-line, tasked the Internet and Mobile Association of India with identifying a list of websites to be blocked. Interestingly, 857 pornography web-sites were blocked by the Indian Government in August 2015, but these were all unblocked with-in a few days. This matter is currently pending before the Court.



Significance: This matter once again seeks to impose proactive content monitoring obligations on online intermediaries, this time by blocking access to pornographic content. It is pertinent to note that the presiding judges had recognized the technical challenges involved in filtering the Internet of all pornography and also touched upon the fact that what an individual does in the privacy of his/her home is not for the state to dictate. However, the Court has also expressed that it is necessary to keep more harmful forms of pornography like child porn at bay and that intermediaries may be un-der an obligation to proactively block access to such content.


In Re: Prajwala[90]


Sunitha Krishnan, founder of Hyderabad-based NGO Prajwala, wrote a letter to the Supreme Court of India highlighting the issue of videos of sexual violence floating on WhatsApp and other social media platforms. She submitted a list of the websites that were airing the videos and requested, among other things, that the Ministry of Home Affairs be directed to look into the matter with the help of intermediaries like Google, YouTube and Facebook. The Supreme Court’s social justice bench took suo moto cognizance of the letter and ordered a Central Bureau of Investigation (CBI) inquiry into the videos. The Department of Telecommunications (DoT) and the Ministry of Home Affairs were also directed to put the concerned web portals under the scanner.[91] Furthermore, a Committee was constituted under the Chairmanship of Dr. Ajay Kumar, the then Additional Secretary of the Ministry of Electronics and IT, to assist and advice the court on the feasibility of preventing sexual abuse/ violence videos from appearing online.


Over the course of the proceedings, the Committee held extensive deliberations involving a number of representatives from various intermediary platforms, lawyers, academics and civil society members. A two-part report was also submitted by the Committee based on its deliberations, containing some recommendations towards preventing the upload and circulation of sexually abusive/violent videos on-line. All parties including Google, Facebook, Microsoft, Yahoo!, WhatsApp and the Government were directed by the Court to implement all recommendations with consensus at the earliest. The matter is still pending before the Court awaiting final disposal.



Significance: This matter raises important questions with regard to the role of intermediaries in controlling the propagation of videos depicting sexual abuse and violence. This also ties in to challenges with regard to formulation of policies to tackle the issue of circulation of non consensual sexually explicit videos, such as revenge porn on the Internet. Interestingly, many of the accepted recommendations of the Ajay Kumar Committee involved blocking of search queries containing certain key words and preventing upload of sexually abusive/violent videos at the source using hashing and other technologies. While the recommendations are currently being considered as voluntary initiatives to be undertaken collaborative-ly by stakeholders, it could be problematic if they come to be treated as legal mandates with mandatory compliance. It is pertinent to note that the SC imposed costs of Rs. 100,000 each on Google, Facebook, Microsoft, Yahoo! and WhatsApp for failing to file replies describing steps taken by them to give effect to the Committee’s recommendations.



Right to Be Forgotten


The Right to be Forgotten is a civil right recognized in many jurisdictions that allows individuals to demand erasure of their personal information from the Internet. It gives individuals the right to control their personal information on the Internet. The roots of this right arises from the right to privacy and right to reputation. The concept was developed in the EU and Argentina and has been in practice since 2006. Google’s Transparency Report on search removals under European privacy law shows a steady increase in “requests to delist” and “URLs requested to be delisted” from May 2014.[92] By January 2019 the number increased to 777,706 requests to delist and 3,006,188 URLs requested to be delisted.[93]

For the purpose of better understanding, this right may be divided into: ‘Right of Erasure’ and ‘Right to Delist’. The right of erasure is when the data is deleted from the source and therefore completely deleted from the Internet. Whereas the right to delist pertains to the search results no longer being linked to the name or identity of the data subject with the data still existing on the web. It is debatable which of these rights must be preferred over the other. The earliest example of usage of this right is that of a criminal’s right to not be linked with their crime for the entirety of their life so that he may be rehabilitated into society again.


The legitimacy of this right is fiercely contested on the grounds that it negatively affects freedom of speech and expression and the right to access information. On the other hand, advocates of this right argue that digital technology allows storage of large amounts of data on the Internet which preserves an unnatural memory. Individuals should have right over their personal information. In other words, the right to be forgotten is an essential safeguard to right to informational self determination and the right to privacy.



The right to be forgotten has evolved differently in different jurisdictions. Germany and France had recognized the right long before the Google Spain ruling which brought the RTBF challenge to prominence[94]. The US does not have a specific legislation with respect to privacy. While there seems to be a greater inclination toward freedom of speech and expression, there have been many cases which have upheld the right to be forgotten and have ordered that data be delisted. These include protection of minors, deletion of criminal records and individual bankruptcy.[95] Argentina currently grapples with the balance between freedom of speech and expression and the right to privacy, where many individuals have filed cases for the delisting of links which contain their personal data. Though, the case of Virginia da Cunha, where the complaint was regarding the linking of the complainant’s name with pornographic websites ended in defeat, it brought Argentina into the spotlight with respect to the debate on the right to be forgotten.[96]


In India, the right to be forgotten has not been formally recognized yet (India’s Draft Personal Data Protection Bill, 2018 provides for a right to be forgotten - which is a right to restrict/ prevent the disclosure of information and not a right of erasure)[97] but has been evolving through decisions rendered by various courts. The doctrine came up before consideration for the first time in April 2016 before the High Court of Delhi.


There have been instances when the courts have asked for particular judgments or personal information to be removed from online repositories or search engine results.


A few noteworthy cases that highlight the evolution of this concept in India are mentioned below:

Laksh Vir Singh Yadav v. Union of India & Ors.[98] (2016)


In this case, the petitioner had made a request to Indian Kanoon and Google to expunge his name from their search results as it was affecting his employment opportunities.


He contended that the criminal case between his wife and mother kept showing in the results, every time his name was searched on the Internet, which gave the impression that he was involved in the matter.


The matter is ongoing in the High Court of Delhi and the next hearing is scheduled for July 18, 2019.


[Unknown] X v. Union of India[99] (2016) The petitioner had moved the High Court of Kerala against Google, seeking removal of hateful content from the Internet posted by his former wife. He alleged that a Google search would end up in certain web links defaming him and his children, causing them immense humiliation.[100] The petitioner cited the Google Spain decision, wherein the Court of Justice of the EU had ruled that Google must create a system by which it can be asked to remove personal data, on the request of an individual. Though this matter has been disposed off as per the Kerala High Court’s website, the final order of the court is not available. On the petitioner’s name, the case status mentions - ‘X - Name and Address of the Petitioners Deleted’.[101]


Sri Vasunathan v. The Registrar (2017)[102]

The petitioner had filed a writ petition in the High Court of Karnataka, seeking removal of his daughter’s name from an earlier order passed by the court with respect to a criminal case involving her and the defendant. According to the petitioner, a name search on search engines like Google and Yahoo revealed that she was embroiled in the dispute, thus harming her marital relationship and reputation in society.


The court granted the request and made the following observation: “This would be in line with the trend in western countries of the ‘right to be forgotten’ in sensitive cases involving women in general and highly sensitive cases involving rape or affecting the modesty and reputation of the person concerned.”


The Court also directed its registry to ascertain that the petitioner’s daughter’s name is not reflected on the Internet, with regard to the criminal matter, thus upholding her right to be forgotten.


Dharmraj Bhanushankar Dave v. State of Gujarat[103] (2015)

The petitioner in this case was acquitted in a previous criminal matter in the Gujarat High Court and the judgment was supposed to be non-reportable. However, indiankanoon.org published the judgment on their web portal and it was available via a simple Google search. Aggrieved by the same, the petitioner approached the High Court of Gujarat seeking deletion of the judgment from the website as it was affecting his personal and professional life.

Rejecting the petitioner’s plea and dismissing the petition, the court held that the High Court is a court of record and Rule 151 of the Gujarat High Court Rules, 1993 provides that copies of documents in any civil or criminal proceeding and copies of judgment of the High Court can be given, even to third parties with the order of the Assistant Registrar. According to the High Court, the petitioner had not been able to point to any provision of law by which the Respondent could be restrained under Art 226 of the Constitution. The court refused to accept the argument put forth by the petitioner that publication of the judgment was violating his Right to Life and Personal Liberty under Article 21. The Court further stated that publishing on the website would not amount to being reported, as the word “reportable” only refers to it being reported in the law reporter. Thus, it can be seen that even though currently there isn’t a statutory right to be forgotten in India, courts have requested search engines to delist content from their platforms. It is important to point out that successful RTBF requests in India have only lead to delisting of posts from search engines and not deletion of content from the source.

Intermediary perspectives


As part of the research, a number of intermediaries were approached to solicit their views on India’s intermediary liability framework and the expanding content obligations brought about by the recently proposed Draft Rules and judicial pronouncements. Said intermediaries included leading social media platforms, search engines, and consumer review websites, as well as startups and small businesses offering specialized services online.

Every intermediary that was spoken with held the view that the current legislative framework is adequate in principle. The IT Act explicitly provides safe harbour protection under Sec-tion 79, exempting intermediaries from liability for user generated content, so long as they exercise no editorial control. The Intermediaries Guidelines Rules then lay down a set of due diligence conditions that must be met before qualifying for immunity under Section 79. While the language of the Intermediaries Guidelines created some confusion initially and resulted in intermediaries having to exercise their personal judgment when responding to takedown requests and over-complying to err on the side of caution, the Supreme Court’s decision in Shreya Singhal was said to be of tremendous help in clarifying the state of law. By virtue of the judgment, intermediaries are no longer required to takedown content upon receiving requests from third- parties, which led to a very significant drop in the number of such requests received.

On the Draft Rules, the intermediaries were of the view that the rules should not be applied uniformly to all categories of service providers, there should be a function based approach and regulation should tie to the different functions that intermediaries play on the Internet. It was felt that Rule 3(2) of the Draft Rules should be less vague and more specific and should not contain large, all-encompassing terms, such as - grossly harmful, harassing, blasphemous etc. This would entail intermediaries to act as adjudicators to takedown con-tent and lead to private censorship.

It was opined that the traceability requirement under Rule 3(5) of the Draft Rules is not framed clearly and the terms “enable tracing” lacks clarity. Also, enforcing this requirement would impact the trust that customers place on any sort of digital transaction.

Intermediaries pointed out that mandatory in-corporation under the Companies Act, 2013, along with appointing a nodal officer will increase the cost and compliance burden of smaller intermediaries. It was also mentioned that the 24-hour requirement to remove content will be difficult to comply with due to technological challenges. Such a requirement doesn’t leave scope for review of takedown orders. Automated tools for takedowns further aggravate these problems and takes away the review process. This also affects fair use and fair dealing activities under copyright law. A review mechanism should be in place, which gives scope to intermediaries for checking the veracity of takedown orders. Intermediaries recommended that, a graded approach to take-down could be implemented. More sensitive content like- terrorist content or child pornography could be treated more expeditiously as compared to other requests.

On the usage of automated tools to proactively monitor content, it was felt that asking intermediaries to proactively filter their networks of impermissible content under threat of legal consequences overlooks certain ground realities. This was also said to be contrary to the prevailing jurisprudence of various courts in the country.

Firstly, an intermediary’s function is limited to providing a platform for its users to pub-lish content or avail services. The intermediary by definition does not play a role in deciding what content is published or what services are offered/availed on its platform. Safe harbour protection is provided to intermediaries on the basis of this very premise – that it would be unjust to hold platform providers answerable under law for content/services that they have no connection with.

Secondly it is in the intermediary’s own business interest to keep their platforms free from unlawful or otherwise harmful content, as users will naturally tend to avoid using inhospitable platforms. However, considering the sheer volume of activity that takes place on intermediary platforms on a daily basis, exhaustive filtering of impermissible content is impossible even after dedicating vast resources to do this. A large intermediary said that it has set up dedicated facilities and devoted vast numbers of personnel all over the world to review con-tent that potentially violates its terms of use. It has also begun to implement automated re-view and takedown processes with the help of algorithms and artificial intelligence in limited contexts. Despite such measures, comprehensive filtration of impermissible content remains elusive due to the millions of data points that are generated on a daily basis. Assigning le-gal penalties for failure to proactively remove impermissible content would be crippling for its business, leading to a situation where it would be forced to adopt overbroad measures that will inevitably affect legitimate content as well as free speech rights. Significant variance amongst national laws with regard to permissibility of certain types of content was also identified as a critical bottleneck for intermediaries with a global presence. It was felt that emphasis when it comes to content regulation should be on self -regulatory or co- regulatory models, where the intermediaries are allowed the bandwidth to develop and use their own internal processes to identify and remove im-permissible content while operating under the ambit of safe-harbour protection from legal li-ability over user-generated content.

Though intermediaries agreed that in principle Shreya Singhal provided a much needed clarification in law, they also said that significant problems remained with the implementation of the Shreya Singhal judgment, especially in the lower judiciary. Several intermediaries were of the view that judges of lower courts are woefully unaware of intermediary liability laws. For instance, many judges remain ignorant of the Supreme Court’s verdict in Shreya Singhal, exempting intermediaries from acting on takedown requests sent by third-parties, prompting them to issue verdicts unfavourable to intermediaries. Though, such verdicts are usually set aside on appeal, they nevertheless tie up intermediaries for weeks, months and years in needless litigation incurring great costs. Awareness building amongst judges on intermediary liability laws as well as broader technology laws was therefore highlighted as a priority.


On the issue of ‘Right to be Forgotten’, the intermediaries reported that they had received RTBF requests and such requests were mostly for reputational harm.

In conversation with intermediaries, some of them stated that sometimes, they receive orders from law enforcement agencies to take-down content from their platforms under Section 91 of the CrPC. This provision, empowers law enforcement agencies to request for ‘any document or other thing’ which would assist them in conducting investigation. Though, there are relevant provisions under the IT Act for production and removal of content (Section 69 and 69A of the IT Act), law enforcement agencies continue to use provisions un-der other laws which may lack the procedural safeguards installed under the provisions of the IT Act and its rules therein.

The table below shows the different stances that various intermediaries and industry associations have taken on changes introduced in the Draft Rules. Due to their far reaching effects on the intermediary landscape in India, this analysis is restricted to issues such as - i) traceability of originator of information; ii) local incorporation requirement; and iii) deployment of automated tools for content filtering.[104]

The organizations that support a particular provision of the Draft Rules, their respective position has been marked by a. The organizations that oppose a particular provision in the Guidelines, their stance is depicted by r.

Several intermediaries, both domestic and foreign, were approached, but not all of them provided their response.


Analysis of submissions sent to the Ministry of IT by various intermediaries/ industry associations

S.No.Organization (Associations and Corporations)Traceability [Rule 3(5)]Incorporation under Companies Act, 2013 [Rule 3(7)]Deployment of automated tools for proactive content monitoring [Rule 3(9)]01Wipro----aa [Provided that there are certain measures to reinstate genuine content]02Freedom Publishers Unionr--------03Asia Internet Coalitionrrr04ITU- APT Foundation of Indiar [For lack of clarity of the purpose of seeking certain data and the obligations of intermediaries for the same]rr05The Indian Music Industry--------a06The Information Technology Industry Councilrrr07Computer and Communications Industry Association-USrrr08Broadband India Forumrrr09CCAOIrrr10ISPAIa [Provided that the terms “lawful order” and “Government Agency” are defined. Also, it should be clarified that the Rule applies only to platform based services.)aa [Provided that the Rule is only applicable to platform based services]11Asia Cloud Computing Associationrrr12IAMAIrrr13CIIrrr14BSA--------r15NASSCOMrrr16Mozillarrr17India Techaaa18COAIrar19Xiaomia [Provided that the power to enable tracing should be derived out of sections 69, 69A or 69B of the IT Act. Moreover, the intermediary shall on a best efforts basis enable tracing. In case it is not able to do so, it shall provide reasons in writing.]a [Only specific intermediaries, as notified by the Government, should be required to incorporate un- der the Companies Act, 2013]r20Amcham Indiar----r21Jioaaa22Star India--------a23ShareChatrar24Bombay Chamber of Commerce and Industrya [Provided that requests under this Rule should be made only if it is required for inves- tigation, detection, prosecution or prevention of an offence. Further, language should be modified to refer to information within the intermediary’s position. Further- more, ‘government agency’ should be defined and limited.]a [Only applicable to specifically notified intermediary]r25IBMa [Provided there are enough safeguards]----r26FICCIrr----27ASSOCHAMrrr28Large multinational software company (did not want to be quoted)rrr29Large multination- al technology company (did not want to be quoted)rrr30Mouthshut.comrrr


On an analysis of the table, it can be stated that only a handful of organizations (6 out of 31) agreed to some form of a traceability requirement. Barring Xiaomi and IBM (both of which asked for more safeguards) all other organizations which agreed to the traceability requirement were Indian in origin.


A quarter of all organizations (8 out of 31) were agreeable to the incorporation requirement, all of which were Indian organizations except for Xiaomi.

On automated content filtering, again, only 6 out of 31 organizations were in acceptance of the requirement - all of these were Indian companies/ associations.


Internet Service Providers Association of India, IndiaTech, Reliance Jio and Bombay Chamber of Commerce and Industry - agreed to all 3 proposed changes (traceability, incorporation and automated content filtering).

Star India and the Indian Music Industry, that specifically deal with copyrighted content, have supported the requirement for automat-ed content takedowns. The reasons for espousing it, as given in their comments is to protect the intellectual property rights of artists and content creators and to curb online piracy.

From this data it can be gathered that Indian businesses are leaning more towards stricter government regulation on online intermediaries. Such regulation will grant Indian businesses greater control over what flows through their pipelines, and at the same time weaken free speech and privacy rights online.


[1] Sflc.in, Intermediary Liability 2.0, Shifting Paradigm accessible as Appendix [.] * Please note that any mention of Intermediary guidelines means intermediary guidelines 2011. Draft Rules mean Intermediary Draft Rules, 2018. This shall be limited to this section only, unless otherwise specified. This section is for historical understanding of the evolution and shall not be construed as latest. This section discusses everything that happened till 2019 in safe harbour space. All case-laws and legal developments around the platform immunity have been analyzed. [2] Section 230 of the Communications Decency Act, ELECTRONIC FRONTIER FOUNDATION, https://www.eff.org/issues/cda230 accessible at Appendix[.] [3] Google has been steadily consolidating its Internet business by making strategic acquisitions - Matt Reynolds, If you can’t build it, buy it: Google’s biggest acquisitions mapped, WIRED, https://www.wired. co.uk/article/google-acquisitions-data-visualisation-infoporn-waze-youtube-android also accessible at Appendix [.] [4] Joe Nocera, Why WhatsApp Is No Threat to Facebook’s Dominance, BLOOMBERG OPINION, https://www.bloomberg.com/opinion/articles/2018-05-04/whatsapp-and-instagram-are-no-threat-to-facebook-s-dominance also accessible at Appendix [.] [5] IndiaSpend, Child-lifting rumours caused 69 mob attacks, 33 deaths in last 18 months, BUSINESS STANDARD, https://www.business-standard.com/article/current-affairs/69-mob-attacks-on-child-lifting-rumours-since-jan-17-only-one-before-that-118070900081_1.html, also accessible at Appendix [.] [6] Paul Mozur, A Genocide Incited on Facebook, With Posts From Myanmar’s Military, NEW YORK TIMES, https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html or accessible at Appendix [.] [7] Ibid [8] Ibid [9] Russia ‘meddled in all big social media’ around US election, BRITISH BROADCASTING CORPORATION-BBC, https://www.bbc.com/news/technology-46590890, also accessible at Appendix [.] [10] Heather Timmons and Hanna Kozlowska, Facebook’s quiet battle to kill the first transparency law for online political ads, QUARTZ, https://qz.com/1235363/mark-zuckerberg-and-facebooks-battle-to-kill-the-honest-ads-act/, accessible at Appendix [.] [11] Rob McLean and Danielle Wiener-Bronner, Mark Zuckerberg in his own words: The CNN interview, CNN MONEY, https://money.cnn.com/2018/03/21/technology/mark-zuckerberg-cnn-interview-transcript/ index.html, also accessible at Appendix [.]. [12] APC, Frequently asked questions on Internet Intermediary Liability, ASSOCIATION FOR PROGRESSIVE COMMUNI-CATIONS, https://www.apc.org/en/pubs/apc%E2%80%99s-frequently-asked-questions-Inter-net-intermed, also accessible at Appendix [.] [13] OECD, Definitions, 9, THE ECONOMIC AND SOCIAL ROLE OF INTERMEDIARIES 2010, https://www.oecd.org/ Internet/ieconomy/44949023.pdf, also accessible at Appendix [.] [14] R. MacKinnon, E. Hickok, A. Bar, H. Lim, Fostering Freedom Online – The Role of Internet Intermediaries, UNESCO Series on Internet Freedoms, 2014, http://unesdoc.unesco.org/images/0023/002311/231162e. pdf, also accessible at Appendix [.] [15] Directive (EU) 2015/1535 of the European Parliament and of the Council, laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services, Available at: https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1551937833098&uri=CELEX:32015L1535, also accessible at Appendix [.] [16] 47 USC S.230 (f)(2), The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions. [17] Information Technology Act 2000 Section 2(t), An “electronic record” is “data, record or data generated, image or sound stored, received or sent in an electronic form or micro film or computer generated micro fiche”. Section 2(o) The term “data” is defined as “a representation of information, knowledge, facts, concepts or instructions which are being prepared or have been prepared in a formalized manner, and is intended to be processed, is being processed or has been processed in a computer system or computer network, and may be in any form (including computer printouts, magnetic or optical storage media, punched cards, punched tapes) or stored internally in the memory of the computer”. [18] Information Technology Act 2000 Section 2(w) [19] Ibid. [20] Article 19, Internet Intermediaries: basic facts, 7 INTERNET INTERMEDIARIES: DILEMMA OF LIABILITY 2013, https://www.article19.org/data/files/Intermediaries_ENGLISH.pdf, also accessible at Appendix [.] [21] Google,Government Requests to Remove Content, Google Transparency Report, GOOGLE (Feb. 26, 2019, 2:50 PM), https://transparencyreport.google.com/government-removals/overview?hl=en, also accessible at Appendix [.] [22] Ibid [23] ibid [24] ibid [25] ibid [26] The IT Act came into force in India on 17 October, 2000. [27] General Assembly of the UN, resolution A/RES/51/162 dated January 30, 1997, also accessible at Appendix [.]. [28] According to the previous Section 79 of the IT Act, network service providers meant - ‘intermediaries’ as defined under the Act. [29] Sec. 79 - Network service providers not to be liable in certain cases: For the removal of doubts, it is hereby declared that no person providing any service as a network service provider shall be liable under this Act, rules or regulations made thereunder for any third party information or data made available by him if he proves that the offence or contravention was committed without his knowledge or that he had exercised all due diligence to prevent the commission of such offence or contravention. Explanation.— For the purposes of this section, — (a) “network service provider” means an intermediary; (b) “third party information” means any information dealt with by a network service provider in his capacity as an intermediary [30] The Information Technology (Amendment) Act, 2008 came into force on 27 October, 2009 - https://meity.gov.in/ writereaddata/files/act301009_0.pdf and the amendment act can be accessed here: https://meity.gov.in/writereaddata/ files/it_amendment_act2008%20%281%29_0.pdf, also accessible at Appendix [.]. [31] Avnish Bajaj v. State, 150 (2008) DLT 769, also accessible at Appendix [.] [32] Section 67 of the then IT Act: Publishing of information which is obscene in electronic form - Whoever publishes or transmits or causes to be published in the electronic form, any material which is lascivious or appeals to the prurient interest or if its effect is such as to tend to deprave and corrupt persons who are likely, having regard to all relevant circumstances, to read, see or hear the matter contained or embodied in it, shall be punished on first conviction with imprisonment of either description for a term which may extend to five years and with fine which may extend to one lakh rupees and in the event of a second or subsequent conviction with imprisonment of either description for a term which may extend to ten years and also with fine which may extend to two lakh rupees. [33] Avnish Bajaj v. State, 150 (2008) DLT 769 [34] Aneeta Hada v. Godfather Travels and Tours Pvt. Ltd, AIR 2012 SC 2795, also accessible at Appendix [.] [35] Avnish Bajaj v. State, 150 (2008) DLT 769 [36] Section 2(1)(w) of the IT Act. [37] Section 79 of the IT Act. [38] Accessible at Appendix [.] or https://eur-lex.europa.eu/legal-content/EN/ALL/?uri=celex%3A32000L0031 [39] The Intermediaries Guidelines Rules, http://dispur.nic.in/itact/it-intermediaries-guidelines-rules-2011.pdf or accessible at Appendix [.] [40] To refer to the entire text of the Intermediaries Guidelines, kindly refer to https://www.wipo.int/edocs/lexdocs/ laws/en/in/in099en.pdf or accessible at Appendix [.] [41] For a full list of prohibited content, refer to Rule 3(2) of the Intermediary Guidelines available at https://www.wipo. int/edocs/lexdocs/laws/en/in/in099en.pdf or accessible at Appendix [.] [42] The Report of the Committee, https://sflc.in/report-committee-subordinate-legislation-intermediaries-rules-tabled (SFLC.in had deposed before the committee highlighting its concerns with various provisions of the Intermediaries Guide-lines). [43] Shreya Singhal v. Union of India, (2015) 5 SCC 1, also accessible at Appendix [.] [44] For the entire text of the erstwhile Section 66A, kindly refer to Appendix [.] [45] Article 19(2) of the Indian Constitution places reasonable restrictions on free speech in the interests of - sovereignty and integrity of India, security of the state, friendly relations with foreign states, public order, decency or morality, con-tempt of court, defamation, or incitement to an offence. [46] Para. 117 of the Shreya Singhal judgment [47] As held by the Supreme Court of India in Shreya Singhal [48] To refer to the entire text of the Draft Rules, see https://meity.gov.in/writereaddata/files/Draft_Intermediary_ Amendment_24122018.pdf or also accessible at Appendix [.] [49] The press note issued by MeitY, http://pib.nic.in/newsite/PrintRelease.aspx?relid=186770, also accessible at Appendix [.] [50] https://sflc.in/our-comments-meity-draft-intermediaries-guidelines-amendment-rules-2018 and here - https://sflc.in/our-counter-comments-meity-draft-intermediaries-guidelines-amendment-rules-2018 [51] Explanation of the end-to-end encryption used by WhatsApp on its service, WHATSAPP, https://faq.whatsapp.com/en/android/28030015/ [52] WP (Civil) No. 494 of 2012, accessible at Appendix [.] [53] The Supreme Court read in informational and communicational privacy as facets of the larger right to privacy in K.S Puttaswamy v. Union of India [54] The Supreme Court in K. S. Puttaswamy v. UoI held that - “the right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III (fundamental rights) of the Constitution.” [55] Sydney Li, Jamie Williams, Despite What Zuckerberg’s Testimony May Imply, AI Cannot Save Us, https://www.eff.org/deeplinks/2018/04/despite-what-zuckerbergs-testimony-may-imply-ai-cannot-save-us/, also accessible at Appendix [.] [56] Natasha Duarte, Emma Llanso, Anna Loup, Mixed Messages? The Limits of Automated Social Media Content Analy-sis Presented at the 2018 Conference on Fairness, Accountability, and Transparency, Natasha Duarte Emma Llansó (Cen-ter for Democracy & Technology), Anna Loup (University of Southern California), https://cdt. org/files/2017/12/FAT-conference-draft-2018.pdf, or also accessible at Appendix[.] [57] Jon Russell, Hike unbundles its messaging app to reach India’s next wave of smartphone users, TECHCRUNCH https://techcrunch.com/2018/01/16/hike-unbundles-its-messaging-app/, also accessible at Appendix [.] [58] Aria Thaker, Indian politicians are now flocking to an unlikely “no English” social network, QUARTZ https://qz.com/india/1414241/sorry-facebook-indias-bjp-and-congress-flock-to-sharechat/, also accessible at Appendix[.] [59] Such chilling effect has already been witnessed as a result of Section 66A [60] S.91 of CrPC - the Omnipotent provision? by SFLC.in, can be accessed here - https://sflc.in/s91-crpc-omnipo-tent-provision [61] Certain intermediaries stated that Section 91 of the CrPC is being used for taking down content [62] Section 69 and 69A of the IT Act, available in the Appendix[.] [63] As discussed previously, Section 81 of the IT Act precludes the applicability of other laws in terms of conflicting provisions. [64] Section 52(b) and (c) of the Copyright Act, 1957 accessible in Appendix[.], or https://copyright.gov.in/documents/copyrightrules1957.pdf [65] Section 81 of the IT Act: Act to have overriding effect. – The provisions of this Act shall have effect notwithstanding anything inconsistent therewith contained in any other law for the time being in force. [Provided that nothing contained in this Act shall restrict any person from exercising any right conferred under the Copy-right Act, 1957 (14 of 1957) or the Patents Act, 1970 (39 of 1970).] [66] Rule 3(2)(d) and Rule 3(3) of the Intermediaries Guidelines [67] Section 52(c) of the Copyright Act and Rule 75 of the Copyright Rules, 2013. The Act can be found in the annexure [.] and the rules can be accessed at Appendix [.] or https://copyright.gov.in/Documents/Copyright_Rules_2013_and_Forms.pdf [68] Myspace v. Super Cassettes Industries Ltd., 236 (2017) DLT 478, also accessible at Appendix[.] [69] In Shreya Singhal, the Supreme Court restricted takedown requests to matters contained under Article 19(2) of the Constitution of India [70] Kent RO Systems Ltd. v. Amit Kotak, [240 (2017) DLT3], also accessible at Appendix[.] [71] Christian Louboutin SAS v. Nakul Bajaj, [253(2018)DLT728], also accessible at Appendix[.] [72] Though the definition of intermediary as per the IT Act specifically includes - online auctions sites and online marketplaces. Kindly refer to Section 2(1)(w) of the IT Act. [73] As guaranteed by the Indian Constitution under Article 19(1)(a) [74] Divij Joshi, SaReGaMa Pardon Me, You Have the Wrong Address: On the Perils and Pitfalls of Notice and Takedown, SPICY IP https://spicyip.com/2019/02/saregama-pa-rdon-me-you-have-the-wrong-address-on-the-perils-and-pitfalls-of-notice-and-takedown.html, also accessible at Appendix[.] [75] Kent RO Systems Ltd. v. Amit Kotak, [240 (2017) DLT3] [76] Draft National e-Commerce Policy, India’s Data for India’s Development, Department of Industrial Policy and Promo-tion (Feb 25, 2019, 4:15PM) https://dipp.gov.in/whats-new/draft-national-e-commerce-policy-stakeholder-comments, also accessible at Appendix [.] [77] Avnish Bajaj v. State, 150 (2008) DLT 769 [78] Post this judgment, the intermediary law of India was amended. [79] Google v. Visakha Industries, [Criminal Petition No. 7207 of 2009] [80] Shreya Singhal v. Union of India, (2015) 5 SCC 1 [81] Myspace Inc. vs. Super Cassettes Industries Ltd. [236 (2017) DLT 478], also accessible at Appendix [.] [82] 2017 (69) PTC 551 (Del), also accessible at Appendix [.] [83] R. Bajaj, In a Welcome Development, Delhi High Court Refuses to Compel Intermediaries to Screen Content Violative of Intellectual Property Laws on an Ex-ante Basis, SpicyIP https://spicyip.com/2017/03/ in-a -welcome-development- delhi-high-court-refuses-to-compel-intermediaries- to-screen- content-violative-of-intellectu-al-property-laws-on-an-ex-ante-basis.html or also accessible at Appendix [.] [84] 2017 (69) PTC 551 (Del), also accessible at Appendix [.] [85] 2018 (1) CTC 506, also accessible at Appendix [.] [86] The IT Act gives the government powers to request for information, intercept, decrypt and also takedown content as per Section 69 and 69A [87] Christian Louboutin SAS v. Nakul Bajaj & Ors, Civil Suit No. 344/2018, also accessible at Appendix [.] [88] AIR 2018 SC 578, can also be accessed at appendix [.] [89] [W.P.(C) No. 177/2013], also accessible at Appendix [.] [90] SMW (Crl) No. 3/2015, also accessible on Appendix [.] [91] B. Sinha, SC orders CBI probe into rape videos circulated on WhatsApp, HINDUSTAN TIMES (June 25, 2018, 5:23PM) http://www.hindustantimes.com/india/sc-orders-cbi-probe-into-rape-videos-circulated-on-whatsapp/story-6OUlIUVqd0n-VqKHrXPxyeK.html, also accessible on Appendix [.] [92] Search Removals under Privacy Law, Google Transparency Report, GOOGLE (Feb 26, 2019, 12:00PM) https:// transparencyreport.google.com/eu-privacy/overview?hl=en, also accessible on Appendix [.] [93] Ibid [94] Ioania Stupariu, Defining the Right to be Forgotten: A comparative Analysis between the EU and US, CENTRAL EURO-PEAN UNIVERSITY, 2015, accessible at Appendix [.] [95] Ibid [96] Ibid [97] Kindly refer to Section 27 of the Draft Personal Data Protection Bill, 2018, which can be accessed, here - https://meity. gov.in/writereaddata/files/Personal_Data_Protection_Bill,2018_0.pdf, or at Appendix [.] [98] W.P. (C) 1021/2016, also accessible at Appendix [.] [99] W.P (C) No. 8477/2016, also accessible at Appendix [.] [100] Marital discord has Google in dock, THE DECCAN CHRONICLE ( June 6, 2018, 5:05PM), http://www.deccanchronicle.com/nation/current-affairs/040316/marital-discord-has-google-in-dock.html, also accessible at Appendix [.] [101] For the case status , kindly refer to - https://services.ecourts.gov.in/ecourtindiaHC/cases/case_no.php?state_cd=4&dist_ cd=1&court_code=1&stateNm=Kerala#, also accessible at Appendix [.] [102] 2017 SCC Kar 424, also accessible at Appendix [.] [103] 2015 SCC Guj 2019, also accessible at Appendix [.] [104] This information has been compiled based on the public submissions of the entities mentioned herein, to the government consultation held on the Draft Rules.

[SD1]In foot notes, mention the Appendix wrt the IT Act. [SD2]Correct reference for the IT Act in appendix [SD3]Map with copyright act.

1 view0 comments
Recent Articles

Subscribe to Our Newsletter

Thanks for submitting!

bottom of page