Defining an intermediary
An intermediary in the context of the Internet can be understood as an entity that acts as a facilitator of the flow of data across the vast and complex synapses of the Internet. While the actual functions of intermediaries are dynamic and often not clear-cut, they can broadly be seen as falling into one of two categories i.e. conduits for data traveling between nodes of the Inter-net, hosts for such data. An Internet intermediary could therefore refer to Telecom Service Providers (TSP) that supply network infrastructure like optic fiber cables and spectrum band-width over which Internet data is transmitted, Internet Service Providers (ISP) that utilize this infrastructure to offer Internet connectivity to the public, web-hosting platforms that provide servers on which Internet data is stored, search engines that sort through and index petabytes of data for easy retrieval, and the myriad online services that provide ways for end-users to lever-age the power of the Internet for the efficient conduct of activities like commerce, governance, education, entertainment, and social networking to name a few. In other words, intermediaries play very crucial roles in the functioning of the Internet. Owing to the complex and diverse nature of functions performed by intermediaries, significant variations can be seen in global and national efforts at formally defining the term. The Organization for Economic Co-operation and Development (OECD) in April 2010 pro-posed that “Internet intermediaries” be defined as follows:
“Internet intermediaries bring together or facilitate transactions between third parties on the Internet. They give access to, host, transmit and index con-tent, products and services originated by third parties on the Internet or provide Internet-based services to third parties.”
The OECD also identified the following as falling within the scope of this definition, though it was also careful to leave room for future expansion: ISPs, data processing and web-hosting providers, search engines, e-commerce platforms, Internet payment systems, and participative networking platforms. This definition was also cited by the United Nations Educational, Scientific and Cultural Organization (UNESCO) in a 2014 report on Internet freedoms.
Some national jurisdictions on the other hand, have chosen to not attempt defining the term “intermediary” as such in relevant laws. Instead, broader alternate terms like “information society services” and “interactive computer services” are employed, and intermediary regulationsare incorporated into law without referencing the term “intermediary”.
The above being said, this report examines intermediary liability primarily in the context of Indian law. As such, the best place to look to under-stand the term “intermediary” for the purposes of this report is the IT Act, specifically Section 2(1)(w), which defines the term in some detail.
Section 2(1)(w) reads:
“Intermediary, with respect to any particular electronic records, means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes telecom service providers, network service providers, Internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, on-line-market places and cyber cafes.”
According to Section 2(1)(w) of the IT Act therefore, an intermediary is any person who receives, stores or transmits an electronic record on behalf of another person or provides any service with respect to that record. The Section then clarifies that the term includes telecom service providers, network service providers, Internet service providers, web hosting service providers, search engines, online payment sites, online auction sites, online marketplaces and cyber cafes. This list is non-exhaustive and Section 2(1)(w) also covers entities such as social media websites, blogging platforms, message boards, consumer review websites and so on. In other words, virtually any website that features user-generated content and a large number of Internet service providers fall within the definition of an intermediary under Section 2(1)(w) of the IT Act.
User-generated content and liability
“Intermediary liability”, to put it simply, refers to the extent of liability that an intermediary stands to incur due to the non-permissibility under law of content they deal in. Seeing how intermediaries neither create nor modify content, the predominant consensus has been that it would be inequitable to hold them strictly accountable for unlawful user-generated content. Users of intermediary services are the true content creators and as such, it has generally been felt that they should be the ones made to answer for the illegality of content hosted or transmitted on intermediary platforms unless intermediaries have meaningful degrees of editorial control. However, some jurisdictions such as China and Thailand have opted to see things differently and maintained that it is the responsibility of platform providers i.e. intermediaries to ensure that the content they host or transmit remains within the confines of legal permissibility.
Based on these divergent viewpoints, three broad models of intermediary liability have emerged globally, as pointed out by Article 19 in their 2013 report titled “Internet Intermediaries: Dilemma of Liability”. These are:
a The strict liability model: Intermediaries are held unconditionally liable for user-generated content. Intermediaries are effectively required to monitor content in order to comply with the law; if they fail to do so, they face a variety of sanctions, including the withdrawal of their business license and/or criminal penalties. Examples include Thailand and China.
b The safe-harbour model: Intermediaries are given conditional immunity from liability arising out of user-generated content i.e. if they comply with certain requirements laid out under law. This model can be further divided into:
a The vertical model: Liability is determined according to the type of content at issue. No distinctions are made as to the type of service provided by intermediaries e.g. hosting vs. trans-mitting.
b The horizontal model: Liability is determined according to the kind of function performed by the intermediary. Intermediaries acting only as a transmitter of content may thus be exempted unconditionally from liability whereas those acting as hosts may be held to more stringent standards. The latter may forfeit immunity if they do not expeditiously remove unlawful content on being notified.
The safe-harbour model is also characterized by the existence of “notice-and-takedown” processes, which are legally prescribed procedures that clearly outline how content takedown requests must be received and processed by intermediaries. Intermediaries may further be encouraged to institute some form of technology-based or self-regulatory content filters so as to prevent the publication of unlawful content. The EU e-commerce Directive, US Digital Millennium Copyright Act and the Indian IT Act are legislations that employ this model of intermediary regulation.
a The broad immunity model: Intermediaries are given broad, at times conditional, immunity from liability arising out of user-generated content. Notably, intermediaries are also expressly excluded from any obligation to monitor for unlawful content. This model treats intermediaries as messengers who merely transmit content on behalf of users, rather than publishers of content. Section 230 of the Communications Decency Act is an example of this model.
Regardless of the model, almost all regulatory regimes overseeing Internet intermediaries obligate intermediaries to remove unlawful content from their platforms upon being asked to do so in accordance to applicable legal procedures. This, coupled with the fact that availability of immunity from liability is contingent in some regulatory regimes on expeditious compliance with takedown requests, means that regulators and intermediaries alike must be mindful of the impact of their actions on freedom of expression, which is a fundamental human right recognized under almost all major national and international jurisdictions. Regulators that impose ambiguous content limitations or ask intermediaries to remove content based on their own judgement while running the risk of forfeiting safe-harbour protection for non-removal of content, as well as intermediaries that over-comply with take-down requests will adversely impact freedom of expression. Google’s transparency reports shows that there has been a sharp increase in the number of content takedown requests received from governments in recent times. While Google received 1,031 such requests in the second-half of 2009, this number climbed to 15,961 in the second half of 2016, representing a fifteen-fold increase. The latest report reveals that 25,534 requests were received in the first half of 2018 itself. According to this report, national security is the most cited reason for takedown requests with 11,430 and 17,999 requests in the years 2016 and 2017 respectively. This is followed by defamation with an increase from 3,440 to 4,257 requests in years 2016 to 2017. Takedown requests on the basis of ‘Privacy and Security’ have also increased from 2404 to 2497 requests in the years 2016 to 2017.
The Intermediary Liability Regime in India
Enlarging the Scope of Safe-Harbour
The Indian Government enacted the IT Act to provide legal recognition to e -commerce, to facilitate electronic filing of documents with government agencies and amend other existing laws like the Indian Penal Code, 1860 and the Indian Evidence Act, 1872. This was based on the UN General Assembly adopting the Model Law on Electronic Commerce issued by the United Nations Commission on International Trade Law, to which India was a signatory. According to the Statement of Objects and Reasons of the IT Act, “There is a need for bringing in suit-able amendments in the existing laws in our country to facilitate e-commerce. It is, therefore, proposed to provide for legal recognition of electronic records and digital signatures.”
At the time the IT Act was enacted, the definition of the term ‘intermediary’ was as follows:
“intermediary” with respect to any particular electronic message means any person who on behalf of another person receives, stores or transmits that message or provides any service with respect to that message.
Section 79 is currently the provision that guarantees safe-harbour protection to intermediaries for third party content. Section 79 of the original Act only protected network service providers from liability arising from third party content, if they proved absence of knowledge; or application of positive application of due diligence on their part to prevent commission of an offence/ contravention.
Subsequently, an amendment to the IT Act in 2008 (“the IT Amendment Act”) made substantial changes to Section 79 (the safe-harbour provision) and the definition of intermediaries. One of the triggers for amending the IT Act in 2008, specifically for widening the protection given to intermediaries, was the MMS scandal affecting Baazee.com (at that time, a wholly owned subsidiary of Ebay Inc. USA). In this case, an MMS clip was listed on Baazee.com (an e-commerce website) which contained sexually explicit content which was being offered for sale on the website. For selling of such content on its website, Avnish Bajaj, the then Managing Director of Baazee.com. was arrested and criminally charged with provisions under the Indian Penal Code, 1860 (“the IPC”) and the IT Act, which dealt with acts of obscenity. In a petition challenging the criminal charges against him, the Delhi High Court in Avnish Bajaj v. State held that a prima facie case for obscenity may be made against Baazee.com. It cannot be made against Avnish Bajaj for provisions under the IPC, but he may be charged for publishing of obscene content in electronic form as per Section 67 of the IT Act (it is important to note that Baa-zee.com was not arraigned in the case as an accused). The court in its judgment had stated that owners or operators of websites that offer space for listings might have to employ content filters to prove that they did not knowingly permit the use of their website for pornographic material. On an appeal made by Avnish Bajaj against the charge under Section 67 of the IT Act, the Supreme Court of India in the year 2012, quashed the proceedings against him on the ground that prosecution of the Managing Director could not go ahead without arraigning the company as an accused party. Drawing parallels between the Negotiable Instruments Act, 1881 and the IT Act in terms of offence by companies and the consequent liability of its officers, the court held that vicarious liability will only arise when the company is arraigned as an accused party.
The IT Amendment Act enlarged the definition of the word ‘intermediary’ to service providers like telecom service providers, Internet ser-vice providers, search engines, online market-places and even cyber cafes. It also widened the safe-harbour protection given to these intermediaries under Section 79 from only network service providers to all intermediaries and protected intermediaries from all unlawful acts rather than offences and contraventions covered under the IT Act itself. This new provision ad-opted a function based approach, wherein if the intermediary - (a) only provided access to a communication system for information made available by third parties, which is transmitted or temporarily stored/ hosted; and (b) it did not initiate the transmission, select the receiver and select/ modify the information, then it could claim protection under this provision for content made available by third parties (user generated content).
The amended provision made this safe-harbour protection available to intermediaries based on certain conditions:
I. Observance of due diligence and certain guidelines issued by the Central Government;
II. Not conspiring, abetting, aiding or inducing the commission of the unlawful act; and
III. Upon receiving ‘actual knowledge’ or being notified by the government, taking down unlawful content.
In the Report of the Expert Committee, set up by the Ministry of Information and Technology in 2005 to recommend changes to the IT Act, the rationale for amending the safe-harbour provision i.e. Section 79 was explained as to bring it in line with the EU’s Directive on e-commerce (2000/31/EC).
3.2 ‘Due Diligence’ Guidelines for Attaining Safe-Harbour
After the amendment to the IT Act in 2008, which incorporated the ‘due-diligence’ requirement for intermediaries for claiming safe-harbour, the Government of India on 11th April, 2011, issued the Information Technology (Intermediaries Guidelines) Rules, 2011 (“the Intermediaries Guidelines”). The Intermediaries Guidelines, inter alia, brought in the following conditions, which all intermediaries had to ad-here to for their safe-harbour protection:
a) Publishing rules/regulations; privacy policies; user agreements;
b) Terms and conditions to specify prohibited content- grossly harmful, harms minors, infringes intellectual property rights, contains virus (among other things).
c) A strict notice and takedown process;
d) Assistance to government agencies for law enforcement;
e) A duty to report cyber security incidents to the government; and
f) Appointment and notification of a grievance officer.
According to the thirty-first report of the Parliamentary Committee on Subordinate Legislation, which studied the Intermediaries Guidelines, among other delegated legislation notified by the Indian Government under the IT Act, there were a number of ‘infirmities’ with the Intermediaries Guidelines, the report identified them as:
a) Ambiguous and Vague Terms: the committee recommended that to remove such ambiguity, terms which are borrowed from other laws shall be incorporated within the guidelines and un-defined terms shall be defined and inserted into the text.
b) Removal of Content by Intermediaries: the committee recommended that there is a need for clarity on the notice and takedown process and there should be safeguards to protect against any abuse during such process.
c) Reconstitution of the CRAC - the Cyber Regulations Advisory Committee: the committee recommended that the CRAC must be re-constituted. It found that the CRAC had met twice since the enactment of the IT Act in the year 2000. According to the committee, MeitY would benefit from the advise of the CRAC and it should incorporate such members who rep-resent the interests of the principally affected and who have special knowledge of the subject matter.
Unfortunately, none of the recommendations made by the Committee on Subordinate Legislation were incorporated by the government either at the time of such consultation or subsequently.
3.3 Narrowing the scope of ‘actual knowledge’
In a batch of writ petitions filed before the Supreme Court of India starting from 2012, a number of provisions of the IT Act were challenged - Section 66A (punishment for sending offensive messages), 69A (power to block websites) and 79 (safe-harbour provision) for severely affecting the fundamental right of free speech and ex-pression as guaranteed under Article 19(1)(a) of the Constitution of India. This case - Shreya Singhal v. Union of India which is otherwise popularly known as the Shreya Singhal judgment, struck down Section 66A of the IT Act as un-constitutional for having a chilling effect on free speech, (Section 66A[SD1]  provided for punishment for sending offensive messages through communication services. It created criminal liability for sending information which was grossly offensive, inconvenient, insulting, dangerous etc.)
This was a landmark judgment in the Supreme Court’s jurisprudence as for the first time the court recognized the Indian citizen’s free speech rights over the Internet and struck down a draconian pro-vision from the IT Act. As India’s Constitution provides for ‘reasonable restrictions’ on free speech in certain circumstances [as per Article 19(2) of the Constitution], the court in Shreya Singhal tried to read in the elements of Article 19(2) into Section 66A but failed to do so.
On the issue of intermediary liability, the Supreme Court read down Section 79 and held that the ‘actual knowledge’ requirement for an intermediary to take down content has to be read to mean either an intimation in the form of a court order or on being notified by the government and such requests must be restricted to the limitation listed by Article 19(2) of the Constitution. The court similarly read down the ‘actual knowledge’ requirement from the Intermediaries Guidelines which operationalised the notice and takedown mechanism under law-
“119. (c) Section 79 is valid subject to Section 79(3)(b) being read down to mean that an intermediary upon receiving actual knowledge from a court order or on being notified by the appropriate government or its agency that unlawful acts relatable to Article 19(2) are going to be commit-ted then fails to expeditiously remove or disable access to such material. Similarly, the Information Technology “Intermediary Guidelines” Rules, 2011 are valid subject to Rule 3 sub-rule (4) being read down in the same manner as indicated in the judgment.”
This marked a significant change in the intermediary liability regime in India, as previously any person could request intermediaries to take down content, if they felt it was unlawful. The law also placed intermediaries in a precarious position to adjudge the legality of content on their platforms, which directly conflicted with their status of being mere functionaries. In fact, the Supreme Court in Shreya Singhal acknowledged that intermediaries like Google and Facebook would have to act upon millions of requests for takedowns, making them the adjudicators as to which requests were legitimate according to law.
The following inferences can be drawn to broadly sum-up India’s Intermediary Liability law:
a) Intermediaries need to fulfill the conditions under Section 79 of the IT Act as discussed above (conditional safe-harbour);
b) Intermediaries are required to comply with all requirements listed under the Intermediaries Guidelines (due diligence rules); and
c) Intermediaries, other than enforcing their own terms and conditions and privacy policies, are liable to take down content from their plat-forms only when notified by a court or an authorised government agency and that too for matters listed under Article 19(2) of the Constitution (the actual knowledge requirement).
3.4 Proposed Amendment to Intermediaries Guidelines
On 24th December, 2018, MeitY released the Draft Information Technology [Inter-mediaries Guidelines (Amendment) Rules], 2018 (“the Draft Rules”) to amend the existing Intermediaries Guidelines. These Draft Rules sought to introduce requirements on intermediaries like - tracing out of originator of information for assistance to law enforcement, deployment of automated tools for proactive filtering of unlawful content, takedown of illegal content within 24-hours, and mandatory incorporation of companies having 5 million + users in India (among other things).
In a press note issued by MeitY alongside the Draft Rules, it has been mentioned that social network platforms are required to follow due diligence as provided in Section 79 of the IT Act and the Rules notified there-in, subject to the import of Article 19(2) of the Constitution, they have to ensure that their platforms are not used to commit and provoke terrorism, extremism, violence and crime. The press note also states that in-stances of misuse of social media platforms by criminals and anti -national elements have brought new challenges to law enforcement agencies, such as inducement for recruitment of terrorists, circulation of obscene content, spread of disharmony, incitement of violence, public order, fake news etc. The press note points to fake news/ rumours being circulated on WhatsApp and other social media platforms for various mob- lynching incidents reported across India in the last year. As MeitY has not issued any other official statement behind their intent in revising the intermediaries guidelines under the IT Act, the Draft Rules need to be read in con-junction with the press note for a critical ex-amination of the proposed changes therein.
MeitY invited comments on the Draft Rules and received responses from around 150 stakeholders, a number of them expressing their concerns around the proposed guidelines for their capacity to severely affect free speech and privacy rights of citizens online.
Key Issues with the Draft Rules
A. The Traceability Requirement: Rule 3(5) of the Draft Rules requires intermediaries to enable the tracing out of originator of information on their platforms as may be required by authorised government agencies. The most concerning aspect of this requirement is how it will affect intermediaries like WhatsApp and Signal who provide personal communication ser-vices which are end-to -end encrypted i.e. wherein even the service provider does not have access to the content of messages/ in-formation which flows through their plat-form. Introducing a traceability requirement for end-to-end encrypted services will lead to breaking of such encryption and thus compromising the privacy of individuals making use of such services for their private communication. In August of 2017, a nine -judge bench of the Supreme Court in KS Puttaswamy v. UOI (“the Privacy Judgment”), held the right to privacy as a fundamental right guaranteed under the Constitution of India.
B. Proactive Filtering of Content: Rule 3(9) of the Draft Rules requires intermediaries to deploy automated tools for proactive filtering of unlawful content on their platforms. Online intermediaries are considered channels of distribution that play a merely neutral, technical and non-adjudicatory role. This Rule requires intermediaries to scrutinize user generated content and determine its legality - a task which must be undertaken by the judiciary considering that there are no clear standards of what is ‘unlawful’. This provision of proactive content filtering is against the judgment in Shreya Singhal (as discussed above), where in the Supreme Court of India had held that intermediaries are neutral platforms that do not need to exercise their own judgment to decide what constitutes legitimate content.
Automated moderation systems that are in use today rely on keyword tagging which is then followed by human review. Even the most advanced automated systems cannot, at the moment, re-place human moderators in terms of accuracy and efficiency. This is mainly because artificial intelligence is currently not mature enough to understand the nuances of human communication such as sarcasm and irony. It should also be noted that global communication is influenced by cultural differences and overtones which an effective system of content moderation has to adapt to. Given the amateurish stage at which AI is at the moment, it may be short sight-ed to rely on this technology.
As societies evolve and change, so does the definition of “grossly harmful / offensive content”.
This implies that algorithms have to constantly understand nuanced social and cultural context that varies across regions. Research on AI has not yet produced any significant sets of data for this kind of understanding. The immediate result of using automated tools will be an increase in content takedowns and account suspensions which in turn will lead to over-censorship as has been seen around the world. Legitimate users (content creators) including journalists, human rights activists and dissidents will have their speech censored on a regular basis.
YouTube’s “Content ID” system for detecting content that infringes copyright has been deemed notorious for over-censoring innocent material. Use of AI without human intervention for detecting hate speech, misinformation, disinformation, trolling, etc which is even more nuanced than identifying copyrighted material will be catastrophic for freedom of speech and expression on the Internet.
The key limitations of natural language pro-cessing tools are:
1. Natural language processing (“NLP”) tools perform best when they are trained and applied in specific domains, and cannot necessarily be applied with the same reliability across different contexts;
2. Decisions based on automated social media content analysis risk further marginalizing and disproportionately censoring groups that already face discrimination. NLP tools can amplify social bias reflected in language and are likely to have lower accuracy for minority groups who are under-represented in training data;
3. Accurate text classification requires clear, consistent definitions of the type of speech to be identified. Policy debates around content moderation and social media mining tend to lack such precise definitions;
4. The accuracy and intercoder reliability challenges documented in NLP studies warn against widespread application of the tools for consequential decision-making; and
5. Text filters remain easy to evade and fall far short of humans’ ability to parse meaning from text.
C. Local Office, Incorporation and Appointment of Nodal Officer: Rule 3(7) of the Draft Rules requires all intermediaries with more than 5 million users in India to be incorporated, have a permanent registered office in India with a physical address and appoint a nodal officer and a senior functionary for 24-hour coordination with Law Enforcement Agencies. At present there is lack of clarity about what this number of users refers to i.e. whether it refers to daily, monthly or yearly users, or the number of total registered users. To understand the implication of this requirement, reference to the user base of popular messaging apps is pertinent. WhatsApp, India’s most popular chatting app, has around 200 million users in India. Relatively newer chatting applications Hike and ShareChat have 100 mil-lion users and 25 million users respectively. The 5 million users specified in the Draft Rules represent around 1% of the Internet user base in India which might bring a substantial number of intermediaries under a new set of compliance requirements. This may cause many start-ups to bear the brunt of high costs stemming from incorporation under the Indian companies law - the Companies Act, 2013.
D. Ambiguous Terms: The Draft Rules contain mandates regarding a broad category of content that is classified as ‘unlawful’. Such a broad category of content described using terms such as “grossly harmful”, “harassing” and “blasphemous” could result in a chilling effect with intermediaries being forced to re-move even lawful content.
Intermediary Liability in Reality
Shreya Singhal brought in a welcome respite to Internet intermediaries in India as they no longer were required to act upon sundry requests for content takedowns and could rely on court orders or notifications of authorised government agencies. This judgment also upheld constitutionally guaranteed rights of free speech of citizens on the Internet and clarified that restriction on speech will need to be within the contours of Article 19(2) of the Constitution, the court held that -
“86. That the content of the right under Article 19(1)(a) (free speech right) remains the same whatever the means of communication including Internet communication is clearly established …” Problems remain though, constitutional limits on free speech like - the security of the state, public order, decency/ morality, defamation or incitement to an offence are not defined, there are various tests established by courts for each of these limits but they are to be determined based on the facts and circumstances of each case. The ambiguity surrounding the meaning of these words and phrases might make it difficult for intermediaries to act upon orders received from competent authorities based on these limits.
Phrases used in the Intermediaries Guidelines, which online platforms are required to incorporate in their terms and conditions remain vague and undefined. According to these, content that is grossly harmful, hateful and blasphemous must not find a place on intermediary platforms. Following Shreya Singhal, such mandate must come from courts or the government, but plat forms might takedown similar content relying on their community guidelines or terms and conditions, which may lead to private censorship.
Then there is the reality of online platforms being utilised by bad actors to disseminate disinformation, terrorist content, child pornography etc. pushing governments around the world to hold intermediaries more accountable for third party content on their platforms. In India, public lynchings which have been attributed to rumour mongering on intermediary platforms have resulted in the government wanting to bring in changes such as - automated content filtering and traceability, which will have negative effects on rights like free speech and privacy. Countries across the world are pressuring intermediaries to be more responsible for the content flowing through their platforms. Though intermediary liability needs to be revisited in the current global context, any changes to law and regulation must ensure that it doesn’t abrogate basic human rights.
Content takedown requests are sometimes also received by intermediaries in the form of orders of law enforcement agencies under Section 91 of the Code of Criminal Procedure,1973 (“CrPC”). Section 91 empowers courts and authorisedpolice officers to ‘summon’ produce ‘any document or other thing’ which may be required for conducting investigation[SD2] . The IT Act, gives enough powers to central and state governments for intercepting, monitoring, decrypting and taking down content from their platforms. No part of Section 91 of the CrPC gives powers to law enforcement agencies to have content taken-off online platforms, it only provides for summoning of documents for aiding investigation. Despite the specific applicability of the IT Act in matters of online content, law enforcement agencies fall back on general laws such as the CrPC to issue orders for content takedowns. The courts in India have held intermediaries more accountable for IP protected content flowing through their channels, which has been discussed in the next section.
Intermediary Liability and IP Disputes in India
The intermediary liability law in India is primarily governed by Section 79 of the IT Act as discussed above. As per that provision, online intermediaries enjoy a safe-harbour for third-party content on their platforms, till they prescribe to certain due diligence rules set out under the Intermediaries Guidelines. Provisions under the Copyright Act, 1957 provide for some protection to certain intermediaries as well. Section 79 of the IT Act in conjunction with the ruling of the Supreme Court of India in Shreya Singhal, which broadened the protection given to intermediaries and allowed them to takedown content only on instructions by courts or authorised government agencies, is the authoritative law of the land on intermediary liability. Though, it is important to point out that in terms of intellectual property rights (“IP rights”), courts in India have placed a higher responsibility on intermediaries to take down content that infringes IP rights.
By Siddharth Dalmia
The Startup Sherpa
 APC, Frequently asked questions on Internet Intermediary Liability, ASSOCIATION FOR PROGRESSIVE COMMUNI-CATIONS, https://www.apc.org/en/pubs/apc%E2%80%99s-frequently-asked-questions-Inter-net-intermed, also accessible at Appendix [.]  OECD, Definitions, 9, THE ECONOMIC AND SOCIAL ROLE OF INTERMEDIARIES 2010, https://www.oecd.org/ Internet/ieconomy/44949023.pdf, also accessible at Appendix [.]  R. MacKinnon, E. Hickok, A. Bar, H. Lim, Fostering Freedom Online – The Role of Internet Intermediaries, UNESCO Series on Internet Freedoms, 2014, http://unesdoc.unesco.org/images/0023/002311/231162e. pdf, also accessible at Appendix [.]  Directive (EU) 2015/1535 of the European Parliament and of the Council, laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services, Available at: https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1551937833098&uri=CELEX:32015L1535, also accessible at Appendix [.]  47 USC S.230 (f)(2), The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.  Information Technology Act 2000 Section 2(t), An “electronic record” is “data, record or data generated, image or sound stored, received or sent in an electronic form or micro film or computer generated micro fiche”. Section 2(o) The term “data” is defined as “a representation of information, knowledge, facts, concepts or instructions which are being prepared or have been prepared in a formalized manner, and is intended to be processed, is being processed or has been processed in a computer system or computer network, and may be in any form (including computer printouts, magnetic or optical storage media, punched cards, punched tapes) or stored internally in the memory of the computer”.  Information Technology Act 2000 Section 2(w)  Ibid.  Article 19, Internet Intermediaries: basic facts, 7 INTERNET INTERMEDIARIES: DILEMMA OF LIABILITY 2013, https://www.article19.org/data/files/Intermediaries_ENGLISH.pdf, also accessible at Appendix [.]  Google,Government Requests to Remove Content, Google Transparency Report, GOOGLE (Feb. 26, 2019, 2:50 PM), https://transparencyreport.google.com/government-removals/overview?hl=en, also accessible at Appendix [.]  Ibid  ibid  ibid  ibid  The IT Act came into force in India on 17 October, 2000.  General Assembly of the UN, resolution A/RES/51/162 dated January 30, 1997, also accessible at Appendix [.].  According to the previous Section 79 of the IT Act, network service providers meant - ‘intermediaries’ as defined under the Act.  Sec. 79 - Network service providers not to be liable in certain cases: For the removal of doubts, it is hereby declared that no person providing any service as a network service provider shall be liable under this Act, rules or regulations made thereunder for any third party information or data made available by him if he proves that the offence or contravention was committed without his knowledge or that he had exercised all due diligence to prevent the commission of such offence or contravention. Explanation.— For the purposes of this section, — (a) “network service provider” means an intermediary; (b) “third party information” means any information dealt with by a network service provider in his capacity as an intermediary  The Information Technology (Amendment) Act, 2008 came into force on 27 October, 2009 - https://meity.gov.in/ writereaddata/files/act301009_0.pdf and the amendment act can be accessed here: https://meity.gov.in/writereaddata/ files/it_amendment_act2008%20%281%29_0.pdf, also accessible at Appendix [.].  Avnish Bajaj v. State, 150 (2008) DLT 769, also accessible at Appendix [.]  Section 67 of the then IT Act: Publishing of information which is obscene in electronic form - Whoever publishes or transmits or causes to be published in the electronic form, any material which is lascivious or appeals to the prurient interest or if its effect is such as to tend to deprave and corrupt persons who are likely, having regard to all relevant circumstances, to read, see or hear the matter contained or embodied in it, shall be punished on first conviction with imprisonment of either description for a term which may extend to five years and with fine which may extend to one lakh rupees and in the event of a second or subsequent conviction with imprisonment of either description for a term which may extend to ten years and also with fine which may extend to two lakh rupees.  Avnish Bajaj v. State, 150 (2008) DLT 769  Aneeta Hada v. Godfather Travels and Tours Pvt. Ltd, AIR 2012 SC 2795, also accessible at Appendix [.]  Avnish Bajaj v. State, 150 (2008) DLT 769  Section 2(1)(w) of the IT Act.  Section 79 of the IT Act.  Accessible at Appendix [.] or https://eur-lex.europa.eu/legal-content/EN/ALL/?uri=celex%3A32000L0031  The Intermediaries Guidelines Rules, http://dispur.nic.in/itact/it-intermediaries-guidelines-rules-2011.pdf or accessible at Appendix [.]  To refer to the entire text of the Intermediaries Guidelines, kindly refer to https://www.wipo.int/edocs/lexdocs/ laws/en/in/in099en.pdf or accessible at Appendix [.]  For a full list of prohibited content, refer to Rule 3(2) of the Intermediary Guidelines available at https://www.wipo. int/edocs/lexdocs/laws/en/in/in099en.pdf or accessible at Appendix [.]  The Report of the Committee, https://sflc.in/report-committee-subordinate-legislation-intermediaries-rules-tabled (SFLC.in had deposed before the committee highlighting its concerns with various provisions of the Intermediaries Guide-lines).  Shreya Singhal v. Union of India, (2015) 5 SCC 1, also accessible at Appendix [.]  For the entire text of the erstwhile Section 66A, kindly refer to Appendix [.]  Article 19(2) of the Indian Constitution places reasonable restrictions on free speech in the interests of - sovereignty and integrity of India, security of the state, friendly relations with foreign states, public order, decency or morality, con-tempt of court, defamation, or incitement to an offence.  Para. 117 of the Shreya Singhal judgment  As held by the Supreme Court of India in Shreya Singhal  To refer to the entire text of the Draft Rules, see https://meity.gov.in/writereaddata/files/Draft_Intermediary_ Amendment_24122018.pdf or also accessible at Appendix [.]  The press note issued by MeitY, http://pib.nic.in/newsite/PrintRelease.aspx?relid=186770, also accessible at Appendix [.]  https://sflc.in/our-comments-meity-draft-intermediaries-guidelines-amendment-rules-2018 and here - https://sflc.in/our-counter-comments-meity-draft-intermediaries-guidelines-amendment-rules-2018  Explanation of the end-to-end encryption used by WhatsApp on its service, WHATSAPP, https://faq.whatsapp.com/en/android/28030015/  WP (Civil) No. 494 of 2012, accessible at Appendix [.]  The Supreme Court read in informational and communicational privacy as facets of the larger right to privacy in K.S Puttaswamy v. Union of India  The Supreme Court in K. S. Puttaswamy v. UoI held that - “the right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III (fundamental rights) of the Constitution.”  Sydney Li, Jamie Williams, Despite What Zuckerberg’s Testimony May Imply, AI Cannot Save Us, https://www.eff.org/deeplinks/2018/04/despite-what-zuckerbergs-testimony-may-imply-ai-cannot-save-us/, also accessible at Appendix [.]  Natasha Duarte, Emma Llanso, Anna Loup, Mixed Messages? The Limits of Automated Social Media Content Analy-sis Presented at the 2018 Conference on Fairness, Accountability, and Transparency, Natasha Duarte Emma Llansó (Cen-ter for Democracy & Technology), Anna Loup (University of Southern California), https://cdt. org/files/2017/12/FAT-conference-draft-2018.pdf, or also accessible at Appendix[.]  Jon Russell, Hike unbundles its messaging app to reach India’s next wave of smartphone users, TECHCRUNCH https://techcrunch.com/2018/01/16/hike-unbundles-its-messaging-app/, also accessible at Appendix [.]  Aria Thaker, Indian politicians are now flocking to an unlikely “no English” social network, QUARTZ https://qz.com/india/1414241/sorry-facebook-indias-bjp-and-congress-flock-to-sharechat/, also accessible at Appendix[.]  Such chilling effect has already been witnessed as a result of Section 66A  S.91 of CrPC - the Omnipotent provision? by SFLC.in, can be accessed here - https://sflc.in/s91-crpc-omnipo-tent-provision  Certain intermediaries stated that Section 91 of the CrPC is being used for taking down content  Section 69 and 69A of the IT Act, available in the Appendix[.]  As discussed previously, Section 81 of the IT Act precludes the applicability of other laws in terms of conflicting provisions.  Section 52(b) and (c) of the Copyright Act, 1957 accessible in Appendix[.], or https://copyright.gov.in/documents/copyrightrules1957.pdf