Cast Adrift without Safe Harbor: The Risks of Ignoring IT Act Protections (PART 9)
INTERMEDIARY LIABILITY IN OTHER JURISDICTIONS
Different jurisdictions may establish different enactments and procedures to restrict content that is considered unlawful. Different regimes also follow different legal frame-works to grant conditional immunity or safe harbour to intermediaries. The notice and notice model obliges intermediaries to direct any complaint of alleged infringement of copyright they get from the owner of copy-right to the user or subscriber in question. This procedure is followed in Canada and is enshrined in the Copyright Modernization Act, that came into effect in January, 2015. According to this model, receiving a notice does not compulsorily mean that the sub-scriber has infringed copyright and does not require the subscriber to contact the copy-right owner or the intermediary.[1] There-fore, the objective of the notice-and-notice regime is to discourage online infringement on the part of Internet subscribers and to raise awareness in instances where Internet subscribers’ accounts are being used for such purposes by others.[2] It enables the complainant and the content owner to resolve the dispute among themselves without the involvement of the intermediary.
The second model is the notice and takedown model. It is followed by countries like South Korea[3] and the United States of America.[4] According to this system, an intermediaryresponds to government notifications, court orders or notices issued by private parties themselves, to take down content by promptly removing or disabling such allegedly illegal content. This self regulatory framework, by which ISPs determine whether or not a website contains illegal or harmful content raises questions of accountability, transparency and the overall appropriateness of delegating content regulation to private actors, who have to act as judges.[5] This could be seen as “privatization of censorship.”[6]
The third model is called the Graduated Response model or the “three strikes system.” Under this system, rights holders may ask intermediaries to send warnings to subscribers identified as engaging in illegal file sharing or infringing copyright. The intermediary may be required to send more than one notice, with repeat infringers risking bandwidth reduction and sometimes even complete suspension of the account. France, New Zealand, Taiwan, South Korea and the United Kingdom have enacted legislations that require intermediaries to exercise certain degree of policing to protect users’ rights. Some countries like the United States and Ireland permit private arrangements between rights holders and intermediaries to accomplish the same end.
United States of America
The law relating to intermediary liability in the United States of America is mostly governed by Section 512(c) of the Digital Millennium Copyright Act (“DMCA”) and Section 230 of the Communications Decency Act (“CDA”). Section 512 of the DMCA was enacted by the US Congress with a view to limit the liability of intermediaries and to check online and copyright infringement, including limitations on liability for compliant service providers to help foster the growth of Internet-based ser-vices.[7] The intermediary must comply with the notice- and-takedown procedure under Section 512 to qualify for protection.
The CDA was originally enacted to restrict freedom of speech and expression but the restrictive sections were later struck down for being unconstitutional. Section 230 is considered one of the most valuable tools for protecting intermediaries from liability for third party generated content. It reads: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any in-formation provided by another information content provider.” The section encompasses claims of defamation, encroachment of privacy, tor-tious interference, civil liability for criminal law violations, and general negligence claims based on third party content.[8]
The legislation also contains a policy statement from the US government that provides safe harbour at Section 230(B)(4) for any action taken to: “encourage the development of technologies that maximize user control over what information is received by individuals to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children’s access to objectionable or inappropriate online material.”
In 2018, a new legislation called Stop Enabling Sex Traffickers Act was passed (SESTA) and Al-low States and Victims to Fight Sex Trafficking Online Act (jointly known as FOSTA-SESTA), which expands criminal liability for classifieds websites like Backpage.com which was alleged to host ads from sex traffickers in its adult ser-vices action. Backpage.com had claimed that it is an intermediary and is not responsible for content uploaded by users. Although, the new bill is well-intentioned, it dilutes the protection provided by Section 230 of the Communications Decency Act, which has been considered the most valuable piece of legislation protecting freedom of speech and expression online, by implicating intermediaries for user generated content.
Case Studies
Dart v. Craigslist, Inc.[9]
Craigslist is the largest online classified advertisement service in the United States. Postings on the site include advertisements for jobs, housing, sale of various items and other services. The listings also included a section for “erotic services”, even though Craigslist’s terms and conditions categorically forbid the advertisement of illegal activities.
The “erotic services” section caught the attention of State and local law enforcement. It was seen that some users were using the section to advertise illegal services. In March 2008, the Attorney General of Connecticut, on behalf of the Attorney Generals of forty other states sent a notice to Craigslist to remove the ads that publicized prostitution and other illicit activities prohibited under state law. In November 2008, Craigslist reached an agreement with the Attorney Generals to implement steps to hinder illegal listings on the erotic services section, but not completely remove them. Subsequently, Craigslist announced a ninety per-cent drop in its erotic services listings.
Four months later, Craigslist was sued by one Thomas Dart, a Sheriff for the county in Illinois, claiming that the site created “public nuisance”, under Illinois state law, because its “conduct in creating erotic services, developing twenty -one categories, and providing a word search function causes a significant interference with the public’s health, safety, peace, and welfare.”[10] Craigslist ultimately won that case on the grounds of Section 230(c)(1) of the CDA. The court held that Craigslist was an Internet service provider (Intermediary) and hence, immune from wrongs committed by third parties. However, Craigslist removed the phrase “erotic services” and replaced it with “adult services.” The case is considered a victory for online speech.
Later, due to mounting pressure, Craigslist completely removed the “adult services” section from its website and the link to the section was replaced with a black label reading “censored.”
Viacom International, Inc v. YouTube, Inc.[11]
In March 2007, Viacom filed a lawsuit against Google and YouTube alleging copyright infringements by its users.
It sought USD 1 Billion in damages for the copyright infringement of more than a hundred thousand videos owned by Viacom. Thereafter, several class action lawsuits were also filed against YouTube by sports leagues, music publishers and other copyright owners.
These lawsuits tested the strength of the DMCA safe harbour as applied to online service providers that host text audio and video on behalf of users.[12] In June 2010, the United States District Court for the Southern District of New York held that YouTube, being an intermediary was protected by the DMCA safe harbour. The judge said that compelling online platforms to constantly police videos that are being up-loaded by third parties “would contravene the structure and operation of the DMCA.”[13] Viacom appealed the decision to the Second Cir-cuit Court of Appeals in August 2011, which reversed the earlier decision. In April 2013, the district again ruled in favour of YouTube saying that YouTube could not possibly have known about the copyright infringements and was protected under the DMCA. Viacom again began the process of second appeal but before the date of the hearing, both the parties negotiated a settlement in March 2014.[14]
Matthew Herrick v. Grindr LLC[15]
Plaintiff Herrick alleged that his ex - boyfriend set up several fake profiles on Grindr (a dating app for the LGBTQ community) that claimed to be him and resulted in identity theft/ manipulation. Over a thousand users responded to the impersonating profiles. Herrick’s ex boyfriend, pretending to be Herrick, would then direct the men to Herrick’s’ workplace and home.
The impersonating profiles were reported to Grindr (the app’s operator), but Herrick claimed that Grindr did not respond, other than to send an automated message. Herrick sued Grindr, accusing the company of negligence, intentional infliction of emotional dis-tress, false advertising, and deceptive business practices for allowing him to be impersonated and turned into an unwitting beacon for stalkers and harassers[16] liable to him because of the defective design of the app and the failure to police such conduct on the app.
The Court rejected Herrick’s claim that Grindr is not an interactive computer service as defined in the CDA. With respect to Grindr’s products liability, negligent design and failure to warn claims, the court found that they were all predicated upon content provided by another user of the app. Any assistance, including algorithmic filtering, aggregation and display functions that Grindr provided to his ex-boyfriend was “neutral assistance” that is available to good and bad actors on the app alike.
The court also highlighted that choosing to remove content or to let it stay on an app is an editorial choice, and finding Grindr liable based on its choice to let the impersonating profiles remain would be finding Grindr liable as if it were the publisher of that content.
An appeal has been filed against the court’s ruling to the Second Circuit Court of Appeals, in this matter.
European Union
E-commerce Directive
Articles 12 to 15 of Directive 2000/31/EC of 8 June 2000 on electronic commerce mandate the member states of the EU to establish defenses, under both civil and criminal law for the benefit of certain types of online intermediary.[17] Directive 2001/29/EC on Copyright in the Information Society (as to copyright) and Directive 2004/48/EC on the Enforcement of Intellectual Property Rights (other than copyright) mandate EU member states to give rights holders the right to seek an injunction against those online intermediaries whose services are used by a third party to infringe an intellectual property right.
Articles 12 to 15 of Directive 2000/31/EC is the primary piece of legislation governing intermediary liability. It incorporates a no-tice-and-takedown system for intermediaries to abide to. Articles 12 to 14 categorises intermediaries into “mere conduits” , ‘caching’ ser-vices and ‘hosting’ services. Article 15 states that intermediaries have no general obligation to actively monitor the information which they transmit or store for illegal activity.
The General Data Protection Regulation (“GDPR”) which came into effect from 25th May 2018[18] is aligned with Directive 2000/31/EC. Article 2(4) of the GDPR reads:
“This Regulation shall be without prejudice to the application of Directive 2000/31/EC, in particular of the liability rules of intermediary service providers in Articles 12 to 15 of that Directive.”
Recital 21 of the GDPR[19] reads as follows:
“This Regulation is without prejudice to the application of Directive 2000/31/EC of the European Parliament and of the Council, in particular of the liability rules of intermediary service providers in Articles 12 to 15 of that Directive. That Directive seeks to contribute to the proper functioning of the internal market by ensuring the free movement of information society services between Member States.”
Directive on Copyright in the Digital Single Market COM/2016/0593 final - 2016/0280 (COD) (EU Copyright Directive)
In September of 2016 the EU Commission proposed a new directive to update its existing copyright framework[20] after a number of years of public consultation. Since then several negotiations and amendments have been incorporated in the proposal and a final text[21] was agreed upon by the EU Parliament and Council on 13th of February, 2019.[22]
Two provisions, in particular, in the pro-posed EU copyright directive, warrant red flags: Article 11 and Article 13.
Article 11, grants publishers the right to re-quest payment from online platforms that share their stories. This provision is being called the “link tax” which gives publish-ers the right to ask for paid licenses when online platforms and aggregators such as Google news share their stories.[23]
The Article excludes ‘uses of individual words or very short extracts of a press publication’ from its purview.[24]
The more problematic provision of the proposed directive is, however, Article 13, which makes ‘online content sharing ser-vice providers’[25] liable for copyright infringement for content uploaded by their users. The proposed copyright directive precludes the ‘safe-harbour’ protection afforded to such online content sharing ser-vice providers, under the EU e-commerce directive, for user generated content which is protected by copyright,. For protection against liability, these services must enter into license agreements; make best efforts to get such authorisation for hosting copy-right protected content and make best efforts to ensure unavailability of protected content (this will likely result in the use of upload filters)[26] ; and implement a notice and takedown mechanism, including prevention of future uploads.
This effectively means that intermediaries will have to proactively monitor and pre-screen all the content that users upload. This degree of monitoring for illegal content is not possible manually and can only be handled by automated filters, that are far from perfect and can be easily manipulated. For example, YouTube’s “Content ID” system has been deemed notorious for over -removing innocent material.[27] Article 13 will turn intermediaries into the content police and would hamper the free flow of information on the Internet.[28] There is also the problem of dedicated infringers finding a way around content filters and the possibility of automated tools making errors, specially in cases of fair use like - criticism, reviews and parodies.[29]
The proposed directive is scheduled for voting before the European Parliament either in late March or mid-April of 2019.
Terrorist Content Regulation[30]
On 12th September 2018, the European Com-mission released the draft - ‘Regulation on Pre-venting the Dissemination of Terrorist Content Online.’ which requires tech companies and online intermediaries to remove “terrorist con-tent” within one hour after it has been flagged to the platforms by law enforcement authorities as well as Europol.[31] The proposal needs to be backed by member states and the EU Parliament before it can be passed as law.
Websites that fail to take immediate action will be liable to pay fines. Systematic failure to comply will invite penalties of up to four percent of the company’s global turnover in the last financial year (similar to fines un-der the GDPR)[32]. Requirement for proactive measures, including automated detection, are needed to effectively and swiftly detect, identify and expeditiously remove or disable terrorist content and stop it from reappearing once it has been removed. A human review step before content is removed, so as to avoid unintended or erroneous removal of content which is not illegal has also been recommended in the proposed regulation.[33]
These draft legislations in the EU, namely - the proposed copyright directive and the terrorist content regulation, point towards a shifting trend in European countries wherein governments wanting to hold online intermediaries more accountable and responsible for illegal user generated content generated on their platforms.
In both cases, for copyright and terrorist content, the EU has suggested (through these legislations) the use of automated tools for content filtering, which may lead to over-compliance (to ring- fence themselves against liability), private censorship and resultant dilution of free speech rights on the Internet.
Case studies
Delfi v. Estonia (2015)[34]
The judgment in this case brings to light fascinating issues of both human rights and the law governing intermediary liability in the EU, making it one of the most important judgment of recent times, with respect to intermediary liability.
Delfi is one of the biggest online news portals in Estonia. Readers may comment on the news stories, even though Delfi operates a system to regulate unlawful content within a notice and takedown framework. In January, 2006, a news article was published by Delfi that talked about how a ferry company, namely SLK Ferry had wrecked the pathway that connected Estonia’s mainland to its islands. There were one hundred and eighty five user generated comments on the news article, out of which about twenty were viewed as offensive and threatening towards the company’s sole shareholder L. The comments were asked to be removed and damages were claimed by L. Delfi removed the comments but refused to pay damages. The matter was brought to various lower courts before it reached the Supreme Court in June 2009, which held that Delfi had a legal obligation to prevent unlawful and illegal content from being posted on their website, since it was the publisher of the comments, along with the original author, and therefore was not protected by EU Directive 2000/31/EC. Further, the court stated that defamatory speech is not covered under right to freedom of expression.
Aggrieved by the Supreme Court’s judgment, Delfi moved the European Court of Human Rights. The question before the ECHR was whether the previous court’s decision to hold Delfi liable was an unreasonable and disproportionate restraint on Delfi’s freedom of ex-pression, according to Article 10[35] of the Convention for the Protection of Human Rights and Fundamental Freedoms.
The ECHR was called upon to strike a balance between freedom of expression under Article 10 of the Convention and the preservation of personality rights of third persons under Article 8 of the same Convention.[36] In 2013, in a unanimous judgment, Delfi lost the case at ECHR and the matter was thereafter brought before the Grand Chamber. On 16 June, 2015, the Grand Chamber upheld the decision of the Fifth Section of the ECHR, asserting that the liability against Delfi was justified and proportionate because:
(1) The comments in question were outrageous and defamatory, and had been posted in response to an article that was published by Delfi on its professionally managed online news portal which is of commercial nature; and
(2) Delfi failed to take enough steps to remove the offensive remarks immediately and the fine of 320 Euros was insufficient.[37]
The decision was criticized by digital and civil rights activists for being against Directive 2000/31/EC which protects intermediaries from user generated content and freedom of expression online. It also set a worrying precedent that could change the dynamics of free speech on the Internet and intermediary liability. Furthermore, the decision was condemned for the Court’s fundamental lack of under-standing of the role of intermediaries.
Magyar Tartalomszolgáltatók Egyesülete (“MTE”) and Index.hu Zrt (“Index”) v. Hungary (2016)[38]
After the controversial Delfi judgment, which was considered by many a setback to online free speech and liability of intermediaries with respect to third party generated content, the European Court of Human Rights delivered another landmark judgment, ruling the other way.
The parties, MTE and Index are a Hungarian self-regulatory body of Internet content providers and a news website respectively. The organizations had featured an opinion piece on the unethical business practices of a real estate company, which garnered a lot of resentful comments from readers. In response, the real estate company sued MTE and Index for infringing its right to a good reputation. The Hungarian courts declined to apply the safe harbour principles under Directive 2000/31/ EC, stating that the same applies only to commercial transactions of electronic nature i.e purchases made online. According to them the comments were made in a personal capacity and were outside the ambit of economic or professional undertakings, and hence not qualified for safe harbour protection.
The matter was moved to the European Court of Human Rights (ECHR). In a 2016 ruling that was considered an enormous step forward for protection of intermediary from liability and online free speech, the Court held that requiring intermediaries to regulate content posted on their platform “amounts to requiring excessive and impracticable forethought capable of undermining freedom of the right to impart information on the Internet.[39]” The Court also declared that the rulings of the Hungarian courts were against Article 10 of the Convention for the Protection of Human Rights and Fundamental Freedoms.[40]
Right to Be Forgotten in the EU
The GDPR came into force on May 25, 2018, repealing the 1995 Data Protection Directive. It is meant to harmonize data privacy laws across Europe, protect data privacy of EU citizens and provide a comprehensive data privacy framework for organizations that collect and process data.[41]
Article 17 of the GDPR provides for the Right to Erasure or the Right to be Forgotten. This is a development from the Data Protection Directive (Directive 95/46/ec) where there was no mention of this term, although it was implicit under Articles 12 and 14. The grounds under Article 17 of the GDPR are detailed and broader than those provided in the 1995 Data Protection Directive. The data subject has the right to demand erasure of the information concerning her in the following cases:
•personal data is not required for processing;
•(s)he withdraws consent;
•when there has been unlawful processing of data;
•objection is on grounds under Article 21(1) and Article 21(2) of GDPR;[42]
•national laws require erasure of data and; and
•when the data is provided in relation to in-formation society services by a child under Article 8(1).[43]
The Article also provides for situations in which the Right to be Forgotten will not be applicable. The grounds are:
•exercise of the right of freedom of expression and information;
•public interest and public health;
•when the processing is a legal obligation;
•for archiving purposes with respect to public interest, scientific, historical research, or statistical purposes; and
•exercise or defence of legal claims. However, RTBF under the GDPR is plagued with several problems, namely:
•Disproportionate Incentives: The infra-structure in place for Right to be Forgotten is heavily tilted toward Right to Privacy and not toward informational rights and freedom of speech and expression. It provides unbalanced incentives to the Controller, thereby causing them to over comply and favour delisting in order to protect themselves. Article 83(5) provides fines as high as upto EUR 20,000,000, or in the case of an undertaking, upto 4% of the total worldwide annual turnover of the preceding financial year, whichever is higher. The Google Transparency Report which provides anonymized data on Right to be Forgotten requests has in its statistics, which began from May 29, 2014, stated that out of all the URLs they have evaluated for removal, 44.2% have been removed until February 2019.[44]
•Procedural Problems: According to both, the Google-Spain ruling and the GDPR, search engines are the initial adjudicators before whom data subjects file RTBF re-quests. This is similar to the intermediary liability takedown procedure and the same difficulties arise in this case as these questions involve a delicate balance between rights and private companies should not be the entities who make this determination. Publishers generally do not to have the right to approach courts under the GDPR regime. This leads to a clear tilt in the system towards the rights of the data subject’s privacy rather than the freedom of speech and expression of the content writer or the publisher.[45]
• Hosting Platforms as Controllers: While it is settled that search engines are Controllers, there exists a lack of clarity on whether hosting platforms will have RTBF obligation on user content. This has not been resolved by the 2016 Resolution. It is probable that hosting platforms will process Right to be Forgotten requests to avoid liability and the risk of being included in the definition of Controller with all the obligations which come along with it.
• Applicability of E-commerce Directive[46] on Intermediaries: Article 2(4) of the GDPR states that the GDPR would be applicable without prejudice to the 2000 E -commerce directive, in particular Article 12 to 15 which pertain to intermediary liability. Often Intermediaries face dual liability under both data protection laws and intermediary liability laws where exists potential for such overlap.
EU cases on Right to Be Forgotten
Google v. Spain[47] (2014)
The landmark case of Google v. Spain before the Court of Justice of the European Union read in the Right to be forgotten from Articles 12 and 14 of the Data Protection Directive specifically with respect to delisting of search results by search engines, and laid down several important principles in this regard. The complainant in this case, one Mr. Costeja Gonzalez filed a case against Google for showing search results related to the auction of his property for recovery of social security debts, that took place ten years ago and was published in the Spanish newspaper La Vanguardia. He wanted the search engine to delist these links from the search engine as it was no longer relevant and harmed his reputation.
The following questions arose during the proceedings of the case:
(1) Whether search engines are ‘Processors/ Controllers’ of data?
Google stated that it is neither the Processor nor the Controller of data. It is not the Processor as it does not discriminate between personal data and general data while undertaking its activities and as it does not exercise any control over the data, it is not the Controller.[48] The Court, however, rejected this reasoning. Google was held to collect, record, retrieve, organize, store, disclose and make data available to the public, which comes under the definition of processing. The fact that the data is already published and not altered makes no difference.[49] The Court also held that because the search engine exercises control and determines the purpose and means of the activities that it undertakes during processing, it will be the Controller with respect to these activities and cannot be excluded only on the basis that it exercises no control over the personal data on the website of third parties. The Court also emphasized that entering a person’s name into a search engine and getting all information pertaining to that person would enable profiling of that individual.[50] It was held that it is irrelevant that publishers possess the means to block search engines from accessing their data. The duty on search engines was separate from that of publishers of data.[51]
(2) What are the duties on the search engine operator as per the 1995 Data Protection Directive?
Google stated that as per the principle of proportionality, the publishers of the websites must take a call on whether the information should be erased or not as they are in the best position to make this determination and take further action for removal of such information. Google further contended that its fundamental right to free speech and expression along with that of the Publisher will be negatively affect-ed if it is asked to delist such links. Additionally, informational rights of Internet users will also be under threat.
The Court once again emphasized the role of search engines in profiling of data subjects and the threat it poses to the right to privacy of individuals. The court further explained that the processing of data cannot be justified solely by the economic interests of the search engine. The rights of other Internet users are also to be considered. The rights of the data subject and that of other Internet users must be balanced by considering factors such as nature of information, sensitivity of the data in the data subject’s life, the role of the data subject in public life and public interest.[52] The court also noted that because of ease in replication of data on the Internet, it may spread to websites over which the court does not have jurisdiction. Due to this, it may not be an effective remedy to mandate that there be parallel erasure of the data from both the publisher or to require erasure of data from the publisher’s website first. There may also be situations where the data subject has the Right to be Forgotten against the search engine but not the publisher (Eg: If the data is solely for journalistic purpose[53]).
(3) Scope of data subjects rights under the Data Protection Directive
The question referred to the court was whether the data subject can exercise his Right to be Forgotten on the grounds that the data is prejudicial or that he wishes that the data be deleted after a reasonable time.
Google submitted that it is only in cases where the processing violates the Data Protection Directive or on compelling legitimate grounds particular to the data subject’s situation that the individual be allowed to exercise the Right to be Forgotten.
The Court held that data collected could be lawful initially, but, may, in the course of time become irrelevant, inaccurate, inadequate, excessive with respect to the purpose for which it was collected.[54] The Court also stated that it is not necessary that the data sought to be erased has to be prejudicial to the data subject.[55]
6.5.2 Google v. Equustek[56] (2017)
In 2011(Canada), Equustek Solutions Inc. filed a lawsuit against its distributor, Datalink Technologies, claiming that Datalink illegally obtained Equustek’s trade secrets and other confidential information. Thereafter, Datalink allegedly began to pass off Equustek’s products as its own by re-labelling them, and also started selling competing products by using Equustek’s trade secrets. In response to this, Equustek procured several interlocutory injunctions against Datalink. However, Datalink disregarded the orders and moved its jurisdiction to some other location and continued its business.
In 2012, Equustek requested Google to de-index Datalink’s websites from appearing on Google’s search results. As a result, Google voluntarily blocked more than three hundred web pages from Google Canada but refused to do the same on an international scale.
The matter came up before the British Columbia Supreme Court, which, consequently, ruled that Google has to remove all of Data-link’s web domains from its global search index. This was essentially a global takedown order. Google appealed the order in the Supreme Court of Canada, contending that the order was against the right to freedom of speech and expression. In a landmark 7-2 ruling, the Supreme Court upheld the lower court’s worldwide takedown order, that re-quired Google to delist Datalink’s websites and domains from its global search index.
The ruling has received widespread criticism from various civil rights organizations and Internet advocates for violating the free speech rights of Internet users. Also, the question that arose was whether a country can enforce its laws in other countries to limit speech and access to information.
Google, Inc v. Commission nationale de l’informatique et des libertés (CNIL)[57](2018)
Google was once again involved in a long le-gal wrangle; this time with the French data protection authority, Commission nationale de l’informatique et des libertés, commonly referred to as CNIL.
In this case, CNIL had ordered Google to delist certain items from its search results. Google had complied with the order, and delisted the concerned articles from its domains in the European Union (google.fr, google.de, etc). The delisted results, however, were still available on the “.com” and non European extensions. Subsequently, in May 2015, a formal injunction was issued against Google by the CNIL chair, ordering the search engine to extend delisting to all “Google Search” extensions within a period of fifteen days.[58] On failure to comply with the injunction order, Google was asked to pay a fine of EUR 10,000.
Google appealed the order in France’s highest administrative court, Couseil d’Etat, and contended that the right to censor web results globally will seriously impair freedom of speech and expression and the right to access information. It was also argued that French authorities have no right to enforce their order worldwide, and doing so would set a dangerous precedent for other countries.
The French court referred the case to Europe’s highest court, Court of Justice of the Europe-an Union (CJEU) for answers to certain legal questions and to arrive at a preliminary ruling before coming to a judgment on the case itself. Arguments were heard in September, 2018 and judgement is awaited.
The Court published the Advocate General’s opinion in January, which stated that de-referencing search results on a global basis will under freedom of speech and expression:[59]
“(T)here is a danger that the Union will prevent people in third countries from accessing infor-mation. If an authority within the Union could order a global deference, a fatal signal would be sent to third countries, which could also order a dereferencing under their own laws. … There is a real risk of reducing freedom of expression to the lowest common denominator across Europe and the world.”
Google v. CNIL highlights the incompatibility between principles of territorial jurisdiction and global data flows.[60]
By Siddharth Dalmia
The Startup Sherpa
+91-9971799250
dalmiasiddharth1994@gmail.com
[1] The US Copyright Modernization Act 2015, also accessible at Appendix [.] [2] Office of Consumer Affairs (OCA), Notice and Notice Regime, Innovation Science and Economic Development Canada (Mar 9, 2019, 1:48 PM), https://ic.gc.ca/eic/site/oca-bc.nsf/eng/ca02920.html, also accessible at Appendix [.] [3] The South Korea Copyright Act 1957 § 103, also accessible at Appendix [.] [4] Digital Millennium Copyright Act 1998 § 512(c), also accessible at Appendix [.] [5] Christian Ahlert, Chris Mrasden and Chester Yung, How Liberty Disappeared from Cyber space: The Mystery Shopper Tests Internet Content Self Regulation, THE PROGRAMME IN COMPARATIVE MEDIA LAW AND POLICY, UNI-VERSITY OF OXFORD , http://pcmlp.socleg.ox.ac.uk/wp-content/uploads/2014/12/liberty.pdf, also accessible at Appendix [.] [6] Ibid [7] The US Copyright Modernization Act 2015 U.S. Copyright Office, Section 512 Study, COPYRIGHT.GOV, https://www.copyright.gov/policy/ section512/, also accessible at Appendix [.] [8] Adam Holland, Chris Bavitz, Jeff Hermes, Andy Sellars, Ryan Budish, Michael Lambert and Nick Decoster Berkman Center for Internet & Society at Harvard University, Intermediary Liability in the United States, PUBLIXPHERE, https://publixphere.net/i/noc/page/OI_Case_Study_Intermediary_Liability_in_the_United_States, also accessible at Appendix [.] [9] [665 F. Supp. 2D 961], accessible at Appendix [.] [10] Thomas Dart, Sheriff of Cook County V. Craigslist, Inc, 665 F. Supp. 2d 961, accessible at Appendix [.] [11] [No. 07 Civ. 2103 2010 WL 2532404 (S.D.N.Y 2010)], accessible at Appendix [.] [12] Viacom v. YouTube, Electronic Frontier Foundation https://www.eff. org/cases/viacom-v-youtube, also accessible at Appendix [.] [13] Miguel Helft, Judge Sides With Google in Viacom Suit Over Videos, NEW YORK TIMES, http://www.nytimes.com/2010/06/24/technology/24google.html?_r=0, also accessible at Appendix [.] [14] Jonathan Stempel, Google, Viacom settle landmark YouTube lawsuit, REUTERS, http://www.reuters.com/article/us-google-viacom-lawsuit-idUSBREA2H11220140318, also accessible at Appendix [.] [15] 17-CV-932 (VEC), also accessible at Appendix [.] [16] Andy Greenberg, Spoofed Grindr Accounts Turned One Man’s Life Into a ‘Living Hell’ , WIRED, https://www.wired.com/2017/01/grinder-lawsuit-spoofed-accounts/, also accessible at Appendix [.] [17] Trevor Cook, Online Intermediary Liability in the European Union, 17, JOURNAL OF INTELLECTUAL PROPERTY RIGHTS, 157-159 (2012), also accessible at Appendix [.] [18] The History of the General Data Protection Regulation, EUROPEAN DATA PROTECTION SUPERVISOR https://edps.europa.eu/data-protection/data-protection/legislation/history-general-data-protection-reg-ulation_en [19] A copy of the GDPR can be downloaded from here - https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CEL-EX:32016R0679, also accessible at Appendix [.] [20] Hayleigh Bosher, Keeping up with the Copyright Directive, IPKITTEN, https://ipkit-ten.blogspot.com/2019/02/keeping -up-with -copyright- directive.html, also accessible at Appendix [.] [21] Proposal For A Directive Of The European Parliament And Of The Council On Copyright In The Digital Single Market, JULIA REDA, https://juliareda.eu/wp-content/uploads/2019/02/Copyright_Fi-nal_compromise.pdf, also accessible at Appendix [.] [22] Ibid [23] Matt Reynolds, What Is Article 13? The EU’s Divisive New Copyright Plan Explained, WIRED, https://www.wired.co.uk/article/what-is-article- 13-article-11-european-directive-on-copyright-ex-plained-meme-ban, also accessible at Appendix [.] [24] Christiane Stuetzle and Patricia C. Ernst, European Union: The EU Copyright Directive Hits The Homestretch, MONDAQ (9 Mar, 2019, 2:26 PM), http://www.mondaq.com/unitedstates/x/786366/Copyright/The+EU+Copy-right+Directive+Hits+The+Homestretch, also accessible at Appendix [.] [25] This includes - online intermediaries who store and give access to a large amount of copyright protected con-tent or other protected content uploaded by its users. This specifically excludes - non profit online encyclopedias, non profit educational and scientific repositories, open source software developing and sharing platforms, online marketplaces and B2B cloud services and cloud services for users. Kindly refer to Article 2(5) of the proposed copy-right directive. [26] Christiane Stuetzle and Patricia C. Ernst, European Union: The EU Copyright Directive Hits The Homestretch, MONDAQ, http://www.mondaq.com/unitedstates/x/786366/Copyright/The+EU+Copy-right+Directive+Hits+The+Homestretch, also accessible at Appendix [.] [27] Cory Doctorow, Artists Against Article 13: When Big Tech and Big Content Make a Meal of Creators, It Doesn’t Matter Who Gets the Bigger Piece, ELECTRONIC FRONTIER FOUNDATION (9 Mar, 2019, 2:26 PM), https://www. eff.org/deeplinks/2019/02/artists- against-article -13-when-big-tech-and-big -content -make -meal -creators-it, also accessible at Appendix [.] [28] Ibid [29] Ibid [30] European Commission, Proposal For A Regulation Of The European Parliament And Of The Council On Pre-venting The Dissemination Of Terrorist Content Online, EUROPEAN COMMISSION (9 Mar, 2019, 2:35 PM), https:// ec.europa.eu/commission/sites/beta-political/files/soteu2018-preventing-terrorist-content-online-regulation-640_ en.pdf, also accessible at Appendix [.] [31] Ibid [32] Ibid [33] Ibid [34] Delfi v. Estonia, 64569/09, ECtHR (2015), also accessible at Appendix [.] [35] Council of Europe, European Convention on Human Rights, EUROPEAN COURT OF HUMAN RIGHTS, https://www.echr.coe.int/Documents/Convention_ENG.pdf, also accessible at Appendix [.] [36] Giancario Frosio, The European Court Of Human Rights Holds Delfi Liable For Anonymous Defamation, CEN-TRE FOR INTERNET AND SOCIETY STANFORD LAW SCHOOL http://cyberlaw.stanford. edu/blog/2013/10/european-court-human-rights-holds-delfiee- liable -anonymous-defamation, also accessible at Appendix [.] [37] HUDOC - European Court of Human Rights, HUDOC (Feb 10, 2019, 6.25PM), http://hudoc.echr.coe.int/eng? i=001 -126635#{“itemid”:[“001 -126635”], also accessible at Appendix [.] [38] Application no. 22947/13, also accessible at Appendix [.] [39] Daphne Keller, New Intermediary Liability Cases from the European Court of Human Rights: What Will They Mean in the Real World? STANFORD LAW SCHOOL- CENTER FOR INTERNET AND SOCIETY, http://cyberlaw.stanford.edu/blog/2016/04/new-intermediary-liability-cases-european-court -hu-man-rights-what -will-they-mean-real, also accessible at Appendix [.] [40] Council of Europe, European Convention on Human Rights, EUROPEAN COURT OF HUMAN RIGHTS, https://www.echr.coe.int/Documents/Convention_ENG.pdf, also accessible at Appendix [.] [41] The European Union General Data Protection Regulation, http://www.eugdpr.org/ [42] European Commission, Article 21 - Right to Object, EUROPEAN COMMISSION (9 Mar, 2019, 2:55 PM), http://ec.europa.eu/justice/data-protection/reform/files/regulation_oj_en.pdf, accessible at Appendix [.] [43] European Commission, Article 8 - Conditions Applicable to Child’s Consent in Relation to Information Society Services, EUROPEAN COMMISSION (9 Mar, 2019, 2:58 PM), http://ec.europa.eu/justice/data -protection/reform/ files/regulation_oj_en.pdf, accessible at Appendix [.] [44] Search Removals under European Privacy Law, Google Transparency Report, GOOGLE, https://transparencyreport.google.com/eu-privacy/overview?hl=en, also accessible at Appendix [.] [45] Daphne Keller, The Right Tools: Europe’s Intermediary Liability Laws and the 2016 General Data Protection Regulation, STANFORD LAW SCHOOL CENTER FOR INTERNET AND SOCIETY, https://pa-pers.ssrn.com/sol3/papers.cfm?abstract_id=2914684, also accessible at Appendix [.] [46] Electronic Commerce Directive 2000/31/EC, also accessible at Appendix [.] [47] [C-131/12], also accessible at Appendix [.] [48] Para 22 of the Google Spain vs AEPD and Mario Costeja Gonzalez decision [49] Ibid. at Paras 28, 29 [50] Ibid. at Para 37 [51] Ibid. at Paras 39, 40 [52] Ibid. at Para 81 [53] Ibid. at Paras 84, 85 [54] Ibid. at Para 93 [55] Ibid. at Para 99 [56] 2017 SCC 34, also accessible at Appendix [.] [57] [C-507/17], also accessible at Appendix [.] [58] Right to be delisted: the CNIL Restricted Committee imposes a €100,000 fine on Google, CNIL, https://www.cnil.fr/en/right-be-delisted- cnil-restricted-committee-imposes-eu100000-fine-google, also accessible at Appendix [.] [59] ‘Right to be forgotten’ by Google should apply only in EU, says court opinion, THE GUARDIAN, https://www.theguardian.com/technology/2019/jan/10/right-to-be-forgotten-by-google- should- apply-only-in-eu-says-court [60] Michele Finck, Google v CNIL: Defining the Territorial Scope of European Data Protection Law, THE GUARD-IAN (Feb 7, 2019, 2:00PM), https://www.theguardian.com/technology/2019/jan/10/right-to-be -forgotten -by-goo-gle-should-apply-only-in-eu- says-court