Shreya Singhal v Union Of India Offers Us A North Star For Intermediary Liability
Devdutta Mukhopadhyay And Sidharth Deb
1 April 2020 10:32 AM IST
It has been more than five years since the Supreme Court's landmark decision in Shreya Singhal v Union of India. In Part 1, of this three part retrospective, we analyse the Court's decision to strike down Section 66A of the Information Technology Act ("IT Act"), 2000, and the deficits which subsist in terms of actual enforcement. Part 2 revisis the inadequacy of the Supreme Court's approach...
It has been more than five years since the Supreme Court's landmark decision in Shreya Singhal v Union of India. In Part 1, of this three part retrospective, we analyse the Court's decision to strike down Section 66A of the Information Technology Act ("IT Act"), 2000, and the deficits which subsist in terms of actual enforcement. Part 2 revisis the inadequacy of the Supreme Court's approach in deliberating the constitutionality of website and application blocking under the IT Act. In Part 3, we take a look at the judgement's views on intermediary liability. The analysis in this article considers the legal basis and rationale of the Court's findings and also traces subsequent evolution of the debate. As you will read through the course of this article, this aspect of the judgement is prescient through the lens of future internet governance and regulation in India.
Background to Intermediary Liability under the IT Act
Briefly, the Court heard arguments on the constitutionality of Section 79 of the IT Act and the Information Technology (Intermediary Guidelines) Rules, 2011 ("Intermediary Guidelines"). This framework grants liability exemptions for intermediaries subject to certain conditions. Intermediaries are defined under Section 2(1)(w) of the IT. They are defined broadly and the term is wide enough to encompass different service providers including social networks, internet service providers, cloud service providers, search engines and even cyber cafes.
The liability exemptions for intermediaries are conditional on: (a) intermediaries only hosting, storing or disseminating third party content; (b) the intermediary not initiating the transmission, selecting the receiver or modifying the information contained within the transmission; and (c) intermediaries observing due diligence requirements prescribed under the IT Act, and the aforementioned Intermediary Guidelines.
Section 79(3)(b) of the IT Act, also states that the liability exemptions for intermediaries do not apply in certain circumstances. Specifically the provision is worded such that the liability exemption does not apply if the intermediary fails to disable public access to unlawful material upon "receiving actual knowledge" or on being notified by an appropriate government authority/agency, that its resources are being used for unlawful activities. Notably, the intermediary must also ensure that evidence to said unlawful activities are not destroyed.
In this context, Rule 3 of the Intermediary Guidelines, 2011 impose certain due diligence obligations on intermediaries. Rule 3(2) mandates intermediaries to set up rules & regulations, user agreement or terms and conditions which impresses upon users to not host, upload, modify, publish or transmit specific types of prohibited content. Select examples of this prohibited content includes:
Content which harms minors,
Content which infringe intellectual property rights,
Impersonation of another individual,
Content which is grossly harmful, harassing, blasphemous defamatory, obscene, pornographic, paedophilic, libellous, invasive of another's privacy, hateful, or racially, ethnically objectionable, disparaging, relating or encouraging money laundering or gambling, or otherwise unlawful in any manner whatever,
Content which threatens the unity, integrity, defence, security or sovereignty of India, friendly relations with foreign states, or public order or causes incitement to the commission of any cognisable offence or prevents investigation of any offence or is insulting any other nation.
Activities which violate any law which is actively in force
The aforementioned thresholds for unlawful activities/content is extremely broad. The petitioners in the Shreya Singhal case were especially concerned due to the conditions set out in Rule 3(4) of the Intermediary Guidelines. Rules 3(4) states that any intermediary upon obtaining knowledge "by itself" or receiving "actual knowledge by an affected individual" about any information mentioned in Rule 3(2) must act within 36 hours to disable such prohibited information. In addition stemming from the data retention requirement mentioned in Section 79(3)(b), Rule 3(4) of the Intermediary Guidelines requires intermediaries to preserve unlawful information and related records for a minimum of 90 days to support investigation.
Arguments on Constitutionality and Court's Decision
The petitioners argued against Rules 3(2) and 3(4) on two primary grounds. First, they argue that Rule 3(4) requires intermediaries to exercise their own judgement as to the legality of certain types of content, even though the legal scheme requires them to operate as neutral platforms over which persons interact with each other over the internet. Second, the petitioners also argued that Rule 3(2) should also be struck down as it is incompatible with thresholds for reasonable restrictions under Article 19(2) of the Constitution. In particular they state that prohibited materials under 3(2) suffer from vagueness and overbreadth along the lines of Section 66A of the IT Act. For more texture on principles of void for vagueness and the doctrine of overbreadth please see our earlier analysis in Part 1 of this series.
Petitioners also challenged the wording in Section 79(3)(b) of the IT Act. Since intermediaries were expected to exercise their own judgement in determining the legality of certain activities, speech or content over its own platform, there are risks of chilling effect. The Court noted that as it stood India's intermediary liability framework fell foul of the right to free speech and expression guaranteed under Article 19(1)(a) and thresholds for reasonable restrictions to speech and expression within Article 19(2).
Justice Nariman observed that the intermediary liability framework lacked sufficient safeguards. He illustrated this by highlighting the requirement for an intermediary to apply its own mind in determining whether information should or should not be blocked. In particular, he lamented that there was no mechanism for a reasoned legal order-- unlike even the website blocking framework prescribed under Section 69A of the IT Act and the Rules thereunder.
Justices Nariman and Chelameshwar therefore held that Section 79(3)(b) of the IT Act and Rule 3(4) of the Intermediary Guidelines must be read down in a manner where "actual knowledge" is only obtained when an intermediary receives either a court order which asks it to expeditiously remove or disable access to certain material. He further clarifies that any Court order and/or notification from an appropriate government authority must strictly conform to speech requirements under Article 19(2). Any unlawful acts outside the scope of Article 19(2) restrictions cannot be used to prohibit content under Section 79. Disappointingly, the Court should have used this standard to strike down certain content prohibition provisions under Rule 3(2) of the Intermediary Guidelines since they also suffer from issues of overbreadth and void for vagueness.
Most notably, Justice Nariman justifies the Court's decision by illustrating the problem with the framework in the context of large internet platforms like Google and Facebook. He highlights that should the legal determination be left at the doorstep of intermediaries such platforms could be left in a position where million of content removal requests would inundate these platforms. This is likely to translate into an environment where platforms err on the side of caution, and with the objective of limiting liability risks, there is an overarching chilling effect. Citing practices in countries like Argentina, the Supreme Court ostensibly brought India's notice and takedown safe harbour regime for intermediaries in line with global thresholds.
So Is There Anything That The Judgement Could Have Done Better On?
Of course while this was a move in the right direction, it is pertinent to consider India's safe harbour standards for intermediary liability vis-a-vis international human rights law. Tellingly one the biggest weaknesses of the Shreya Singhal judgement is that it never clarified which body is the appropriate government authority under the applicable law which can forward content takedown requests of intermediaries.
In this regard, it is useful in referring to the views of both Frank La Rue and David Kaye, the previous and current UN Special Rapporteurs for the Promotion and Protection of the Right to Freedom of Opinion and Expression, respectively. For instance, in 2011 Special Rapporteur La Rue noted that any legal requests to intermediaries to prevent access to certain content should only be done through an order issued by a court or a competent body which is independent of political, commercial or other unwarranted influences.
Similarly, in 2018 Special Rapporteur Kaye endorsed the Manilla Principles on Intermediary Liability. Through it he stated that in order to comply with international human rights law, countries should ensure that intermediaries are not required to restrict content unless an order is issued by an independent and impartial judicial authority that the material issue is unlawful. These thresholds clearly indicate that the decision of Justices Nariman and Chelamaeshwar in not striking down the ability of Government authorities in issuing content takedown orders is one of the shortcomings of the Shreya Singhal judgement. Moreover, the judgement also fails to clarify the nature of the orders a Government authority can issue under Section 79 of the IT Act.
Today's Relevance of Shreya Singhal on MeitY's Proposed Intermediary Amendments
At the same time experts nonetheless hold the Shreya Singhal judgement with respect to intermediary liability in high regard. For instance, Ms Daphne Keller recently lauded the judgement's emphasis on ensuring the responsibility of determining whether certain activities/speech is unlawful should not be delegated to private actors. Reasonably, she even juxtaposes the judgements progressive leanings in this regard as against the Union Ministry of Electronics and Information Technology's ("MeitY's") recent attempts at walking back on this principle.
For context in December 2018, MeitY published a draft amendment to the Intermediary Guidelines which were subject to a public consultation. Two major complaints emerging from the proposed amendments can be neatly compared vis-a-vis key principles outlined in the Shreya Singhal case. The first is its implications to informational privacy and the second is its issues with respect to general monitoring obligations and collateral censorship.
First, the scope of the intermediary liability framework was meant to apply in the context of prohibited speech and prohibited content. This is evident when one reads the relevant portions of Justice Nariman's judgement with respect to Section 79 and the Intermediary Guidelines. Therefore, the insertion of a proposed requirement of traceability of originator of unlawful content [under Rule 3(5)] intersects with the right to privacy and end-to-end encrypted communications services. We argue that this is outside the scheme of these provisions, and should be dealt under other provisions like Section 69 of the IT Act which deals with the decryption of computer resources.
On a related note, IFF's litigation team is involved in a Transfer Petition before the Supreme Court which has attached several matters (from various High Courts including the Madras HC) on a range of privacy related issues including traceability and the linkage of Aadhaar with social media accounts. In it notably, MeitY filed an affidavit before the Supreme Court indicating that the intermediary amendments were likely to be published by January 15, 2020.
Second, the proposed amendment suggests the imposition of mandatory obligation on intermediaries to build and deploy automated content moderation tools which can proactively identify and disable public access to unlawful content. This directly clashes with the observations of Justice Nariman in the Shreya Singhal case. Moreover, we can refer to the Joint Dissenting opinion of the Grand Chamber of the European Court of Human Rights in Delfi AS v Estonia (Application no. 64569/09 decided on June 16, 2015). The dissenting judges observed that taking a constructive knowledge approach which may impose a general monitoring obligation on intermediaries to filter out unlawful content leads to the skewing of incentives. These pre-filtering obligations may lead to self-censorship. Platforms would prefer developing tools which lean toward over censoring in order to avoid risks of legal liability. They cite Jack Balkin's conception of collateral censorship and observe:
"Governments may not always be directly censoring expression, but by putting pressure and imposing liability on those who control the technological infrastructure (Internet service providers, etc.), they create an environment in which collateral or private-party censorship is the inevitable result."
This is especially evident should laws mandate intermediaries deploy content filtering tools. There is a likelihood firms will develop some form of software or algorithm in order to comply. However, such algorithms when fully automated struggle to factor in context or account the multifaceted complexities associated with speech. A recent example which illustrates the weaknesses in such systems was when social media platforms migrated to work from home alternatives. In such a scenario, AI-based automated content monitoring systems malfunctioned leading to an over censorship of COVID-19 related news posts. These instances highlight the need for the Government to remain cautious of imposing self censorship obligations on intermediaries. A good north star which can remind decision makers of this imperative is the Shreya Singhal case which outlines the legal basis as to issues with recent approaches to proposed amendments.
Views Are Personal Only.
You can help support work that safeguards internet freedom in India by becoming a IFF member today. Click here to donate.