Citing data from 'Pornhub' (purportedly world's largest pornographic site), the India Child Protection Fund (ICPF) said that the 'traffic to child pornographic websites from India increased by 95% between March 24 and 26, as compared to average traffic before the lockdown'. ICPF is a recently set up NGO with an aim to support with funding resources curbing exploitation of children,...
Citing data from 'Pornhub' (purportedly world's largest pornographic site), the India Child Protection Fund (ICPF) said that the 'traffic to child pornographic websites from India increased by 95% between March 24 and 26, as compared to average traffic before the lockdown'. ICPF is a recently set up NGO with an aim to support with funding resources curbing exploitation of children, managed by Nobel Laureate Kailash Satyarthi's son, Bhuvan Ribhu. This disturbing trend showed up when the website had announced on 24 March that it would offer free, premium content for users across the countries.
Another worrisome news broke more recently when one of the members of the 'Bois Locker Room' leaked the contents of Instagram's group of teenagers who shared obscene images of girls using internet and were allegedly planning to rape one of the girls.
In India, though, viewing adult pornography in private is not an offence; seeking, browsing, downloading or exchanging child pornography is an offence punishable under IT Act, 2000 with imprisonment up to five years. However, the Internet Service Providers (ISPs) are exempted from liability for any third party data if they do not initiate the transmission and only provide access for communication or host or store it temporarily. This, in fact, makes it very difficult for the law enforcement agencies to monitor, detect and take down the pornographic material.
Before discussing the adequacy of initiatives taken by the Indian government and importance of ISPs' proactive role in checking child sexual abusive material (CSAM), the two successful models in use for more than two decades are worth mentioning.
American model of mandatorily reporting by ISPs
The National Centre for Missing and Exploited Centre (NCMEC), created in 1984 as a private, non-profit organisation, operates as a national clearinghouse on issues related to missing and exploited children in the US. It receives funds from various sources including federal government. The CyberTipline, a program of the NCMEC since 1998, serves as a national reporting mechanism and clearinghouse for tips and leads about possible child sexual exploitation in the US. It receives reports via online reporting forms and telephone.
In 2008, the federal law viz., the PROTECT Our Children Act identified the CyberTipline of the NCMEC as the contact agency for reporting information about child pornography offences. In addition, the ISPs were mandated to report the identity and location of the individual suspected of violating the law, as well as the esoteric child sex abuse images. Specifically, the suspected child pornographic images are to be treated as evidence to be later turned over to the law enforcement agencies by the NCMEC. The NCMEC and the ISPs are granted limited immunity from civil and criminal liability so long as 'good faith' attempts were made to report all suspected incidents of child pornography. Also, the NCMEC may notify the ISPs to stop or block transmission of CSAM on the Internet.
The extent of problem can be judged from the fact that, in 2018, CyberTipline received more than 18 million reports from US based companies about US hosted CSAM.
UK's model of collaboration with self-regulated ISPs
The Internet Watch Foundation (IWF) is a non-profit organisation established by the UK's Internet industry in 1996 to ensure a safe online environment for users with particular focus on CSAM. Its mission includes disrupting the availability of CSAM and deleting such contents hosted in the UK. In addition to responding to reports of possible online CSAM, the IWF began proactive services in 2014, by engaging the analysts to actively search for criminal content and not just rely on inbound reports from external sources. As a result, the IWF hotline can remove more content in a more time-efficient way.
The UK does not explicitly state that ISPs must report suspected CSAM to law enforcement or to some mandated agency; however, ISPs may be held responsible for third party content if it hosts or caches contents on its servers.
According to the IWF's latest annual report, out of total 2,60,426 reports received in 2019, 1,47,450 (56.62%) were found proactively by IWF analysts, 1,08,773 (41.77%) by public and rest by other agencies. However, a noteworthy fact is that about 86% of proactively found reports by IWF were found to contain CSAM, whereas only 11% of those reported by public were found to contain CSAM. Thus, the role of technical analysts of IWF who conducted proactive searches were found to be more effective in identifying the suspect CSAM. The 132,676 URLs which displayed CSAM appeared across 4,956 domains, traced to 58 countries with maximum 71% from only one country, Netherlands. In addition, 54 newsgroups were confirmed as containing CSAM.
Other international efforts
International Association of Hotlines (INHOPE), formed in 1999 and based in Netherlands, is the leading network combating online CSAM. The network has expanded from 9 to 52 hotlines (including IWF and NCMEC) in 48 countries that provide the public with a way to anonymously report illegal content online with a focus on CSAM. INHOPE provides secure IT infrastructure, ICCAM (I See Child Abusive Material), hosted by INTERPOL, that facilitates the exchange of CSAM reports between hotlines and law enforcement agencies. ICCAM is a tool which aims to facilitate image/video hashing/fingerprinting and crawling technologies and thus reduce number of duplicate investigations.
In the year 2008, 1,55,240 CSAM related reports (carrying 3,37,588 images) were exchanged via ICCAM, an increase of almost 80% over 2017, out of which 2,26,999 (67%) images were assessed as illegal. Among INHOPE's network, Canadian Centre for Child Protection and the IWF are currently active in proactive search of CSAM online. NCMEC does not use ICCAM in these instances because the content is already removed from the internet.
India's initiatives so far
India, at best, can be stated only a beginner to act upon the menace of child pornography. The law is place but the Supreme Court (SC) watered it down/ took a step backward in Shreya Singhal (2015) case when it read down section 79(3)(b) of IT Act to mean that the intermediary (i.e., ISP) only upon receiving actual knowledge of the court order or on being notified by the appropriate government, shall remove or disable access to illegal contents. Thus, the ISPs are exempted from liability of any third party information.
In Kamlesh Vaswani (WP(C) 177/2013) case, the petitioner sought complete ban on pornography and is still pending in the SC. However, inter alia, the Advisory Committee, constituted under section 88 and given the brief of case, issued orders in March 2015 to the ISPs to disable total nine (domain) URLs which hosted contents in violation of the morality and decency clause of Article 19(2) of the Constitution of India.
'Aarambh India', a Mumbai based NGO, taking an important initiative, partnered with the IWF and launched India's first internet hotline www.aarambhindia.org in September 2016 to report images and videos of child abuse online. These reports are assessed by the expert team of IWF analysts and offending URLs are added to its blocking list. Till 2018, out of 1182 reports received at the portal (which is one of the total 29 IWF reporting portals in four continents), only 122 (10.3%) were found to contain CSAM.
India does not have a hotline (conforming to INTERPOL's requirement) and therefore, is not a member of INHOPE. India does not have requisite infrastructure in place to proactively monitor, search and remove CSAM hosted on the country's servers. It is worth noting that the UK which hosted about 18% of the global CSAM in 1996, after the IWF's intensive work on proactive search, the figure came down to just 0.1% in 2019.
The Ministry of Home Affairs (MHA), while implementing a scheme named 'Indian Cyber Crime Coordination Centre (IC4), launched a national cybercrime reporting portal, www.cybercrime.gov.in in September 2018, for filing online complaints pertaining to child pornography (CP) and rape-gang rape (RGR). This facility was created in compliance to the SC's directions with regard to a PIL filed by NGO Prajwala. 'Prajwala' is a Hyderabad based NGO that rescues and rehabilitates sex trafficking survivors. However, as not many cases of CP/RGR were reported, later, from 30 August, 2019, the portal was extended for all types of cybercrime with special focus on cybercrime against women and children.
Further, the NCRB, on behalf of the MHA, signed a memorandum of understanding (MOU) with the NCMEC in April 2019 to receive CyberTipline reports so that punitive action could be taken against those who host CSAM through Indian ISPs. These reports provide individual's data as well as location of hosting illegal contents. This has helped the police in taking action against some offenders.
Jairam Ramesh Committee's recommendations
The chairman of the adhoc committee of the Rajya Sabha, Jairam Ramesh, in his report of January 2020, has made wide ranging recommendations on 'the alarming issue of pornography on social media and its effect on children and society as whole'.
On the legislative front, the committee has not only recommended the widening of the definition of 'child pornography', inclusion of 'cyber grooming' and 'using misleading domain names' as a crime and application of 'code of conduct for social media platforms', but also proactively monitoring, mandatory reporting and taking down or blocking CSAM by the intermediaries (i.e., ISPs) to the law enforcement agencies.
On the technical front, the committee, inter alia, has recommended to permit breaking of end-to-end encryption to trace distribution of child pornography, deployment of mandatory apps on all devices sold in India to monitor children's access to pornographic content and building partnership with industry to develop AI tools for dark-web investigations, trace identity of users engaged in crypto currency transactions to purchase child pornography online and liaison with financial service companies to prevent online payments for purchasing child pornography.
Other recommendations are institutional and socio-educational in nature and need continuum efforts to make an impact.
Way forward
According to the International Centre for Missing and Exploited Children's 9th edition (2018) report on Child Sexual Abusive Material: Model Legislation & Global Review, more than thirty (30) countries (including USA, Australia, Canada, China, France, Italy, Switzerland, South Africa and Sri Lanka) now require mandatory reporting of CSAM by the ISPs. Surprisingly, India also figures in this list, though, the law exempts ISPs from such mandatory reporting.
The Optional Protocol to the UN Convention on the Rights of the Child (CRC) that addresses child sexual exploitation i.e., The Sale of Children, Child Prostitution and Child Pornography, encourages the State Parties to establish liability of legal persons. (impliedly ISPs in this case) Similarly, the Council of Europe's Convention on Cybercrime and Convention on The Protection of Children against Sexual Exploitation and Sexual Abuse also require member states to address the issue of corporate liability.
Therefore, it is need of the hour that the IT Act be amended to incorporate mandatory reporting of suspected CSAM by the ISPs.
The 2019 annual report of the IWF has revealed that almost a third of CSAM was assessed as containing self-generated imagery (created using webcams and shared online) and there was increasing trend of using commercial website brands using 'digital pathway' technique. This means it is more difficult to locate and investigate the criminal imagery embedded in disguised websites. More hidden services were identified to be hosted within proxy networks- also called dark web. Hidden services commonly contain hundreds of links to child sexual abuse imagery that is hosted on image hosts and cyberlockers on the open web. It is therefore suggested that India joins INHOPE and establishes its hotline to utilise INTERPOL's secure IT infrastructure and take advantage of the technology already in use.
The other option will be to collaborate with the ISPs and financial companies and follow the model of self-regulation. However, if India wants to seriously address the issue of child pornography, it must establish an independent facility like IWF or NCMEC (with government support) to start proactive search and removal of CSAM.
The SC is not expected to step in every time some serious issue of child sexual exploitation comes to the knowledge of society. Jairam committee's recommendations must be given a serious follow up. Cybercrime in India is generally under reported and a victim of burking at the police station level. We, at all levels, need to shore up our capacity to fight cybercrime in general, and child pornography in particular, which is spreading like a moral cancer in society.
Views are personal only.
(The author is a senior IPS officer in Chhattisgarh)