What should Indian govt do to curb child sex abuse material? Experts speak

An ad-hoc committee set up by the Rajya Sabha to tackle child pornography has put forth 40 recommendations.
What should Indian govt do to curb child sex abuse material? Experts speak
What should Indian govt do to curb child sex abuse material? Experts speak
Written by:

A few days ago, Vice President M Venkaiah Naidu expressed concern over the issue of child pornography – child sexual abuse material (CSAM) – and said that he hopes that the 40 recommendations made by an ad-hoc committee to tackle CSAM are discussed in the Parliament. The committee was set up by the Rajya Sabha, headed by member Jairam Ramesh, and submitted a report on the matter on January 25 to Naidu, who is also the Rajya Sabha Chairman.  

It deals with two main issues: children's access to pornographic material on social media, and circulation of CSAM on social media. It has suggestions on several fronts – amendments to the Protection of Children from Sexual Offences (POCSO) Act, Information Technology (IT) Act, recommendations for intermediaries (social media companies like Facebook) as well as Internet Service Providers (ISPs, like Airtel, Jio etc.), as well as educational and state-level measures.

Over the last few years, there has been increasing scrutiny on CSAM on WhatsApp, Facebook, the dark web, and other online platforms. With the government seeking to deal with CSAM, TNM looked at the ad-hoc committee’s recommendations, what experts have to say about some of its pertinent points, and the steps that India should be doing to tackle the issue.

Adding cyber-grooming to POCSO

A significant recommendation is for the definition of cyber-grooming to be added to the POCSO Act. This will allow grooming i.e., establishing, building a relationship with a child either in person or online so as to facilitate either online or offline sexual contact with the child, to specifically be recognised as a crime in India.

Vidya Reddy of Tulir, a Chennai-based organisation working to prevent child sexual abuse, points out, however, that under section 67B (c) of the IT Act, grooming is already criminalised.

Exemption for sexting – a double-edged sword?

Another recommendation is to include a provision to safeguard minors engaged in sexting and exchanging explicit selfies from being penalised under the POCSO Act. Experts for long have long been pointing out that POCSO does not take into account teen sexuality – teenagers engage in sexual acts and behaviour, often without the knowledge that it can be deemed criminal.

The report notes that POCSO should be amended to say that minors cannot be prosecuted for CSAM if the child takes or stores or exchanges with another minor "indecent images" of oneself; the images don’t show an act that is a serious criminal offence; and if no person in the image is more than two years younger than the individual. However, all the committee members did not agree to this – some said that there shouldn’t be such an exception at all, and others felt that the exception should only be for those between 16 to 19 years of age.

Vidya supports the exception for sexting under POCSO for those below 18. However, she says the cut-off should not extend to 19-year-olds, and should remain congruent with the definition of a child as per POCSO and section 67B of the IT Act – under 18 years. Otherwise, it will cause a lot of confusion for implementing agencies. 

Nitish Chandan, a cyber-security specialist who has conducted multiple investigations into CSAM on WhatsApp and Telegram, among other applications, also favours the exception. However, he cautions that it is a double-edged sword.

“For instance, what about a couple where the individuals are 15 and 18 respectively? The two-year gap condition could prove to be arbitrary in some cases. Similarly, you could also be groomed and coerced into sharing your explicit photos by a peer of the same age,” Nitish says. “There is also the issue of revenge porn. We don’t know what unintended consequences legalising exchange of such material may result in.”

Companies unlikely to break end-to-end encryption

There is a recommendation to modify the IT (Intermediary Guidelines) Rules 2011 to have a provision to break the end-to-end encryption on platforms to trace the originator or sender of a message if CSAM is being shared.

However, end-to-end encryption is a big push for companies like WhatsApp, especially with growing concerns over privacy and surveillance. Experts are not sure if a recommendation like this will materialise.  

Even in the unlikely event that it does, Siddharth Pillai of Aarambh, an online portal which works on online safety and combats CSAM online, points out that there are always other apps and platforms that people can migrate to, which do have encryption. “By regulating some platforms, you cannot solve the problem of CSAM circulation altogether,” he says.

Privacy vs child safety?

It is not as though child safety experts do not understand the concerns about privacy. However, it is the trade-off that picks privacy over child safety that they are opposing.

A coalition of child protection organisations and experts from across the world recently wrote to Facebook about their reservations against the company’s proposal of introducing end-to-end encryption on all its messaging platforms – which would then include Messenger also. The over 100 signatories globally include Indian organisations against child sexual abuse, Tulir and Arpan, as well.

Last year, Facebook said that it removed 11.6 million images of child abuse from the platform. However, experts have said multiple times that the social media giant needs to do more, and that CSAM is circulated widely on the platform.

John Carr, secretary of the UK-based Children’s Coalition on Internet Safety and a member of the Executive Board of the UK Council for Child Internet Safety, told TNM that a crucial way to limit the spread of CSAM is for social media companies to use technical tools to detect known illegal images, possibly even before they are seen by anyone other than the person who has posted it. “And that person can be identified and reported to the police,” he says.

A best-known tool of this kind, John points out, is PhotoDNA, which is developed by Microsoft. PhotoDNA is already used by Facebook, and its company WhatsApp too. Countries like Australia, the US and the UK use it too, though it’s not mandatory. A company can scan the information on its platform and cross-reference it to the PhotoDNA database containing previously reported child exploitative imagery.

“However, the problem is none of these tools will work if you use encryption. Encryption blinds them (the companies). Why would any company intentionally want to deprive itself of the ability to detect this kind of illegal behaviour? There is no good answer to that question,” John says.

In their letter to Facebook, of which John is also a signatory, the organisations have argued that concerns of privacy and data protection are valid, however, it is not a fair trade-off to put children in harm’s way “either as a result of commercial decisions or design choices.”

Protection for those investigating CSAM, storing it for reporting

The committee suggests that individuals downloading, storing or possessing CSAM for the purpose of mandatory reporting (under Section 19 of POCSO) should be given protection.

This, Vidya cautions, is a huge loophole. She gives the example of Pete Townshend, the guitarist of English band Who, who was arrested in 2003 for using his credit card to access a website that had CSAM. He was let off because he argued he was doing ‘research’ for his autobiography – he said that he was also a victim of child sexual abuse.  

But Nitish says that this could be a good move, considering it is not always possible to rely on big companies and law enforcement in a country as big as ours. However, there must be strict rules such as a time limit on storing CSAM if it is for the purpose of reporting, exceeding which a person can be held liable. “There should be Standard Operating Procedures (SOPs) for destroying such material as well,” he says.  

The committee also recommended that NGOs and activists who want to investigate sites to find abusers of children should be allowed to do so with the approval of a nodal agency. If they are found misusing this privilege, they will be liable to strict action.

“This is one of the biggest lacunae in the report,” Vidya says. “We know that there are people who seek out jobs that specifically give them access to sexually abuse children and CSAM. This provision could give all those perpetrators the opportunity to masquerade in its guise.”

If one looks at travelling child sexual offenders (TCSOs) – people who travelled to countries and sought jobs as NGO workers solely for the purpose of abusing children – like Richard Huckle, Ernest Macintosh, Paul Meekin to name a few, it is clear that the fears Vidya expresses have a lot of empirical evidence. Read more about TCSOs here.

On regulating intermediaries

Among the many recommendations for intermediaries is also the suggestion that they be required to report CSAM as well as information relating to missing children on their platforms to Indian state and Central governments, and not just foreign authorities. They could also be required to make these reports to Indian authorities on a regular monthly or quarterly basis.

Siddharth says that these moves could help hold big companies like Facebook accountable in India, in line with practices. Nitish agrees, adding, “Otherwise, many of these companies headquartered in the US just report to the American National Centre for Missing and Exploited Children (NCMEC) while Indian authorities don’t have as much access.”

Concerns about surveillance

The committee in India has recommended that ISPs also monitor and take down CSAM – this means that service providers like Airtel or Jio may have to proactively identify, remove and block CSAM content and websites, and can be held liable if they don’t. This is another move that experts think is impractical.

“ISPs can’t really access to the content that is circulated on the internet and platforms. They could have access to your search history – but that could practically amount to surveillance if there is no transparency. Further, it is very unlikely that they will be able to remove content,” Nitish says. That being said, he adds, “With proper guidelines from the Ministry of Home Affairs or National Commission for the Protection of Child Rights (NCPCR), if ISPs are given a list of keywords and websites that they can look out for, then it may be beneficial.”

Another recommendation is that the Ministry of Electronics and Information Technology mandate that a screen-monitoring app for devices should be developed and made freely available to ISPs, companies, schools and parents. However, Nitish argues that if an app that has access to your screen is made mandatory in India, it could be a “privacy disaster”.

On awareness, education and a national portal

Other recommendations in the report include steps to increase international cooperation, initiatives to bolster education and awareness and so on. However, instead of the Women and Child Development (WCD) Ministry as recommended in the report, this should, as per usual, fall under the aegis of Human Resource Development Ministry, Vidya says.

Further, the report suggests that “NCPCR be designated as the nodal agency to deal with the issue of child pornography.”

“The NCPCR is just an extension of the Central government right now. The issue of CSAM is a police matter; it should not be handled with soft gloves. It is a law matter, a Home Ministry matter, an HRD matter and should remain so,” Vidya asserts.

India playing a ‘catch-up game’

In the last paragraph, the report says that Prime Minister Narendra Modi should take the lead in building a global political alliance to combat CSAM on social media. However, there are existing alliances that India should be part of, Vidya says.

“India is still playing a catch-up game. We are not part of any international initiatives that are already tackling CSAM for many years – like WePROTECT Global AllianceVirtual Global Taskforce and so on. Many countries have been doing great work in a collaborative manner for the past 8-9 years to curb CSAM,” she says.

She also points out that while the report looks at children’s access to pornography on social media, it should not be conflated with CSAM in the same breath, as the report attempts to do.

“We live in a sexualised world. It’s not just the internet, children are accessing pornography elsewhere also. The recommendations say that those providing access to children to pornography should be criminalised too. Look at the ‘item numbers’ in films. They are also meant to titillate. Are we going to criminalise those who give access to children to these as well?” She adds that instead, children should be educated about the issues of the pornography industry.   

Siddharth, while appreciating the multi-stakeholder approach by the committee, adds that it says nothing about rehabilitating victims of CSAM.

Related Stories

No stories found.
The News Minute
www.thenewsminute.com