How can WhatsApp act against child abuse material in encrypted chats? Report suggests

A recent report by Cyberpeace Foundation was compiled after studying groups on WhatsApp and Telegram, in some of which CSAM was being shared.
Sexual abuse
Sexual abuse
Written by:

After studying groups disseminating adult pornography as well as Child Sexual Abuse Material (CSAM), the Cyberpeace Foundation has released a report with suggestions on how end-to-end encrypted (E2EE) platforms like WhatsApp and Telegram can bolster reporting of CSAM on the platform, without impinging on user privacy. 

While most countries/commentators “have taken an either-or approach to propose solutions, where it is either monitoring of content and invasion of user privacy [...],” the report says, the Cyberpeace Foundation researchers Akshata Singh, Nitish Chandan, Raj Pagariya, Sachet Sahni, Shipra Sahu and Srushti Iyer have now suggested reporting mechanisms to tackle CSAM on E2EE platforms, while maintaining user privacy.

According to a report compiled by the American National Centre for Missing and Exploited Children (NCMEC), India is at the top of the list with 19.87 lakh reports of CSAM. While it is sought and shared on open platforms like social media and on the dark web too, the entry of E2EE platforms have led to wider proliferation of CSAM. 

E2EE is a result of the push for user privacy. However, the lack of adequate reporting mechanisms for CSAM makes them ripe ground for its dissemination as well. Experts have earlier pointed out to TNM that tools do exist to check for CSAM. However, E2EE makes these tools blind.

Findings

The researchers investigated pornography groups on WhatsApp and Telegram, two instant messaging apps that focus on privacy by providing E2EE, and the proliferation of CSAM on them between June 2020 and July 2020.

On WhatsApp, the researchers identified 1,299 adult pornography groups, studied 29 of them in-depth, and found over 100 instances of CSAM. What’s more – none of the 15 groups were removed after reporting them through multiple channels, and only four of the 29 reported users were banned.

On Telegram, 350 adult pornography channels were identified, 283 were studied in depth and 23 instances of CSAM were found. 171 channels were removed after reporting them through multiple channels.

What E2EE platforms can do

WhatsApp allows you to report a group from within the chat or from the help-contact option in the application. However, there is no way to directly or specifically report CSAM on either WhatsApp or Telegram, though the platforms do explicitly prohibit usage for transmission of objectionable content including CSAM.

The report suggests that for an individual user to effectively report CSAM on WhatsApp and Telegram, the companies will have to standardise reporting and make it specific, while also introducing legislative change or a regulatory policy.

The researchers recommend, firstly that an end user be able to report CSAM directly by tagging it in the app. The platform should then create a hash value with it, which will be used to cross-reference the media with CSAM with PhotoDNA, a tool developed by Microsoft.

PhotoDNA has a database of previously reported child exploitative imagery. A hash is a unique digital signature of an image created by PhotoDNA, which it can then use to match previously-identified illegal images. “PhotoDNA is an incredible tool to help detect, disrupt and report the distribution of child exploitation material,” its website states.  

When such a report is received, as per India’s laws, one is required to report child sexual abuse and CSAM to the authorities, and action can be taken against the uploader. “This also furthers the principle of removing/actioning known CSAM on platforms without violating any user privacy,” the report says.

However, no hash match will be found on databases like PhotoDNA, the Cyberpeace Foundation suggests that the reporting user be informed of the same. The platform can then allow, by introducing an advanced chat export feature, for the user to upload only the objectionable media and not the whole chat, and report it again.

After this CSAM is verified, the higher authorities can be notified, and if the CSAM is new, it can be added as a hash value to a register like PhotoDNA. If the content is not found to be CSAM, the reporting user should be notified, and he/she can further file a complaint with the higher authorities.

In a statement, WhatsApp noted that they were working with the Cyberpeace Foundation to respond to CSAM. "WhatsApp has zero tolerance for this heinous abuse," a WhatsApp spokesperson said. "We appreciate our partnership with the Cyberpeace Foundation and we banned abusive users and groups they flagged for us. We are working closely with CPF to confront this scourge and are constantly improving our capabilities to prevent and respond to abuse."

A domestic hash register?

The report also recommends that India set up a hash register – either by participating in PhotoDNA, the services of which are already used by several countries like the US, Australia and the UK – or by setting up another hash register. “This hash register for CSAM/rape videos must also be made accessible to members of the civil society so as to enable them to report content that they might come across in the course of their work relating to CSAM,” it says.

Presently, WhatsApp does use PhotoDNA, but only the non-encrypted information such as the public display picture of a group is accessible to be run through the database, as the chats themselves are encrypted. If a user or group profile draws a match from the PhotoDNA database, WhatsApp bans the uploader and all group members.

Other suggestions to bolster CSAM reporting on E2EE platforms include a setting up a national tip-line which would be the focal point for mandatory reporting under section 19 of the Protection of Children from Sexual Offences (POCSO) Act, as well as for reporting CSAM by intermediaries including E2EE platforms. It could also help potentially help manage the national hash register, coordination with NGOs and civil societies, and relevant agencies.

Related Stories

No stories found.
The News Minute
www.thenewsminute.com