Inside Whatsapp groups with child sex abuse content: What a cyber specialist found

Nitish Chandan simply ran an internet search for apps that had been removed from app stores for having child sexual abuse material. What he found was shocking.
Inside Whatsapp groups with child sex abuse content: What a cyber specialist found
Inside Whatsapp groups with child sex abuse content: What a cyber specialist found
Written by:

Two weeks ago, Tech Crunch published an article on how two Israeli NGOs had found third-party apps that were allowing people to find and join WhatsApp groups sharing child sexual abuse material (CSAM) and enabling commercial sexual exploitation of children (CSEoC). As a result, Google Play Store removed at least five such apps.

However, an Indian cyber security specialist has now found that these apps are still available online and, worse, are still broadcasting CSAM. Nitish Chandan, founder of The Cyber Blog India and project manager of Cyber Peace Foundation, decided to run a simple internet search for the third-party apps mentioned in the Tech Crunch article that lead to WhatsApp groups broadcasting CSAM. While removed from the app stores, he found that their apk files (which are downloadable for Android) were readily available online.

It is noteworthy that according to anti-harassment algorithm start-up AntiToxin, these apps were sustained by ads run by Google and Facebook’s advertising networks, Tech Crunch reported. Not only had Facebook and Google been lax in monitoring the removal these apps, but even WhatsApp moderators had not flagged and removed these groups.

To investigate further, Nitish downloaded one of these applications, ‘Unlimited Whats Groups.’

“I had no idea what was on these apps or the groups that they pointed to. What the app did offer was a button to search and before I would turn on the investigator mode to use some catch phrases and jargon for CSAM, I thought of just typing “child.” Yes, this search led to hundreds of groups like Only Child Pornography, Cool Child Pornography Group, Gay Kids Sex Only, [names withheld] etc.,” Nitish noted.

He found that most of these groups had the maximum number of members, and were unable to accept more participants. What's more, the groups alarmingly were not even trying to conceal their purpose. The profile photos, their names, descriptions were all sexual and even their content contained images of children, solicitations and offers like “video chat with children at Rs 500 for 10 minutes and sexual intercourse at Rs 5,000.”

The groups also lack any sort of screening for people who want to join. This is unlike the CSAM forums and groups that operate on the dark web (the part of the World Wide Web that is not indexed, and hence not accessible by commonly used browsers), which can be member-only, are wary of accepting new members and make an attempt to conceal their purpose.

Nitish told TNM that while several of these groups seem to be hosted by US numbers, most descriptions are in Hindi, and a significant number of the group members are Indian, with their real names and profile photos on their WhatsApp profiles. “On numbers that are not Indian, some of their profile pictures are images of Indian boys and girls most likely on virtual numbers of the United States. The rest of the user majority is from Pakistan, the Middle East and the United States,” Nitish observed.

Even the messages sent on the groups made no attempt to veil their illegal and disturbing activities. “One group that stood out in this investigation was called [name withheld]. This was one group with the maximum number of participants who were Indian, created using a virtual number from the United States and the biggest aggregator of CSAM where people readily posted messages like “CP (child porn) Bhejo,” “CP send karo,” “Bacho ke video dikhao” (Send CP, show children’s videos) etc. And videos did follow as well,” Nitish said.  

“Proactive investigation into these groups is needed in order to bring in deterrence against consumption of CSAM content which is an offence in India,” he added.

Nitish also told TNM that while there seem to be plenty of such groups on the app, the apps are not exclusively dedicated to CSAM. They also have WhatsApp groups for hacker services, children’s gaming groups, mobile wallet cashback and so on. “These applications are essentially aggregators,” he explained.

Further, considering that many of the numbers appear to US-based on these groups, there may be an issue with India having jurisdiction to take action against these people. However, given that most of the group descriptions were in Hindi and had Indian members, there is a good chance that the admins who have non-Indian numbers may be based in India, but are using technology like VPNs to conceal their locations and IP addresses to avoid detection.

“Upon further investigation, I found that the US numbers being used are all virtual numbers. They have been purchased from legitimate websites. So, law enforcement can request data on who took these numbers. Even if they are using VPNs, the VPN service providers are likely to comply and reveal information to track the users once they know that the case is about CSAM and child exploitation,” Nitish said.

He has sent his findings along with screenshots to law enforcement officials as well as the Ministry of Home Affairs. The latter has assured action.

WhatsApp reached out to to TNM on January 11 with their response on the matter. A WhatsApp spokesperson said, “WhatsApp has a zero-tolerance policy around child sexual abuse. We deploy our most advanced technology, including artificial intelligence to scan profile photos and actively ban accounts suspected of sharing this vile content. We also respond to law enforcement requests in India and around the world. Sadly, because both app stores and communications services are being misused to spread abusive content, technology companies must work together to stop it.”

Editor's note: Some group names have been withheld to avoid hampering the investigation by the Minister of Home Affairs.  

(Screenshots courtesy Nitish Chandan)

This story has been updated. 

Related Stories

No stories found.
The News Minute
www.thenewsminute.com