Madras HC suggests ‘moral education’ to tackle child porn: Why this doesn't work

The Madras High Court, while granting anticipatory bail to a man arrested for viewing child sexual abuse material, said that only ‘moral education’ and ‘Bhartiya culture’ that can act as a bulwark to tackle the issue.
Child abuse representative image
Child abuse representative image
Written by:

The Madurai bench of the Madras High Court recently made a peculiar statement around tackling child sexual abuse material (CSAM, also referred to as child pornography). While granting anticipatory bail to a man who was arrested for viewing CSAM, the High Court said that the issue can be tackled only by inculcating the right values in people. Justice GR Swaminathan said, “The system also may not be able to prosecute every offender. Therefore, it is only through moral education, there can be a way out. It is only the Bharatiya (Indian) culture that can act as a bulwark. The menace of child pornography can be tackled only if all of us inculcate the right values.”

Differentiating between a one-time consumer of CSAM and people who possess or transmit CSAM for commercial purposes, the court pointed out that in the present case, the petitioner appears to fall in the former category and was cooperative in the investigation. The judge also noted that as soon as one steps into the digital domain, they come under the surveillance of either the state or those manning the social networking sites. In this case, the petitioner had sent the CSAM to his friend via Facebook messenger. However, “the ‘Big Brother’ is watching us may not deter those who are determined to indulge in such acts of perversity,” Justice Swaminathan noted, and therefore stressed on the need for moral education.

However, in providing a broad and oversimplified solution like ‘moral education’ to a problem as complex as CSAM and adults having a sexual interest in children, the court seems to have missed out on several nuances. Some studies have also pointed out a relationship between sustained possession and viewing of CSAM leading to commission of an actual sexual offence against a child. “While the evidence that people who consume CSAM will go on to harm a child is arguably inconclusive, it is well documented that a significant number of convicted child sexual offenders were in possession of CSAM,” says Abhilash Nair, a legal expert based at Aston University, Birmingham, UK, and the author of The Regulation of Internet Pornography: Issues and Challenges.

Abhilash observes that saying moral education is the way to tackle CSAM lacks a comprehensive understanding of child sexual abuse online and offline. “If morality was to work as a primary argument, we would be living in a perfect society, where people would not be doing anything that is considered immoral. Narrowing the solution down to morality is a gross underestimation of the problem. It assumes that anyone seeking CSAM only has a morality problem.”

Possession, viewing of CSAM and causing harm to a child

While there are people who may not be seeking CSAM and view it in an isolated or momentary manner, there are those who seek out CSAM particularly because they have a sexual interest in children. A study published in 2014 based on a sample of 1,978 Swedish people between 17 and 20 years of age found that 4.2% of them had viewed CSAM at least once. “Most theory-based variables were moderately and significantly associated with child pornography viewing and were consistent with models of sexual offending implicating both antisociality and sexual deviance,” the study found. These variances included “likely to have sex with a child aged 12–14, likely to have sex with a child 12 or less, perception of children as seductive, having friends who have watched child pornography,” among others. 

Another study of 8,718 German men found that 4.1% of them had sexual fantasies involving pre-pubescent children, 3.2% reported sexual offending against pre-pubescent children and 0.1% reported a paedophilic sexual preference. Sexual fantasies involving prepubescent children were also found to be linked with committing sexual offences against children, most frequently, using CSAM.

This is not to say that everyone who views or is found to be in possession of CSAM will try to or commit a sexual offence against a child. For instance, a study on 120 adult males convicted for offences related to indecent images of children found that 60 “had a previous contact child sexual offence (dual offenders) and 60 had no evidence of an offence against a child.” Another unpublished study from 2017 found that of the 372 men convicted of CSAM-offences between 1995 and 2009, 81 had sexually abused a child offline, and 39 had tried to have online contact with a child. Of these, 20 had done both.

Experts say that regardless of whether a person who is in possession of CSAM will commit an act of sexual violence against a child, the potential threat of it is enough reason to tackle the issue.

“A majority of responders in India have no understanding of dynamics of sexual violence online or offline,” points out Vidya Reddy of Tulir – Centre for Prevention and Healing of Child Sexual Abuse. “There is a vast difference between people accessing CSAM online and sexually offending offline, with a huge common intersection as well. The court providing ‘moral education’ as a solution to this is ignorant of these aspects.” 

Identifying people who need therapeutic intervention

It is important to note that not everyone who has viewed CSAM has paedophilic disorder (a psychosexual disorder defined by having recurrent feelings of intense sexual arousal, fantasies, sexual urges or behaviours around sexual activity with a prepubescent child or children, usually under the age of 14) or will abuse children. However, according to ‘Child Molesters: A Behavioral Analysis’, by Kenneth V Lanning, a former Supervisory Special Agent at the Federal Bureau of Investigation (FBI) published under National Center for Missing & Exploited Children (NCMEC), a “proposed sign” of having paedophilic disorder would be “use of child pornography in preference to adult pornography, for a period of six months or longer.”

Abhilash says that Justice Swaminathan correctly observed that prosecuting every single person found to be viewing or possessing CSAM is not the way out of the problem. However, understanding their consumption and use could lead to identifying people who need intervention and thus actually prevent abuse. It could be used to get treatment through psychotherapy for people who are subsequently diagnosed with paedophilia (sexual attraction to children below 10 years of age) or hebephilia (persistent sexual interest in children who are older than 11, and in their teenage years).

It is also important to note that sexual interest in children can never be justified and cannot be equated to sexual orientations like homosexuality because of the inherent imbalance of power between an adult and a child, and the fact that a latter cannot give informed consent to sexual acts. Dr Ujjwal Nene, a sexual health consultant based in Pune, had told TNM that while the diagnosis of paedophilic disorder or hebephilic disorder is not a person’s fault, their behaviour and management of the impulses is in their control, and such persons should be encouraged to seek help.

Not all those with paedophilic disorder will sexually abuse children, and not all those who sexually abuse children have paedophilic disorder. However, by studying and understanding behaviours that lead to child sexual abuse, meaningful interventions can be done, amounting to preventing child sexual abuse than simply reacting to it.

Viewing CSAM is not a victim-less, contact-less offence

In the digital age, CSAM has become easily accessible and shareable, which adds to the problem. “Because of the easy accessibility, there is now a community of people who legitimise interest in CSAM and sexual interest in children. Earlier, your own self-inhibition would not have allowed you to view or seek it out. Now, that urge has become easier to rationalise,” Vidya says.

There is a school of thought that says that due to the prevalence of CSAM, people are getting desensitised to it, and over time, will seek more violent, hardcore material. “It normalises such behaviour not just for the offender, but also for the victim,” points out Abhilash. Lanning pointed this out as well – showing sexually explicit material to children, and collecting “child pornography or child erotica” are two of the many traits to look out for ‘preferential sex offenders’ i.e. those who systematically seek out children to sexually abuse. 

One of the reasons these offenders collect CSAM is to show it to a child to lower his/her inhibitions. “A child who is reluctant to engage in sexual activity with an adult or pose for sexually explicit photographs can sometimes be convinced by viewing other children having “fun” participating in the activity,” Lanning wrote.  

Abhilash points out that in this sense, Justice Swaminathan was right to say that while viewing adult pornography in private is not an offence, the same cannot be said about viewing ‘child pornography’, even in private. “But if you are criminalising something, you need to educate people about why CSAM is unacceptable. It is not about morality alone – it is about the harm it causes to the child. It is about the possibility that someone who consumes CSAM may pose a risk to children. It is about the increase in the demand for CSAM doing harm to the society on the whole,” he says.

Both Abhilash and Vidya say that CSAM is not a contact-less or victim-less offence. “People think that by just watching CSAM, they aren’t harming a child because they aren’t the ones abusing them. However, every time you click on that media, you are abusing the child too,” Vidya says. “The child who in that media is going through secondary harm and abuse every time someone views it, regardless of whether the child is aware. If someone is making, viewing, or transmitting CSAM, it has consequences for children on the whole,” Abhilash adds.

Loading content, please wait...

Related Stories

No stories found.
The News Minute
www.thenewsminute.com