Newsreel Asia

View Original

Social media platforms linked to the spread of child sex abuse material in India

A report by Boomlive shows how easy it is for traffickers to spread and consume such videos

Newsreel Asia Insight #282
July 15, 2024

An investigative report by Boomlive’s Decode reveals a widespread sharing of child sex abuse videos in India. The report, which included undercover interactions with admins of Telegram channels that distribute such disturbing content, shows how easy it is for them to spread child sex abuse material (CSAM).

One particularly disconcerting aspect of this issue is the role of platforms like Instagram and Telegram, where algorithms inadvertently suggest similar abusive content, expanding the network of potential viewers and contributors, according to the report of the investigation conducted in June.

The investigation found that the majority of CSAM videos shared on Telegram are hosted on the file-sharing site Terabox, which offers monetary incentives based on viewer traffic. This amounts to a grotesque monetisation of child exploitation. The undercover interactions that were part of the probe revealed that anyone can openly discuss rates for channels specialising in child abuse videos.

India leads in the creation and consumption of child sex abuse material, with over 1.9 million cases reported, the report notes, adding that prices for CSAM videos start as low as 40 rupees.

Sharing of these videos breach several domestic and international laws against child exploitation and abuse. While regulations exist, the enforcement mechanisms appear grossly inadequate, as suggested by the rapid re-emergence of flagged content under different guises, the report says.

Authorities need to respond to this crisis promptly.

First, there is an urgent need for more robust regulatory frameworks that impose stringent penalties for platform providers who fail to effectively block CSAM. Social media companies and file-sharing websites should be held accountable for facilitating the spread of illegal content, regardless of their passive or active roles.

Second, technological interventions are crucial. This includes enhanced algorithmic filters that preemptively detect and block exploitative content and more sophisticated monitoring tools that can trace and dismantle networks involved in such activities.

Third, public awareness and education campaigns are essential. These should aim at informing parents, educators and children themselves of the dangers posed by certain online interactions and the importance of safeguarding personal information.

Moreover, support services for victims need strengthening. Victims should not be held responsible, by parents, relatives, friends or society, for videos depicting them.

Non-profit groups committed to curbing this menace require ample funding and support from both the government and the private sector to effectively manage and respond to the volume of cases they encounter.

Lastly, international cooperation is vital. Many of these digital platforms operate across borders, making it imperative that law enforcement agencies and governments work collaboratively to tackle these transnational crimes.

Several countries have made significant strides in combating the proliferation of CSAM online.

A good example is the United Kingdom, which has implemented measures through agencies like the Internet Watch Foundation (IWF) and the Child Exploitation and Online Protection Centre (CEOP). These organisations work collaboratively to monitor, detect and remove CSAM from the internet. They also engage in public awareness campaigns and provide educational resources to help prevent abuse.

Another example is the United States, which has extensive legal and technological frameworks in place. The National Center for Missing and Exploited Children (NCMEC) operates a CyberTipline that receives reports on suspected sexual exploitation, which it forwards to law enforcement agencies for investigation. U.S. law enforcement also works closely with tech companies to enhance the detection and reporting of abusive content.

Australia also has strong measures against CSAM, including stringent laws and dedicated bodies such as the Australian Centre to Counter Child Exploitation (ACCCE). “Sextortion,” it says, is “a form of online blackmail where someone tricks you into sending your sexual images then threatens to share them unless their demands are met.”

As citizens and members of the global community, it is our moral duty to demand and support these changes. Silence and inaction only aid the perpetrators of these heinous crimes.