Table of Contents
Context: The Department for Science, Innovation and Technology of the British Government, along with the AI Security Institute, released the International AI Safety Report 2025.
More In News
The International AI Safety Report 2025 highlights the imminent risk posed by AI tools in generating, possessing, and disseminating child sexual abuse material (CSAM).
About Child Sexual Abuse Material (CSAM)
- Child Sexual Abuse Material (CSAM) refers to audio, video, or images depicting sexually explicit portrayals of children.
- The United Kingdom is leading a legislative effort to target AI tools capable of producing CSAM.
- A 2023 World Economic Forum (WEF) paper flagged generative AI’s ability to create lifelike images, especially of children.
- The Internet Watch Foundation (IWF) report (October 2024) highlighted the proliferation of CSAM on the open web.
- Given these developments, India must amend existing laws to address AI-driven CSAM and ensure long-term effectiveness.
Recent Developments: UK’s Pioneering Legislation
- The UK’s upcoming law introduces a tool-centric approach rather than focusing solely on the perpetrator.
- Key Provisions:
- Illegal to possess, create, or distribute AI tools capable of generating CSAM.
- Outlawing the possession of paedophile manuals that guide individuals in using AI for CSAM.
Expected Benefits
- Deterrence and Holistic Approach: By criminalizing the possession of AI tools, the law strengthens preventive mechanisms.
- Early Apprehension of Offenders: Authorities can act at the preparation stage before harm occurs.
- Reducing the Mental Health Impact on Children: It addresses the initial ripple effects of CSAM proliferation.
- Bridging Legislative Gaps: Recognizes AI-generated CSAM even when no real child is depicted.
India’s Readiness: Existing Gaps in Legal Framework
Increasing Cybercrimes Against Children
- National Crime Records Bureau (NCRB) Report 2022: Cybercrimes against children saw a substantial increase from the previous year.
- National Cyber Crime Reporting Portal (NCRP): Under the Cyber Crime Prevention against Women and Children (CCPWC) scheme, 94 lakh child pornography cases were recorded as of April 2024.
- Collaboration with NCMEC (USA): Since 2019, India’s NCRB has received 05 lakh cyber tip-line reports from the National Centre for Missing and Exploited Children (NCMEC), USA (as of March 2024).
Legislative Shortcomings
- Existing laws do not address AI-generated CSAM.
-
- There is no emphasis on targeting AI tools or platforms that facilitate CSAM creation.
Existing Laws Addressing CSAM in India |
|
A Plan for India: Strengthening the Legal Framework
- Expand the Definition of CSAM: As per the NHRC Advisory (October 2023), replace ‘child pornography’ in the POCSO Act with CSAM to make it more comprehensive.
- Define ‘Sexually Explicit’ in IT Act: Section 67B should explicitly define ‘sexually explicit’ to help identify and block CSAM in real time.
- Broaden the Definition of ‘Intermediary’ in IT Act: Include Virtual Private Networks (VPNs), Virtual Private Servers (VPS), and Cloud Services to ensure they comply with CSAM-related provisions.
- Legislative Amendments for Emerging Tech: Laws should address risks from AI, deepfake technology, and generative models producing CSAM.
- Adopt UN’s Draft Convention on Cybercrimes: India must actively support the UN Draft Convention on Countering the Use of ICT for Criminal Purposes at the UN General Assembly.
- Integrate AI-Specific Provisions in Digital India Act: The proposed Digital India Act 2023 (to replace the IT Act 2000) should incorporate AI-related CSAM provisions based on the UK model.
Conclusion
India must modernize its legal framework to address AI-driven CSAM threats effectively. The UK’s upcoming AI law offers a progressive model, shifting from an accused-centric approach to a tool-centric deterrence strategy. By adopting similar legal mechanisms, India can strengthen child protection laws, tackle AI-generated abuse, and safeguard children’s rights in the digital era.