Words by Mikaela Jago // image created by AI
Growing up in an era where technology rapidly became part of everyday life felt like navigating an entirely new landscape from scratch. The internet was a tool for everything – helping with homework, fostering friendships, opening doors to creativity, and helping the world feel smaller and more connected in the best way. It was exciting, limitless, and, for the most part, safe.
Its dark side was something I was lucky enough not to see. But once I reached adulthood and began my work in the not-for-profit sector at ICMEC Australia, my eyes were opened. It became frightening and inescapable. Once you learn about the prevalence and sheer horror of child sexual exploitation – and how the online world has amplified it – there’s no going back. And frankly, rightly so. I sometimes wish I could sit every person down and make them see it too.
It reminds me of Greta Gerwig’s 2023 film Barbie, where the Barbies live in a blissfully sheltered world, oblivious to the harsh realities of patriarchy. Their perfect society is upended when Ken introduces the real-world patriarchy, forcing the Barbies to confront its destructive impact. Once they see it, they can’t ignore it, and they take action to dismantle it and restore balance to their world. Child sexual abuse is one of those unspoken, ignored crises. Once you listen to the horrific but crucial work of victim-identification officers, hear the stories of how the legal system has failed survivors, and meet those dedicating their lives and careers to fighting this crime, ignorance is no longer an option.
It can feel like an impossible fight. But that’s exactly why innovative solutions matter. Technology continues to fuel the crisis, yes – but it also holds the power to fight back. We don’t have to start from scratch. We just have to be willing to act.
Artificial Intelligence (AI) – a term that seems to stir both excitement and unease – is front and centre in this conversation. Over half of the global population views AI with apprehension. But despite this nervousness, the potential of AI in child protection is not just promising; it’s transformative.
The rise of Generative AI has ushered in new challenges in our battle against child exploitation. Offenders, always quick to exploit new technologies, are now using AI to produce Child Sexual Abuse Material (CSAM) at alarming rates and with unnerving realism. This surge in AI-generated content is overwhelming already stretched law enforcement agencies, making the distinction between real and artificially generated images of children a complex and daunting task.
The stakes couldn’t be higher: the difference between saving a child and allowing their continued abuse could hinge on our ability to keep up with these technological advancements.
Consider the numbers: In South Asia, one in four young women is married before the age of 18. Meanwhile, the prevalence of online sexual solicitation of children in East Asia and the Pacific is a disturbing 13%.
Even more chilling, nearly half of the world’s CSAM reports come from East Asia, the Pacific, and South Asia – a staggering 17,798,299 cases.
These figures are more than just statistics; they are a call for a collective and urgent response.
As technology advances, so do the tactics of those who exploit it. This escalating arms race between offenders and protectors demands that we not only innovate in our technology but also in our approach, language, and partnerships. We must understand the intricacies of AI-assisted crimes and recognise that while technology can be a double-edged sword, it is also a potent weapon for good when wielded with care and purpose.
Offenders operate without regard for borders, constantly networking, innovating, and collaborating to inflict harm. Our response must be equally nimble and unified, drawing on the strength of cross-border collaboration within the ASEAN region.
By pooling resources, sharing intelligence, and deploying AI for the greater good, we stand a better chance of dismantling these networks of exploitation. Law enforcement agencies are already seeing results: Microsoft’s PhotoDNA helps detect known CSAM through digital signatures, while advanced forensics tools recover crucial evidence from suspects’ devices. In Thailand, facial recognition technology has helped identify missing children in trafficking cases, and as the region’s adoption of blockchain technology grows, there’s a crucial opportunity to leverage blockchain analysis tools to track and disrupt exploitation networks’ financial trails across borders.
The technological arsenal includes virtual reality for investigator training and AI monitoring of online spaces for grooming behaviour. In Malaysia, this technology recently identified a perpetrator targeting children through gaming platforms. However, adoption remains uneven across ASEAN. Privacy concerns, while valid, must be balanced against child protection imperatives. The Philippines’ Anti-Online Sexual Abuse and Exploitation of Children Law offers a model for legislation that enables effective enforcement while protecting civil liberties.
Too often, innovation is confined to the corporate sector. But when it comes to combating child sexual exploitation, we don’t have time to waste.
The ASEAN region has a unique opportunity to lead by example. By embracing AI-driven strategies and robust prevention measures now, we can protect our children and set a new global standard for innovation or in the fight against exploitation.
The future of our region – and indeed, our world – is in our hands.