Making a safe digital ecosystem for children, beyond regulations
The government should not rely primarily on control to protect children online
The authors are PhD candidates with research areas in public policy and AI governance. Diah is studying at Leiden University, the Netherlands. Mirah is a teaching fellow at Monash University, Australia. This article reflects the authors’ own analysis and views and does not necessarily represent those of The Reformist.

Last month, the Communications and Digital Affairs Ministry officially classified platforms like Instagram, TikTok, X, and Roblox as ‘high-risk’ for children, effectively banning under-16 users from accessing them.
The regulation is outlined in the recently enacted Government Regulation No. 17/2025 on the Governance of Electronic Systems for Child Protection, also known as ‘PP Tunas’. The ministry framed this regulation as a landmark policy, positioning Indonesia as the first non-Western country to adopt such a comprehensive approach to online child protection.
What does PP Tunas govern?
In a nutshell, PP Tunas introduces a set of obligations for electronic system providers (ESPs) to safeguard children in the digital environment. These include features requiring age verification and parental controls, stronger content moderation to remove harmful material, user reporting tools, and restrictions on the commercial use of children’s personal data.
The government argues that these measures are necessary given the scale of children’s exposure to digital technologies. Almost half of Indonesian Internet users are under 18, with estimates suggesting that many children spend up to seven hours online each day.
UNICEF data indicates that a significant proportion of Indonesian children have encountered harmful online content, including sexual material. Further, a growing body of research suggests that excessive social media use may be associated with increased risks of addiction, anxiety, stress, and other mental health challenges.
These concerns have strengthened the urgency for regulatory intervention to ensure safer online environments for children.
Global effort to protect children online
Efforts to create a child-friendly digital environment have gained increasing global attention. International frameworks from UNESCO and the OECD consistently emphasize that protecting children online requires more than regulatory intervention; it demands a holistic approach that combines governance, education, and platform accountability.
Policy developments in Australia highlight both the importance and the limits of regulation in creating a child-friendly digital environment. As one of the early movers in this space, the Australian government has introduced measures to protect children from harmful online content. However, these efforts have also attracted criticism for risking an overly restrictive approach that may undermine children’s agency and do little to address the broader ecosystem in which digital harms occur. This suggests that regulation, while necessary, cannot function as a standalone solution.
Finland offers an example of a more holistic approach, integrating digital literacy into its national curriculum from an early age. This equips children not only with technical skills but also with critical thinking abilities and awareness of online safety. Rather than relying primarily on control, the Finnish model emphasizes empowerment, preparing children to navigate digital spaces responsibly and independently. Such experiences demonstrate that building a safe digital environment requires long-term investment in education and capacity-building, not only technological safeguards.
These global experiences offer important lessons for Indonesia. The recently introduced PP Tunas represents a significant regulatory step, but it largely operates at the downstream level by focusing on controlling risks on digital platforms. Platforms are required to strengthen content moderation, governments are obliged to enhance blocking mechanisms, and parents are urged to increase supervision. While these measures are important, they do not address the full scope of the problem.
Beyond regulatory interventions
Building a comprehensive child-friendly digital environment requires upstream interventions, too. This means policy must be accompanied by:
First, systematic digital literacy initiatives across multiple levels, including children, parents, and schools. Children should be supported to grow into empowered digital citizens capable of navigating online spaces safely and critically.
This also requires equipping parents and educators with the knowledge and tools to guide children’s digital engagement, including issues related to well-being, online safety, and responsible technology use. Without sufficient understanding, protective measures are unlikely to be effective in everyday contexts.
Second, children are not merely objects that need protection, but subjects with rights in the digital environment. Digital spaces must be safe, yes. But it should also enable children to learn, express themselves, and participate. Policies that emphasize restriction too heavily risk overlooking other important dimensions of children’s digital experiences. For example, excessive limitations on access may reduce opportunities for children to develop digital literacy, foster creativity, or build positive social networks.
More importantly, the regulation raises questions about what happens when children turn 16 and suddenly gain access to previously restricted platforms. Without adequate preparation, children may enter digital spaces without the skills and resilience necessary to navigate online risks. This transition period highlights the importance of not only protecting children but also preparing them to participate meaningfully in digital society.
Third, there is also a need for stronger platform accountability. Not all Indonesian children have equal access, capacity, or digital experiences. Given this diversity, regulation needs to move beyond content moderation obligations toward greater transparency of algorithms and safety-by-design approaches. For instance, platforms should be encouraged or required to reduce exposure to harmful content, limit features that may foster addictive use, and provide child-friendly reporting mechanisms.
Ultimately, child-friendly digital environments cannot be achieved through a single policy instrument alone. Regulations such as PP Tunas provide an important foundation, but they are not a standalone solution. Protecting children online should not only focus on reducing risks. It should also enable meaningful and safe opportunities for children to learn, participate, and develop as empowered digital citizens. Policies, therefore, need to move beyond a purely protectionist approach toward one that strengthens children’s autonomy, resilience, and digital capabilities.



