Public Rage Grows as Malaysians Demand Digital Platforms Be Held Responsible for Online Harm

As Malaysia rolls out its new Online Safety Act, members of the public are demanding that digital platforms do more than just comply with the law, they want tech companies to take real responsibility for the safety of children and families online rather than chasing profits and engagement.
A growing wave of public concern in Malaysia underscores a powerful message: child protection in the digital age is everyone’s business and can no longer be treated as an afterthought by tech companies or policymakers. Recent public calls for digital platforms to be more accountable come as the country begins implementing the Online Safety Act 2025, designed to force platforms to moderate harmful content and take preventive action against cyber risks.
Under the new law, digital platforms operating in Malaysia must actively address harmful material, including child sexual abuse material, cyberbullying, online scams and other digital threats, or face enforcement action. The public’s support for harsher implementation reflects deep frustration with the status quo, where children increasingly encounter exposure to dangerous content and predators online without effective safeguards in place.
Experts and advocacy groups have noted that responsibility for online child safety cannot be pushed solely onto families or individual users. Children’s digital engagement has grown exponentially, making them vulnerable to grooming, exploitation and psychological harm if platforms do not design protection mechanisms by default. Observers have stressed that effective platform accountability must go beyond age gates or warnings and embed child-centered safety features directly into digital environments.
Malaysian authorities have taken additional measures, such as automatically registering major social media and messaging providers under licensing rules that would compel compliance with online safety standards starting in 2026. Child welfare advocates and parents have expressed that these steps are necessary but must be consistently enforced and constantly improved to keep pace with evolving online risks.
UNICEF Malaysia has highlighted concerns that age restrictions and bans alone won’t fully protect children unless accompanied by broader accountability and technological safeguards. Without proactive platform responsibility and community involvement, digital spaces can continue to serve as channels for exploitation rather than safe environments for learning and connection.
This groundswell of public demand reflects a broader principle: protecting children online is not a luxury or a side objective, it must be prioritized ahead of algorithmic engagement and profit models. For Malaysian families and child advocates, the digital safety debate has become a defining test of how the country balances innovation with the fundamental rights and well‑being of its youngest users.



