Malaysian Communications Minister Fahmi Fadzil has called on TikTok to enhance its age verification mechanisms in response to growing concerns about the platform's impact on children’s mental health. Following a meeting with TikTok’s top management, Fadzil expressed dissatisfaction with the platform’s current protective measures but has given the company an opportunity to collaborate with national authorities on improvements. The government is pushing for a robust verification system to be developed in partnership with the Malaysian Communications and Multimedia Commission and law enforcement agencies.
This initiative is part of Malaysia’s broader regulatory framework for large social media platforms. Since January, services with more than 8 million users have been required to obtain a license and comply with strict content guidelines. Platforms that fail to prevent harmful material—such as online gambling, scams, child exploitation, cyberbullying, or content related to sensitive topics like race and religion—may face significant penalties. Malaysian officials have also indicated that they plan to engage other major tech companies, including Meta and X, to address similar issues.
The move reflects a wider global trend toward strengthening online protections for minors. Countries like Australia and the UK have already introduced age restrictions and verification requirements for social media and other platforms hosting harmful content. Several European nations are collaboratively testing an age verification app to shield underage users. Malaysia’s actions underscore its commitment to balancing digital accessibility with the safety and mental well-being of young people, emphasizing the responsibility of social media companies in creating a safer online environment.
Lorem ipsum dolor sit amet, consectetur adipisicing elit. Velit omnis animi et iure laudantium vitae, praesentium optio, sapiente distinctio illo?