Malaysia is urging TikTok to introduce stronger age verification measures to protect children from harmful online content. Communications Minister Fahmi Fadzil said the government is concerned about the platform’s current approach and wants TikTok to work closely with local authorities to ensure better safety standards.
Fahmi revealed that officials recently met with TikTok representatives and expressed dissatisfaction with the way the company handles harmful material. He stressed that TikTok must collaborate with both the Communications Ministry and the police to create an effective system that verifies user ages before granting access.
Malaysia’s definition of harmful content includes online gambling, scams, child pornography and grooming, cyberbullying, and sensitive material involving race, religion, and royalty. Fahmi said stricter age checks would help prevent children and teens from being exposed to such risks.
Malaysia recently introduced a new law requiring social media platforms with more than eight million users to obtain a local license. This move aims to give the government greater oversight, ensuring companies comply with safety regulations. Platforms that fail to follow the rules could face fines or other penalties.
Globally, other nations are also stepping up. Since July, Britain has required pornography sites and platforms hosting harmful content to verify users’ ages, while France, Spain, Italy, Denmark, and Greece are jointly testing a template for an age verification app. Australia, meanwhile, has gone even further by banning children under 16 from using social media.
“Children must be safe online. Platforms like TikTok need to take responsibility and ensure young users are not exposed to harmful content,” Fahmi said.