TikTok is refusing to mention how many videos were taken down or restricted.
Social media companies need to "get serious" about asking for ID to prevent harmful content being shown to teenagers.
Speaking to KFM, David McNamara, Managing Director at CommSec Cyber Security, said social media companies need act fast to stop youths being exposed to suicide-related content on their platforms.
He said people often share content without understanding the "consequences and how it’s hurting other people.”
He made an urgent call for TikTok to change its algorithm.
He said similar to banking apps, social media companies needs to start asking for ID.
His comments follow reports on Primetime that children as young as 13 are creating and viewing suicidal content on TikTok.
TikTok is refusing to mention how many videos were taken down or restricted.
McNamara said their response is not good enough - “by a long shot.”
Meanwhile, Facebook and Instagram are being investigated over concerns they lead to addictive behaviour in children.
The European Commission will look at systems which create so-called "rabbit hole" effects.
Meta, which owns both platforms, says it has a number of online tools to protect children.

Taxi Driver Protest Suspended Pending Outcome Of Talks
Lights Could Make Anti-Social Behaviour Worse: Council Says No To Security Lighting At Kilcock Playground
BREAKING: Green Light For €1.3bn Drainage Scheme Serving Kildare After Court Agreement
Kildare Man Nigel Cullen Awarded With National Bravery Award After Saving Young Man From House Fire
Council Diverts €17K Carbury Castle Resoration Funding To Christmas Lights - Maynooth Gets Lion's Share
Lakeside–Dara Park Redevelopment: Newbridge Residents Urged To Take Part In Survey
Irish Troops Come Under "Small-Arms Fire" In Lebanon
€17,000 Paid Out For Pothole Damage In Kildare - But Some Claims Now Deemed ‘Act Of God’