Meta, Snap, and TikTok have launched Thrive, a program aimed at curbing the spread of graphic content related to self-harm and suicide.
Thrive allows these companies to share “signals” to identify and address violating content across platforms.
Developed with the Mental Health Coalition, Thrive uses secure signal-sharing technology similar to that in the Lantern program, which combats online child abuse.
Meta has already made it more difficult to find such content on its platform while allowing discussions on mental health when not promoting self-harm.
Meta reports it acts on millions of pieces of self-harm content quarterly, with about 25,000 posts restored last quarter after user appeals.
We’ve worked with the Mental Health Coalition to establish Thrive, a new program that allows tech companies to share signals about violating suicide or self-harm content and stop it spreading across different platforms.
Between April and June this year, we took action on over 12 million pieces of suicide and self-harm content on Facebook and Instagram.
While we allow people to discuss their experiences with suicide and self-harm – as long as it’s not graphic or promotional –
this year we’ve taken important steps to make this content harder to find in Search and to hide it completely from teens, even if it’s shared by someone they follow.
The International Association for Suicide Prevention lists a number of suicide hotlines by country.