Meta, Snap, and TikTok have founded a new program called Thrive to help stop the spread of graphic content depicting or encouraging self-harm and suicide. Thrive enables participating companies to share “signals” to alert each other of violating content on their platforms.

Thrive is built in conjunction with the Mental Health Coalition, a charitable organization that says it works to remove the stigma around mental health discussions. Meta says it provides the technical infrastructure behind Thrive that enables “signals to be shared securely.” It uses the same cross-platform signal sharing tech used in the Lantern program, which is designed to help fight child abuse online. Participating companies can share hashes that match the violating media to signal it to each other.

Meta says it’s already made such content harder to find on its platform but that it’s trying to leave room for people to discuss their mental health, suicide, and self-harm stories, as long as they aren’t promoting it or providing graphic descriptions.

According to Meta’s charts, the company takes action on millions of pieces of suicide and self-harm content every quarter. Last quarter, it restored an estimated 25,000 of those posts, most of them after a user appealed.

Shares:

Leave a Reply

Your email address will not be published. Required fields are marked *