Some of the biggest social media companies are working together as part of a program to prevent content that features suicide or self-harm from spreading online.
Meta, Snapchat, and TikTok have all signed up to work with the Mental Health Coalition on the new Thrive program which was announced on Thursday (Sep,12).
The scheme involves participating technology companies sharing signals about suicide or self-harm content so that other companies can investigate and take action if the same content is being shared on their platform.
“We at MHC are excited to work with Thrive; a unique collaborative of the most influential social media platforms that have come together to address suicide and self-harm content.
“Meta, Snap and TikTok are some of the initial partners to join ‘the exchange’ committing to make an even greater impact and help save lives,” said Kenneth Cole, founder of The Mental Health Coalition.
If content raises concerns, the companies involved will receive alerts and then be able to independently assess whether to take action.
The director of Thrive, Dr. Dan Reidenberg, described the project as being a ‘major breakthrough’:
“The integration of signal sharing, coupled with cross-industry collaboration and moderated by an independent and neutral intermediary, represents a major breakthrough in industry collaboration and public protection on the global, public health crisis of suicide and ultimately save lives.”
We’re proud to be a founding member of Thrive, a new program that allows tech companies to share signals about violating suicide or self-harm content and stop it spreading across different platforms.https://t.co/RXBc9SdGI7
— Meta Newsroom (@MetaNewsroom) September 12, 2024
How does the signal sharing technology work to address self-harm related content?
The technical infrastructure behind Thrive is being provided by Meta and is the same technology they provide to the Tech Coalition’s Lantern program which also enables signals to be shared.
“Participating companies will start by sharing hashes – numerical codes that correspond to violating content – of images and videos showing graphic suicide and self-harm, and of content depicting or encouraging viral suicide or self-harm challenges,” writes Meta in the announcement.
“We’re prioritizing this content because of its propensity to spread across different platforms quickly. These initial signals represent content only, and will not include identifiable information about any accounts or individuals.”
The post Meta, Snapchat and TikTok work together to tackle self-harm content appeared first on ReadWrite.