On Tuesday, TikTok rolled out a slew of features and guides to support users struggling with mental health issues.
As part of its efforts the platform has created a well-being guide that users can access on its Safety Center. It developed this guide alongside International Association for Suicide Prevention, Crisis Text Line, Samaritans (UK), Samaritans of Singapore, and Live For Tomorrow. The guide advises users to think about why they are sharing their experiences. It also encourages them to consider if they are ready for the impact their posts could have, among other things.
Also, there is a search intervention feature that will direct users who search for phrases like #suicide to local support resources like Crisis Text Line helpline. Furthermore, when users search for sensitive content, the platform will blur it after which they can “opt-in” to view it. This will protect users from potentially triggering posts.
TikTok’s move follows a report by the Wall Street Journal that has placed Instagram, a major rival, under severe scrutiny. The report leaked Facebook’s research that shows Instagram negatively impacts the mental health of its young female users.