TikTok it announced a new set of resources to support users experiencing mental health issues, especially those related to eating disorders, self-harm and suicide.
First, the platform expands its pitguides to support people who choose to share their personal experiences on the platform.
As explained by TikTok:
“While we do not allow content that promotes, glorifies, or normalizes suicide, self-harm, or eating disorders, we do support people who choose to share their experiences to raise awareness, and help others who are struggling and finding support in our community. To make it easier, we have newly developed wellness guides to support people who choose to share their personal experiences, with the guidance of the International Association for Suicide Prevention, Crisis text line, Life for tomorrow, Samaritans of Singapore and Samaritans (UK). ”
The new guides, now available in TikTok’s Security Center, offers tips to help users share their experiences, as well as how to responsibly connect with others who are struggling or in need.
In addition, TikTok also highlights a new set of composite content from its partner organizations in the app, which provides more information on important wellness issues beforehand.
The new programming is currently available and runs until September 16th.
TikTok also expands its search directions when users enter questions about eating disorders, which will help them support professional tools and resources.
‘We added a new one Safety Center Guide about eating disorders for teens, caregivers and educators. Developed in consultation with independent experts, including the The National Eating Disorders Association (NEDA), National Eating Disorders Information Center, Butterfly Foundation, en Bodywhys, this guide will provide information, support and advice on eating disorders.”
TikTok also adds to suicide and self-harm searches, with links they refer to local support resources and options.
And last but not least, TikTok update the warning labels for sensitive content.
‘Starting in September, when a user searches for terms that may contain content that may upset some people, for example’ narrow makeup ‘, the search results page will be covered with a login screen. Individuals will be able to click ‘Show results’ to continue viewing the content. “
These login screens already appear on top of videos that some find graphic or disturbing, while this type of content may not be recommended for anyone else. Feeding You.
TikTok has gained a lot of traction among the younger audience over the past few years and is adding more and more users – and within that there is an obligation on the platform to protect these more impressive users, where possible, and to protect them both from harm, while also providing support.
TikTok has faced several challenges in this area. Last year, the app was temporarily banned in Italy after the death of a young girl participating in an in-app challenge, while TikTok has also been criticized for exploiting young girls, with its highly tuned algorithms seemingly in line with personal characteristics and content that may appeal to predators.
Like Instagram, the visual nature of the platform can easily lead to consequences for mental health, and as such, TikTok should facilitate support to users in need, as well as possible, with more resources, more in-app tools, and community connectivity.
There is no way to address such issues in their entirety, but it’s good to see how TikTok is adding new tools to this front.