(CNN)TikTok is rolling out new resources to support the well-being of its hundreds of millions of users, most of whom are teens and young adults.
The resources include in-app guides that address topics such as "signs of struggling," "steps to create a connection," and advice about eating and body concerns, with an aim toward helping people who are dealing with mental health issues.
"We're proud that our platform has become a place where people can share their personal experiences with mental well-being, find community and support each other, and we take very seriously our responsibility to keep TikTok a safe space for these important conversations," said Tara Wadhwa, TikTok's US director of policy, in the September 14 announcement.
TikTok has further developed its search engine interventions, too. Search for words or phrases such as "suicide," and you'll be met with information for local support resources which offer guidance about treatment options. If you opt to view search results, you'll generally see supportive or educational content about suicide, rather than potentially dangerous TikToks.
"That's a great idea and really pretty much all that a platform can do, because other than that, we move into more extreme censorship of content and maybe even ... monetizing things that are intended to be helpful resources," said Mike C. Parent, a psychologist and associate professor in the department of educational psychology at the University of Texas at Austin.
"It's important to talk about suicide and eating disorders and not remove that content," he added. Some people might think talking about suicide will make teens more inclined to try it, Parent said, but that's not always the case.
TikTok's announcement comes in the wake of a Wall Street Journal story that alleged Facebook publicly downplayed Instagram's effects on teens' mental health, though Facebook's own research reportedly revealed serious negative impacts.
"While the (WSJ) story focuses on a limited set of findings and casts them in a negative light, we stand by this research," said Karina Newton, Instagram's head of public policy, in a statement. "It demonstrates our commitment to understanding complex and difficult issues young people may struggle with, and informs all the work we do to help those experiencing these issues. The question on many people's minds is if social media is good or bad for people. The research on this is mixed; it can be both."
Instagram has had tools -- such as a "Get Support" prompt that directs users to helplines and other tips -- intended to help people struggling with mental health issues for some time, according to a company spokesperson.
Pros and cons of the changes
Chicago-based psychologist John Duffy said via email that many of his young clients say they initially learn about depression, anxiety, attention problems and eating disorders on TikTok.
"I've seen some of these videos, some by other children and others posted by professionals, and many of them are accurate, informative and quite helpful," said Duffy, who works with teens, parents, couples and families and wrote "Parenting the New Teen in the Age of Anxiety." "I'm glad that there is some good mental health-related information available to our young people on a platform that draws them in."
"That said, these changes are not nearly enough," Duffy added. Kids often rely on TikTok content to diagnose and treat themselves, which can be dangerous without adult supervision. "It is crucial that TikTok makes it clear that their platform is not a substitute for direct mental health care." At the bottom of its "Well-Being Guide," TikTok states the guides are for informational and educational purposes only and aren't "intended to provide mental health or medical services."
Plus, the guides -- which were created with the help of expert organizations including Crisis Text Line, the International Association for Suicide Prevention, Live for Tomorrow, Samaritans of Singapore, Samaritans (UK) and the National Eating Disorders Association -- exist in TikTok's Safety Center. To access them, users can go to their profile, click the menu icon in the upper right corner, then scroll down to the "Support" section, where they can click "Safety Center."
"The general thing with user experience is that everything important should be one click or less away from the user," Parent said. "Putting it multiple clicks away does form a barrier."
TikTok already had warning labels and opt-in screens over videos with sensitive or distressing content. The company is expanding on that by applying the warnings to search results as well, for phrases such as "scary make-up."
Offering trigger warnings is polite in certain settings, but research has shown trigger warnings can be a double-edged sword -- in that people sometimes might increasingly incorporate trauma into their identity rather than viewing triggering content as something to process to become healthier, Parent said.
"It ends up maybe what, in psychology, we would call a safety behavior, which sounds nice, but it's actually unhealthy," Parent said. "It's sort of like a person who's afraid of spiders and you never ever show them a photo of spiders -- well, they're never not going to be afraid of spiders."
Advice for parents and teens
Having open communication with teens before problems arise is important, Parent said. If parents suspect their teens' social media use is harming their mental health, don't punish them, he added -- instead, enlist a mental health professional who can mediate those conversations in ways that might better resonate with kids.
Also, remember that problematic social media use related to issues such as body image can be more a symptom than the cause of the teen's issue, according to Parent.
"Body image concerns existed long before social media. And we now live in an era where you can go on social media and find images of people as models who you never would have seen before," Parent said. "Before, it was a bunch of White people and maybe Tyra Banks, and they were all skinny and all looked a particular way."
Parents should familiarize themselves with social media platforms, especially the mental health-related elements, and regularly talk openly with their teens about what they're witnessing, Duffy said.
For teens, being aware of the deception common on social media -- such as cosmetic filters and edits -- and changing your social media consumption can help.
If you live in the US and are experiencing suicidal thoughts, you can call 1-800-273-8255 to reach the National Suicide Prevention Lifeline, which provides free and confidential support 24/7 for people in suicidal crisis or distress. You can learn more about its services here, including its guide on what to do if you see suicidal language on social media. You can also call 1-800-273-8255 to talk to someone about how you can help a person in crisis. For crisis support in Spanish, call 1-888-628-9454.