New York
CNN
—
Snapchat says it’s working to make its app even safer for teen customers.
Guardian firm Snap mentioned Thursday that it’s rolling out a collection of latest options and insurance policies geared toward higher defending 13- to 17-year-old customers, together with restrictions on good friend solutions and a brand new system for eradicating age-inappropriate content material. The corporate additionally launched a collection of YouTube movies for folks concerning the options and an up to date web site laying out its teen security and parental management insurance policies.
The brand new options come amid rising stress on social media platforms by lawmakers, educators and fogeys to guard younger customers from inappropriate content material, undesirable grownup consideration, illicit drug gross sales and different points. A Snap govt testified alongside leaders from TikTok and YouTube in a fall 2021 Senate committee listening to about youth security on social media, promising new instruments to assist dad and mom maintain their teenagers protected. And since then, Snapchat — like different platforms — has rolled out quite a lot of new teen security and parental supervision instruments.
Thursday’s announcement follows the launch final 12 months of Snapchat’s Household Heart, which provides dad and mom extra perception into who their youngsters are speaking with on the messaging app. The app’s different current teen security measures embrace prohibiting younger customers from having public profiles and having teenagers’ Snap Map location-sharing instrument turned off by default.
As a part of Thursday’s function rollout, Snapchat will now require 13-to-17-year-old customers to have a larger variety of mutual associates in widespread with one other account earlier than that account will present up in Search outcomes or as a good friend suggestion, in an effort to keep away from teenagers including customers on the app who they don’t know in actual life. The app may also ship a pop-up warming to teenagers if they’re about so as to add an account that doesn’t share any mutual Snapchat associates or telephone e-book contacts.
“When a teen turns into associates with somebody on Snapchat, we wish to be assured it’s somebody they know in actual life — akin to a good friend, member of the family, or different trusted individual,” the corporate mentioned in a weblog submit.
Snapchat may also impose a brand new strike system for accounts selling content material inappropriate for teenagers in its Tales and Highlight sections, the place customers can share content material publicly on the app. If inappropriate content material is reported or detected by the corporate, it is going to instantly take away the content material and problem a strike towards the poster’s account. If a person accrues “too many strikes over an outlined time period, their account will probably be disabled,” the platform says, though it doesn’t lay out what number of strikes would result in a suspension.
Teen customers may also begin to see in-app content material geared toward educating them on on-line dangers akin to catfishing and monetary sextortion — when somebody persuades a sufferer to share nude pictures after which blackmails them for cash — and letting them know what to do in the event that they see it, together with offering hotlines to contact for assist. The PSA-style content material will probably be featured on Snapchat’s Tales platform and in response to sure search phrases or key phrases.