Tech

Snapchat officially introduces parental controls through a new ‘Family Center’ feature – TechCrunch


Snapchat is rolling out its first set of parental controls today, after announced last October it is developing tools that allow parents to better understand how their teens use social networking apps. The update follows the launch of similar parental control features on other apps loved by teens, including Instagram, TikTok and YouTube.

To use the new feature, called Family Center, parents or guardians will need to have the Snapchat app installed on their own device to link their account with the teen through the invite process. .

Once configured, parents will be able to see which accounts their kids have been chatting with on the app for the past seven days without being able to see the content of those messages. They can also view the teen’s friends list and report potential abuse to Snap’s Trust & Safety team for review. These are basically the same features TechCrunch reported earlier this year is in development.

Parents can access new controls from the app’s Profile Settings or by searching for “home hub” or related terms from the app’s Search feature.

Snap notes that the feature is only available to parents and teens ages 13 to 18 because the app isn’t intended for younger users. The launch comes later get a raise pressure on social media to better protect their minor users from harm both in the United States and abroad. This has led major tech companies to introduce parental controls and other safety features follow with EU law and expected US regulations.

Other social networks have introduced more extensive parental controls than what was available at launch from Snapchat’s Family Center. For example, TikTok allows parents to set screen time controls, enable a more “restricted mode” for younger users, turn off search, set accounts to private, and restrict messaging and who can see the teen’s likes and who can comment on their posts, Among other things. Instagram also includes support for time limits set by parents along with parental controls.

However, Snap points out that it doesn’t require a lot of parental controls because of the way its app was designed in the first place.

Image credits: Snap

By default, teens have to be mutual friends to initiate communication – thus reducing the risk of them receiving unwanted messages from potential predators. Friends lists are private and teens are not allowed to have public profiles. Additionally, teen users only show up as “Recommended Friends” or in search results when they have mutual friends with users on the app, which also limits their visibility. .

That said, parental interest in Snapchat isn’t limited to fears of unwanted contact between teens and potentially dangerous adults.

At its core, Snapchat’s disappearing messages feature makes it easier for teens to engage in bullying, abuse, and other inappropriate behaviors, such as sending sexual messages. Thus, Snap became the subject of much litigation from grieving parents whose teens commit suicide. They claim that Snap’s platform has helped facilitate online bullying, since then led the company to improve its policies and limited access to its developer tools. It also cuts out friend-finding apps that encourage users to share their personal information with strangers – one Common avenue for Child predator to reach youngerSnapchat users are vulnerable.

Sexting is also a matter of much litigation. Most recently, a teenage girl filed a class action lawsuit against Snapchat which alleges that their designers did nothing to protect against the sexual exploitation of girls using their services.

Image credits: Snap

With Snapchat’s new Family Hub, the company is giving parents some insight into teen use of the app — but not enough to completely prevent abuse or exploitation. because it advocates maintaining the privacy of teenagers.

For parents, the ability to see a teen’s friends list doesn’t necessarily help them understand if those contacts are safe. And parents don’t always know the names of all of their teen’s school friends and acquaintances, only the names of their closer friends. Snap also doesn’t allow parents to block their teens from sending private photos to friends, nor does it implement a similar feature. Apple’s iMessage technology automatically intervenes to warn parents when sexually explicit images are sent in chats. (Although it has now tapped into CSAI Matching Technology to remove known abusive material.)

The Family Center also doesn’t offer control over if and how their kids can interact with the app’s Spotlight feature, a TikTok clone of short videos. Parents also can’t control whether their child’s live location can be shared on Snap Maps within the app. And parents can’t control who their teens can add as friends.

The company’s Explore section is also ignored by parental controls.

In a congressional hearing last year, Snap was asked to defend why some of the content in its Explore section was clearly aimed at adults – like invitations to pornographic video games, articles about going to bars or Articles about pornography and other items appear out of sync with the app’s age rating of 13+. The new Home Center offers no control over this part of the app, which includes a large amount of click content.

We’ve found this section always features intentionally shocking medical photos and images – similar to the low-click-value articles and ads you’ll see scattered across the web.

At the time of writing, a quick glance at Explore uncovered various articles designed to frighten or alarm – at least three with photos of giant spiders. Another story is about a parent who murdered her children. One story focuses on the suicide forest of Japan and another is about people who have died at amusement parks. There’s also a story about a teacher caught “cheating” (its words) a 12-year-old student – a really disgusting way to title a story about child sexual abuse . And there are plenty of photos of rare medical conditions that should probably be seen by doctors, not teenagers.

Snap says a future update will introduce “content controls” for parents and the ability for teens to notify parents when they report an account or piece of content to the safety group. by Snap.

“While we tightly moderate and regulate both our content and entertainment platforms, and do not allow undisclosed content to reach a large audience on Snapchat, we know every family Families have differing views on what content is appropriate for their teens and wanted to give them a Snap spokesperson for the upcoming parental control features.

The company added that it will continue to add other controls after receiving more feedback from parents and teens.



Source link

kignews

Kig News: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button