Select Page

Children are found to increasingly use the safety features for blocking and reporting accounts.
Meta, the parent company of Facebook and Instagram, is launching a host of safety features aimed at protecting young people from both “direct and indirect harm,” the company said in a July 23 statement.

“Teen Accounts will now have new safety features in DMs, giving teens more information about who they’re chatting with,” the statement said, noting that teenagers will be provided “more context” about the account they’re chatting with in order to spot scammers.

Instagram Teen Accounts, which were introduced in 2024, now ensure that teens younger than age 16 need a parent’s permission to change any of the built-in protections in these accounts.

According to the latest announcement, teens will see new options to view safety tips and block an account, as well as the month and date that the account was created on the platform.

An older account typically suggests more trust, as malicious actors generate new accounts when platforms identify and block their previous ones.

Meta requires users to have a minimum age of 13 to join Facebook or Instagram. The company uses a combination of age screens and artificial intelligence to gauge whether the user is at least age 13, although it has admitted that the task remains complicated.

The latest protections will cover adult accounts that primarily feature children. These include parents or caretakers who regularly share photos and videos of their children, or talent managers who run accounts representing children younger than age 13.

A new block-and-report option has been added in DMs to report violating accounts, and more teens have been using the feature.

“In June alone, they blocked accounts 1 million times and reported another 1 million after seeing a Safety Notice,” the company stated, adding that one in 10 teens used the location notice on Instagram to check where the people they’re chatting with are located and if it matches what they’re claiming.

Meta has faced challenges to ensure the protection of children using its platforms.

In 2024, Meta won the dismissal of a lawsuit that accused the company of misleading shareholders about Meta’s ability to ensure the safety of children who use Facebook and Instagram.
In 2023, a lawsuit initially filed by 34 states—including California and New York—sued Meta for harvesting data from children younger than age 13 without their parents’ consent, in violation of federal law, and contributing to the youth mental health crisis via addictive features on their social media apps.

The court declined to dismiss the lawsuit, saying that “Meta’s alleged yearslong public campaign of deception as to the risks of addiction and mental harms to minors from platform use fits readily within these states’ deceptive acts and practices framework.”

The case is still ongoing, although now with 29 states participating.

Image-Blurring, Offensive Comments

Meta introduced the nudity protection feature in DMs in 2024. It blurs images detected as containing nudity and encourages people to think twice before posting or sending nude images.

Since the introduction of the feature, the company said 99 percent of people, including teens, have it turned on. In June, more than 40 percent of the blurred images stayed blurred, according to the company, while in May, 45 percent of people decided against forwarding such content after seeing the warning.

Meta said that the nudity protection will be turned on “by default” for teens younger than age 18 globally, and is also suggested for adults.

Some of the protections given to teens will be extended to adult accounts primarily featuring children.

Strict monitoring of DMs and turning on the Hidden Words feature that filters offensive comments will be rolled out in the coming months for these accounts.

Adults who have been blocked by teens will be prevented from locating their accounts. Other teens will also find it harder to find these adults’ profiles on the platform, according to the company.

Finally, Meta stated that nearly 135,000 Instagram accounts, along with an additional 500,000 Facebook and Instagram accounts linked to the original accounts, were removed from the platforms for leaving sexualized comments or requesting sexual images from adult-managed accounts featuring children younger than age 13.

According to data from Statista, there are more than 250 million Facebook users and more than 171 million Instagram users in the United States.
(Visited 1 times, 1 visits today)
GLA NEWS
WP Twitter Auto Publish Powered By : XYZScripts.com