Meta - the parent company of the Instagram app, has just announced a series of new safety features to protect teenagers when using Meta platforms, including displaying information about the accounts sending messages to them, along with the option to block and report with just one tap.
Meta also said it had removed thousands of accounts that left sexually explicit comments or requested explicit images. The accounts were identified as being owned by adults but posing as children under 13. Of these, 135,000 were for comments and 500,000 were for "inappropriate interactions," according to a statement on the company's blog.
The move comes as social media platforms come under increasing scrutiny over their impact on the mental health and safety of young users, particularly amid concerns about the risk of being lured and blackmailed with nude photos.
According to Meta, thanks to a safety alert reminding users to "be careful when private messaging and report anything that makes you uncomfortable," more than 1 million accounts were blocked and more than 1 million violations were reported.
Earlier this year, Meta began testing artificial intelligence (AI) technology to detect users who lied about their ages on Instagram, a platform that is only for people over 13.
If violations are detected, the account will automatically be converted to a junior account, which has more restrictions than an adult account.
Teen accounts will be private by default, allowing messages only from people they follow or have previously connected with. This policy will apply from 2024.
Meta is currently facing multiple lawsuits from dozens of states in the US, accusing the company of deliberately designing addictive features on Instagram and Facebook, thereby harming the mental health of young people./.
(TTXVN/Vietnam+)
Source: https://www.vietnamplus.vn/meta-tang-cuong-bien-phap-bao-ve-thieu-nien-tren-instagram-post1051542.vnp
Comment (0)