Meta Rolls Out Teen Instagram Accounts: Stricter Privacy, Messaging Limits, And Parental Control Features
Meta has introduced ‘Teen Accounts’ for Instagram with stronger built-in protections for under-16 users and parental supervision provisions. The new feature, aimed at teens, will make accounts default to private mode, limit messaging and interactions, display set time reminders, and allow parents to control the settings of their child’s account and monitor who they interact with.
Meta plans to launch these accounts within 60 days in the US, UK, Canada, and Australia, later in the year in the European Union, and early 2025 in the rest of the world. It also plans to bring these provisions to other Meta platforms, like Facebook, next year.
What are the new features?
- Private accounts: Under-16 users who are currently using Instagram or signing up and under-18 users who are signing up will now have private accounts by default. They will need to accept new followers and people who do not follow them cannot view or interact with their posts
- Messaging restrictions: Teen Instagram users will receive messages only from users they follow or are already connected to.
- Sensitive content restrictions: Meta will restrict teens from viewing ‘sensitive content’ such as content that shows people fighting, promotes cosmetic procedures, or involves suicide or self-harm.
- Anti-bullying features: These Instagram users can only be tagged or mentioned by people they follow. Meta will also enforce its anti-bullying feature, ‘Hidden Words’, to filter out offensive words and phrases from comments and DM requests.
- Time limit reminders: Users will get notifications telling them to leave the app after 60 minutes each day.
- Sleep mode enabled: Meta will turn on ‘Sleep mode’ on between 10 PM and 7 AM, which will mute notifications overnight and send auto-replies to DMs.
Parental supervision features
Meta has also introduced features enabling parents to approve and deny their children’s requests to change settings or allow teens to manage their settings themselves. They will also be able to directly change settings, soon. Parents will also be able to:
- Gain insights into who the teens are chatting with: Parents will now be able to see who their teen has messaged in the past seven days but they will not be able to read these messages.
- Impose daily time limits for usage: Parents can set a daily time limit for their child’s app usage, which will not allow them to access the app after they reach a limit.
- Block from using Instagram for specific time periods: Parents can choose to block their teenage kids from using the app for set time periods.
- See topics: Parents can view the topics their teen has chosen to view.
How will Meta identify teenagers?
Meta has stated that it has taken into consideration that teen users may try to circumvent age verification requirements, that is, teenagers may lie about their age to bypass the limits. Thus, it has introduced a measure wherein users will have to verify their age with an ID if they change their birth date. In addition to this, they are also testing two new measures to verify age.
Users can confirm their age through a method called ‘social vouching’ wherein three of the user’s followers can vouch that they are 18. The followers must be at least 18 years old and must not be vouching for anyone else at that time.
Further, Meta has collaborated with technology company Yoti, to allow it to estimate a person’s age through a video selfie. The software will determine a person’s age based on their facial features. The tech giant has stated that it will delete the image immediately after age confirmation and will not use it for anything else.
Meta introduces changes following criticism for its impact on children
Meta has been criticised in multiple jurisdictions for its effect on minors. In 2021, a report by the Wall Street Journal found Meta’s internal research that showed that a large number of teenagers (32%), particularly teenage girls, trace a significant amount of anxiety and mental health problems to Instagram.
Following this, 33 US states filed a lawsuit against Meta in 2023 for knowingly designing and deploying harmful features on Instagram and other social media platforms that make children and teens addicted to them. The lawsuit also questioned it for its non-compliance with the Children’s Online Privacy Protection Act of 1998 (COPPA), alleging that Facebook and Instagram, process children’s data without parental consent. The European Commission has also sought information on how Meta protects children from harmful content.
Meta’s response
In response to this, Meta began making changes to accommodate young users. It introduced its first set of safety features on Instagram that would provide ‘Take a Break’ reminders and notifications directing teens away from topics that they are spending too much time on, limit the visibility of sensitive content, restrict who can tag teens, and provide more parental controls. Earlier this year, it introduced a feature allowing teens to turn off direct messages (DMs) from those they aren’t connected with on Instagram. It also restricted teens’ content access related to suicide, self-harm, and eating disorders and prompted them to update their privacy settings.
Catering to a young teenage audience also furthers Meta’s business interest. In its Q3FY21 earnings call, Meta said that teens use their products, but they were facing tough competition from TikTok and Snapchat. When asked how it planned to engage with younger audiences, Meta said that it aimed to innovate and create products like Reels that would attract and retain a younger demographic.
Also Read:
- 33 US States Sue Meta Over Harmful Effects On The Mental And Physical Health Of Young People
- Meta Under Fire In The US For Allegedly Allowing Underage Users To Access Its Platforms
The post Meta Rolls Out Teen Instagram Accounts: Stricter Privacy, Messaging Limits, and Parental Control Features appeared first on MEDIANAMA.