Instagram To Use Ai To Detect Underage Users
Instagram is planning to release an Artificial Intelligence (AI) tool that can detect when a minor is lying about their age, reported Bloomberg. Called ‘adult classifier,’ the system will go through a user’s profile and determine their age based on their follower list, content they interact with and even ‘happy birthday’ posts made by friends. Users detected as below 18 will automatically be slotted into a more restrictive version of the app, regardless of how old they claimed to be.
Meta had announced ‘teen accounts’ for Instagram in September 2024, which featured greater restrictions for users. Teen accounts are private by default, limit messaging and interactions, display set time reminders, and allow parents to control the settings of their child’s account and monitor who they interact with.
The social media giant has also announced other measures to identify people’s age. In one, users will have to verify their age with an ID if they change their birth date. Users can also confirm their age through a method called ‘social vouching’ wherein three of the user’s followers can vouch that they are 18. The followers must be at least 18 years old and must not be vouching for anyone else at that time.
Meta also collaborated with tech firm Yoti to allow users to verify their age with video selfies.
Why Meta is making these changes
Meta has been facing immense regulatory and media pressure over the impact of its platforms on teen users. The New York Times reported how Meta received over 1.1 million reports of under-age users on Instagram since early 2019 but disabled only a fraction of those accounts.
The Wall Street Journal had reported in 2021 that Meta was aware that large number of teenagers, particularly teenage girls, trace a significant amount of anxiety and mental health problems to Instagram.
Following this, 33 US states filed a lawsuit against Meta in 2023 for knowingly designing and deploying harmful features on Instagram and other social media platforms that make children and teens addicted to them. The lawsuit also questioned it for its non-compliance with the Children’s Online Privacy Protection Act of 1998 (COPPA), alleging that Facebook and Instagram, process children’s data without parental consent. Meta is also facing a lawsuit in the state of New Mexico, under the allegation that it knowingly exposed children to the twin dangers of sexual exploitation and mental health harm. The European Commission has also sought information on how Meta protects children from harmful content.
Also Read:
- June report shows decline in social media user complaints, but Meta’s self-harm reports remain high
- New Mexico Sues Snap Over Child Safety
- New York Passes Bill Prohibiting Social Media Platforms From Showing Addictive Feeds To Children
The post Instagram To Use AI To Detect Underage Users appeared first on MEDIANAMA.