Skip to content

Meta Will Use AI to Analyse Height and Bone Structure to Detect Underage Users

1 min read
Share

Meta has announced it will use artificial intelligence to analyse photos and videos on Facebook and Instagram - with the goal of detecting whether a user is under 13 years old. The system will analyse „general visual signals" such as the height and bone structure of the face in the frame.

The company immediately made a distinction: this is not facial recognition. „We want to be clear: this is not facial recognition. Our AI looks at general themes and visual cues, such as height or bone structure, to estimate general age; it does not identify a specific face," the statement says.

The system does not work in isolation. It combines visual signals with analysis of text and interactions - birthdays, mentions of school, posts, comments, bios. If the algorithm concludes that the user is underage, the profile is deactivated. If the user wants to keep it, they have to go through an age-verification process.

To understand why Meta has suddenly become so concerned about protecting children, one number is enough - 375 million dollars. That is what the company has to pay under a New Mexico jury verdict for misleading consumers about the safety of the platform and endangering children. It is probably no coincidence that the next week brings new measures.

Meta is at the same time expanding „Teen Accounts" in 27 EU countries and in Brazil, and a similar regime is being introduced on Facebook in the US, the UK and the EU. Teen accounts have message restrictions limited to followers, filters for upsetting comments, and private settings by default.

The question for us in the Balkans is simple: an algorithm that, with the help of height and bone structure, decides whether you are underage - is this technology that works, or that works „well enough" to avoid lawsuits? And even more important - whose servers do those photos go to? Meta claims it does not identify specific people. But the same company that watched for years as its platform destroyed teenagers' mental health is now asking for trust on its word.