The Balkans In The Red: 90% of Europe Breathes Bad Air - We Pay With Our Lives
06.05.2026
06.05.2026
06.05.2026
06.05.2026
06.05.2026
05.05.2026
04.05.2026
06.05.2026
06.05.2026
05.05.2026
06.05.2026
06.05.2026
06.05.2026
06.05.2026
05.05.2026
04.05.2026
06.05.2026
06.05.2026
06.05.2026
06.05.2026
06.05.2026
06.05.2026
09.03.2026
27.02.2026
19.02.2026
14.04.2026
07.11.2025
07.11.2025
No news available in this category.
23.04.2026
23.04.2026
12.04.2026
Meta has announced it will use artificial intelligence to analyse photos and videos on Facebook and Instagram - with the goal of detecting whether a user is under 13 years old. The system will analyse „general visual signals" such as the height and bone structure of the face in the frame.
The company immediately made a distinction: this is not facial recognition. „We want to be clear: this is not facial recognition. Our AI looks at general themes and visual cues, such as height or bone structure, to estimate general age; it does not identify a specific face," the statement says.
The system does not work in isolation. It combines visual signals with analysis of text and interactions - birthdays, mentions of school, posts, comments, bios. If the algorithm concludes that the user is underage, the profile is deactivated. If the user wants to keep it, they have to go through an age-verification process.
To understand why Meta has suddenly become so concerned about protecting children, one number is enough - 375 million dollars. That is what the company has to pay under a New Mexico jury verdict for misleading consumers about the safety of the platform and endangering children. It is probably no coincidence that the next week brings new measures.
Meta is at the same time expanding „Teen Accounts" in 27 EU countries and in Brazil, and a similar regime is being introduced on Facebook in the US, the UK and the EU. Teen accounts have message restrictions limited to followers, filters for upsetting comments, and private settings by default.
The question for us in the Balkans is simple: an algorithm that, with the help of height and bone structure, decides whether you are underage - is this technology that works, or that works „well enough" to avoid lawsuits? And even more important - whose servers do those photos go to? Meta claims it does not identify specific people. But the same company that watched for years as its platform destroyed teenagers' mental health is now asking for trust on its word.
The latest 10 news from this category
Peter Sarlin raised 25 million euros in an angel round, with no venture funds, and a clear message - we...
Germany's SAP is going aggressive in the AI race - buying a Berlin startup and committing 1 billion euros over...
The US government is warning all federal agencies to patch by May 15. Data centres, banks, public administration - everyone...
Bret Taylor's startup - former Salesforce co-CEO, now OpenAI chairman - hits 15 billion dollar valuation. Uber is already writing...
Blackstone, Goldman Sachs, Sequoia on one side. TPG, Brookfield, Bain Capital on the other. The winner of the AI war...
The company that was the symbol of the end of cinema is now building a full theatrical window for Daniel...
21 companies that are not Lovable and not Mistral - but might be the future of Europe's tech scene. Eastern...
A Harvard study pits OpenAI models against doctors in emergency medicine. The numbers favour the AI - but the researchers...
Cursor (Claude Opus 4.6) decided on its own to "fix" a discrepancy. Silicon Valley is in a hurry. Users pay...
Only 20,000 people are developing AI. Fewer than 200 work on its safety. The documentary offers no answers - but...