Photo via Fast Company
The European Union has formally accused Meta of inadequately protecting children on Facebook and Instagram, according to reporting from Fast Company. The EU's executive branch claims Meta lacks sufficient safeguards to prevent users under 13 from creating accounts, and fails to effectively identify and remove underage accounts once they're active. The finding challenges Meta's stated minimum age requirement of 13 and raises questions about enforcement across the tech industry.
Beyond access control, European regulators argue Meta is not properly evaluating exposure to age-inappropriate content for younger users. The accusation stems from an investigation launched in 2024 under the EU's Digital Services Act, a comprehensive regulatory framework requiring tech companies operating in the 27-member bloc to strengthen user protections. According to Henna Virkkunen, an executive vice president at the European Commission, the investigation found Meta platforms 'are doing very little' to align actual practices with their stated terms of service.
Meta has pushed back on the findings, contending it has detection systems in place to identify and remove accounts belonging to users under 13. The company characterized age verification as 'an industry-wide challenge' requiring collaborative solutions and promised additional protective measures in the coming weeks. The response suggests Meta intends to work with regulators rather than contest the preliminary findings.
Potential penalties could be substantial: violations of the Digital Services Act can result in fines up to 6% of a company's annual worldwide revenue. Meta now has an opportunity to formally respond before the EU issues final conclusions. For Atlanta-area tech companies and digital marketing firms serving Meta's platform, the regulatory tightening signals stricter compliance requirements ahead and may influence how brands approach youth-targeted campaigns and audience data collection.



