Atlanta, GA
Sign InEvents
ATLANTA BUSINESS
Magazine
DOW
S&P
NASDAQ
Real EstateFinanceTechnologyHealthcareLogisticsStartupsEnergyRetail
● Breaking
Fed Rate Moves: What Atlanta Business Leaders Need to KnowNvidia CEO Huang Names the Most AI-Proof Career PathRoku's Budget Streaming Service Howdy Hits 1M SubscribersAustralia's Digital Tax Could Signal Global Shift for Tech GiantsPershing Square Goes Public: What Atlanta Investors Need to KnowFed Rate Moves: What Atlanta Business Leaders Need to KnowNvidia CEO Huang Names the Most AI-Proof Career PathRoku's Budget Streaming Service Howdy Hits 1M SubscribersAustralia's Digital Tax Could Signal Global Shift for Tech GiantsPershing Square Goes Public: What Atlanta Investors Need to Know
Advertisement
Technology
Technology

EU Targets Meta Over Child Safety Failures on Facebook, Instagram

The European Union's Digital Services Act enforcement could reshape how tech platforms protect minors—with potential implications for Atlanta-based tech compliance and policy.

AI News Desk
Automated News Reporter
Apr 29, 2026 · 2 min read
EU Targets Meta Over Child Safety Failures on Facebook, Instagram

Photo via Fast Company

The European Union has formally accused Meta of inadequately protecting children on Facebook and Instagram, according to reporting from Fast Company. The EU's executive branch claims Meta lacks sufficient safeguards to prevent users under 13 from creating accounts, and fails to effectively identify and remove underage accounts once they're active. The finding challenges Meta's stated minimum age requirement of 13 and raises questions about enforcement across the tech industry.

Beyond access control, European regulators argue Meta is not properly evaluating exposure to age-inappropriate content for younger users. The accusation stems from an investigation launched in 2024 under the EU's Digital Services Act, a comprehensive regulatory framework requiring tech companies operating in the 27-member bloc to strengthen user protections. According to Henna Virkkunen, an executive vice president at the European Commission, the investigation found Meta platforms 'are doing very little' to align actual practices with their stated terms of service.

Meta has pushed back on the findings, contending it has detection systems in place to identify and remove accounts belonging to users under 13. The company characterized age verification as 'an industry-wide challenge' requiring collaborative solutions and promised additional protective measures in the coming weeks. The response suggests Meta intends to work with regulators rather than contest the preliminary findings.

Potential penalties could be substantial: violations of the Digital Services Act can result in fines up to 6% of a company's annual worldwide revenue. Meta now has an opportunity to formally respond before the EU issues final conclusions. For Atlanta-area tech companies and digital marketing firms serving Meta's platform, the regulatory tightening signals stricter compliance requirements ahead and may influence how brands approach youth-targeted campaigns and audience data collection.

Advertisement
MetaDigital Services Actchild safetytech regulationEU enforcement
Related Coverage
Advertisement