EU Finds Meta Violated Digital Rules On Child Safety

Post by : Sophia Matthew

The European Commission has preliminarily found that Meta Platforms may have violated European digital safety rules by failing to adequately protect children using Facebook and Instagram.

According to the Commission, Meta did not do enough to prevent children under the age of 13 from accessing its platforms, despite company policies that officially prohibit users below that age from creating accounts.

The investigation was conducted under the European Union’s Digital Services Act, commonly known as the DSA, which requires major online platforms to protect users, especially minors, from harmful or unsafe online experiences.

EU regulators said current safeguards used by Meta are weak because children can easily bypass age restrictions simply by entering false birth dates during account registration. Officials argued that the company lacks effective age verification systems capable of preventing underage access.

The Commission also criticized Meta’s reporting systems, saying they are unnecessarily difficult for users to navigate. Investigators found that users may need to click through several steps before reaching tools designed to report harmful or inappropriate content.

In addition, regulators claimed Meta’s internal risk assessment underestimated the size of the issue. According to EU officials, available evidence suggests that between 10 and 12 per cent of children under 13 within the European Union may still be accessing Facebook and Instagram.

Henna Virkkunen, the EU’s technology chief, said online safety rules should not exist only on paper. She stated that companies must take real and effective action to protect children and younger users online.

Meta responded by saying it disagrees with the preliminary findings but acknowledged that age verification remains a challenge across the technology industry. The company also announced plans to introduce additional safety measures in the near future.

The investigation into Meta originally began in May 2024 as part of broader European efforts to increase oversight of major technology companies under the DSA framework. The law gives European regulators greater authority to investigate online platforms over issues including child safety, misinformation, harmful content, and user protection.

If the preliminary findings are confirmed after further review, Meta could face major financial penalties. Under EU law, companies found in serious violation of the DSA can be fined up to six per cent of their global annual revenue, potentially resulting in penalties worth billions of dollars.

The case highlights growing international pressure on major social media companies to improve protections for children online. Governments and regulators in several countries have recently increased scrutiny of how digital platforms handle underage users, harmful content, privacy, and mental health concerns.

The European Commission has also launched investigations into other social media platforms over child safety issues. Earlier this year, regulators opened a similar investigation into Snapchat regarding the protection of minors.

The EU’s actions reflect a wider global debate about the responsibilities of large technology companies toward younger users. Policymakers increasingly argue that platforms should be required to introduce stronger identity checks, safer content moderation systems, and easier reporting tools to reduce online risks for children.

The investigation into Meta is still ongoing, and the company will have the opportunity to formally respond to the allegations and propose corrective measures before any final decision is made by European regulators.

April 29, 2026 4:13 p.m. 106

Canada News CNI News world news