Brussels investigates Facebook and Instagram for failing to protect the physical and mental health of young people

The European Commission has launched an investigation against two of Meta's social networks, Facebook and Instagram, because according to the European Commissioner for the Internal Market, Thierry Breton, it is not convinced "that Meta has done enough to comply with the obligations of the law.

Oliver Thansan
Oliver Thansan
15 May 2024 Wednesday 16:23
3 Reads
Brussels investigates Facebook and Instagram for failing to protect the physical and mental health of young people

The European Commission has launched an investigation against two of Meta's social networks, Facebook and Instagram, because according to the European Commissioner for the Internal Market, Thierry Breton, it is not convinced "that Meta has done enough to comply with the obligations of the law." of digital services (DSA): mitigating the risks of negative effects on the physical and mental health of young Europeans on their platforms. The EU leader has communicated the opening of this procedure provided for in the legislation through a tweet on X, without offering further explanations.

The articles for which Facebook and Instagram are being investigated are two. The first is number 35, on risk mitigation. The law states that these very large platforms “shall establish reasonable, proportionate and effective mitigation measures, adapted to the specific systemic risks identified” in a previous article. These are very broad, such as the dissemination of illicit content, although it seems that the Commission's investigation will focus mainly on the actual or foreseeable negative effects on children's rights, as it focuses on minors.

The other article of the digital services law that Breton cites as the basis of the investigation into Facebook and Meta, is article 28, which preserves the online protection of minors, indicating that this type of social networks accessible to minors "will establish measures “adequate and proportionate to guarantee a high level of privacy, security and protection of minors in its service.” The text prohibits platforms on which there are children from "not presenting advertisements on their interface based on profiling" based on the use of "personal data of the recipient of the service when they know with reasonable certainty that the recipient of the service is a minor." “.

The DSA, which came into full force in February this year, provides for fines for non-compliance with a single obligation of a maximum of 6% of the annual turnover of the previous financial year. A maximum penalty of 1% of annual worldwide revenue is also foreseen for providing incorrect, incomplete or misleading information, in addition to periodic penalty payments of a maximum of 5% of the average daily turnover.

This investigation is not the first that the European Commission has undertaken into the risks to children under the DSA. Last February, the Community Executive announced a formal procedure to evaluate whether TikTok may have broken the law in the areas of protection of minors, advertising transparency, access to data by researchers and risk management. Addictive designs and harmful content.