Imagine that every day you have a person at your bedside writing down what time you get up, accompanying you to the bathroom to monitor how much time you spend in it, then following you to the kitchen to write down what you have for breakfast, and continuing to record your movements when He goes out into the street…would it seem normal to you? “Well, this is what happens on the Internet,” says Enrique Dans, professor of innovation at IE University, as an example. This lucrative business model brings enormous benefits to social media companies, which do not leave the little ones in the house on the sidelines.

Specifically, Facebook, Instagram, Snapchat, TikTok, This is estimated by a study recently published by the Havard T.H. school of public health. Chan. In light of their findings, the research authors call for government regulation of social media to reduce potentially harmful advertising practices targeting children and adolescents.

In this sense, they denounce that the personalized algorithms used by these platforms can lead this type of user to make excessive use of social networks. A concern that has motivated some states to pass laws to stop this upward trend and to file a class action lawsuit against Meta, owner of Facebook and Instagram, for collecting data from minors without consent and knowingly and deliberately designing functions that make that children become addicted to their platforms.

According to Bryan Austin, a Havard professor and lead author of the study, “the overwhelming financial incentives” of these companies would be behind the delay in adopting “significant” protection measures. In fact, the American Academy of Pediatrics warned a few years ago that children are “especially vulnerable to the persuasive effects of advertising because of their immature critical thinking skills and inhibition of their impulses.”

The digital rights expert from the Federation of Consumers and Users (CECU) Anabel K. Arias clarifies that in the United States “there is no regulation as broad as the one that the European Union (EU) did dictate with the Digital Services law, which as of last August became a mandatory application for large platforms.” The rule prohibits them from broadcasting specific advertisements for minors. However, Arias maintains that “it is very difficult” to control it when the minor registers on the social network by falsifying her age. That is why consumer organizations are demanding the approval of new regulations to control this aspect.

In any case, Arias admits that it is still early to assess whether the platforms are complying with the new law and whether it is effective enough to protect minors and the rest of the users. However, platforms invent formulas to try to avoid the restrictions imposed by the new regulations. In this sense, Dans recalls Meta’s decision to launch a paid version of Facebook and Instagram for users who do not want their data to be used to target personalized advertising. Of course, the company will stop showing ads to children even if they do not pay. “There is an attempt to protect the minor, but the standard user is unprotected,” warns the innovation expert.