Answer to a written question – Tackling the promotion of extreme thinness on social media – E-001820/2025(ASW)

Source: European Parliament

The Commission takes the impact of social media on children very seriously, and is thus committed to swift enforcement of the Digital Services Act (DSA)[1].

In 2024, it initiated proceedings against TikTok based on concerns that it may have breached the DSA in areas related to the harmful effects on children stemming, notably, from its recommender systems and addictive features[2]. These proceedings are ongoing, and the Commission is carrying them out as a matter of priority .

As part of these proceedings, the Commission is closely monitoring the ‘SkinnyTok’ phenomenon. Should it find that TikTok does not comply with the DSA, it can adopt a non-compliance decision and order TikTok to take the necessary measures to ensure compliance with its decision.

The Commission is currently working to finalise guidelines on the protection of minors online[3]. The draft guidelines recommend that platforms implement age assurance measures that reduce the risks of children being exposed to age-inappropriate content.

The Commission and Member States are also working towards an interim age verification solution, which is intended to be an easy-to-use and privacy-preserving age verification method that can determine whether a user is 18 or older.

The release of this is expected by the end of this year. It is intended to bridge the gap until the EU Digital Identity Wallet is available.

  • [1] https://eur-lex.europa.eu/eli/reg/2022/2065/oj/eng.
  • [2] https://ec.europa.eu/commission/presscorner/detail/en/ip_24_926.
  • [3] https://digital-strategy.ec.europa.eu/en/library/commission-seeks-feedback-guidelines-protection-minors-online-under-digital-services-act.
Last updated: 10 July 2025