EU Pressures Tech Giants Over Child Online Safety, Considers Social Media Age Restrictions

The European Union (EU) has demanded that major digital platforms, including Snapchat and YouTube, explain how they are protecting children from harmful online content, as member states debate introducing bloc-wide restrictions on minors’ access to social media.

The EU has long maintained strict rules on what children can view online, but officials say more action is needed. Inspired by Australia’s ban on social media for users under 16, Brussels is now examining whether similar measures could be applied across the 27-member bloc. France and Spain are among the countries pushing for tighter age-based restrictions.

The EU’s primary tool for holding platforms accountable is the Digital Services Act (DSA), which requires companies to remove illegal content and safeguard users. As part of ongoing investigations under the DSA, the European Commission has requested information from Snapchat on its measures to prevent children under 13 from accessing the platform. Apple’s App Store and Google Play have also been asked to explain how they stop minors from downloading illegal or harmful apps, such as those containing gambling services or sexual content.

Specifically, the EU wants to know how Apple and Google are addressing the use of so-called “nudify apps,” which allow users to create non-consensual sexualised images, and how age ratings are enforced.

“Privacy, security and safety have to be ensured, and this is not always the case. That’s why the Commission is tightening enforcement of our rules,” said EU tech chief Henna Virkkunen ahead of a ministerial meeting in Denmark.

While a request for information does not imply any wrongdoing, it can lead to formal investigations and potential fines.

Brussels is also seeking details from Snapchat on how it prevents the sale of drugs and vapes through its platform, an issue raised by Danish Digital Minister Caroline Stage Olsen. YouTube has been asked to clarify how its recommendation system protects minors from harmful content.

These inquiries follow similar DSA investigations into Meta’s Facebook and Instagram, as well as TikTok, over concerns about their impact on children’s well-being and the addictive design of their platforms.

Separately, EU telecoms ministers are discussing ways to strengthen age verification systems and improve online safety for minors. They are expected to endorse a joint statement supporting European Commission President Ursula von der Leyen’s plan to study the creation of a bloc-wide digital age of majority.

Von der Leyen has previously expressed support for the initiative and announced the formation of an expert panel to explore possible next steps. Denmark, which currently holds the EU presidency, is leading the charge for stronger collective action. Prime Minister Mette Frederiksen recently revealed plans to ban social media for children under 15. France already requires parental consent for under-15 social media users.

Leave a Reply

Your email address will not be published. Required fields are marked *